hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
791c291bb61fdd398af58d2e8130b7559485a81f | 135 | py | Python | viewer/images/edits/blur.py | Lukasz1928/DICOM-viewer | 778541d85c6e6a96e90c9d1050f3dec2b8387b5d | [
"MIT"
] | null | null | null | viewer/images/edits/blur.py | Lukasz1928/DICOM-viewer | 778541d85c6e6a96e90c9d1050f3dec2b8387b5d | [
"MIT"
] | null | null | null | viewer/images/edits/blur.py | Lukasz1928/DICOM-viewer | 778541d85c6e6a96e90c9d1050f3dec2b8387b5d | [
"MIT"
] | null | null | null | import cv2
def mean(image):
return cv2.medianBlur(image, 5)
def gaussian(image):
return cv2.GaussianBlur(image, (5, 5), 0)
| 13.5 | 45 | 0.674074 | 20 | 135 | 4.55 | 0.55 | 0.241758 | 0.307692 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.06422 | 0.192593 | 135 | 9 | 46 | 15 | 0.770642 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.4 | false | 0 | 0.2 | 0.4 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 7 |
f7016dd197dc84aeab737877e938c7ecfc8970b5 | 5,276 | py | Python | tests/analyzer/test_as_import.py | CAM-Gerlach/unimport | acaebf547274a95a33816e47ec22bb73d8456b17 | [
"MIT"
] | 147 | 2019-09-19T15:43:06.000Z | 2022-03-25T16:42:08.000Z | tests/analyzer/test_as_import.py | CAM-Gerlach/unimport | acaebf547274a95a33816e47ec22bb73d8456b17 | [
"MIT"
] | 154 | 2019-10-31T19:50:18.000Z | 2022-03-29T12:43:00.000Z | tests/analyzer/test_as_import.py | CAM-Gerlach/unimport | acaebf547274a95a33816e47ec22bb73d8456b17 | [
"MIT"
] | 28 | 2019-10-31T18:11:13.000Z | 2021-09-06T08:24:14.000Z | from tests.analyzer.utils import UnusedTestCase
from unimport.statement import Import, ImportFrom
class AsImportTestCase(UnusedTestCase):
def test_as_import_all_unused_all_cases(self):
self.assertSourceAfterScanningEqualToExpected(
"""\
from x import y as z
import x
from t import s as ss
from f import a as c, l as k, i as ii
from fo import (bar, i, x as z)
import le as x
""",
[
ImportFrom(
lineno=1,
column=1,
name="z",
package="x",
star=False,
suggestions=[],
),
Import(
lineno=2,
column=1,
name="x",
package="x",
),
ImportFrom(
lineno=3,
column=1,
name="ss",
package="t",
star=False,
suggestions=[],
),
ImportFrom(
lineno=4,
column=1,
name="c",
package="f",
star=False,
suggestions=[],
),
ImportFrom(
lineno=4,
column=2,
name="k",
package="f",
star=False,
suggestions=[],
),
ImportFrom(
lineno=4,
column=3,
name="ii",
package="f",
star=False,
suggestions=[],
),
ImportFrom(
lineno=5,
column=1,
name="bar",
package="fo",
star=False,
suggestions=[],
),
ImportFrom(
lineno=5,
column=2,
name="i",
package="fo",
star=False,
suggestions=[],
),
ImportFrom(
lineno=5,
column=3,
name="z",
package="fo",
star=False,
suggestions=[],
),
Import(
lineno=6,
column=1,
name="x",
package="le",
),
],
)
def test_as_import_one_used_in_function_all_cases(self):
self.assertSourceAfterScanningEqualToExpected(
"""\
from x import y as z
import x
from t import s as ss
from f import a as c, l as k, i as ii
from fo import (bar, i, x as z)
import le as x
def x(t=x):pass
""",
[
ImportFrom(
lineno=1,
column=1,
name="z",
package="x",
star=False,
suggestions=[],
),
Import(
lineno=2,
column=1,
name="x",
package="x",
),
ImportFrom(
lineno=3,
column=1,
name="ss",
package="t",
star=False,
suggestions=[],
),
ImportFrom(
lineno=4,
column=1,
name="c",
package="f",
star=False,
suggestions=[],
),
ImportFrom(
lineno=4,
column=2,
name="k",
package="f",
star=False,
suggestions=[],
),
ImportFrom(
lineno=4,
column=3,
name="ii",
package="f",
star=False,
suggestions=[],
),
ImportFrom(
lineno=5,
column=1,
name="bar",
package="fo",
star=False,
suggestions=[],
),
ImportFrom(
lineno=5,
column=2,
name="i",
package="fo",
star=False,
suggestions=[],
),
ImportFrom(
lineno=5,
column=3,
name="z",
package="fo",
star=False,
suggestions=[],
),
],
)
| 29.311111 | 60 | 0.288855 | 339 | 5,276 | 4.454277 | 0.156342 | 0.169536 | 0.211921 | 0.238411 | 0.874834 | 0.854305 | 0.854305 | 0.854305 | 0.854305 | 0.854305 | 0 | 0.019368 | 0.628127 | 5,276 | 179 | 61 | 29.47486 | 0.750255 | 0 | 0 | 0.919255 | 0 | 0 | 0.011065 | 0 | 0 | 0 | 0 | 0 | 0.012422 | 1 | 0.012422 | false | 0 | 0.149068 | 0 | 0.167702 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 10 |
e39346c67e0e7719f1e49b5daab3f3980bb20a7a | 388 | py | Python | simulation/lib/numerics/algorithms.py | Secretmud/bachelor_thesis | 6c72987a99d9b70007e18394b5ac3f1c9de416c8 | [
"MIT"
] | 3 | 2021-12-01T17:48:10.000Z | 2022-02-09T06:06:54.000Z | simulation/lib/numerics/algorithms.py | Secretmud/bachelor_thesis | 6c72987a99d9b70007e18394b5ac3f1c9de416c8 | [
"MIT"
] | null | null | null | simulation/lib/numerics/algorithms.py | Secretmud/bachelor_thesis | 6c72987a99d9b70007e18394b5ac3f1c9de416c8 | [
"MIT"
] | null | null | null | import numpy as np
def vec_midpoint(f, a, b, n, g=None, c=None):
h = (b - a)/n
x = np.linspace(a+h/2, b-h/2, n)
if callable(g):
return np.sum(f(x, g(x, c)))
return np.sum(f(x, g))*h
def trapezoidal(f, a, b, n, g=None, c=None):
h = (b - a)/n
x = np.linspace(a, b, n)
if callable(g):
return np.sum(f(x, g(x, c)))
return np.sum(f(x, g))*h
| 21.555556 | 45 | 0.505155 | 85 | 388 | 2.294118 | 0.270588 | 0.164103 | 0.225641 | 0.246154 | 0.748718 | 0.748718 | 0.748718 | 0.748718 | 0.748718 | 0.748718 | 0 | 0.007143 | 0.278351 | 388 | 17 | 46 | 22.823529 | 0.689286 | 0 | 0 | 0.615385 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.153846 | false | 0 | 0.076923 | 0 | 0.538462 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 8 |
e3b2e5258587b697a38478cbf6019ae314e120cc | 153 | py | Python | src/lk_db/ents/both/census/EntEthnicityOfPopulation.py | nuuuwan/lk_db | ac0abfa47ba31b0d4c2c8566b3101b83749bd45d | [
"MIT"
] | null | null | null | src/lk_db/ents/both/census/EntEthnicityOfPopulation.py | nuuuwan/lk_db | ac0abfa47ba31b0d4c2c8566b3101b83749bd45d | [
"MIT"
] | null | null | null | src/lk_db/ents/both/census/EntEthnicityOfPopulation.py | nuuuwan/lk_db | ac0abfa47ba31b0d4c2c8566b3101b83749bd45d | [
"MIT"
] | null | null | null | # Auto Generated - DO NOT EDIT!
from lk_db.ents.both.EntCensusResult import EntCensusResult
class EntEthnicityOfPopulation(EntCensusResult):
pass
| 19.125 | 59 | 0.803922 | 17 | 153 | 7.176471 | 0.882353 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.137255 | 153 | 7 | 60 | 21.857143 | 0.924242 | 0.189542 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.333333 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 7 |
e3b99fef941553ae5e7e87f24b6e20edff67432d | 157,659 | py | Python | nipy/modalities/fmri/fmristat/tests/FIACdesigns.py | bpinsard/nipy | d49e8292adad6619e3dac710752131b567efe90e | [
"BSD-3-Clause"
] | 236 | 2015-01-09T21:28:37.000Z | 2022-03-27T11:51:58.000Z | nipy/modalities/fmri/fmristat/tests/FIACdesigns.py | bpinsard/nipy | d49e8292adad6619e3dac710752131b567efe90e | [
"BSD-3-Clause"
] | 171 | 2015-03-23T00:31:43.000Z | 2021-11-22T12:43:00.000Z | nipy/modalities/fmri/fmristat/tests/FIACdesigns.py | bpinsard/nipy | d49e8292adad6619e3dac710752131b567efe90e | [
"BSD-3-Clause"
] | 94 | 2015-02-01T12:39:47.000Z | 2022-01-27T06:38:19.000Z | from __future__ import absolute_import
# emacs: -*- mode: python; py-indent-offset: 4; indent-tabs-mode: nil -*-
# vi: set ft=python sts=4 ts=4 sw=4 et:
import numpy as np
# number of scans in design (rows in design matrix)
N_ROWS = 191
# This is the alternative design to the standard, here called
# 'description'
# subj3_evt_fonc1.txt
altdescr = {'block':'''time,sentence,speaker
2.0,SSt,SSp
5.33,SSt,SSp
8.67,SSt,SSp
12.0,SSt,SSp
15.33,SSt,SSp
18.67,SSt,SSp
31.0,SSt,SSp
34.33,SSt,SSp
37.67,SSt,SSp
41.0,SSt,SSp
44.33,SSt,SSp
47.67,SSt,SSp
60.0,SSt,DSp
63.33,SSt,DSp
66.67,SSt,DSp
70.0,SSt,DSp
73.33,SSt,DSp
76.67,SSt,DSp
89.0,DSt,SSp
92.33,DSt,SSp
95.67,DSt,SSp
99.0,DSt,SSp
102.33,DSt,SSp
105.67,DSt,SSp
118.0,DSt,DSp
121.33,DSt,DSp
124.67,DSt,DSp
128.0,DSt,DSp
131.33,DSt,DSp
134.67,DSt,DSp
147.0,DSt,DSp
150.33,DSt,DSp
153.67,DSt,DSp
157.0,DSt,DSp
160.33,DSt,DSp
163.67,DSt,DSp
176.0,DSt,SSp
179.33,DSt,SSp
182.67,DSt,SSp
186.0,DSt,SSp
189.33,DSt,SSp
192.67,DSt,SSp
205.0,SSt,DSp
208.33,SSt,DSp
211.67,SSt,DSp
215.0,SSt,DSp
218.33,SSt,DSp
221.67,SSt,DSp
234.0,SSt,DSp
237.33,SSt,DSp
240.67,SSt,DSp
244.0,SSt,DSp
247.33,SSt,DSp
250.67,SSt,DSp
263.0,DSt,SSp
266.33,DSt,SSp
269.67,DSt,SSp
273.0,DSt,SSp
276.33,DSt,SSp
279.67,DSt,SSp
292.0,SSt,SSp
295.33,SSt,SSp
298.67,SSt,SSp
302.0,SSt,SSp
305.33,SSt,SSp
308.67,SSt,SSp
321.0,DSt,DSp
324.33,DSt,DSp
327.67,DSt,DSp
331.0,DSt,DSp
334.33,DSt,DSp
337.67,DSt,DSp
350.0,SSt,SSp
353.33,SSt,SSp
356.67,SSt,SSp
360.0,SSt,SSp
363.33,SSt,SSp
366.67,SSt,SSp
379.0,DSt,SSp
382.33,DSt,SSp
385.67,DSt,SSp
389.0,DSt,SSp
392.33,DSt,SSp
395.67,DSt,SSp
408.0,SSt,DSp
411.33,SSt,DSp
414.67,SSt,DSp
418.0,SSt,DSp
421.33,SSt,DSp
424.67,SSt,DSp
437.0,DSt,DSp
440.33,DSt,DSp
443.67,DSt,DSp
447.0,DSt,DSp
450.33,DSt,DSp
453.67,DSt,DSp''',
'event':'''time,sentence,speaker
2.0,DSt,DSp
5.33,SSt,SSp
8.67,DSt,SSp
12.0,DSt,DSp
15.33,SSt,DSp
18.67,DSt,SSp
22.0,SSt,SSp
25.33,DSt,SSp
28.67,SSt,SSp
32.0,SSt,DSp
35.33,SSt,SSp
38.67,SSt,DSp
42.0,DSt,DSp
45.33,SSt,SSp
48.67,DSt,SSp
52.0,SSt,DSp
55.33,SSt,SSp
58.67,DSt,SSp
62.0,DSt,DSp
65.33,SSt,SSp
68.67,DSt,SSp
72.0,SSt,SSp
75.33,SSt,DSp
78.67,DSt,DSp
82.0,SSt,SSp
85.33,DSt,DSp
88.67,SSt,SSp
92.0,DSt,DSp
95.33,DSt,SSp
98.67,DSt,DSp
102.0,DSt,SSp
105.33,SSt,SSp
108.67,SSt,DSp
112.0,DSt,DSp
115.33,SSt,DSp
118.67,SSt,SSp
122.0,DSt,SSp
125.33,DSt,DSp
128.67,DSt,SSp
132.0,DSt,DSp
135.33,DSt,SSp
138.67,SSt,DSp
142.0,DSt,DSp
145.33,SSt,SSp
148.67,SSt,DSp
152.0,SSt,SSp
155.33,SSt,DSp
158.67,SSt,SSp
162.0,SSt,DSp
165.33,DSt,SSp
168.67,SSt,SSp
172.0,DSt,SSp
175.33,DSt,DSp
178.67,SSt,SSp
182.0,DSt,DSp
185.33,SSt,DSp
188.67,SSt,SSp
192.0,SSt,DSp
195.33,SSt,SSp
198.67,DSt,DSp
202.0,SSt,DSp
205.33,DSt,SSp
208.67,DSt,DSp
212.0,SSt,SSp
215.33,SSt,DSp
218.67,DSt,SSp
222.0,SSt,SSp
225.33,DSt,DSp
228.67,SSt,SSp
232.0,DSt,DSp
235.33,DSt,SSp
238.67,DSt,DSp
242.0,SSt,SSp
245.33,DSt,DSp
248.67,SSt,DSp
252.0,DSt,SSp
255.33,DSt,DSp
258.67,SSt,DSp
262.0,DSt,DSp
265.33,DSt,SSp
268.67,DSt,DSp
272.0,DSt,SSp
275.33,DSt,DSp
278.67,SSt,SSp
282.0,DSt,SSp
285.33,SSt,DSp
288.67,DSt,DSp
292.0,DSt,SSp
295.33,SSt,DSp
298.67,DSt,DSp
302.0,DSt,SSp
305.33,SSt,DSp
308.67,DSt,SSp
312.0,SSt,DSp
315.33,SSt,SSp
318.67,DSt,SSp
322.0,SSt,SSp
325.33,DSt,DSp
328.67,SSt,SSp
332.0,SSt,DSp
335.33,SSt,SSp
338.67,DSt,SSp
342.0,DSt,DSp
345.33,SSt,DSp
348.67,DSt,DSp
352.0,SSt,DSp
355.33,SSt,SSp
358.67,DSt,DSp
362.0,SSt,DSp
365.33,DSt,SSp
368.67,SSt,DSp
372.0,DSt,SSp
375.33,DSt,DSp
378.67,DSt,SSp
382.0,SSt,SSp
385.33,SSt,DSp
388.67,DSt,SSp
392.0,SSt,SSp
395.33,SSt,DSp
398.67,DSt,SSp
402.0,DSt,DSp
405.33,DSt,SSp
408.67,SSt,SSp
412.0,DSt,SSp
415.33,SSt,SSp
418.67,DSt,DSp
422.0,DSt,SSp
425.33,SSt,DSp
428.67,SSt,SSp
432.0,DSt,DSp
435.33,SSt,DSp
438.67,SSt,SSp
442.0,DSt,DSp
445.33,SSt,DSp
448.67,DSt,SSp
452.0,DSt,DSp
455.33,SSt,DSp
458.67,DSt,SSp
462.0,SSt,SSp
465.33,DSt,DSp
468.67,SSt,SSp'''}
# convert altdescr to recarray for convenience
converters = float, str, str
for key, value in altdescr.items():
lines = value.split('\n')
names = lines.pop(0).strip().split(',')
dtype = np.dtype(list(zip(names, ('f8', 'S3', 'S3'))))
rec = np.recarray(shape=(len(lines),), dtype=dtype)
for i, line in enumerate(lines):
vals = line.strip().split(',')
for name, val, conv in zip(names, vals, converters):
rec[i][name] = conv(val)
altdescr[key] = rec
# standard analysis onsets
event_dict = {1:'SSt_SSp', 2:'SSt_DSp', 3:'DSt_SSp', 4:'DSt_DSp'}
descriptions = {'event':"""
2.00 4
5.33 1
8.67 3
12.00 4
15.33 2
18.67 3
22.00 1
25.33 3
28.67 1
32.00 2
35.33 1
38.67 2
42.00 4
45.33 1
48.67 3
52.00 2
55.33 1
58.67 3
62.00 4
65.33 1
68.67 3
72.00 1
75.33 2
78.67 4
82.00 1
85.33 4
88.67 1
92.00 4
95.33 3
98.67 4
102.00 3
105.33 1
108.67 2
112.00 4
115.33 2
118.67 1
122.00 3
125.33 4
128.67 3
132.00 4
135.33 3
138.67 2
142.00 4
145.33 1
148.67 2
152.00 1
155.33 2
158.67 1
162.00 2
165.33 3
168.67 1
172.00 3
175.33 4
178.67 1
182.00 4
185.33 2
188.67 1
192.00 2
195.33 1
198.67 4
202.00 2
205.33 3
208.67 4
212.00 1
215.33 2
218.67 3
222.00 1
225.33 4
228.67 1
232.00 4
235.33 3
238.67 4
242.00 1
245.33 4
248.67 2
252.00 3
255.33 4
258.67 2
262.00 4
265.33 3
268.67 4
272.00 3
275.33 4
278.67 1
282.00 3
285.33 2
288.67 4
292.00 3
295.33 2
298.67 4
302.00 3
305.33 2
308.67 3
312.00 2
315.33 1
318.67 3
322.00 1
325.33 4
328.67 1
332.00 2
335.33 1
338.67 3
342.00 4
345.33 2
348.67 4
352.00 2
355.33 1
358.67 4
362.00 2
365.33 3
368.67 2
372.00 3
375.33 4
378.67 3
382.00 1
385.33 2
388.67 3
392.00 1
395.33 2
398.67 3
402.00 4
405.33 3
408.67 1
412.00 3
415.33 1
418.67 4
422.00 3
425.33 2
428.67 1
432.00 4
435.33 2
438.67 1
442.00 4
445.33 2
448.67 3
452.00 4
455.33 2
458.67 3
462.00 1
465.33 4
468.67 1
""",
# subj3_bloc_fonc3.txt
'block':"""
2.00 1
5.33 1
8.67 1
12.00 1
15.33 1
18.67 1
31.00 1
34.33 1
37.67 1
41.00 1
44.33 1
47.67 1
60.00 2
63.33 2
66.67 2
70.00 2
73.33 2
76.67 2
89.00 3
92.33 3
95.67 3
99.00 3
102.33 3
105.67 3
118.00 4
121.33 4
124.67 4
128.00 4
131.33 4
134.67 4
147.00 4
150.33 4
153.67 4
157.00 4
160.33 4
163.67 4
176.00 3
179.33 3
182.67 3
186.00 3
189.33 3
192.67 3
205.00 2
208.33 2
211.67 2
215.00 2
218.33 2
221.67 2
234.00 2
237.33 2
240.67 2
244.00 2
247.33 2
250.67 2
263.00 3
266.33 3
269.67 3
273.00 3
276.33 3
279.67 3
292.00 1
295.33 1
298.67 1
302.00 1
305.33 1
308.67 1
321.00 4
324.33 4
327.67 4
331.00 4
334.33 4
337.67 4
350.00 1
353.33 1
356.67 1
360.00 1
363.33 1
366.67 1
379.00 3
382.33 3
385.67 3
389.00 3
392.33 3
395.67 3
408.00 2
411.33 2
414.67 2
418.00 2
421.33 2
424.67 2
437.00 4
440.33 4
443.67 4
447.00 4
450.33 4
453.67 4
"""}
# convert to record array for convenience
dtype = np.dtype([('time', np.float), ('event', 'S7')])
for key, txt in descriptions.items():
vals = np.fromstring(txt, sep='\t').reshape(-1, 2)
full_def = np.zeros((vals.shape[0],), dtype=dtype)
for i, row in enumerate(vals):
full_def[i]['time' ] = row[0]
full_def[i]['event'] = event_dict[row[1]]
descriptions[key] = full_def
# fmristat designs, probably saved from matlab to ascii
fmristat = {'block':
"""
0.0000000000000000e+00 1.7972165294585549e-07 0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00 -7.0891758549395515e-07 0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00 1.0000000000000000e+00 -1.0000000000000000e+00 1.0000000000000000e+00 -1.0000000000000000e+00 -0.0000000000000000e+00 -3.9964378119605467e+07
2.4177346180074877e-02 5.9438222395961854e-03 0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00 -1.7549188213126270e-02 0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00 1.0000000000000000e+00 -9.8947368421052628e-01 9.7905817174515231e-01 -9.6875229625309800e-01 -0.0000000000000000e+00 -3.9964295632754065e+07
2.9724288489710671e-01 6.7859854151592250e-02 0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00 -1.2825364543031706e-01 0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00 1.0000000000000000e+00 -9.7894736842105268e-01 9.5833795013850420e-01 -9.3816241434611469e-01 -0.0000000000000000e+00 -3.9964008717195466e+07
2.5803044610535053e-01 2.0702131854989700e-01 0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00 -2.2538113774670057e-01 0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00 1.0000000000000000e+00 -9.6842105263157896e-01 9.3783933518005547e-01 -9.0822335617436944e-01 -0.0000000000000000e+00 -3.9963382251577556e+07
1.5480779827902719e-02 3.5555300272627366e-01 0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00 -1.5615934487315666e-01 0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00 1.0000000000000000e+00 -9.5789473684210524e-01 9.1756232686980610e-01 -8.7892812363318262e-01 -0.0000000000000000e+00 -3.9962264238541424e+07
-8.5092843112443153e-02 4.1993036162424330e-01 0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00 -1.0730035819227872e-03 0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00 1.0000000000000000e+00 -9.4736842105263153e-01 8.9750692520775610e-01 -8.5026971861787415e-01 -0.0000000000000000e+00 -3.9960523102045782e+07
-6.7344467282778389e-02 3.9648730685263067e-01 0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00 6.4795239339487262e-02 0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00 1.0000000000000000e+00 -9.3684210526315792e-01 8.7767313019390591e-01 -8.2224114302376450e-01 -0.0000000000000000e+00 -3.9958030461220443e+07
-3.0859121798647051e-02 3.4988988278635952e-01 0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00 5.1950746816847776e-02 0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00 1.0000000000000000e+00 -9.2631578947368420e-01 8.5806094182825488e-01 -7.9483539874617293e-01 -0.0000000000000000e+00 -3.9954647886381753e+07
-1.0398603799754021e-02 3.0094602992319508e-01 0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00 6.9200680125590358e-02 0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00 1.0000000000000000e+00 -9.1578947368421049e-01 8.3867036011080320e-01 -7.6804548768041980e-01 -0.0000000000000000e+00 -3.9950228902413361e+07
-2.8261891016884526e-03 1.9842544809308393e-01 0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00 1.7957629818392778e-01 0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00 1.0000000000000000e+00 -9.0526315789473688e-01 8.1950138504155134e-01 -7.4186441172182538e-01 -0.0000000000000000e+00 -3.9944630184490673e+07
-6.5240578331303358e-04 4.1333332231519290e-02 0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00 2.2566882506564701e-01 0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00 1.0000000000000000e+00 -8.9473684210526316e-01 8.0055401662049863e-01 -7.1628517276570935e-01 -0.0000000000000000e+00 -3.9937731831733234e+07
-1.3230474301884534e-04 -8.9013361783352618e-02 0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00 1.0200288654363485e-01 0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00 1.0000000000000000e+00 -8.8421052631578945e-01 7.8182825484764540e-01 -6.9130077270739165e-01 -0.0000000000000000e+00 -3.9929402377448291e+07
-2.2763919031965966e-05 -1.1818776925247122e-01 0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00 -3.3071814284000439e-02 0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00 1.0000000000000000e+00 -8.7368421052631584e-01 7.6332409972299176e-01 -6.6690421344219286e-01 -0.0000000000000000e+00 -3.9919467169778965e+07
1.1766933934135151e-01 -5.9145813637303944e-02 0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00 -1.1704903409713198e-01 0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00 1.0000000000000000e+00 -8.6315789473684212e-01 7.4504155124653748e-01 -6.4308849686543235e-01 -0.0000000000000000e+00 -3.9907786847505786e+07
3.3915151685855366e-01 7.8487423349090008e-02 0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00 -2.2055846003109769e-01 0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00 1.0000000000000000e+00 -8.5263157894736841e-01 7.2698060941828258e-01 -6.1984662487243036e-01 -0.0000000000000000e+00 -3.9894241441418923e+07
1.5626897220672240e-01 2.5758459445421755e-01 0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00 -2.3727553274059962e-01 0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00 1.0000000000000000e+00 -8.4210526315789469e-01 7.0914127423822704e-01 -5.9717159935850694e-01 -0.0000000000000000e+00 -3.9878693250274450e+07
-4.5889761416029337e-02 3.9139980364646210e-01 0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00 -9.7047116711321060e-02 0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00 1.0000000000000000e+00 -8.3157894736842108e-01 6.9152354570637120e-01 -5.7505642221898245e-01 -0.0000000000000000e+00 -3.9860998616045773e+07
-8.5485466293479639e-02 4.1644987530692945e-01 0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00 3.5079498059630163e-02 0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00 1.0000000000000000e+00 -8.2105263157894737e-01 6.7412742382271473e-01 -5.5349409534917626e-01 -0.0000000000000000e+00 -3.9841029804515697e+07
-5.1613987307140528e-02 3.7656626544155047e-01 0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00 6.4995785276060730e-02 0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00 1.0000000000000000e+00 -8.1052631578947365e-01 6.5695290858725752e-01 -5.3247762064440873e-01 -0.0000000000000000e+00 -3.9818654567553706e+07
-2.0586709118519942e-02 3.3320587820969133e-01 0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00 4.7437405355933771e-02 0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00 1.0000000000000000e+00 -8.0000000000000004e-01 6.4000000000000012e-01 -5.1200000000000012e-01 -0.0000000000000000e+00 -3.9793725269769594e+07
-6.3149871015004810e-03 2.6901760355826837e-01 0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00 1.0881647626783833e-01 0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00 1.0000000000000000e+00 -7.8947368421052633e-01 6.2326869806094187e-01 -4.9205423531126991e-01 -0.0000000000000000e+00 -3.9766099107882999e+07
-1.5998995910758370e-03 1.3858542822449937e-01 0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00 2.1519290836378713e-01 0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00 1.0000000000000000e+00 -7.7894736842105261e-01 6.0675900277008310e-01 -4.7263332847353839e-01 -0.0000000000000000e+00 -3.9735647326953202e+07
-3.4949313513852877e-04 -1.9968098640650647e-02 0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00 1.9284674322244838e-01 0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00 1.0000000000000000e+00 -7.6842105263157889e-01 5.9047091412742370e-01 -4.5373028138212557e-01 -0.0000000000000000e+00 -3.9702236464460857e+07
-6.7774195098072875e-05 -1.1325358183759204e-01 0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00 3.7893132438445457e-02 0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00 1.0000000000000000e+00 -7.5789473684210529e-01 5.7440443213296399e-01 -4.3533809593235173e-01 -0.0000000000000000e+00 -3.9665728276365325e+07
5.6890986261887112e-03 -1.0696208513639913e-01 2.4632390461375082e-03 0.0000000000000000e+00 0.0000000000000000e+00 -5.6615568209656034e-02 -7.8171058742844777e-03 0.0000000000000000e+00 0.0000000000000000e+00 1.0000000000000000e+00 -7.4736842105263157e-01 5.5855955678670355e-01 -4.1744977401953637e-01 -0.0000000000000000e+00 -3.9625947171426475e+07
2.4618481081097995e-01 -6.1272673063231448e-02 4.8820311055440317e-02 0.0000000000000000e+00 0.0000000000000000e+00 -5.8086735980862475e-02 -1.0183399555242575e-01 0.0000000000000000e+00 0.0000000000000000e+00 1.0000000000000000e+00 -7.3684210526315785e-01 5.4293628808864258e-01 -4.0005831753899979e-01 -0.0000000000000000e+00 -3.9582766803961083e+07
2.9953964554960644e-01 -2.5547441406802145e-02 1.7507033112252593e-01 0.0000000000000000e+00 0.0000000000000000e+00 -3.0993666067196276e-02 -2.1612831441918828e-01 0.0000000000000000e+00 0.0000000000000000e+00 1.0000000000000000e+00 -7.2631578947368425e-01 5.2753462603878121e-01 -3.8315672838606218e-01 -0.0000000000000000e+00 -3.9536072560850807e+07
5.7109131744450661e-02 -8.4051340693276771e-03 3.2987495754170404e-01 0.0000000000000000e+00 0.0000000000000000e+00 -1.1758010932446473e-02 -1.8320362885086547e-01 0.0000000000000000e+00 0.0000000000000000e+00 1.0000000000000000e+00 -7.1578947368421053e-01 5.1235457063711909e-01 -3.6673800845604315e-01 -0.0000000000000000e+00 -3.9485728182394587e+07
-7.8184230485700307e-02 -2.2938316718133949e-03 4.1619910711870090e-01 0.0000000000000000e+00 0.0000000000000000e+00 -3.5232574314479738e-03 -2.7189893305956774e-02 0.0000000000000000e+00 0.0000000000000000e+00 1.0000000000000000e+00 -7.0526315789473681e-01 4.9739612188365645e-01 -3.5079515964426300e-01 -0.0000000000000000e+00 -3.9431584623730086e+07
-7.4737010936004322e-02 -5.3798391262850328e-04 4.0522771221040943e-01 0.0000000000000000e+00 0.0000000000000000e+00 -8.8267150904121708e-04 5.8945294122267095e-02 0.0000000000000000e+00 0.0000000000000000e+00 1.0000000000000000e+00 -6.9473684210526321e-01 4.8265927977839340e-01 -3.3532118384604176e-01 -0.0000000000000000e+00 -3.9373503040717266e+07
-3.7125986784173719e-02 -1.1129983958096788e-04 3.5823603012410249e-01 0.0000000000000000e+00 0.0000000000000000e+00 -1.9175152461996296e-04 5.6583906662152982e-02 0.0000000000000000e+00 0.0000000000000000e+00 1.0000000000000000e+00 -6.8421052631578949e-01 4.6814404432132967e-01 -3.2030908295669924e-01 -0.0000000000000000e+00 -3.9311345987254620e+07
-1.3178061904414793e-02 -2.0616437121920034e-05 3.1330364005523526e-01 0.0000000000000000e+00 0.0000000000000000e+00 -3.6864262677767610e-05 5.6795794533862881e-02 0.0000000000000000e+00 0.0000000000000000e+00 1.0000000000000000e+00 -6.7368421052631577e-01 4.5385041551246535e-01 -3.0575185887155559e-01 -0.0000000000000000e+00 -3.9244973542355932e+07
-3.7205065395061786e-03 -3.5244398305267092e-06 2.2498345090173064e-01 0.0000000000000000e+00 0.0000000000000000e+00 -6.4936311517650639e-06 1.5701149313902130e-01 0.0000000000000000e+00 0.0000000000000000e+00 1.0000000000000000e+00 -6.6315789473684206e-01 4.3977839335180047e-01 -2.9164251348593084e-01 -0.0000000000000000e+00 -3.9174248924560487e+07
-8.8460183856306875e-04 -5.1822314315757625e-07 7.3882604017788261e-02 0.0000000000000000e+00 0.0000000000000000e+00 -9.7632309208494808e-07 2.2974705727631178e-01 0.0000000000000000e+00 0.0000000000000000e+00 1.0000000000000000e+00 -6.5263157894736845e-01 4.2592797783933523e-01 -2.7797404869514508e-01 -0.0000000000000000e+00 -3.9099039317975588e+07
-1.8370553324891862e-04 0.0000000000000000e+00 -7.0043906737830369e-02 0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00 1.3510500812682708e-01 0.0000000000000000e+00 0.0000000000000000e+00 1.0000000000000000e+00 -6.4210526315789473e-01 4.1229916897506924e-01 -2.6473946639451817e-01 -0.0000000000000000e+00 -3.9019207925871357e+07
-3.4179300052400812e-05 0.0000000000000000e+00 -1.2022600095442171e-01 1.4728700311903286e-05 0.0000000000000000e+00 0.0000000000000000e+00 -1.3519940771543261e-02 -5.5984441974886700e-05 0.0000000000000000e+00 1.0000000000000000e+00 -6.3157894736842102e-01 3.9889196675900274e-01 -2.5193176847937016e-01 -0.0000000000000000e+00 -3.8934600623322882e+07
6.1977227981245921e-02 0.0000000000000000e+00 -8.9600939014954029e-02 1.1919836474499302e-02 0.0000000000000000e+00 0.0000000000000000e+00 -6.4884679517546728e-02 -3.2555493185429440e-02 0.0000000000000000e+00 1.0000000000000000e+00 -6.2105263157894741e-01 3.8570637119113577e-01 -2.3954395684502119e-01 -0.0000000000000000e+00 -3.8845083451900810e+07
3.2914194698808724e-01 0.0000000000000000e+00 -4.4610650244984916e-02 9.0274941429971764e-02 0.0000000000000000e+00 0.0000000000000000e+00 -4.7586436324094594e-02 -1.5426326998121220e-01 0.0000000000000000e+00 1.0000000000000000e+00 -6.1052631578947369e-01 3.7274238227146816e-01 -2.2756903338679108e-01 -0.0000000000000000e+00 -3.8750522818035603e+07
2.0869631953634510e-01 0.0000000000000000e+00 -1.6794634932976529e-02 2.3929665270590575e-01 0.0000000000000000e+00 0.0000000000000000e+00 -2.1745661626648261e-02 -2.2651021459888929e-01 0.0000000000000000e+00 1.0000000000000000e+00 -5.9999999999999998e-01 3.5999999999999999e-01 -2.1599999999999997e-01 -0.0000000000000000e+00 -3.8650779152144141e+07
-1.8988164869436275e-02 0.0000000000000000e+00 -5.1018176378398936e-03 3.7764046922164829e-01 0.0000000000000000e+00 0.0000000000000000e+00 -7.4407761527462875e-03 -1.2483558646645473e-01 0.0000000000000000e+00 1.0000000000000000e+00 -5.8947368421052626e-01 3.4747922437673123e-01 -2.0482985857996788e-01 -0.0000000000000000e+00 -3.8545724148709796e+07
-8.7172921748424712e-02 0.0000000000000000e+00 -1.3055419048209041e-03 4.2010747575335811e-01 0.0000000000000000e+00 0.0000000000000000e+00 -2.0641445077479018e-03 2.0039325175152698e-02 0.0000000000000000e+00 1.0000000000000000e+00 -5.7894736842105265e-01 3.3518005540166207e-01 -1.9405161102201490e-01 -0.0000000000000000e+00 -3.8435214400087699e+07
-5.9479346491040953e-02 0.0000000000000000e+00 -2.9049537695812611e-04 3.8676128528650705e-01 0.0000000000000000e+00 0.0000000000000000e+00 -4.8684122664065948e-04 6.6822940192205887e-02 0.0000000000000000e+00 1.0000000000000000e+00 -5.6842105263157894e-01 3.2310249307479222e-01 -1.8365825922146084e-01 -0.0000000000000000e+00 -3.8319113881397702e+07
-2.5344894340844023e-02 0.0000000000000000e+00 -5.7519918573519355e-05 3.4173267838996024e-01 0.0000000000000000e+00 0.0000000000000000e+00 -1.0071519927700912e-04 4.8590711578671605e-02 0.0000000000000000e+00 1.0000000000000000e+00 -5.5789473684210522e-01 3.1124653739612185e-01 -1.7364280507362584e-01 -0.0000000000000000e+00 -3.8197280698436156e+07
-8.1360247239318281e-03 0.0000000000000000e+00 -1.0271890012879138e-05 2.8635449602721674e-01 0.0000000000000000e+00 0.0000000000000000e+00 -1.8603553893817027e-05 8.6957892817575433e-02 0.0000000000000000e+00 1.0000000000000000e+00 -5.4736842105263162e-01 2.9961218836565101e-01 -1.6399825047383004e-01 -0.0000000000000000e+00 -3.8069578593331799e+07
-2.1329811439999668e-03 0.0000000000000000e+00 -1.5726052562367482e-06 1.6942030072831732e-01 0.0000000000000000e+00 0.0000000000000000e+00 -2.9207518307516431e-06 1.9932735803028329e-01 0.0000000000000000e+00 1.0000000000000000e+00 -5.3684210526315790e-01 2.8819944598337949e-01 -1.5471759731739321e-01 -0.0000000000000000e+00 -3.7935869323744483e+07
-4.7868580374672127e-04 0.0000000000000000e+00 -2.4393255686410929e-07 9.7342091717260160e-03 0.0000000000000000e+00 0.0000000000000000e+00 -4.6354368728304449e-07 2.1317803332965357e-01 0.0000000000000000e+00 1.0000000000000000e+00 -5.2631578947368418e-01 2.7700831024930744e-01 -1.4579384749963548e-01 -0.0000000000000000e+00 -3.7796023122095928e+07
-9.4885576906897671e-05 0.0000000000000000e+00 0.0000000000000000e+00 -1.0342703349042752e-01 0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00 6.8989839045089055e-02 0.0000000000000000e+00 1.0000000000000000e+00 -5.1578947368421058e-01 2.6603878116343493e-01 -1.3722000291587699e-01 -0.0000000000000000e+00 -3.7649889965446725e+07
4.6549987495046264e-04 0.0000000000000000e+00 0.0000000000000000e+00 -1.1370885399932430e-01 7.7849954623234125e-04 0.0000000000000000e+00 0.0000000000000000e+00 -4.6780737173119882e-02 -2.6406827272240662e-03 1.0000000000000000e+00 -5.0526315789473686e-01 2.5529085872576179e-01 -1.2898906546143754e-01 -0.0000000000000000e+00 -3.7497316265670031e+07
1.8286923577271125e-01 0.0000000000000000e+00 0.0000000000000000e+00 -7.0505792246302060e-02 3.3178956057083007e-02 0.0000000000000000e+00 0.0000000000000000e+00 -6.2077599161047464e-02 -7.6128408846089957e-02 1.0000000000000000e+00 -4.9473684210526314e-01 2.4476454293628808e-01 -1.2109403703163725e-01 -0.0000000000000000e+00 -3.7338169264406279e+07
3.2815365238396688e-01 0.0000000000000000e+00 0.0000000000000000e+00 -3.1075379328076348e-02 1.4440180014681392e-01 0.0000000000000000e+00 0.0000000000000000e+00 -3.6293517323374769e-02 -1.9992737502479768e-01 1.0000000000000000e+00 -4.8421052631578948e-01 2.3445983379501387e-01 -1.1352791952179618e-01 -0.0000000000000000e+00 -3.7172326619055934e+07
1.0472734947167857e-01 0.0000000000000000e+00 0.0000000000000000e+00 -1.0670392954156069e-02 3.0145227582186795e-01 0.0000000000000000e+00 0.0000000000000000e+00 -1.4579521804191179e-02 -2.0462195254207211e-01 1.0000000000000000e+00 -4.7368421052631576e-01 2.2437673130193903e-01 -1.0628371482723427e-01 -0.0000000000000000e+00 -3.6999660033267289e+07
-6.5414735178786182e-02 0.0000000000000000e+00 0.0000000000000000e+00 -3.0133373947628865e-03 4.0816930917890015e-01 0.0000000000000000e+00 0.0000000000000000e+00 -4.5547919780519388e-03 -5.7811109467925295e-02 1.0000000000000000e+00 -4.6315789473684210e-01 2.1451523545706372e-01 -9.9354424843271616e-02 -0.0000000000000000e+00 -3.6820030059284538e+07
-8.1039296440989866e-02 0.0000000000000000e+00 0.0000000000000000e+00 -7.2674473172627254e-04 4.1235355967765930e-01 0.0000000000000000e+00 0.0000000000000000e+00 -1.1786336808960263e-03 4.9499274042558168e-02 1.0000000000000000e+00 -4.5263157894736844e-01 2.0487534626038784e-01 -9.2733051465228172e-02 -0.0000000000000000e+00 -3.6633274691983856e+07
-4.4087905632133433e-02 0.0000000000000000e+00 0.0000000000000000e+00 -1.5386856254847681e-04 3.6716665813617538e-01 0.0000000000000000e+00 0.0000000000000000e+00 -2.6278123722066446e-04 6.1468774353944371e-02 1.0000000000000000e+00 -4.4210526315789472e-01 1.9545706371191135e-01 -8.6412596588423957e-02 -0.0000000000000000e+00 -3.6439258355741180e+07
-1.6550422413122053e-02 0.0000000000000000e+00 0.0000000000000000e+00 -2.9267163684834725e-05 3.2392301620679653e-01 0.0000000000000000e+00 0.0000000000000000e+00 -5.2002150652398645e-05 4.9847666173530675e-02 1.0000000000000000e+00 -4.3157894736842106e-01 1.8626038781163437e-01 -8.0386062108179043e-02 -0.0000000000000000e+00 -3.6237849955159605e+07
-4.8645759144098875e-03 0.0000000000000000e+00 0.0000000000000000e+00 -5.0502559169199467e-06 2.4859549248739221e-01 0.0000000000000000e+00 0.0000000000000000e+00 -9.2540988607474148e-06 1.3286328271100897e-01 1.0000000000000000e+00 -4.2105263157894735e-01 1.7728531855955676e-01 -7.4646449919813368e-02 -0.0000000000000000e+00 -3.6028915291220829e+07
-1.1930031026986410e-03 0.0000000000000000e+00 0.0000000000000000e+00 -7.5235207367785176e-07 1.0653571529948751e-01 0.0000000000000000e+00 0.0000000000000000e+00 -1.4109555146948659e-06 2.2584550280153473e-01 1.0000000000000000e+00 -4.1052631578947368e-01 1.6853185595567868e-01 -6.9186761918647033e-02 -0.0000000000000000e+00 -3.5812314029410847e+07
-2.5396202412826985e-04 0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00 -4.6832193791313273e-02 0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00 1.6614010275431002e-01 1.0000000000000000e+00 -4.0000000000000002e-01 1.6000000000000003e-01 -6.4000000000000015e-02 -0.0000000000000000e+00 -3.5587902398848765e+07
-4.8220518125413127e-05 0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00 -1.1871326011304753e-01 0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00 1.0090386571392081e-02 1.0000000000000000e+00 -3.8947368421052631e-01 1.5168975069252078e-01 -5.9079166059192299e-02 -0.0000000000000000e+00 -3.5355541558413237e+07
2.4169016536933870e-02 0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00 -9.2791475308364513e-02 0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00 -8.0004825585489200e-02 1.0000000000000000e+00 -3.7894736842105264e-01 1.4360110803324100e-01 -5.4417261991543966e-02 -0.0000000000000000e+00 -3.5115099058204055e+07
2.9724155612227232e-01 0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00 1.5270927425803231e-02 0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00 -1.8135969165363014e-01 1.0000000000000000e+00 -3.6842105263157893e-01 1.3573407202216065e-01 -5.0007289692374973e-02 -0.0000000000000000e+00 -3.4866436158003025e+07
2.5803024806313540e-01 0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00 1.8621478092052027e-01 0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00 -2.5150110948521948e-01 1.0000000000000000e+00 -3.5789473684210527e-01 1.2808864265927977e-01 -4.5842251057005394e-02 -0.0000000000000000e+00 -3.4609428135132596e+07
1.5480751995147656e-02 0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00 3.4898126191507456e-01 0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00 -1.6555402474497252e-01 1.0000000000000000e+00 -3.4736842105263160e-01 1.2066481994459835e-01 -4.1915147980755220e-02 -0.0000000000000000e+00 -3.4343941110639565e+07
-8.5092846828545440e-02 0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00 4.1819492389675139e-01 0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00 -3.7787005240121178e-03 1.0000000000000000e+00 -3.3684210526315789e-01 1.1346260387811634e-01 -3.8218982358944449e-02 -0.0000000000000000e+00 -3.4069813706143320e+07
-6.7344467282778389e-02 0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00 3.9609103273326435e-01 0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00 6.4137932796112615e-02 1.0000000000000000e+00 -3.2631578947368423e-01 1.0648199445983381e-01 -3.4746756086893135e-02 -0.0000000000000000e+00 -3.3786908936772466e+07
-3.0859121798647051e-02 0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00 3.4980970938405542e-01 0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00 5.1811469414551475e-02 1.0000000000000000e+00 -3.1578947368421051e-01 9.9722991689750684e-02 -3.1491471059921269e-02 -0.0000000000000000e+00 -3.3495079538397674e+07
-1.0398603799754021e-02 0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00 3.0093145263337484e-01 0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00 6.9174443414894288e-02 1.0000000000000000e+00 -3.0526315789473685e-01 9.3185595567867041e-02 -2.8446129173348884e-02 -0.0000000000000000e+00 -3.3194207260061242e+07
-2.8261891016884526e-03 0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00 1.9842318427908845e-01 0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00 1.7957211533156259e-01 1.0000000000000000e+00 -2.9473684210526313e-01 8.6869806094182808e-02 -2.5603732322495985e-02 -0.0000000000000000e+00 -3.2884149286440846e+07
-6.5240578331303358e-04 0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00 4.1332976227096016e-02 0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00 2.2566815140637572e-01 1.0000000000000000e+00 -2.8421052631578947e-01 8.0775623268698055e-02 -2.2957282402682605e-02 -0.0000000000000000e+00 -3.2564763542401820e+07
-1.3230474301884534e-04 0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00 -8.9013361783352618e-02 0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00 1.0200288654363485e-01 1.0000000000000000e+00 -2.7368421052631581e-01 7.4903047091412753e-02 -2.0499781309228755e-02 -0.0000000000000000e+00 -3.2235910996290851e+07
-2.2763919031965966e-05 0.0000000000000000e+00 0.0000000000000000e+00 1.5963017835031278e-04 -1.1834739943082154e-01 0.0000000000000000e+00 0.0000000000000000e+00 -5.7534210508715421e-04 -3.2496472178913285e-02 1.0000000000000000e+00 -2.6315789473684209e-01 6.9252077562326861e-02 -1.8224230937454435e-02 -0.0000000000000000e+00 -3.1897453852406133e+07
1.1766933934135151e-01 0.0000000000000000e+00 0.0000000000000000e+00 2.0909077918752295e-02 -8.0054891556056232e-02 0.0000000000000000e+00 0.0000000000000000e+00 -5.2504663312063840e-02 -6.4544370785068139e-02 1.0000000000000000e+00 -2.5263157894736843e-01 6.3822714681440448e-02 -1.6123633182679693e-02 -0.0000000000000000e+00 -3.1549256225830898e+07
3.3915151685855366e-01 0.0000000000000000e+00 0.0000000000000000e+00 1.1591503626231370e-01 -3.7427612913223690e-02 0.0000000000000000e+00 0.0000000000000000e+00 -1.7866822805253882e-01 -4.1890231978558873e-02 1.0000000000000000e+00 -2.4210526315789474e-01 5.8614958448753467e-02 -1.4190989940224523e-02 -0.0000000000000000e+00 -3.1191183393587172e+07
1.5626897220672240e-01 0.0000000000000000e+00 0.0000000000000000e+00 2.7102589539944072e-01 -1.3441300945223146e-02 0.0000000000000000e+00 0.0000000000000000e+00 -2.1937554065255879e-01 -1.7899992088040828e-02 1.0000000000000000e+00 -2.3157894736842105e-01 5.3628808864265930e-02 -1.2419303105408952e-02 -0.0000000000000000e+00 -3.0823097233739205e+07
-4.5889761416029337e-02 0.0000000000000000e+00 0.0000000000000000e+00 3.9533355560952921e-01 -3.9337519630671123e-03 0.0000000000000000e+00 0.0000000000000000e+00 -9.1202571282810271e-02 -5.8445454285107907e-03 1.0000000000000000e+00 -2.2105263157894736e-01 4.8864265927977837e-02 -1.0801574573552995e-02 -0.0000000000000000e+00 -3.0444855215506077e+07
-8.5485466293479639e-02 0.0000000000000000e+00 0.0000000000000000e+00 4.1742655305133147e-01 -9.7667774440200706e-04 0.0000000000000000e+00 0.0000000000000000e+00 3.6644079357620526e-02 -1.5645812979903574e-03 1.0000000000000000e+00 -2.1052631578947367e-01 4.4321329639889190e-02 -9.3308062399766710e-03 -0.0000000000000000e+00 -3.0056331771375515e+07
-5.1613987307140528e-02 0.0000000000000000e+00 0.0000000000000000e+00 3.7677806853318929e-01 -2.1180309163878829e-04 0.0000000000000000e+00 0.0000000000000000e+00 6.5354184877135096e-02 -3.5839960107434895e-04 1.0000000000000000e+00 -2.0000000000000001e-01 4.0000000000000008e-02 -8.0000000000000019e-03 -0.0000000000000000e+00 -2.9657376031923760e+07
-2.0586709118519942e-02 0.0000000000000000e+00 0.0000000000000000e+00 3.3324698528225988e-01 -4.1107072568475722e-05 0.0000000000000000e+00 0.0000000000000000e+00 4.7509924533272796e-02 -7.2519177339011736e-05 1.0000000000000000e+00 -1.8947368421052632e-01 3.5900277008310250e-02 -6.8021577489429958e-03 -0.0000000000000000e+00 -2.9247849788508385e+07
-6.3149871015004810e-03 0.0000000000000000e+00 0.0000000000000000e+00 2.6902481763940428e-01 -7.2140811359126060e-06 0.0000000000000000e+00 0.0000000000000000e+00 1.0882962004244967e-01 -1.3143774611327210e-05 1.0000000000000000e+00 -1.7894736842105263e-01 3.2022160664819943e-02 -5.7302813821256742e-03 -0.0000000000000000e+00 -2.8827620597817719e+07
-1.5998995910758370e-03 0.0000000000000000e+00 0.0000000000000000e+00 1.3858651749072567e-01 -1.0892662263073478e-06 0.0000000000000000e+00 0.0000000000000000e+00 2.1519494147264256e-01 -2.0331088554603621e-06 1.0000000000000000e+00 -1.6842105263157894e-01 2.8365650969529085e-02 -4.7773727948680561e-03 -0.0000000000000000e+00 -2.8396549759912662e+07
-3.4949313513852877e-04 0.0000000000000000e+00 0.0000000000000000e+00 -1.9968098640650647e-02 0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00 1.9284674322244838e-01 0.0000000000000000e+00 1.0000000000000000e+00 -1.5789473684210525e-01 2.4930747922437671e-02 -3.9364338824901587e-03 -0.0000000000000000e+00 -2.7954504561053500e+07
-6.7774195098072875e-05 0.0000000000000000e+00 0.0000000000000000e+00 -1.1325358183759204e-01 0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00 3.7893132438445457e-02 0.0000000000000000e+00 1.0000000000000000e+00 -1.4736842105263157e-01 2.1717451523545702e-02 -3.2004665403119982e-03 -0.0000000000000000e+00 -2.7501342752012245e+07
5.6890986261887112e-03 0.0000000000000000e+00 2.4632390461375082e-03 -1.0696208513639913e-01 0.0000000000000000e+00 0.0000000000000000e+00 -7.8171058742844777e-03 -5.6615568209656034e-02 0.0000000000000000e+00 1.0000000000000000e+00 -1.3684210526315790e-01 1.8725761772853188e-02 -2.5624726636535944e-03 -0.0000000000000000e+00 -2.7036898926030204e+07
2.4618481081097995e-01 0.0000000000000000e+00 4.8820311055440317e-02 -6.1272673063231448e-02 0.0000000000000000e+00 0.0000000000000000e+00 -1.0183399555242575e-01 -5.8086735980862475e-02 0.0000000000000000e+00 1.0000000000000000e+00 -1.2631578947368421e-01 1.5955678670360112e-02 -2.0154541478349616e-03 -0.0000000000000000e+00 -2.6561044027604885e+07
2.9953964554960644e-01 0.0000000000000000e+00 1.7507033112252593e-01 -2.5547441406802145e-02 0.0000000000000000e+00 0.0000000000000000e+00 -2.1612831441918828e-01 -3.0993666067196276e-02 0.0000000000000000e+00 1.0000000000000000e+00 -1.1578947368421053e-01 1.3407202216066482e-02 -1.5524128881761190e-03 -0.0000000000000000e+00 -2.6073662987834934e+07
5.7109131744450661e-02 0.0000000000000000e+00 3.2987495754170404e-01 -8.4051340693276771e-03 0.0000000000000000e+00 0.0000000000000000e+00 -1.8320362885086547e-01 -1.1758010932446473e-02 0.0000000000000000e+00 1.0000000000000000e+00 -1.0526315789473684e-01 1.1080332409972297e-02 -1.1663507799970839e-03 -0.0000000000000000e+00 -2.5574620168755420e+07
-7.8184230485700307e-02 0.0000000000000000e+00 4.1619910711870090e-01 -2.2938316718133949e-03 0.0000000000000000e+00 0.0000000000000000e+00 -2.7189893305956774e-02 -3.5232574314479738e-03 0.0000000000000000e+00 1.0000000000000000e+00 -9.4736842105263161e-02 8.9750692520775624e-03 -8.5026971861787448e-04 -0.0000000000000000e+00 -2.5063768014097415e+07
-7.4737010936004322e-02 0.0000000000000000e+00 4.0522771221040943e-01 -5.3798391262850328e-04 0.0000000000000000e+00 0.0000000000000000e+00 5.8945294122267095e-02 -8.8267150904121708e-04 0.0000000000000000e+00 1.0000000000000000e+00 -8.4210526315789472e-02 7.0914127423822712e-03 -5.9717159935850702e-04 -0.0000000000000000e+00 -2.4540960104356453e+07
-3.7125986784173719e-02 0.0000000000000000e+00 3.5823603012410249e-01 -1.1129983958096788e-04 0.0000000000000000e+00 0.0000000000000000e+00 5.6583906662152982e-02 -1.9175152461996296e-04 0.0000000000000000e+00 1.0000000000000000e+00 -7.3684210526315783e-02 5.4293628808864255e-03 -4.0005831753899977e-04 -0.0000000000000000e+00 -2.4006064559022680e+07
-1.3178061904414793e-02 0.0000000000000000e+00 3.1330364005523526e-01 -2.0616437121920034e-05 0.0000000000000000e+00 0.0000000000000000e+00 5.6795794533862881e-02 -3.6864262677767610e-05 0.0000000000000000e+00 1.0000000000000000e+00 -6.3157894736842107e-02 3.9889196675900280e-03 -2.5193176847937020e-04 -0.0000000000000000e+00 -2.3458940367557507e+07
-3.7205065395061786e-03 0.0000000000000000e+00 2.2498345090173064e-01 -3.5244398305267092e-06 0.0000000000000000e+00 0.0000000000000000e+00 1.5701149313902130e-01 -6.4936311517650639e-06 0.0000000000000000e+00 1.0000000000000000e+00 -5.2631578947368418e-02 2.7700831024930744e-03 -1.4579384749963548e-04 -0.0000000000000000e+00 -2.2899451906451862e+07
-8.8460183856306875e-04 0.0000000000000000e+00 7.3882604017788261e-02 -5.1822314315757625e-07 0.0000000000000000e+00 0.0000000000000000e+00 2.2974705727631178e-01 -9.7632309208494808e-07 0.0000000000000000e+00 1.0000000000000000e+00 -4.2105263157894736e-02 1.7728531855955678e-03 -7.4646449919813377e-05 -0.0000000000000000e+00 -2.2327463736216143e+07
-1.8370553324891862e-04 0.0000000000000000e+00 -7.0043906737830369e-02 0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00 1.3510500812682708e-01 0.0000000000000000e+00 0.0000000000000000e+00 1.0000000000000000e+00 -3.1578947368421054e-02 9.9722991689750701e-04 -3.1491471059921275e-05 -0.0000000000000000e+00 -2.1742838455218170e+07
-3.4179300052400812e-05 0.0000000000000000e+00 -1.2021127225410981e-01 0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00 -1.3575925213518148e-02 0.0000000000000000e+00 0.0000000000000000e+00 1.0000000000000000e+00 -2.1052631578947368e-02 4.4321329639889195e-04 -9.3308062399766721e-06 -0.0000000000000000e+00 -2.1145430968515009e+07
6.1977227981245921e-02 0.0000000000000000e+00 -7.7681102540454725e-02 0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00 -9.7440172702976169e-02 0.0000000000000000e+00 0.0000000000000000e+00 1.0000000000000000e+00 -1.0526315789473684e-02 1.1080332409972299e-04 -1.1663507799970840e-06 -0.0000000000000000e+00 -2.0535093915450227e+07
3.2914194698808724e-01 0.0000000000000000e+00 4.5664291184986848e-02 0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00 -2.0184970630530680e-01 0.0000000000000000e+00 0.0000000000000000e+00 1.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00 -1.9911701609588567e+07
2.0869631953634510e-01 0.0000000000000000e+00 2.2250201777292919e-01 0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00 -2.4825587622553755e-01 0.0000000000000000e+00 0.0000000000000000e+00 1.0000000000000000e+00 1.0526315789473684e-02 1.1080332409972299e-04 1.1663507799970840e-06 1.1663507799970840e-06 -1.9275125611193918e+07
-1.8988164869436275e-02 0.0000000000000000e+00 3.7253865158380839e-01 0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00 -1.3227636261920103e-01 0.0000000000000000e+00 0.0000000000000000e+00 1.0000000000000000e+00 2.1052631578947368e-02 4.4321329639889195e-04 9.3308062399766721e-06 9.3308062399766721e-06 -1.8625232264776390e+07
-8.7172921748424712e-02 0.0000000000000000e+00 4.1880193384853720e-01 0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00 1.7975180667404809e-02 0.0000000000000000e+00 0.0000000000000000e+00 1.0000000000000000e+00 3.1578947368421054e-02 9.9722991689750701e-04 3.1491471059921275e-05 3.1491471059921275e-05 -1.7961888448384833e+07
-5.9479346491040953e-02 0.0000000000000000e+00 3.8647078990954892e-01 0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00 6.6336098965565221e-02 0.0000000000000000e+00 0.0000000000000000e+00 1.0000000000000000e+00 4.2105263157894736e-02 1.7728531855955678e-03 7.4646449919813377e-05 7.4646449919813377e-05 -1.7284953087222628e+07
-2.5344894340844023e-02 0.0000000000000000e+00 3.4167515847138674e-01 0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00 4.8489996379394593e-02 0.0000000000000000e+00 0.0000000000000000e+00 1.0000000000000000e+00 5.2631578947368418e-02 2.7700831024930744e-03 1.4579384749963548e-04 1.4579384749963548e-04 -1.6594301290707462e+07
-8.1360247239318281e-03 0.0000000000000000e+00 2.8634422413720390e-01 0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00 8.6939289263681607e-02 0.0000000000000000e+00 0.0000000000000000e+00 1.0000000000000000e+00 6.3157894736842107e-02 3.9889196675900280e-03 2.5193176847937020e-04 2.5193176847937020e-04 -1.5889801689018261e+07
-2.1329811439999668e-03 0.0000000000000000e+00 1.6941872812306108e-01 0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00 1.9932443727845253e-01 0.0000000000000000e+00 0.0000000000000000e+00 1.0000000000000000e+00 7.3684210526315783e-02 5.4293628808864255e-03 4.0005831753899977e-04 4.0005831753899977e-04 -1.5171332894279920e+07
-4.7868580374672127e-04 0.0000000000000000e+00 9.7339652391691400e-03 0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00 2.1317756978596630e-01 0.0000000000000000e+00 0.0000000000000000e+00 1.0000000000000000e+00 8.4210526315789472e-02 7.0914127423822712e-03 5.9717159935850702e-04 5.9717159935850702e-04 -1.4438759648172289e+07
-9.4885576906897671e-05 0.0000000000000000e+00 -1.0342703349042752e-01 0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00 6.8989839045089055e-02 0.0000000000000000e+00 0.0000000000000000e+00 1.0000000000000000e+00 9.4736842105263161e-02 8.9750692520775624e-03 8.5026971861787448e-04 8.5026971861787448e-04 -1.3691952932217911e+07
4.6549987495046264e-04 0.0000000000000000e+00 -1.1370885399932430e-01 7.7849954623234125e-04 0.0000000000000000e+00 0.0000000000000000e+00 -4.6780737173119882e-02 -2.6406827272240662e-03 0.0000000000000000e+00 1.0000000000000000e+00 1.0526315789473684e-01 1.1080332409972297e-02 1.1663507799970839e-03 1.1663507799970839e-03 -1.2930769741141085e+07
1.8286923577271125e-01 0.0000000000000000e+00 -7.0505792246302060e-02 3.3178956057083007e-02 0.0000000000000000e+00 0.0000000000000000e+00 -6.2077599161047464e-02 -7.6128408846089957e-02 0.0000000000000000e+00 1.0000000000000000e+00 1.1578947368421053e-01 1.3407202216066482e-02 1.5524128881761190e-03 1.5524128881761190e-03 -1.2155081930344526e+07
3.2815365238396688e-01 0.0000000000000000e+00 -3.1075379328076348e-02 1.4440180014681392e-01 0.0000000000000000e+00 0.0000000000000000e+00 -3.6293517323374769e-02 -1.9992737502479768e-01 0.0000000000000000e+00 1.0000000000000000e+00 1.2631578947368421e-01 1.5955678670360112e-02 2.0154541478349616e-03 2.0154541478349616e-03 -1.1364771600352474e+07
1.0472734947167857e-01 0.0000000000000000e+00 -1.0670392954156069e-02 3.0145227582186795e-01 0.0000000000000000e+00 0.0000000000000000e+00 -1.4579521804191179e-02 -2.0462195254207211e-01 0.0000000000000000e+00 1.0000000000000000e+00 1.3684210526315790e-01 1.8725761772853188e-02 2.5624726636535944e-03 2.5624726636535944e-03 -1.0559703577307537e+07
-6.5414735178786182e-02 0.0000000000000000e+00 -3.0133373947628865e-03 4.0816930917890015e-01 0.0000000000000000e+00 0.0000000000000000e+00 -4.5547919780519388e-03 -5.7811109467925295e-02 0.0000000000000000e+00 1.0000000000000000e+00 1.4736842105263157e-01 2.1717451523545702e-02 3.2004665403119982e-03 3.2004665403119982e-03 -9.7397521797204204e+06
-8.1039296440989866e-02 0.0000000000000000e+00 -7.2674473172627254e-04 4.1235355967765930e-01 0.0000000000000000e+00 0.0000000000000000e+00 -1.1786336808960263e-03 4.9499274042558168e-02 0.0000000000000000e+00 1.0000000000000000e+00 1.5789473684210525e-01 2.4930747922437671e-02 3.9364338824901587e-03 3.9364338824901587e-03 -8.9047886006812230e+06
-4.4087905632133433e-02 0.0000000000000000e+00 -1.5386856254847681e-04 3.6716665813617538e-01 0.0000000000000000e+00 0.0000000000000000e+00 -2.6278123722066446e-04 6.1468774353944371e-02 0.0000000000000000e+00 1.0000000000000000e+00 1.6842105263157894e-01 2.8365650969529085e-02 4.7773727948680561e-03 4.7773727948680561e-03 -8.0546743582526855e+06
-1.6550422413122053e-02 0.0000000000000000e+00 -2.9267163684834725e-05 3.2392301620679653e-01 0.0000000000000000e+00 0.0000000000000000e+00 -5.2002150652398645e-05 4.9847666173530675e-02 0.0000000000000000e+00 1.0000000000000000e+00 1.7894736842105263e-01 3.2022160664819943e-02 5.7302813821256742e-03 5.7302813821256742e-03 -7.1892840764447525e+06
-4.8645759144098875e-03 0.0000000000000000e+00 -5.0502559169199467e-06 2.4859549248739221e-01 0.0000000000000000e+00 0.0000000000000000e+00 -9.2540988607474148e-06 1.3286328271100897e-01 0.0000000000000000e+00 1.0000000000000000e+00 1.8947368421052632e-01 3.5900277008310250e-02 6.8021577489429958e-03 6.8021577489429958e-03 -6.3084860786178336e+06
-1.1930031026986410e-03 0.0000000000000000e+00 -7.5235207367785176e-07 1.0653571529948751e-01 0.0000000000000000e+00 0.0000000000000000e+00 -1.4109555146948659e-06 2.2584550280153473e-01 0.0000000000000000e+00 1.0000000000000000e+00 2.0000000000000001e-01 4.0000000000000008e-02 8.0000000000000019e-03 8.0000000000000019e-03 -5.4121575398616791e+06
-2.5396202412826985e-04 0.0000000000000000e+00 0.0000000000000000e+00 -4.6832193791313273e-02 0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00 1.6614010275431002e-01 0.0000000000000000e+00 1.0000000000000000e+00 2.1052631578947367e-01 4.4321329639889190e-02 9.3308062399766710e-03 9.3308062399766710e-03 -4.5001672449266911e+06
-4.8220518125413127e-05 1.7972165294585549e-07 0.0000000000000000e+00 -1.1871343983470048e-01 0.0000000000000000e+00 -7.0891758549395515e-07 0.0000000000000000e+00 1.0091095488977575e-02 0.0000000000000000e+00 1.0000000000000000e+00 2.2105263157894736e-01 4.8864265927977837e-02 1.0801574573552995e-02 1.0801574573552995e-02 -3.5723705129204541e+06
2.4169016536933870e-02 5.9438222395961854e-03 0.0000000000000000e+00 -9.8735297547960704e-02 0.0000000000000000e+00 -1.7549188213126270e-02 0.0000000000000000e+00 -6.2455637372362927e-02 0.0000000000000000e+00 1.0000000000000000e+00 2.3157894736842105e-01 5.3628808864265930e-02 1.2419303105408952e-02 1.2419303105408952e-02 -2.6286213762276992e+06
2.9724155612227232e-01 6.7859854151592250e-02 0.0000000000000000e+00 -5.2588926725789020e-02 0.0000000000000000e+00 -1.2825364543031706e-01 0.0000000000000000e+00 -5.3106046223313087e-02 0.0000000000000000e+00 1.0000000000000000e+00 2.4210526315789474e-01 5.8614958448753467e-02 1.4190989940224523e-02 1.4190989940224523e-02 -1.6688142544999197e+06
2.5803024806313540e-01 2.0702131854989700e-01 0.0000000000000000e+00 -2.0806537629376717e-02 0.0000000000000000e+00 -2.2538113774670057e-01 0.0000000000000000e+00 -2.6119971738518863e-02 0.0000000000000000e+00 1.0000000000000000e+00 2.5263157894736843e-01 6.3822714681440448e-02 1.6123633182679693e-02 1.6123633182679693e-02 -6.9281623238137364e+05
1.5480751995147656e-02 3.5555300272627366e-01 0.0000000000000000e+00 -6.5717408111991144e-03 0.0000000000000000e+00 -1.5615934487315666e-01 0.0000000000000000e+00 -9.3946798718158985e-03 0.0000000000000000e+00 1.0000000000000000e+00 2.6315789473684209e-01 6.9252077562326861e-02 1.8224230937454435e-02 1.8224230937454435e-02 2.9950556338893622e+05
-8.5092846828545440e-02 4.1993036162424330e-01 0.0000000000000000e+00 -1.7354377274918880e-03 0.0000000000000000e+00 -1.0730035819227872e-03 0.0000000000000000e+00 -2.7056969420893432e-03 0.0000000000000000e+00 1.0000000000000000e+00 2.7368421052631581e-01 7.4903047091412753e-02 2.0499781309228755e-02 2.0499781309228755e-02 1.3082802235329971e+06
-6.7344467282778389e-02 3.9648730685263067e-01 0.0000000000000000e+00 -3.9627411936631822e-04 0.0000000000000000e+00 6.4795239339487262e-02 0.0000000000000000e+00 -6.5730654337462586e-04 0.0000000000000000e+00 1.0000000000000000e+00 2.8421052631578947e-01 8.0775623268698055e-02 2.2957282402682605e-02 2.2957282402682605e-02 2.3336263527088910e+06
-3.0859121798647051e-02 3.4988988278635952e-01 0.0000000000000000e+00 -8.0173402304067263e-05 0.0000000000000000e+00 5.1950746816847776e-02 0.0000000000000000e+00 -1.3927740229629913e-04 0.0000000000000000e+00 1.0000000000000000e+00 2.9473684210526313e-01 8.6869806094182808e-02 2.5603732322495985e-02 2.5603732322495985e-02 3.3756763413345665e+06
-1.0398603799754021e-02 3.0094602992319508e-01 0.0000000000000000e+00 -1.4577289820257387e-05 0.0000000000000000e+00 6.9200680125590358e-02 0.0000000000000000e+00 -2.6236710696081891e-05 0.0000000000000000e+00 1.0000000000000000e+00 3.0526315789473685e-01 9.3185595567867041e-02 2.8446129173348884e-02 2.8446129173348884e-02 4.4345758976545855e+06
-2.8261891016884526e-03 1.9842544809308393e-01 0.0000000000000000e+00 -2.2638139954898488e-06 0.0000000000000000e+00 1.7957629818392778e-01 0.0000000000000000e+00 -4.1828523652043875e-06 0.0000000000000000e+00 1.0000000000000000e+00 3.1578947368421051e-01 9.9722991689750684e-02 3.1491471059921269e-02 3.1491471059921269e-02 5.5104500529150888e+06
-6.5240578331303358e-04 4.1333332231519290e-02 0.0000000000000000e+00 -3.5600442327298116e-07 0.0000000000000000e+00 2.2566882506564701e-01 0.0000000000000000e+00 -6.7365927127104966e-07 0.0000000000000000e+00 1.0000000000000000e+00 3.2631578947368423e-01 1.0648199445983381e-01 3.4746756086893135e-02 3.4746756086893135e-02 6.6034118503785804e+06
-1.3230474301884534e-04 -8.9013361783352618e-02 0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00 1.0200288654363485e-01 0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00 1.0000000000000000e+00 3.3684210526315789e-01 1.1346260387811634e-01 3.8218982358944449e-02 3.8218982358944449e-02 7.7135919904695749e+06
-2.2763919031965966e-05 -1.1834739943082154e-01 0.0000000000000000e+00 0.0000000000000000e+00 1.5963017835031278e-04 -3.2496472178913285e-02 0.0000000000000000e+00 0.0000000000000000e+00 -5.7534210508715421e-04 1.0000000000000000e+00 3.4736842105263160e-01 1.2066481994459835e-01 4.1915147980755220e-02 4.1915147980755220e-02 8.8411533235496357e+06
1.1766933934135151e-01 -8.0054891556056232e-02 0.0000000000000000e+00 0.0000000000000000e+00 2.0909077918752295e-02 -6.4544370785068139e-02 0.0000000000000000e+00 0.0000000000000000e+00 -5.2504663312063840e-02 1.0000000000000000e+00 3.5789473684210527e-01 1.2808864265927977e-01 4.5842251057005394e-02 4.5842251057005394e-02 9.9862234019981027e+06
3.3915151685855366e-01 -3.7427612913223690e-02 0.0000000000000000e+00 0.0000000000000000e+00 1.1591503626231370e-01 -4.1890231978558873e-02 0.0000000000000000e+00 0.0000000000000000e+00 -1.7866822805253882e-01 1.0000000000000000e+00 3.6842105263157893e-01 1.3573407202216065e-01 5.0007289692374973e-02 5.0007289692374973e-02 1.1148932302210726e+07
1.5626897220672240e-01 -1.3441300945223146e-02 0.0000000000000000e+00 0.0000000000000000e+00 2.7102589539944072e-01 -1.7899992088040828e-02 0.0000000000000000e+00 0.0000000000000000e+00 -2.1937554065255879e-01 1.0000000000000000e+00 3.7894736842105264e-01 1.4360110803324100e-01 5.4417261991543966e-02 5.4417261991543966e-02 1.2329372176345021e+07
-4.5889761416029337e-02 -3.9337519630671123e-03 0.0000000000000000e+00 0.0000000000000000e+00 3.9533355560952921e-01 -5.8445454285107907e-03 0.0000000000000000e+00 0.0000000000000000e+00 -9.1202571282810271e-02 1.0000000000000000e+00 3.8947368421052631e-01 1.5168975069252078e-01 5.9079166059192299e-02 5.9079166059192299e-02 1.3527680857389219e+07
-8.5485466293479639e-02 -9.7667774440200706e-04 0.0000000000000000e+00 0.0000000000000000e+00 4.1742655305133147e-01 -1.5645812979903574e-03 0.0000000000000000e+00 0.0000000000000000e+00 3.6644079357620526e-02 1.0000000000000000e+00 4.0000000000000002e-01 1.6000000000000003e-01 6.4000000000000015e-02 6.4000000000000015e-02 1.4744013954299904e+07
-5.1613987307140528e-02 -2.1180309163878829e-04 0.0000000000000000e+00 0.0000000000000000e+00 3.7677806853318929e-01 -3.5839960107434895e-04 0.0000000000000000e+00 0.0000000000000000e+00 6.5354184877135096e-02 1.0000000000000000e+00 4.1052631578947368e-01 1.6853185595567868e-01 6.9186761918647033e-02 6.9186761918647033e-02 1.5978502905917309e+07
-2.0586709118519942e-02 -4.1107072568475722e-05 0.0000000000000000e+00 0.0000000000000000e+00 3.3324698528225988e-01 -7.2519177339011736e-05 0.0000000000000000e+00 0.0000000000000000e+00 4.7509924533272796e-02 1.0000000000000000e+00 4.2105263157894735e-01 1.7728531855955676e-01 7.4646449919813368e-02 7.4646449919813368e-02 1.7231268692723721e+07
-6.3149871015004810e-03 -7.2140811359126060e-06 0.0000000000000000e+00 0.0000000000000000e+00 2.6902481763940428e-01 -1.3143774611327210e-05 0.0000000000000000e+00 0.0000000000000000e+00 1.0882962004244967e-01 1.0000000000000000e+00 4.3157894736842106e-01 1.8626038781163437e-01 8.0386062108179043e-02 8.0386062108179043e-02 1.8502441721045829e+07
-1.5998995910758370e-03 -1.0892662263073478e-06 0.0000000000000000e+00 0.0000000000000000e+00 1.3858651749072567e-01 -2.0331088554603621e-06 0.0000000000000000e+00 0.0000000000000000e+00 2.1519494147264256e-01 1.0000000000000000e+00 4.4210526315789472e-01 1.9545706371191135e-01 8.6412596588423957e-02 8.6412596588423957e-02 1.9792148664706565e+07
-3.4949313513852877e-04 0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00 -1.9968098640650647e-02 0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00 1.9284674322244838e-01 1.0000000000000000e+00 4.5263157894736844e-01 2.0487534626038784e-01 9.2733051465228172e-02 9.2733051465228172e-02 2.1100520328300230e+07
-6.7774195098072875e-05 0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00 -1.1325358183759204e-01 0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00 3.7893132438445457e-02 1.0000000000000000e+00 4.6315789473684210e-01 2.1451523545706372e-01 9.9354424843271616e-02 9.9354424843271616e-02 2.2427701408503219e+07
5.6890986261887112e-03 2.4632390461375082e-03 0.0000000000000000e+00 0.0000000000000000e+00 -1.0696208513639913e-01 -7.8171058742844777e-03 0.0000000000000000e+00 0.0000000000000000e+00 -5.6615568209656034e-02 1.0000000000000000e+00 4.7368421052631576e-01 2.2437673130193903e-01 1.0628371482723427e-01 1.0628371482723427e-01 2.3773807935681403e+07
2.4618481081097995e-01 4.8820311055440317e-02 0.0000000000000000e+00 0.0000000000000000e+00 -6.1272673063231448e-02 -1.0183399555242575e-01 0.0000000000000000e+00 0.0000000000000000e+00 -5.8086735980862475e-02 1.0000000000000000e+00 4.8421052631578948e-01 2.3445983379501387e-01 1.1352791952179618e-01 1.1352791952179618e-01 2.5138965494905464e+07
2.9953964554960644e-01 1.7507033112252593e-01 0.0000000000000000e+00 0.0000000000000000e+00 -2.5547441406802145e-02 -2.1612831441918828e-01 0.0000000000000000e+00 0.0000000000000000e+00 -3.0993666067196276e-02 1.0000000000000000e+00 4.9473684210526314e-01 2.4476454293628808e-01 1.2109403703163725e-01 1.2109403703163725e-01 2.6523310263039686e+07
5.7109131744450661e-02 3.2987495754170404e-01 0.0000000000000000e+00 0.0000000000000000e+00 -8.4051340693276771e-03 -1.8320362885086547e-01 0.0000000000000000e+00 0.0000000000000000e+00 -1.1758010932446473e-02 1.0000000000000000e+00 5.0526315789473686e-01 2.5529085872576179e-01 1.2898906546143754e-01 1.2898906546143754e-01 2.7926978180642635e+07
-7.8184230485700307e-02 4.1619910711870090e-01 0.0000000000000000e+00 0.0000000000000000e+00 -2.2938316718133949e-03 -2.7189893305956774e-02 0.0000000000000000e+00 0.0000000000000000e+00 -3.5232574314479738e-03 1.0000000000000000e+00 5.1578947368421058e-01 2.6603878116343493e-01 1.3722000291587699e-01 1.3722000291587699e-01 2.9350100892358869e+07
-7.4737010936004322e-02 4.0522771221040943e-01 0.0000000000000000e+00 0.0000000000000000e+00 -5.3798391262850328e-04 5.8945294122267095e-02 0.0000000000000000e+00 0.0000000000000000e+00 -8.8267150904121708e-04 1.0000000000000000e+00 5.2631578947368418e-01 2.7700831024930744e-01 1.4579384749963548e-01 1.4579384749963548e-01 3.0792791682040647e+07
-3.7125986784173719e-02 3.5823603012410249e-01 0.0000000000000000e+00 0.0000000000000000e+00 -1.1129983958096788e-04 5.6583906662152982e-02 0.0000000000000000e+00 0.0000000000000000e+00 -1.9175152461996296e-04 1.0000000000000000e+00 5.3684210526315790e-01 2.8819944598337949e-01 1.5471759731739321e-01 1.5471759731739321e-01 3.2255188890072003e+07
-1.3178061904414793e-02 3.1330364005523526e-01 0.0000000000000000e+00 0.0000000000000000e+00 -2.0616437121920034e-05 5.6795794533862881e-02 0.0000000000000000e+00 0.0000000000000000e+00 -3.6864262677767610e-05 1.0000000000000000e+00 5.4736842105263162e-01 2.9961218836565101e-01 1.6399825047383004e-01 1.6399825047383004e-01 3.3737424698033422e+07
-3.7205065395061786e-03 2.2498345090173064e-01 0.0000000000000000e+00 0.0000000000000000e+00 -3.5244398305267092e-06 1.5701149313902130e-01 0.0000000000000000e+00 0.0000000000000000e+00 -6.4936311517650639e-06 1.0000000000000000e+00 5.5789473684210522e-01 3.1124653739612185e-01 1.7364280507362584e-01 1.7364280507362584e-01 3.5239632326350734e+07
-8.8460183856306875e-04 7.3882604017788261e-02 0.0000000000000000e+00 0.0000000000000000e+00 -5.1822314315757625e-07 2.2974705727631178e-01 0.0000000000000000e+00 0.0000000000000000e+00 -9.7632309208494808e-07 1.0000000000000000e+00 5.6842105263157894e-01 3.2310249307479222e-01 1.8365825922146084e-01 1.8365825922146084e-01 3.6761934819881216e+07
-1.8370553324891862e-04 -7.0043906737830369e-02 0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00 1.3510500812682708e-01 0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00 1.0000000000000000e+00 5.7894736842105265e-01 3.3518005540166207e-01 1.9405161102201490e-01 1.9405161102201490e-01 3.8304448683193475e+07
-3.4179300052400812e-05 -1.2022600095442171e-01 0.0000000000000000e+00 1.4728700311903286e-05 0.0000000000000000e+00 -1.3519940771543261e-02 0.0000000000000000e+00 -5.5984441974886700e-05 0.0000000000000000e+00 1.0000000000000000e+00 5.8947368421052626e-01 3.4747922437673123e-01 2.0482985857996788e-01 2.0482985857996788e-01 3.9867328474246711e+07
6.1977227981245921e-02 -8.9600939014954029e-02 0.0000000000000000e+00 1.1919836474499302e-02 0.0000000000000000e+00 -6.4884679517546728e-02 0.0000000000000000e+00 -3.2555493185429440e-02 0.0000000000000000e+00 1.0000000000000000e+00 5.9999999999999998e-01 3.5999999999999999e-01 2.1599999999999997e-01 2.1599999999999997e-01 4.1450727994291037e+07
3.2914194698808724e-01 -4.4610650244984916e-02 0.0000000000000000e+00 9.0274941429971764e-02 0.0000000000000000e+00 -4.7586436324094594e-02 0.0000000000000000e+00 -1.5426326998121220e-01 0.0000000000000000e+00 1.0000000000000000e+00 6.1052631578947369e-01 3.7274238227146816e-01 2.2756903338679108e-01 2.2756903338679108e-01 4.3054756450479880e+07
2.0869631953634510e-01 -1.6794634932976529e-02 0.0000000000000000e+00 2.3929665270590575e-01 0.0000000000000000e+00 -2.1745661626648261e-02 0.0000000000000000e+00 -2.2651021459888929e-01 0.0000000000000000e+00 1.0000000000000000e+00 6.2105263157894741e-01 3.8570637119113577e-01 2.3954395684502119e-01 2.3954395684502119e-01 4.4679530292131960e+07
-1.8988164869436275e-02 -5.1018176378398936e-03 0.0000000000000000e+00 3.7764046922164829e-01 0.0000000000000000e+00 -7.4407761527462875e-03 0.0000000000000000e+00 -1.2483558646645473e-01 0.0000000000000000e+00 1.0000000000000000e+00 6.3157894736842102e-01 3.9889196675900274e-01 2.5193176847937016e-01 2.5193176847937016e-01 4.6325184495605588e+07
-8.7172921748424712e-02 -1.3055419048209041e-03 0.0000000000000000e+00 4.2010747575335811e-01 0.0000000000000000e+00 -2.0641445077479018e-03 0.0000000000000000e+00 2.0039325175152698e-02 0.0000000000000000e+00 1.0000000000000000e+00 6.4210526315789473e-01 4.1229916897506924e-01 2.6473946639451817e-01 2.6473946639451817e-01 4.7991846162983879e+07
-5.9479346491040953e-02 -2.9049537695812611e-04 0.0000000000000000e+00 3.8676128528650705e-01 0.0000000000000000e+00 -4.8684122664065948e-04 0.0000000000000000e+00 6.6822940192205887e-02 0.0000000000000000e+00 1.0000000000000000e+00 6.5263157894736845e-01 4.2592797783933523e-01 2.7797404869514508e-01 2.7797404869514508e-01 4.9679642121459484e+07
-2.5344894340844023e-02 -5.7519918573519355e-05 0.0000000000000000e+00 3.4173267838996024e-01 0.0000000000000000e+00 -1.0071519927700912e-04 0.0000000000000000e+00 4.8590711578671605e-02 0.0000000000000000e+00 1.0000000000000000e+00 6.6315789473684206e-01 4.3977839335180047e-01 2.9164251348593084e-01 2.9164251348593084e-01 5.1388713588894203e+07
-8.1360247239318281e-03 -1.0271890012879138e-05 0.0000000000000000e+00 2.8635449602721674e-01 0.0000000000000000e+00 -1.8603553893817027e-05 0.0000000000000000e+00 8.6957892817575433e-02 0.0000000000000000e+00 1.0000000000000000e+00 6.7368421052631577e-01 4.5385041551246535e-01 3.0575185887155559e-01 3.0575185887155559e-01 5.3119190209192812e+07
-2.1329811439999668e-03 -1.5726052562367482e-06 0.0000000000000000e+00 1.6942030072831732e-01 0.0000000000000000e+00 -2.9207518307516431e-06 0.0000000000000000e+00 1.9932735803028329e-01 0.0000000000000000e+00 1.0000000000000000e+00 6.8421052631578949e-01 4.6814404432132967e-01 3.2030908295669924e-01 3.2030908295669924e-01 5.4871193086425975e+07
-4.7868580374672127e-04 -2.4393255686410929e-07 0.0000000000000000e+00 9.7342091717260160e-03 0.0000000000000000e+00 -4.6354368728304449e-07 0.0000000000000000e+00 2.1317803332965357e-01 0.0000000000000000e+00 1.0000000000000000e+00 6.9473684210526321e-01 4.8265927977839340e-01 3.3532118384604176e-01 3.3532118384604176e-01 5.6644850124556869e+07
-9.4885576906897671e-05 0.0000000000000000e+00 0.0000000000000000e+00 -1.0342703349042752e-01 0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00 6.8989839045089055e-02 0.0000000000000000e+00 1.0000000000000000e+00 7.0526315789473681e-01 4.9739612188365645e-01 3.5079515964426300e-01 3.5079515964426300e-01 5.8440306078570858e+07
4.6549987495046264e-04 0.0000000000000000e+00 7.7849954623234125e-04 -1.1370885399932430e-01 0.0000000000000000e+00 0.0000000000000000e+00 -2.6406827272240662e-03 -4.6780737173119882e-02 0.0000000000000000e+00 1.0000000000000000e+00 7.1578947368421053e-01 5.1235457063711909e-01 3.6673800845604315e-01 3.6673800845604315e-01 6.0257698738376126e+07
1.8286923577271125e-01 0.0000000000000000e+00 3.3178956057083007e-02 -7.0505792246302060e-02 0.0000000000000000e+00 0.0000000000000000e+00 -7.6128408846089957e-02 -6.2077599161047464e-02 0.0000000000000000e+00 1.0000000000000000e+00 7.2631578947368425e-01 5.2753462603878121e-01 3.8315672838606218e-01 3.8315672838606218e-01 6.2097159447418898e+07
3.2815365238396688e-01 0.0000000000000000e+00 1.4440180014681392e-01 -3.1075379328076348e-02 0.0000000000000000e+00 0.0000000000000000e+00 -1.9992737502479768e-01 -3.6293517323374769e-02 0.0000000000000000e+00 1.0000000000000000e+00 7.3684210526315785e-01 5.4293628808864258e-01 4.0005831753899979e-01 4.0005831753899979e-01 6.3958803792432591e+07
1.0472734947167857e-01 0.0000000000000000e+00 3.0145227582186795e-01 -1.0670392954156069e-02 0.0000000000000000e+00 0.0000000000000000e+00 -2.0462195254207211e-01 -1.4579521804191179e-02 0.0000000000000000e+00 1.0000000000000000e+00 7.4736842105263157e-01 5.5855955678670355e-01 4.1744977401953637e-01 4.1744977401953637e-01 6.5842753707203761e+07
-6.5414735178786182e-02 0.0000000000000000e+00 4.0816930917890015e-01 -3.0133373947628865e-03 0.0000000000000000e+00 0.0000000000000000e+00 -5.7811109467925295e-02 -4.5547919780519388e-03 0.0000000000000000e+00 1.0000000000000000e+00 7.5789473684210529e-01 5.7440443213296399e-01 4.3533809593235173e-01 4.3533809593235173e-01 6.7749140587336496e+07
-8.1039296440989866e-02 0.0000000000000000e+00 4.1235355967765930e-01 -7.2674473172627254e-04 0.0000000000000000e+00 0.0000000000000000e+00 4.9499274042558168e-02 -1.1786336808960263e-03 0.0000000000000000e+00 1.0000000000000000e+00 7.6842105263157889e-01 5.9047091412742370e-01 4.5373028138212557e-01 4.5373028138212557e-01 6.9678108008850276e+07
-4.4087905632133433e-02 0.0000000000000000e+00 3.6716665813617538e-01 -1.5386856254847681e-04 0.0000000000000000e+00 0.0000000000000000e+00 6.1468774353944371e-02 -2.6278123722066446e-04 0.0000000000000000e+00 1.0000000000000000e+00 7.7894736842105261e-01 6.0675900277008310e-01 4.7263332847353839e-01 4.7263332847353839e-01 7.1629787035486892e+07
-1.6550422413122053e-02 0.0000000000000000e+00 3.2392301620679653e-01 -2.9267163684834725e-05 0.0000000000000000e+00 0.0000000000000000e+00 4.9847666173530675e-02 -5.2002150652398645e-05 0.0000000000000000e+00 1.0000000000000000e+00 7.8947368421052633e-01 6.2326869806094187e-01 4.9205423531126991e-01 4.9205423531126991e-01 7.3604299627690956e+07
-4.8645759144098875e-03 0.0000000000000000e+00 2.4859549248739221e-01 -5.0502559169199467e-06 0.0000000000000000e+00 0.0000000000000000e+00 1.3286328271100897e-01 -9.2540988607474148e-06 0.0000000000000000e+00 1.0000000000000000e+00 8.0000000000000004e-01 6.4000000000000012e-01 5.1200000000000012e-01 5.1200000000000012e-01 7.5601781557119071e+07
-1.1930031026986410e-03 0.0000000000000000e+00 1.0653571529948751e-01 -7.5235207367785176e-07 0.0000000000000000e+00 0.0000000000000000e+00 2.2584550280153473e-01 -1.4109555146948659e-06 0.0000000000000000e+00 1.0000000000000000e+00 8.1052631578947365e-01 6.5695290858725752e-01 5.3247762064440873e-01 5.3247762064440873e-01 7.7622359293640539e+07
-2.5396202412826985e-04 0.0000000000000000e+00 -4.6832193791313273e-02 0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00 1.6614010275431002e-01 0.0000000000000000e+00 0.0000000000000000e+00 1.0000000000000000e+00 8.2105263157894737e-01 6.7412742382271473e-01 5.5349409534917626e-01 5.5349409534917626e-01 7.9666160208791256e+07
-4.8220518125413127e-05 0.0000000000000000e+00 -1.1871343983470048e-01 0.0000000000000000e+00 1.7972165294585549e-07 0.0000000000000000e+00 1.0091095488977575e-02 0.0000000000000000e+00 -7.0891758549395515e-07 1.0000000000000000e+00 8.3157894736842108e-01 6.9152354570637120e-01 5.7505642221898245e-01 5.7505642221898245e-01 8.1733321542241797e+07
2.4169016536933870e-02 0.0000000000000000e+00 -9.8735297547960704e-02 0.0000000000000000e+00 5.9438222395961854e-03 0.0000000000000000e+00 -6.2455637372362927e-02 0.0000000000000000e+00 -1.7549188213126270e-02 1.0000000000000000e+00 8.4210526315789469e-01 7.0914127423822704e-01 5.9717159935850694e-01 5.9717159935850694e-01 8.3823983214631885e+07
2.9724155612227232e-01 0.0000000000000000e+00 -5.2588926725789020e-02 0.0000000000000000e+00 6.7859854151592250e-02 0.0000000000000000e+00 -5.3106046223313087e-02 0.0000000000000000e+00 -1.2825364543031706e-01 1.0000000000000000e+00 8.5263157894736841e-01 7.2698060941828258e-01 6.1984662487243036e-01 6.1984662487243036e-01 8.5938269240893766e+07
2.5803024806313540e-01 0.0000000000000000e+00 -2.0806537629376717e-02 0.0000000000000000e+00 2.0702131854989700e-01 0.0000000000000000e+00 -2.6119971738518863e-02 0.0000000000000000e+00 -2.2538113774670057e-01 1.0000000000000000e+00 8.6315789473684212e-01 7.4504155124653748e-01 6.4308849686543235e-01 6.4308849686543235e-01 8.8076292616174176e+07
1.5480751995147656e-02 0.0000000000000000e+00 -6.5717408111991144e-03 0.0000000000000000e+00 3.5555300272627366e-01 0.0000000000000000e+00 -9.3946798718158985e-03 0.0000000000000000e+00 -1.5615934487315666e-01 1.0000000000000000e+00 8.7368421052631584e-01 7.6332409972299176e-01 6.6690421344219286e-01 6.6690421344219286e-01 9.0238174707291275e+07
-8.5092846828545440e-02 0.0000000000000000e+00 -1.7354377274918880e-03 0.0000000000000000e+00 4.1993036162424330e-01 0.0000000000000000e+00 -2.7056969420893432e-03 0.0000000000000000e+00 -1.0730035819227872e-03 1.0000000000000000e+00 8.8421052631578945e-01 7.8182825484764540e-01 6.9130077270739165e-01 6.9130077270739165e-01 9.2424069093082637e+07
-6.7344467282778389e-02 0.0000000000000000e+00 -3.9627411936631822e-04 0.0000000000000000e+00 3.9648730685263067e-01 0.0000000000000000e+00 -6.5730654337462586e-04 0.0000000000000000e+00 6.4795239339487262e-02 1.0000000000000000e+00 8.9473684210526316e-01 8.0055401662049863e-01 7.1628517276570935e-01 7.1628517276570935e-01 9.4634114440171003e+07
-3.0859121798647051e-02 0.0000000000000000e+00 -8.0173402304067263e-05 0.0000000000000000e+00 3.4988988278635952e-01 0.0000000000000000e+00 -1.3927740229629913e-04 0.0000000000000000e+00 5.1950746816847776e-02 1.0000000000000000e+00 9.0526315789473688e-01 8.1950138504155134e-01 7.4186441172182538e-01 7.4186441172182538e-01 9.6868439125350088e+07
-1.0398603799754021e-02 0.0000000000000000e+00 -1.4577289820257387e-05 0.0000000000000000e+00 3.0094602992319508e-01 0.0000000000000000e+00 -2.6236710696081891e-05 0.0000000000000000e+00 6.9200680125590358e-02 1.0000000000000000e+00 9.1578947368421049e-01 8.3867036011080320e-01 7.6804548768041980e-01 7.6804548768041980e-01 9.9127162507476807e+07
-2.8261891016884526e-03 0.0000000000000000e+00 -2.2638139954898488e-06 0.0000000000000000e+00 1.9842544809308393e-01 0.0000000000000000e+00 -4.1828523652043875e-06 0.0000000000000000e+00 1.7957629818392778e-01 1.0000000000000000e+00 9.2631578947368420e-01 8.5806094182825488e-01 7.9483539874617293e-01 7.9483539874617293e-01 1.0141041149911594e+08
-6.5240578331303358e-04 0.0000000000000000e+00 -3.5600442327298116e-07 0.0000000000000000e+00 4.1333332231519290e-02 0.0000000000000000e+00 -6.7365927127104966e-07 0.0000000000000000e+00 2.2566882506564701e-01 1.0000000000000000e+00 9.3684210526315792e-01 8.7767313019390591e-01 8.2224114302376450e-01 8.2224114302376450e-01 1.0371832633873191e+08
-1.3230474301884534e-04 0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00 -8.9013361783352618e-02 0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00 1.0200288654363485e-01 1.0000000000000000e+00 9.4736842105263153e-01 8.9750692520775610e-01 8.5026971861787415e-01 8.5026971861787415e-01 1.0605103777870619e+08
-2.4138787293590782e-05 0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00 -1.1834739943082154e-01 0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00 -3.2496472178913285e-02 1.0000000000000000e+00 9.5789473684210524e-01 9.1756232686980610e-01 8.7892812363318262e-01 8.7892812363318262e-01 1.0840866954560232e+08
-4.0326882503041127e-06 0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00 -8.0054891556056232e-02 0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00 -6.4544370785068139e-02 1.0000000000000000e+00 9.6842105263157896e-01 9.3783933518005547e-01 9.0822335617436944e-01 9.0822335617436944e-01 1.1079135118806735e+08
-6.2527588338813119e-07 0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00 -3.7427612913223690e-02 0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00 -4.1890231978558873e-02 1.0000000000000000e+00 9.7894736842105268e-01 9.5833795013850420e-01 9.3816241434611469e-01 9.3816241434611469e-01 1.1319921057647294e+08
-9.0937220696952162e-08 0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00 -1.3441300945223146e-02 0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00 -1.7899992088040828e-02 1.0000000000000000e+00 9.8947368421052628e-01 9.7905817174515231e-01 9.6875229625309800e-01 9.6875229625309800e-01 1.1563238092136064e+08
-1.2510669177246911e-08 0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00 -3.9337519630671123e-03 0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00 -5.8445454285107907e-03 1.0000000000000000e+00 1.0000000000000000e+00 1.0000000000000000e+00 1.0000000000000000e+00 1.0000000000000000e+00 1.1809099716453132e+08
""",
'event':"""0.0000000000000000e+00 1.7972165294585549e-07 0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00 -7.0891758549395515e-07 0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00 1.0000000000000000e+00 -1.0000000000000000e+00 1.0000000000000000e+00 -1.0000000000000000e+00 -0.0000000000000000e+00 -3.9964378119605467e+07
2.4177346180074877e-02 5.9438222395961854e-03 0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00 -1.7549188213126270e-02 0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00 1.0000000000000000e+00 -9.8947368421052628e-01 9.7905817174515231e-01 -9.6875229625309800e-01 -0.0000000000000000e+00 -3.9964295632754065e+07
2.9724288489710671e-01 6.6691877390710710e-02 0.0000000000000000e+00 1.1679767608815418e-03 0.0000000000000000e+00 -1.2437298949140786e-01 0.0000000000000000e+00 -3.8806559389091872e-03 0.0000000000000000e+00 1.0000000000000000e+00 -9.7894736842105268e-01 9.5833795013850420e-01 -9.3816241434611469e-01 -0.0000000000000000e+00 -3.9964008717195466e+07
2.5803044610535053e-01 1.6919887664583222e-01 0.0000000000000000e+00 3.7738862474685671e-02 8.3579429379114777e-05 -1.4115599824277203e-01 0.0000000000000000e+00 -8.3918434149865523e-02 -3.0670535406300835e-04 1.0000000000000000e+00 -9.6842105263157896e-01 9.3783933518005547e-01 -9.0822335617436944e-01 -0.0000000000000000e+00 -3.9963382251577556e+07
1.5480779827902719e-02 2.0115527889004484e-01 1.7972165294585549e-07 1.3671413001462854e-01 1.7683414099947323e-02 5.0427887160567904e-02 -7.0891758549395515e-07 -1.6094050000148924e-01 -4.5646023114649811e-02 1.0000000000000000e+00 -9.5789473684210524e-01 9.1756232686980610e-01 -8.7892812363318262e-01 -0.0000000000000000e+00 -3.9962264238541424e+07
-8.5092843112443153e-02 1.0826831140941834e-01 5.9438222395961854e-03 2.0428516922367393e-01 1.0143305875155482e-01 1.9784723228238235e-01 -1.7549188213126270e-02 -2.7801642355785985e-02 -1.5356940529539287e-01 1.0000000000000000e+00 -9.4736842105263153e-01 8.9750692520775610e-01 -8.5026971861787415e-01 -0.0000000000000000e+00 -3.9960523102045782e+07
-6.7344467282778389e-02 -1.5192826604774678e-02 6.6691877390710710e-02 1.5192563333963566e-01 1.9306262272705899e-01 1.1100114773388517e-01 -1.2437298949140786e-01 1.7226230721222197e-01 -9.4095226115212052e-02 1.0000000000000000e+00 -9.3684210526315792e-01 8.7767313019390591e-01 -8.2224114302376450e-01 -0.0000000000000000e+00 -3.9958030461220443e+07
-3.0859121798647051e-02 -5.9611929791310761e-02 1.6919887664583222e-01 5.7816980815322772e-02 1.8256953454589439e-01 -3.1066893522055071e-03 -1.4115599824277203e-01 7.1597287908986007e-02 1.2430944114877630e-01 1.0000000000000000e+00 -9.2631578947368420e-01 8.5806094182825488e-01 -7.9483539874617293e-01 -0.0000000000000000e+00 -3.9954647886381753e+07
-1.0398603799754021e-02 -2.7212216744210212e-02 2.0115527889004484e-01 8.2514000683269279e-02 6.2172560915691455e-02 -8.0658471772627749e-02 5.0427887160567904e-02 -1.3400726359739601e-01 1.8779179630281093e-01 1.0000000000000000e+00 -9.1578947368421049e-01 8.3867036011080320e-01 -7.6804548768041980e-01 -0.0000000000000000e+00 -3.9950228902413361e+07
-2.8261891016884526e-03 8.0209005576718964e-02 1.0826831140941834e-01 1.5740056826346380e-01 -4.0075556165366157e-02 -1.7757047085784922e-01 1.9784723228238235e-01 -7.7253408640873558e-02 6.5434351891749037e-02 1.0000000000000000e+00 -9.0526315789473688e-01 8.1950138504155134e-01 -7.4186441172182538e-01 -0.0000000000000000e+00 -3.9944630184490673e+07
-6.5240578331303358e-04 1.8671873839475647e-01 -1.5192826604774678e-02 1.8910188123201388e-01 -5.8371983911825132e-02 -1.0812787798155156e-01 1.1100114773388517e-01 2.2488780400294686e-02 -2.2042096632510385e-02 1.0000000000000000e+00 -8.9473684210526316e-01 8.0055401662049863e-01 -7.1628517276570935e-01 -0.0000000000000000e+00 -3.9937731831733234e+07
-1.3230474301884534e-04 2.1815823789069552e-01 -5.9611929791310761e-02 1.7834064442664524e-01 -3.6309461213591215e-02 3.7153406706898165e-02 -3.1066893522055071e-03 2.8801918432358464e-04 -3.3403546593305647e-02 1.0000000000000000e+00 -8.8421052631578945e-01 7.8182825484764540e-01 -6.9130077270739165e-01 -0.0000000000000000e+00 -3.9929402377448291e+07
-2.4138787293590782e-05 1.9836539362542779e-01 -2.7212216744210212e-02 1.4361396913677585e-01 -1.5388981806849873e-02 2.6005752171882255e-02 -8.0658471772627749e-02 7.2486302997612467e-02 -1.8697604146125976e-02 1.0000000000000000e+00 -8.7368421052631584e-01 7.6332409972299176e-01 -6.6690421344219286e-01 -0.0000000000000000e+00 -3.9919467169778965e+07
-4.0326882503041127e-06 1.7004278105755538e-01 8.0209005576718964e-02 5.4586736964392375e-02 -5.0386096958458283e-03 1.9894509103233610e-02 -1.7757047085784922e-01 1.6459230660036478e-01 -7.0993673209910615e-03 1.0000000000000000e+00 -8.6315789473684212e-01 7.4504155124653748e-01 -6.4308849686543235e-01 -0.0000000000000000e+00 -3.9907786847505786e+07
-6.2527588338813119e-07 1.5905651853639177e-01 1.8671873839475647e-01 -4.3729264683742025e-02 -1.3582822198243730e-03 2.9690484600776099e-02 -1.0812787798155156e-01 8.1402624180322181e-02 -2.1023200711287751e-03 1.0000000000000000e+00 -8.5263157894736841e-01 7.2698060941828258e-01 -6.1984662487243036e-01 -0.0000000000000000e+00 -3.9894241441418923e+07
-9.0937220696952162e-08 1.5296388884063916e-01 2.1815823789069552e-01 -7.0669011124139827e-02 -2.3041555027059071e-04 -1.9050509451844708e-02 3.7153406706898165e-02 -1.6936932112187754e-02 -8.2546236142740852e-04 1.0000000000000000e+00 -8.4210526315789469e-01 7.0914127423822704e-01 -5.9717159935850694e-01 -0.0000000000000000e+00 -3.9878693250274450e+07
-1.2510669177246911e-08 1.3156558369691071e-01 1.9836521390377485e-01 -4.8243261570625216e-02 1.7619353944818051e-02 5.8662420655810149e-02 2.6006461089467747e-02 -3.9900023154498407e-02 -4.5757060072061179e-02 1.0000000000000000e+00 -8.3157894736842108e-01 6.9152354570637120e-01 -5.7505642221898245e-01 -0.0000000000000000e+00 -3.9860998616045773e+07
-1.6393707816718330e-09 5.6345100753362388e-02 1.6409895881795919e-01 -2.2078293686281818e-02 1.0142127575894735e-01 1.4129609867630374e-01 3.7443697316359881e-02 -2.5355446281368068e-02 -1.5359059365486832e-01 1.0000000000000000e+00 -8.2105263157894737e-01 6.7412742382271473e-01 -5.5349409534917626e-01 -0.0000000000000000e+00 -3.9841029804515697e+07
0.0000000000000000e+00 2.1793115828660697e-02 9.2364641145681062e-02 -6.5326696738188961e-03 1.9306063649818450e-01 -4.4755355088080082e-02 1.5406347409218396e-01 -1.4349982160473887e-02 -9.4098902995715231e-02 1.0000000000000000e+00 -8.1052631578947365e-01 6.5695290858725752e-01 -5.3247762064440873e-01 -0.0000000000000000e+00 -3.9818654567553706e+07
0.0000000000000000e+00 9.8253021885668562e-02 -1.6151408375813942e-02 3.5551552001175248e-02 1.8256922374919074e-01 -1.5854644388003486e-01 1.2179878343686432e-01 -8.7221277924167465e-02 1.2430885212834518e-01 1.0000000000000000e+00 -8.0000000000000004e-01 6.4000000000000012e-01 -5.1200000000000012e-01 -0.0000000000000000e+00 -3.9793725269769594e+07
0.0000000000000000e+00 1.5285476691215422e-01 -5.1906460814839753e-02 1.3618602296187249e-01 6.2172560915691455e-02 1.0428163142958963e-02 -3.7410780701822063e-02 -1.6179738022522952e-01 1.8779179630281093e-01 1.0000000000000000e+00 -7.8947368421052633e-01 6.2326869806094187e-01 -4.9205423531126991e-01 -0.0000000000000000e+00 -3.9766099107882999e+07
0.0000000000000000e+00 9.2123146236351552e-02 4.3566025855902690e-02 2.0417342571709904e-01 -4.0075556165366157e-02 1.5492344253726803e-01 -1.9257135068834524e-01 -2.7992687684244611e-02 6.5434351891749037e-02 1.0000000000000000e+00 -7.7894736842105261e-01 6.0675900277008310e-01 -4.7263332847353839e-01 -0.0000000000000000e+00 -3.9735647326953202e+07
0.0000000000000000e+00 4.3796418122361118e-02 1.6335668776978365e-01 1.5190460181838772e-01 -5.8371983911825132e-02 -2.3844844859590569e-02 -1.2547873944576943e-01 1.7222491478578517e-01 -2.2042096632510385e-02 1.0000000000000000e+00 -7.6842105263157889e-01 5.9047091412742370e-01 -4.5373028138212557e-01 -0.0000000000000000e+00 -3.9702236464460857e+07
0.0000000000000000e+00 1.0731574615492825e-01 1.7131918900642060e-01 5.7813335883083829e-02 -3.6225881784212099e-02 -1.4725941503564760e-01 1.0971897950965598e-01 7.1590601234367446e-02 -3.3710251947368652e-02 1.0000000000000000e+00 -7.5789473684210529e-01 5.7440443213296399e-01 -4.3533809593235173e-01 -0.0000000000000000e+00 -3.9665728276365325e+07
0.0000000000000000e+00 1.5573172071478419e-01 5.8767500060305407e-02 8.2513236906691323e-02 2.2944322930974499e-03 1.4557849361264199e-02 1.8280522986076542e-01 -1.3400765344253551e-01 -6.4343627260775790e-02 1.0000000000000000e+00 -7.4736842105263157e-01 5.5855955678670355e-01 -4.1744977401953637e-01 -0.0000000000000000e+00 -3.9625947171426475e+07
0.0000000000000000e+00 9.2876336967603779e-02 -4.0940490403193273e-02 1.5145674602386761e-01 9.6394449055709000e-02 1.5610593317834112e-01 6.4060815922217310e-02 -5.9704220427747284e-02 -1.6066877261638393e-01 1.0000000000000000e+00 -7.3684210526315785e-01 5.4293628808864258e-01 -4.0005831753899979e-01 -0.0000000000000000e+00 -3.9582766803961083e+07
0.0000000000000000e+00 4.3966158171504059e-02 -5.8562755482215997e-02 1.2357798060218469e-01 1.9170434050723462e-01 -2.3561230111389816e-02 -2.2363103807147941e-02 1.4298111395279336e-01 -9.6197546186340827e-02 1.0000000000000000e+00 -7.2631578947368425e-01 5.2753462603878121e-01 -3.8315672838606218e-01 -0.0000000000000000e+00 -3.9536072560850807e+07
0.0000000000000000e+00 1.0743314279239796e-01 -3.6346923353920750e-02 4.6880630255498698e-02 1.8225553956624468e-01 -1.4750697456160872e-01 -3.3469379096026097e-02 5.7525583277230091e-02 1.2379068414141191e-01 1.0000000000000000e+00 -7.1578947368421053e-01 5.1235457063711909e-01 -3.6673800845604315e-01 -0.0000000000000000e+00 -3.9485728182394587e+07
0.0000000000000000e+00 1.7342100106436448e-01 -1.5395432111407833e-02 7.9172820261359550e-02 6.2108500760562180e-02 -3.1076518586638784e-02 -1.8710358075597799e-02 -1.3888208416444467e-01 1.8768075934539957e-01 1.0000000000000000e+00 -7.0526315789473681e-01 4.9739612188365645e-01 -3.5079515964426300e-01 -0.0000000000000000e+00 -3.9431584623730086e+07
0.0000000000000000e+00 1.8836666274578873e-01 9.0412327752404948e-04 1.5060359477864796e-01 -4.0087339157973620e-02 2.0087749204929978e-02 -2.4650588642972794e-02 -6.1056568037803557e-02 6.5413163532273583e-02 1.0000000000000000e+00 -6.9473684210526321e-01 4.8265927977839340e-01 -3.3532118384604176e-01 -0.0000000000000000e+00 -3.9373503040717266e+07
0.0000000000000000e+00 1.7033690350785233e-01 6.5333595170886341e-02 1.2222121849978677e-01 -5.7205993379818069e-02 6.7165332648059933e-03 -1.2647530956253664e-01 1.4654443959756816e-01 -2.5926429451922754e-02 1.0000000000000000e+00 -6.8421052631578949e-01 4.6814404432132967e-01 -3.2030908295669924e-01 -0.0000000000000000e+00 -3.9311345987254620e+07
0.0000000000000000e+00 1.2080380069246013e-01 1.6888488166618251e-01 9.1046164371871487e-03 1.4290904643907976e-03 1.1795846482993963e-01 -1.4167475525013642e-01 1.4137877394480627e-01 -1.1732256976360228e-01 1.0000000000000000e+00 -6.7368421052631577e-01 4.5385041551246535e-01 -3.0575185887155559e-01 -0.0000000000000000e+00 -3.9244973542355932e+07
0.0000000000000000e+00 3.4438283090011089e-02 2.0109121873491556e-01 -5.7547939779479883e-02 1.2132532792943161e-01 1.0628739055560422e-01 5.0316850203156537e-02 2.2046370825158240e-02 -1.7963881306520071e-01 1.0000000000000000e+00 -6.6315789473684206e-01 4.3977839335180047e-01 -2.9164251348593084e-01 -0.0000000000000000e+00 -3.9174248924560487e+07
0.0000000000000000e+00 4.0022795171004232e-02 1.0825652841681087e-01 -5.3682663711252271e-02 2.0519038176742430e-01 -1.1232513118570334e-01 1.9782604392290690e-01 -3.3256958790873037e-02 -5.2450197889903322e-02 1.0000000000000000e+00 -6.5263157894736845e-01 4.2592797783933523e-01 -2.7797404869514508e-01 -0.0000000000000000e+00 -3.9099039317975588e+07
0.0000000000000000e+00 1.2832572296168343e-01 -1.5194812833649152e-02 -2.8536438078967347e-02 2.1609125174964045e-01 -1.3020736704049873e-01 1.1099747085338199e-01 -2.9598523553562982e-02 4.9667653588594499e-02 1.0000000000000000e+00 -6.4210526315789473e-01 4.1229916897506924e-01 -2.6473946639451817e-01 -0.0000000000000000e+00 -3.9019207925871357e+07
0.0000000000000000e+00 1.8184513174486536e-01 -5.9695820017393535e-02 -1.0973501903449952e-02 1.8904657943619874e-01 3.7431734389739568e-03 -2.8005730185736164e-03 -1.4136948114045254e-02 1.3534261454652096e-02 1.0000000000000000e+00 -6.3157894736842102e-01 3.9889196675900274e-01 -2.5193176847937016e-01 -0.0000000000000000e+00 -3.8934600623322882e+07
0.0000000000000000e+00 1.8297564804199995e-01 -4.4895630844157534e-02 -3.3474510048147358e-03 1.6457432378185066e-01 7.3077581806167691e-03 -3.5012448657977931e-02 -4.8882834141059702e-03 3.1604772410185444e-02 1.0000000000000000e+00 -6.2105263157894741e-01 3.8570637119113577e-01 -2.3954395684502119e-01 -0.0000000000000000e+00 -3.8845083451900810e+07
0.0000000000000000e+00 1.5906034912211336e-01 -2.1224053174835857e-02 5.0895817281502252e-03 1.5686116396855940e-01 3.0344329995368825e-02 -2.4001065562456333e-02 -1.8903568932038005e-02 1.2354060555552715e-02 1.0000000000000000e+00 -6.1052631578947369e-01 3.7274238227146816e-01 -2.2756903338679108e-01 -0.0000000000000000e+00 -3.8750522818035603e+07
0.0000000000000000e+00 9.1006358925856679e-02 -7.5118610931840489e-03 6.6503092049194315e-02 1.5068813391684041e-01 1.5196115402105520e-01 -1.0151995927430326e-02 -1.2469031978554224e-01 -1.6259604460167863e-02 1.0000000000000000e+00 -5.9999999999999998e-01 3.5999999999999999e-01 -2.1599999999999997e-01 -0.0000000000000000e+00 -3.8650779152144141e+07
0.0000000000000000e+00 -1.6548982784842764e-02 -2.1501591298845458e-03 1.6924530473158547e-01 1.4967622644336245e-01 1.2158673178356291e-01 -3.2376002920126139e-03 -1.4152794707912439e-01 2.3518729348581235e-02 1.0000000000000000e+00 -5.8947368421052626e-01 3.4747922437673123e-01 -2.0482985857996788e-01 -0.0000000000000000e+00 -3.8545724148709796e+07
0.0000000000000000e+00 -6.9653755348263402e-02 -5.2147702654514955e-04 2.1883206296378127e-01 1.5065005938590567e-01 8.1234965378308831e-03 -8.4483521185395067e-04 4.7698190340317581e-03 -1.3036681841290393e-02 1.0000000000000000e+00 -5.7894736842105265e-01 3.3518005540166207e-01 -1.9405161102201490e-01 -0.0000000000000000e+00 -3.8435214400087699e+07
0.0000000000000000e+00 -5.1934993648663420e-02 -1.1065424034858019e-04 2.0970028089474685e-01 1.4213240863825227e-01 -5.6572321965554090e-02 -1.8901221960316596e-04 4.4275793878134018e-02 1.2279296363450445e-02 1.0000000000000000e+00 -5.6842105263157894e-01 3.2310249307479222e-01 -1.8365825922146084e-01 -0.0000000000000000e+00 -3.8319113881397702e+07
0.0000000000000000e+00 3.6983956204560908e-02 1.1469452396336173e-03 1.7786979612228432e-01 8.4685026232228552e-02 -1.5576017970246842e-01 -3.9180483653459804e-03 1.6905921618673117e-02 1.4363154029705605e-01 1.0000000000000000e+00 -5.5789473684210522e-01 3.1124653739612185e-01 -1.7364280507362584e-01 -0.0000000000000000e+00 -3.8197280698436156e+07
0.0000000000000000e+00 1.5794822030965477e-01 3.7735217542446728e-02 1.2287402532520451e-01 -1.8335073917085422e-02 -1.5574704890232346e-01 -8.3925120824484084e-02 1.2150945715063380e-01 1.1850262633718094e-01 1.0000000000000000e+00 -5.4736842105263162e-01 2.9961218836565101e-01 -1.6399825047383004e-01 -0.0000000000000000e+00 -3.8069578593331799e+07
0.0000000000000000e+00 1.9775021803465878e-01 1.3671372568135648e-01 1.7276930071533921e-02 -5.2433983812670809e-02 4.5441320718522388e-02 -1.6094230768179973e-01 1.5277934764483300e-01 -3.8266562162837352e-02 1.0000000000000000e+00 -5.3684210526315790e-01 2.8819944598337949e-01 -1.5471759731739321e-01 -0.0000000000000000e+00 -3.7935869323744483e+07
0.0000000000000000e+00 1.0740337717159122e-01 2.1022899146327012e-01 -6.1299609340202010e-02 4.3454282349327800e-02 1.9647369631285061e-01 -4.5350830568912259e-02 4.1433286329292704e-02 -1.9276239601680384e-01 1.0000000000000000e+00 -5.2631578947368418e-01 2.7700831024930744e-01 -1.4579384749963548e-01 -0.0000000000000000e+00 -3.7796023122095928e+07
0.0000000000000000e+00 -1.4215621414284001e-02 2.1744953396946481e-01 -6.5883845005009184e-02 1.6333565624853574e-01 1.0679948462033843e-01 5.1769973659723287e-02 -3.2194092559940712e-02 -1.2551613187220623e-01 1.0000000000000000e+00 -5.1578947368421058e-01 2.6603878116343493e-01 -1.3722000291587699e-01 -0.0000000000000000e+00 -3.7649889965446725e+07
0.0000000000000000e+00 -2.1994108886333741e-02 1.8927699498646933e-01 -3.8376040914096646e-02 1.7131554407418167e-01 -8.6784250650728478e-02 1.4359723816079500e-02 -3.6947852239381268e-02 1.0971229283503742e-01 1.0000000000000000e+00 -5.0526315789473686e-01 2.5529085872576179e-01 -1.2898906546143754e-01 -0.0000000000000000e+00 -3.7497316265670031e+07
0.0000000000000000e+00 9.1811869144260094e-02 1.4695496983703263e-01 1.7729552665523007e-03 5.8767095727033349e-02 -1.9596499367135350e-01 7.7361832482246623e-02 -6.5188462472629743e-02 1.8280342218045492e-01 1.0000000000000000e+00 -4.9473684210526314e-01 2.4476454293628808e-01 -1.2109403703163725e-01 -0.0000000000000000e+00 -3.7338169264406279e+07
0.0000000000000000e+00 1.8306002678261177e-01 5.5439888209612029e-02 9.6283794815360410e-02 -3.4996668163597089e-02 -5.1804741027097784e-02 1.6594465421042104e-01 -1.6085778483598709e-01 4.6511627709091036e-02 1.0000000000000000e+00 -4.8421052631578948e-01 2.3445983379501387e-01 -1.1352791952179618e-01 -0.0000000000000000e+00 -3.7172326619055934e+07
0.0000000000000000e+00 1.4324579548557007e-01 -4.3540479342225637e-02 1.9285128574686822e-01 8.1291219084947131e-03 1.6599096722370082e-01 8.1719954474456569e-02 -1.0011559455168681e-01 -1.4673609329855580e-01 1.0000000000000000e+00 -4.7368421052631576e-01 2.2437673130193903e-01 -1.0628371482723427e-01 -0.0000000000000000e+00 -3.6999660033267289e+07
0.0000000000000000e+00 1.7927959210752555e-02 -7.0631859780513950e-02 2.1999075710869140e-01 1.3293553272129058e-01 1.5227812176683891e-01 -1.6871688629898421e-02 3.9865563316927824e-02 -1.7493208269286115e-01 1.0000000000000000e+00 -4.6315789473684210e-01 2.1451523545706372e-01 -9.9354424843271616e-02 -0.0000000000000000e+00 -3.6820030059284538e+07
0.0000000000000000e+00 -5.4721786079557348e-02 -4.8236631544414314e-02 1.9882222644191866e-01 2.0344308115693138e-01 2.6089110109824766e-02 -3.9887978142612079e-02 2.6738451663599842e-02 -1.3927785112094214e-02 1.0000000000000000e+00 -4.5263157894736844e-01 2.0487534626038784e-01 -9.2733051465228172e-02 -0.0000000000000000e+00 -3.6633274691983856e+07
0.0000000000000000e+00 -5.2939077440154893e-02 -2.2077204420055511e-02 1.7014165230529651e-01 2.0466167119890102e-01 -3.2091590291564463e-02 -2.5353413172512607e-02 2.0062332963361327e-02 3.7176426557142955e-02 1.0000000000000000e+00 -4.4210526315789472e-01 1.9545706371191135e-01 -8.6412596588423957e-02 -0.0000000000000000e+00 -3.6439258355741180e+07
0.0000000000000000e+00 -2.8368684258698883e-02 -6.5326696738188961e-03 1.5907556382876520e-01 1.7651151390245995e-01 -2.9318585685865400e-02 -1.4349982160473887e-02 2.9724200146709720e-02 1.4803601547544343e-02 1.0000000000000000e+00 -4.3157894736842106e-01 1.8626038781163437e-01 -8.0386062108179043e-02 -0.0000000000000000e+00 -3.6237849955159605e+07
0.0000000000000000e+00 -1.0939995492063020e-02 3.5551552001175248e-02 1.5296722297617443e-01 1.2264360977493392e-01 -1.4078391306374482e-02 -8.7221277924167465e-02 -1.9044411797657257e-02 1.2068399478920640e-01 1.0000000000000000e+00 -4.2105263157894735e-01 1.7728531855955676e-01 -7.4646449919813368e-02 -0.0000000000000000e+00 -3.6028915291220829e+07
0.0000000000000000e+00 -3.3414050335288385e-03 1.3618602296187249e-01 1.3156598803018277e-01 3.4896284016351968e-02 -4.8773371649446359e-03 -1.6179738022522952e-01 5.8664228336120644e-02 1.0702228757277181e-01 1.0000000000000000e+00 -4.1052631578947368e-01 1.6853185595567868e-01 -6.9186761918647033e-02 -0.0000000000000000e+00 -3.5812314029410847e+07
0.0000000000000000e+00 5.0906709943765328e-03 2.0417342571709904e-01 5.0401278513766204e-02 4.0121666418745344e-02 -1.8901535823182544e-02 -2.7992687684244611e-02 1.5884528688943000e-01 -1.1215730732557562e-01 1.0000000000000000e+00 -4.0000000000000002e-01 1.6000000000000003e-01 -6.4000000000000015e-02 -0.0000000000000000e+00 -3.5587902398848765e+07
0.0000000000000000e+00 6.6503092049194315e-02 1.5190460181838772e-01 -4.4898761562050013e-02 1.2717679149317535e-01 -1.2469031978554224e-01 1.7222491478578517e-01 7.9617634403327781e-02 -1.2629299555565593e-01 1.0000000000000000e+00 -3.8947368421052631e-01 1.5168975069252078e-01 -5.9079166059192299e-02 -0.0000000000000000e+00 -3.5355541558413237e+07
0.0000000000000000e+00 1.6924530473158547e-01 5.7813335883083829e-02 -7.0945854760163657e-02 1.4410960340571496e-01 -1.4152794707912439e-01 7.1590601234367446e-02 -1.7390445637262820e-02 8.7667705243026917e-02 1.0000000000000000e+00 -3.7894736842105264e-01 1.4360110803324100e-01 -5.4417261991543966e-02 -0.0000000000000000e+00 -3.5115099058204055e+07
0.0000000000000000e+00 2.1883206296378127e-01 8.2513416628344272e-02 -4.8300691699543583e-02 4.6262102082296433e-02 4.7698190340317581e-03 -1.3400836236012101e-01 -3.9999015100023447e-02 1.6824935694483101e-01 1.0000000000000000e+00 -3.6842105263157893e-01 1.3573407202216065e-01 -5.0007289692374973e-02 -0.0000000000000000e+00 -3.4866436158003025e+07
0.0000000000000000e+00 2.0970028089474685e-01 1.5740056826346380e-01 -2.2088987412662978e-02 -4.5224820101560564e-02 4.4275793878134018e-02 -7.7253408640873558e-02 -2.5374601531988065e-02 5.8145972351154811e-02 1.0000000000000000e+00 -3.5789473684210527e-01 1.2808864265927977e-01 -4.5842251057005394e-02 -0.0000000000000000e+00 -3.4609428135132596e+07
0.0000000000000000e+00 1.7903777288316586e-01 1.8910188123201388e-01 -7.7026326635749130e-03 -5.9751297652897432e-02 1.3025265679763930e-02 2.2488780400294686e-02 -1.0473003102067882e-02 -2.4181809130075956e-02 1.0000000000000000e+00 -3.4736842105263160e-01 1.2066481994459835e-01 -4.1915147980755220e-02 -0.0000000000000000e+00 -3.4343941110639565e+07
0.0000000000000000e+00 1.6061288779989019e-01 1.7842422385602436e-01 -2.1876212702140792e-03 -3.6627101125479865e-02 3.7591023000768276e-02 -1.8686169739423706e-05 -3.3034327947330652e-03 -3.3928990275288611e-02 1.0000000000000000e+00 -3.3684210526315789e-01 1.1346260387811634e-01 -3.8218982358944449e-02 -0.0000000000000000e+00 -3.4069813706143320e+07
0.0000000000000000e+00 1.5399106008616245e-01 1.6129738323672316e-01 -5.2792733110310866e-04 -1.5453626016904153e-02 -8.1611523566562372e-03 2.6840279882962656e-02 -8.5758914132577559e-04 -1.8809739866262339e-02 1.0000000000000000e+00 -3.2631578947368423e-01 1.0648199445983381e-01 -3.4746756086893135e-02 -0.0000000000000000e+00 -3.3786908936772466e+07
0.0000000000000000e+00 1.4298555988347192e-01 1.5601979571594721e-01 5.8320787330212983e-03 -5.0503926884532951e-03 1.3631643973506719e-02 1.1022901304971910e-02 -1.7740233541584896e-02 -7.1205556804665180e-03 1.0000000000000000e+00 -3.1578947368421051e-01 9.9722991689750684e-02 -3.1491471059921269e-02 -0.0000000000000000e+00 -3.3495079538397674e+07
0.0000000000000000e+00 8.6041788334626473e-02 1.4933335804331696e-01 6.6670845869462786e-02 -1.3602684486988478e-03 1.4006821465228125e-01 -1.2692601934889872e-02 -1.2441038191784466e-01 -2.1059969516319570e-03 1.0000000000000000e+00 -3.0526315789473685e-01 9.3185595567867041e-02 -2.8446129173348884e-02 -0.0000000000000000e+00 -3.3194207260061242e+07
0.0000000000000000e+00 1.9357360471847010e-02 1.1190052342175456e-01 1.6927881114297239e-01 -3.1430577635336443e-04 3.4956141023667744e-02 1.0737250903658854e-01 -1.4146939027145361e-01 -5.1934602779551791e-04 1.0000000000000000e+00 -2.9473684210526313e-01 8.6869806094182808e-02 -2.5603732322495985e-02 -0.0000000000000000e+00 -3.2884149286440846e+07
0.0000000000000000e+00 6.6603362128221322e-02 1.3929299345066239e-02 2.1883810893506717e-01 -6.3880433476327442e-05 -1.5354899403779046e-01 1.4789177314831253e-01 4.7807652831930933e-03 -1.1174587499686113e-04 1.0000000000000000e+00 -2.8421052631578947e-01 8.0775623268698055e-02 -2.2957282402682605e-02 -0.0000000000000000e+00 -3.2564763542401820e+07
0.0000000000000000e+00 1.4630748208767322e-01 -6.2153849851647971e-02 2.0970137016097318e-01 5.9320392469887186e-03 -6.6992599968341504e-02 4.0078905610380966e-02 4.4277826986989482e-02 -1.7570376572601729e-02 1.0000000000000000e+00 -2.7368421052631581e-01 7.4903047091412753e-02 -2.0499781309228755e-02 -0.0000000000000000e+00 -3.2235910996290851e+07
0.0000000000000000e+00 1.2219866686111240e-01 -6.6072630346525565e-02 1.7786979612228432e-01 6.6689891161836240e-02 1.4084140145522778e-01 -3.2511422854075087e-02 1.6905921618673117e-02 -1.2437666637191104e-01 1.0000000000000000e+00 -2.6315789473684209e-01 6.9252077562326861e-02 -1.8224230937454435e-02 -0.0000000000000000e+00 -3.1897453852406133e+07
0.0000000000000000e+00 4.6562990343610049e-02 -3.8496771687101639e-02 1.2287402532520451e-01 1.6928214527850768e-01 5.7000139595247135e-02 -3.6706390367607596e-02 1.2150945715063380e-01 -1.4146329261726617e-01 1.0000000000000000e+00 -2.5263157894736843e-01 6.3822714681440448e-02 -1.6123633182679693e-02 -0.0000000000000000e+00 -3.1549256225830898e+07
0.0000000000000000e+00 7.9108176051305282e-02 -1.5916909137952982e-02 1.7276930071533921e-02 2.1883869298999215e-01 -1.3899421988458102e-01 -1.9555193287451748e-02 1.5277934764483300e-01 4.7818640459180925e-03 1.0000000000000000e+00 -2.4210526315789474e-01 5.8614958448753467e-02 -1.4190989940224523e-02 -0.0000000000000000e+00 -3.1191183393587172e+07
0.0000000000000000e+00 1.5059181178604050e-01 7.9346903717546994e-04 -6.1299609340202010e-02 2.0970137016097318e-01 -6.1077756397279012e-02 -2.4839600862575958e-02 4.1433286329292704e-02 4.4277826986989482e-02 1.0000000000000000e+00 -2.3157894736842105e-01 5.3628808864265930e-02 -1.2419303105408952e-02 -0.0000000000000000e+00 -3.0823097233739205e+07
0.0000000000000000e+00 1.2338720903179384e-01 6.5312563649638417e-02 -6.5883845005009184e-02 1.7786979612228432e-01 1.4266010677815580e-01 -1.2651270198897344e-01 -3.2194092559940712e-02 1.6905921618673117e-02 1.0000000000000000e+00 -2.2105263157894736e-01 4.8864265927977837e-02 -1.0801574573552995e-02 -0.0000000000000000e+00 -3.0444855215506077e+07
0.0000000000000000e+00 4.6843168115169163e-02 1.6896481616332268e-01 -3.8459620343475762e-02 1.2287402532520451e-01 5.7459750774509627e-02 -1.4198814727881801e-01 -3.6641146885318263e-02 1.2150945715063380e-01 1.0000000000000000e+00 -2.1052631578947367e-01 4.4321329639889190e-02 -9.3308062399766710e-03 -0.0000000000000000e+00 -3.0056331771375515e+07
0.0000000000000000e+00 7.9166369956801597e-02 2.1877404877993789e-01 -1.5910458833395022e-02 1.7276930071533921e-02 -1.3889483809391651e-01 4.6697283257817260e-03 -1.9542439357979925e-02 1.5277934764483300e-01 1.0000000000000000e+00 -2.0000000000000001e-01 4.0000000000000008e-02 -8.0000000000000019e-03 -0.0000000000000000e+00 -2.9657376031923760e+07
0.0000000000000000e+00 1.5654632775201785e-01 2.0968958716836569e-01 -5.1492639361944087e-03 -6.1299609340202010e-02 -7.8607789359785296e-02 4.4256638627514028e-02 -7.2883795405942276e-03 4.1433286329292704e-02 1.0000000000000000e+00 -1.8947368421052632e-01 3.5900277008310250e-02 -6.8021577489429958e-03 -0.0000000000000000e+00 -2.9247849788508385e+07
0.0000000000000000e+00 1.8891309589049748e-01 1.7786780989340983e-01 -1.3793137410722976e-03 -6.4715868244127644e-02 2.2171450106160298e-02 1.6902244738169939e-02 -2.1397124975655688e-03 -3.6074748498849903e-02 1.0000000000000000e+00 -1.7894736842105263e-01 3.2022160664819943e-02 -5.7302813821256742e-03 -0.0000000000000000e+00 -2.8827620597817719e+07
0.0000000000000000e+00 1.7830349308301938e-01 1.2295729395787997e-01 -3.1763991188864764e-04 -7.2075786879009091e-04 2.2277570203424468e-04 1.2120216277613968e-01 -5.2544368198296078e-04 -1.2055958103518379e-01 1.0000000000000000e+00 -1.6842105263157894e-01 2.8365650969529085e-02 -4.7773727948680561e-03 -0.0000000000000000e+00 -2.8396549759912662e+07
0.0000000000000000e+00 1.4360733911056495e-01 3.4960344171481243e-02 -6.4464488401335120e-05 1.2080367118123352e-01 7.2474257985726140e-02 1.0713332453018319e-01 -1.1284463772185742e-04 -1.8048293935946916e-01 1.0000000000000000e+00 -1.5789473684210525e-01 2.4930747922437671e-02 -3.9364338824901587e-03 -0.0000000000000000e+00 -2.7954504561053500e+07
0.0000000000000000e+00 5.4585647698166068e-02 4.0133449411352815e-02 5.9320392469887186e-03 1.9913590528747951e-01 1.6459027349150931e-01 -1.1213611896610017e-01 -1.7570376572601729e-02 -3.5090021896380212e-02 1.0000000000000000e+00 -1.4736842105263157e-01 2.1717451523545702e-02 -3.2004665403119982e-03 -0.0000000000000000e+00 -2.7501342752012245e+07
0.0000000000000000e+00 -4.3729264683742025e-02 1.2717877772204980e-01 6.6689891161836240e-02 1.5054631959856335e-01 8.1402624180322181e-02 -1.2628931867515275e-01 -1.2437666637191104e-01 1.7012259471465641e-01 1.0000000000000000e+00 -1.3684210526315790e-01 1.8725761772853188e-02 -2.5624726636535944e-03 -0.0000000000000000e+00 -2.7036898926030204e+07
0.0000000000000000e+00 -7.0585431694760711e-02 1.4410991420241864e-01 1.6919856584912857e-01 5.7499340903434129e-02 -1.7243637466250763e-02 8.7668294263458041e-02 -1.4115658726320315e-01 7.1071844227003050e-02 1.0000000000000000e+00 -1.2631578947368421e-01 1.5955678670360112e-02 -2.0154541478349616e-03 -0.0000000000000000e+00 -2.6561044027604885e+07
0.0000000000000000e+00 -3.0559847470677894e-02 4.6262281803949382e-02 2.0115527889004484e-01 8.2449176751562048e-02 -8.5546046269148218e-02 1.6824864802724551e-01 5.0427887160567904e-02 -1.3411869039994689e-01 1.0000000000000000e+00 -1.1578947368421053e-01 1.3407202216066482e-02 -1.5524128881761190e-03 -0.0000000000000000e+00 -2.6073662987834934e+07
0.0000000000000000e+00 7.9354765065273003e-02 -3.9280997861964380e-02 1.0826831140941834e-01 1.5144496303126015e-01 -1.7892485157676094e-01 4.0596784138028544e-02 1.9784723228238235e-01 -5.9725408787222739e-02 1.0000000000000000e+00 -1.0526315789473684e-01 1.1080332409972297e-02 -1.1663507799970839e-03 -0.0000000000000000e+00 -2.5574620168755420e+07
0.0000000000000000e+00 1.8536197629235854e-01 6.9405797378132780e-03 -1.4024849843893137e-02 1.2240801761242869e-01 -1.0456455233677675e-01 -1.4855479862148382e-01 1.0712049179497599e-01 1.4685809301119937e-01 1.0000000000000000e+00 -9.4736842105263161e-02 8.9750692520775624e-03 -8.5026971861787448e-04 -0.0000000000000000e+00 -2.5063768014097415e+07
0.0000000000000000e+00 1.8046580350176308e-01 1.3257177552035235e-01 -2.1956646746004206e-02 9.1414569841093644e-03 1.2069989202041136e-01 -1.7508498851806065e-01 -8.6718418148008028e-02 1.4144342840666449e-01 1.0000000000000000e+00 -8.4210526315789472e-02 7.0914127423822712e-03 -5.9717159935850702e-04 -0.0000000000000000e+00 -2.4540960104356453e+07
0.0000000000000000e+00 7.9327867962882714e-02 1.8570165287314069e-01 9.1818499170470996e-02 -5.7541130031616032e-02 1.4128889296442082e-01 3.1618147294305565e-02 -1.9595294865946716e-01 2.2057706919459078e-02 1.0000000000000000e+00 -7.3684210526315783e-02 5.4293628808864255e-03 -4.0005831753899977e-04 -0.0000000000000000e+00 -2.4006064559022680e+07
0.0000000000000000e+00 6.1245759079613778e-02 1.0321791872096504e-01 1.8306111604883807e-01 -4.7737752205429779e-02 -8.8326098732102454e-02 1.9072667660191583e-01 -5.1802707918242319e-02 -5.0804113895143846e-02 1.0000000000000000e+00 -6.3157894736842107e-02 3.9889196675900280e-03 -2.5193176847937020e-04 -0.0000000000000000e+00 -2.3458940367557507e+07
0.0000000000000000e+00 1.3583758405486746e-01 -1.6553095053473525e-02 1.4324579548557007e-01 3.8155439311743367e-02 -1.2005537111306841e-01 1.0889515078225322e-01 1.6599096722370082e-01 -1.5397151304497084e-01 1.0000000000000000e+00 -5.2631578947368418e-02 2.7700831024930744e-03 -1.4579384749963548e-04 -0.0000000000000000e+00 -2.2899451906451862e+07
0.0000000000000000e+00 1.8399529087474989e-01 -6.0009814997043241e-02 1.7927959210752555e-02 1.5830895417176138e-01 6.9807737309865725e-03 -3.3193300259380166e-03 1.5227812176683891e-01 -1.5559965171088030e-01 1.0000000000000000e+00 -4.2105263157894736e-02 1.7728531855955678e-03 -7.4646449919813377e-05 -0.0000000000000000e+00 -2.2327463736216143e+07
0.0000000000000000e+00 1.8349712506854510e-01 -4.4959690999286810e-02 -5.4721606357904400e-02 2.1549106226352449e-01 8.1525933924707084e-03 -3.5123485615389298e-02 2.6088401192239274e-02 -1.0571045060238371e-04 1.0000000000000000e+00 -3.1578947368421054e-02 9.9722991689750701e-04 -3.1491471059921275e-05 -0.0000000000000000e+00 -2.1742838455218170e+07
0.0000000000000000e+00 1.5917100336246193e-01 -2.1235836167443324e-02 -4.6995255200558708e-02 2.0884712964952720e-01 3.0533342214971990e-02 -2.4022253921931792e-02 -4.9640778504690730e-02 4.2923446268077758e-02 1.0000000000000000e+00 -2.1052631578947368e-02 4.4321329639889195e-04 -9.3308062399766721e-06 -0.0000000000000000e+00 -2.1145430968515009e+07
0.0000000000000000e+00 9.1027390447104617e-02 -7.5138473220585237e-03 3.8323193132011824e-02 1.7884898754164946e-01 1.5199854644749200e-01 -1.0155672807933508e-02 -1.5369157517727328e-01 1.2707935385629556e-02 1.0000000000000000e+00 -1.0526315789473684e-02 1.1080332409972299e-04 -1.1663507799970840e-06 -0.0000000000000000e+00 -2.0535093915450227e+07
0.0000000000000000e+00 -1.6461758423224705e-02 -2.1504699265882049e-03 1.5825888115376921e-01 1.6057573645626430e-01 1.2128671310411847e-01 -3.2381893124437317e-03 -1.5523438954914651e-01 3.7525779518478949e-02 1.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00 -1.9911701609588567e+07
0.0000000000000000e+00 -5.1969936915044021e-02 -5.2147702654514955e-04 1.9781369413486305e-01 1.5398460978160450e-01 -3.7520718896508438e-02 -8.4483521185395067e-04 4.5551258913208763e-02 -8.1739062861280725e-03 1.0000000000000000e+00 1.0526315789473684e-02 1.1080332409972299e-04 1.1663507799970840e-06 1.1663507799970840e-06 -1.9275125611193918e+07
0.0000000000000000e+00 4.3554242863295220e-02 -1.1065424034858019e-04 1.0741516016419869e-01 1.4892829285684181e-01 -1.9259253904782070e-01 -1.8901221960316596e-04 1.9649488467232606e-01 -3.9195773484750167e-03 1.0000000000000000e+00 2.1052631578947368e-02 4.4321329639889195e-04 9.3308062399766721e-06 9.3308062399766721e-06 -1.8625232264776390e+07
0.0000000000000000e+00 1.6335470154090917e-01 1.1469452396336173e-03 -1.5381611946291066e-02 1.5156568896445566e-01 -1.2548241632627261e-01 -3.9180483653459804e-03 1.1068381743975080e-01 1.9575881099782574e-02 1.0000000000000000e+00 3.1578947368421054e-02 9.9722991689750701e-04 3.1491471059921275e-05 3.1491471059921275e-05 -1.7961888448384833e+07
0.0000000000000000e+00 1.7131887820971695e-01 3.7735217542446728e-02 -5.9649081134936638e-02 1.5081737464299355e-01 1.0971839048922487e-01 -8.3925120824484084e-02 -3.1719328344948406e-03 -2.2281423069238762e-02 1.0000000000000000e+00 4.2105263157894736e-02 1.7728531855955678e-03 7.4646449919813377e-05 7.4646449919813377e-05 -1.7284953087222628e+07
0.0000000000000000e+00 5.8767500060305407e-02 1.3671354595970353e-01 -2.7218846770421114e-02 1.3104469072529057e-01 1.8280522986076542e-01 -1.6094159876421424e-01 -8.0670516784514076e-02 5.7818684206681195e-02 1.0000000000000000e+00 5.2631578947368418e-02 2.7700831024930744e-03 1.4579384749963548e-04 1.4579384749963548e-04 -1.6594301290707462e+07
0.0000000000000000e+00 -4.0940490403193273e-02 2.0428516922367393e-01 8.0207916310492663e-02 5.6234446513013805e-02 6.4060815922217310e-02 -2.7801642355785985e-02 -1.7757250396670465e-01 1.4110708645670056e-01 1.0000000000000000e+00 6.3157894736842107e-02 3.9889196675900280e-03 2.5193176847937020e-04 2.5193176847937020e-04 -1.5889801689018261e+07
0.0000000000000000e+00 -5.8562755482215997e-02 1.5192563333963566e-01 1.8555076163387493e-01 2.1772084307412773e-02 -2.2363103807147941e-02 1.7226230721222197e-01 -1.0424722204264238e-01 -4.4792747514516867e-02 1.0000000000000000e+00 7.3684210526315783e-02 5.4293628808864255e-03 4.0005831753899977e-04 4.0005831753899977e-04 -1.5171332894279920e+07
0.0000000000000000e+00 -3.6346923353920750e-02 5.7816980815322772e-02 1.8041937541600986e-01 9.8332956382808728e-02 -3.3469379096026097e-02 7.1597287908986007e-02 1.2107184085676369e-01 -1.5885983590871644e-01 1.0000000000000000e+00 8.4210526315789472e-02 7.0914127423822712e-03 5.9717159935850702e-04 5.9717159935850702e-04 -1.4438759648172289e+07
0.0000000000000000e+00 -1.5395611833060778e-02 8.2513820961616330e-02 6.1651263610799256e-02 1.7053741723552357e-01 -1.8709649158012306e-02 -1.3400655467981051e-01 1.8694625217337149e-01 -3.5218249816830346e-02 1.0000000000000000e+00 9.4736842105263161e-02 8.9750692520775624e-03 8.5026971861787448e-04 8.5026971861787448e-04 -1.3691952932217911e+07
0.0000000000000000e+00 -5.0396989620721359e-03 1.5145674602386761e-01 -3.4242388166118555e-02 1.8761238274831019e-01 -7.1014004298465220e-03 -5.9704220427747284e-02 4.7696151459019592e-02 1.8903225455001421e-02 1.0000000000000000e+00 1.0526315789473684e-01 1.1080332409972297e-02 1.1663507799970839e-03 1.1663507799970839e-03 -1.2930769741141085e+07
0.0000000000000000e+00 -1.3582822198243730e-03 1.2241000384130316e-01 8.2988619576376543e-03 1.7133514021959093e-01 -2.1023200711287751e-03 1.4686176989170255e-01 -1.4645247855035504e-01 2.5522625776960533e-03 1.0000000000000000e+00 1.1578947368421053e-01 1.3407202216066482e-02 1.5524128881761190e-03 1.5524128881761190e-03 -1.2155081930344526e+07
0.0000000000000000e+00 -3.1399497964970549e-04 9.1417677808130239e-03 1.3296934992938117e-01 1.5842526652967609e-01 -5.1875700736440022e-04 1.4144401742709561e-01 -1.7487293686475924e-01 3.4287590206035223e-02 1.0000000000000000e+00 1.2631578947368421e-01 1.5955678670360112e-02 2.0154541478349616e-03 2.0154541478349616e-03 -1.1364771600352474e+07
0.0000000000000000e+00 -6.4060155129273298e-05 -5.7541309753268981e-02 2.0344912712821728e-01 1.5346313275505935e-01 -1.1103695741136719e-04 2.2058415837044570e-02 -1.3916838862932879e-02 -9.0187414979820118e-03 1.0000000000000000e+00 1.3684210526315790e-01 1.8725761772853188e-02 2.5624726636535944e-03 2.5624726636535944e-03 -1.0559703577307537e+07
0.0000000000000000e+00 -1.1782992607466748e-05 -5.3681574445025963e-02 2.0466276046512732e-01 1.4881763861649322e-01 -2.1188359475456707e-05 -3.3254925682017572e-02 3.7178459665998420e-02 -4.1085895680781741e-03 1.0000000000000000e+00 1.4736842105263157e-01 2.1717451523545702e-02 3.2004665403119982e-03 3.2004665403119982e-03 -9.7397521797204204e+06
0.0000000000000000e+00 1.1659905320070671e-03 -2.8536438078967347e-02 1.7651151390245995e-01 1.5154465744320772e-01 -3.8843328194123690e-03 -2.9598523553562982e-02 1.4803601547544343e-02 1.9538488673345775e-02 1.0000000000000000e+00 1.5789473684210525e-01 2.4930747922437671e-02 3.9364338824901587e-03 3.9364338824901587e-03 -8.9047886006812230e+06
0.0000000000000000e+00 3.7738551677982013e-02 -1.0973501903449952e-02 1.2264360977493392e-01 1.5081372971075463e-01 -8.3919023170296647e-02 -1.4136948114045254e-02 1.2068399478920640e-01 -2.2288109743857323e-02 1.0000000000000000e+00 1.6842105263157894e-01 2.8365650969529085e-02 4.7773727948680561e-03 4.7773727948680561e-03 -8.0546743582526855e+06
0.0000000000000000e+00 1.3671413001462854e-01 -3.3474510048147358e-03 3.4896284016351968e-02 1.3104392694871261e-01 -1.6094050000148924e-01 -4.8882834141059702e-03 1.0702228757277181e-01 5.7818294361541699e-02 1.0000000000000000e+00 1.7894736842105263e-01 3.2022160664819943e-02 5.7302813821256742e-03 5.7302813821256742e-03 -7.1892840764447525e+06
0.0000000000000000e+00 2.0428516922367393e-01 5.0895817281502252e-03 4.0121666418745344e-02 5.0290624273417621e-02 -2.7801642355785985e-02 -1.8903568932038005e-02 -1.1215730732557562e-01 1.5865627466982682e-01 1.0000000000000000e+00 1.8947368421052632e-01 3.5900277008310250e-02 6.8021577489429958e-03 6.8021577489429958e-03 -6.3084860786178336e+06
0.0000000000000000e+00 1.5075765657875412e-01 6.6503092049194315e-02 1.2717679149317535e-01 -4.3751816322416398e-02 1.7614296315113115e-01 -1.2469031978554224e-01 -1.2629299555565593e-01 7.5699586037981811e-02 1.0000000000000000e+00 2.0000000000000001e-01 4.0000000000000008e-02 8.0000000000000019e-03 8.0000000000000019e-03 -5.4121575398616791e+06
0.0000000000000000e+00 2.0078118340637101e-02 1.6916172530220636e-01 1.4419318283509408e-01 -3.3210637217716936e-02 1.5551572205885153e-01 -1.4122124172506137e-01 8.7360999888963911e-02 -1.0131556646174690e-01 1.0000000000000000e+00 2.1052631578947367e-01 4.4321329639889190e-02 9.3308062399766710e-03 9.3308062399766710e-03 -4.5001672449266911e+06
0.0000000000000000e+00 -5.4200309053012201e-02 2.0114882858548688e-01 6.3945516182243756e-02 8.8412854260159948e-02 2.6933945321678716e-02 5.0415133231096074e-02 1.2260333383018120e-01 -2.0094061386423767e-01 1.0000000000000000e+00 2.2105263157894736e-01 4.8864265927977837e-02 1.0801574573552995e-02 1.0801574573552995e-02 -3.5723705129204541e+06
0.0000000000000000e+00 -5.2828423199806310e-02 1.1421104438278823e-01 5.6208238649994260e-02 1.8219618181101094e-01 -3.1902578071961299e-02 1.8029601096040063e-01 -9.5423432944238051e-02 -5.3176243887774047e-02 1.0000000000000000e+00 2.3157894736842105e-01 5.3628808864265930e-02 1.2419303105408952e-02 1.2419303105408952e-02 -2.6286213762276992e+06
0.0000000000000000e+00 -2.8347652737450959e-02 5.1499050785936029e-02 1.3331132507416155e-01 1.4422300067606075e-01 -2.9281193259428608e-02 -1.3371841757522693e-02 -1.1827703524528801e-01 1.6178930411015408e-01 1.0000000000000000e+00 2.4210526315789474e-01 5.8614958448753467e-02 1.4190989940224523e-02 1.4190989940224523e-02 -1.6688142544999197e+06
0.0000000000000000e+00 -1.0936350559824077e-02 1.0950336742514234e-01 1.4602601284979363e-01 5.5629359545108691e-02 -1.4071704631755921e-02 -1.4395598224091452e-01 9.0073745519424694e-02 6.8293855114252955e-02 1.0000000000000000e+00 2.5263157894736843e-01 6.3822714681440448e-02 1.6123633182679693e-02 1.6123633182679693e-02 -6.9281623238137364e+05
0.0000000000000000e+00 -3.3410007002567768e-03 1.5625982776754024e-01 6.4402348998734621e-02 8.1985713908860280e-02 -4.8755294846341453e-03 1.5414729585004479e-02 1.2333603332189878e-01 -1.3486343490355079e-01 1.0000000000000000e+00 2.6315789473684209e-01 6.9252077562326861e-02 1.8224230937454435e-02 1.8224230937454435e-02 2.9950556338893622e+05
0.0000000000000000e+00 -8.5315124521965287e-04 9.2988080474178669e-02 5.6307109897735373e-02 1.5134500251729271e-01 -1.3523476100562744e-03 1.5629697850679977e-01 -9.5255609084110349e-02 -5.9895265756205913e-02 1.0000000000000000e+00 2.7368421052631581e-01 7.4903047091412753e-02 2.0499781309228755e-02 2.0499781309228755e-02 1.3082802235329971e+06
0.0000000000000000e+00 -1.8878534151638908e-04 4.3987189692751984e-02 1.3449834712741654e-01 1.2238897232005523e-01 -3.1733029413437370e-04 -2.3523837684953017e-02 -1.2212397563826358e-01 1.4682437746526575e-01 1.0000000000000000e+00 2.8421052631578947e-01 8.0775623268698055e-02 2.2957282402682605e-02 2.2957282402682605e-02 2.3336263527088910e+06
0.0000000000000000e+00 -3.7151343625874653e-05 1.0743678772463691e-01 1.8368463003063548e-01 9.1381228485740811e-03 -6.5243482289333469e-05 -1.4750028788699016e-01 6.4681143778096140e-03 1.4143733075247705e-01 1.0000000000000000e+00 2.9473684210526313e-01 8.6869806094182808e-02 2.5603732322495985e-02 2.5603732322495985e-02 3.3756763413345665e+06
0.0000000000000000e+00 -6.4503045579590711e-06 1.7342158511928946e-01 1.8343364896834086e-01 -5.7541893808193988e-02 -1.2753929471824879e-05 -3.1075419823913791e-02 8.0426551977843541e-03 2.2057317074319575e-02 1.0000000000000000e+00 3.0526315789473685e-01 9.3185595567867041e-02 2.8446129173348884e-02 2.8446129173348884e-02 4.4345758976545855e+06
0.0000000000000000e+00 5.9427329733698778e-03 1.8836666274578873e-01 1.5915922036985447e-01 -5.3681574445025963e-02 -1.7551221321981732e-02 2.0087749204929978e-02 3.0512153855496535e-02 -3.3254925682017572e-02 1.0000000000000000e+00 3.1578947368421051e-01 9.9722991689750684e-02 3.1491471059921269e-02 3.1491471059921269e-02 5.5104500529150888e+06
0.0000000000000000e+00 6.6691877390710710e-02 1.7033690350785233e-01 9.2193380979111672e-02 -2.8536438078967347e-02 -1.2437298949140786e-01 6.7165332648059933e-03 1.4811421362807964e-01 -2.9598523553562982e-02 1.0000000000000000e+00 3.2631578947368423e-01 1.0648199445983381e-01 3.4746756086893135e-02 3.4746756086893135e-02 6.6034118503785804e+06
0.0000000000000000e+00 1.6928245607521133e-01 1.2072022126308102e-01 2.1193213825378192e-02 -1.0973501903449952e-02 -1.4146270359683505e-01 1.1826517018400264e-01 3.7674395287884840e-02 -1.4136948114045254e-02 1.0000000000000000e+00 3.3684210526315789e-01 1.1346260387811634e-01 3.8218982358944449e-02 3.8218982358944449e-02 7.7135919904695749e+06
0.0000000000000000e+00 2.1883869298999215e-01 1.6754868990063766e-02 6.7060778999637194e-02 -3.3474510048147358e-03 4.7818640459180925e-03 1.5193341367025404e-01 -1.5281519578334787e-01 -4.8882834141059702e-03 1.0000000000000000e+00 3.4736842105263160e-01 1.2066481994459835e-01 4.1915147980755220e-02 4.1915147980755220e-02 8.8411533235496357e+06
0.0000000000000000e+00 2.0970137016097318e-01 -6.1410263580550593e-02 1.4640635333541432e-01 5.0895817281502252e-03 4.4277826986989482e-02 4.1244274109689540e-02 -6.6824776108213801e-02 -1.8903568932038005e-02 1.0000000000000000e+00 3.5789473684210527e-01 1.2808864265927977e-01 4.5842251057005394e-02 4.5842251057005394e-02 9.9862234019981027e+06
0.0000000000000000e+00 1.7903777288316586e-01 -6.5904876526257108e-02 1.2104973539260432e-01 6.6503092049194315e-02 1.3025265679763930e-02 -3.2231484986377505e-02 1.4475577294007058e-01 -1.2469031978554224e-01 1.0000000000000000e+00 3.6842105263157893e-01 1.3573407202216065e-01 5.0007289692374973e-02 5.0007289692374973e-02 1.1148932302210726e+07
0.0000000000000000e+00 1.6061288779989019e-01 -3.8379685846335589e-02 8.8274620044596595e-03 1.6916172530220636e-01 3.7591023000768276e-02 -3.6954538913999829e-02 1.4092467139930009e-01 -1.4122124172506137e-01 1.0000000000000000e+00 3.7894736842105264e-01 1.4360110803324100e-01 5.4417261991543966e-02 5.4417261991543966e-02 1.2329372176345021e+07
0.0000000000000000e+00 1.5399123980781540e-01 1.7723712116272938e-03 -5.7605369908398249e-02 2.0114864886383393e-01 -8.1618612742417312e-03 -6.5189561235354729e-02 2.1947378879633203e-02 5.0415842148681569e-02 1.0000000000000000e+00 3.8947368421052631e-01 1.5168975069252078e-01 5.9079166059192299e-02 5.9079166059192299e-02 1.3527680857389219e+07
0.0000000000000000e+00 1.4892938212306811e-01 9.6283794815360410e-02 -5.3693357437633427e-02 1.0826722214319204e-01 -3.9175442396195519e-03 -1.6085778483598709e-01 -3.3276114041493027e-02 1.9784519917352689e-01 1.0000000000000000e+00 4.0000000000000002e-01 1.6000000000000003e-01 6.4000000000000015e-02 6.4000000000000015e-02 1.4744013954299904e+07
0.0000000000000000e+00 1.5156568896445566e-01 1.9168330898598668e-01 -2.7370447546960278e-02 -1.5192826604774678e-02 1.9575881099782574e-02 -9.6234938612777626e-02 -3.3482856372975350e-02 1.1100114773388517e-01 1.0000000000000000e+00 4.1052631578947368e-01 1.6853185595567868e-01 6.9186761918647033e-02 6.9186761918647033e-02 1.5978502905917309e+07
0.0000000000000000e+00 1.5081737464299355e-01 1.8225189463400573e-01 2.6765049774532063e-02 -5.9611929791310761e-02 -2.2281423069238762e-02 1.2378399746679335e-01 -9.8055971284341889e-02 -3.1066893522055071e-03 1.0000000000000000e+00 4.2105263157894735e-01 1.7728531855955676e-01 7.4646449919813368e-02 7.4646449919813368e-02 1.7231268692723721e+07
0.0000000000000000e+00 1.3104451100363762e-01 6.2108096427290121e-02 1.3336649928816086e-01 -2.7212216744210212e-02 5.7819393124266691e-02 1.8767895166508908e-01 -1.6582807449800971e-01 -8.0658471772627749e-02 1.0000000000000000e+00 4.3157894736842106e-01 1.8626038781163437e-01 8.0386062108179043e-02 8.0386062108179043e-02 1.8502441721045829e+07
0.0000000000000000e+00 5.0290624273417621e-02 -3.4143516918377435e-02 2.0343092871222798e-01 8.0209005576718964e-02 1.5865627466982682e-01 4.7863975319147309e-02 -2.9156023074697720e-02 -1.7757047085784922e-01 1.0000000000000000e+00 4.4210526315789472e-01 1.9545706371191135e-01 8.6412596588423957e-02 8.6412596588423957e-02 1.9792148664706565e+07
0.0000000000000000e+00 -4.4919793083297938e-02 8.3179072500111015e-03 1.5056887123723772e-01 1.8671873839475647e-01 7.9580241976890995e-02 -1.4641876300442142e-01 1.7582563285699679e-01 -1.0812787798155156e-01 1.0000000000000000e+00 4.5263157894736844e-01 2.0487534626038784e-01 9.2733051465228172e-02 9.2733051465228172e-02 2.1100520328300230e+07
0.0000000000000000e+00 -7.0949499692402607e-02 1.3297268406491647e-01 2.0040966997011227e-02 2.1815823789069552e-01 -1.7397132311881381e-02 -1.7486683921057181e-01 1.5545047857656219e-01 3.7153406706898165e-02 1.0000000000000000e+00 4.6315789473684210e-01 2.1451523545706372e-01 9.9354424843271616e-02 9.9354424843271616e-02 2.2427701408503219e+07
0.0000000000000000e+00 -4.8301096032815641e-02 2.0344971118314231e-01 -5.4206939079223103e-02 1.9836521390377485e-01 -4.0000822780333935e-02 -1.3915740100207880e-02 2.6921900309792385e-02 2.6006461089467747e-02 1.0000000000000000e+00 4.7368421052631576e-01 2.2437673130193903e-01 1.0628371482723427e-01 1.0628371482723427e-01 2.3773807935681403e+07
0.0000000000000000e+00 -1.6145165173066793e-02 2.0466276046512732e-01 -5.2829512466032617e-02 1.6409895881795919e-01 -4.2923789745114335e-02 3.7178459665998420e-02 -3.1904611180816757e-02 3.7443697316359881e-02 1.0000000000000000e+00 4.8421052631578948e-01 2.3445983379501387e-01 1.1352791952179618e-01 1.1352791952179618e-01 2.5138965494905464e+07
0.0000000000000000e+00 5.8989244727135799e-02 1.7651151390245995e-01 -2.8347652737450959e-02 9.3532617906562601e-02 -1.3484599259347574e-01 1.4803601547544343e-02 -2.9281193259428608e-02 1.5018281815327478e-01 1.0000000000000000e+00 4.9473684210526314e-01 2.4476454293628808e-01 1.2109403703163725e-01 1.2109403703163725e-01 2.6523310263039686e+07
0.0000000000000000e+00 1.6701125537561815e-01 1.2264360977493392e-01 -1.0936350559824077e-02 2.1503874669492613e-02 -1.4445943103750508e-01 1.2068399478920640e-01 -1.4071704631755921e-02 3.8187054641061799e-02 1.0000000000000000e+00 5.0526315789473686e-01 2.5529085872576179e-01 1.2898906546143754e-01 1.2898906546143754e-01 2.7926978180642635e+07
0.0000000000000000e+00 2.0062717183728879e-01 3.4896284016351968e-02 -3.3408209786038308e-03 6.7124255099841462e-02 4.9571006936827623e-02 1.0702228757277181e-01 -4.8762384022196393e-03 -1.5270525758866149e-01 1.0000000000000000e+00 5.1578947368421058e-01 2.6603878116343493e-01 1.3722000291587699e-01 1.3722000291587699e-01 2.9350100892358869e+07
0.0000000000000000e+00 1.0815656790284345e-01 4.0121666418745344e-02 5.0906709943765328e-03 1.4641813632802181e-01 1.9765618695392373e-01 -1.1215730732557562e-01 -1.8901535823182544e-02 -6.6803587748738347e-02 1.0000000000000000e+00 5.2631578947368418e-01 2.7700831024930744e-01 1.4579384749963548e-01 1.4579384749963548e-01 3.0792791682040647e+07
0.0000000000000000e+00 -1.5213858126022602e-02 1.2834476825405688e-01 6.6503092049194315e-02 1.2105172162147879e-01 1.1096375530744837e-01 -1.3017365149456511e-01 -1.2469031978554224e-01 1.4475944982057376e-01 1.0000000000000000e+00 5.3684210526315790e-01 2.8819944598337949e-01 1.5471759731739321e-01 1.5471759731739321e-01 3.2255188890072003e+07
0.0000000000000000e+00 -5.9699154152928820e-02 1.8184846588040063e-01 1.6924530473158547e-01 8.8277728011633191e-03 -2.8066706727610594e-03 3.7492710931613937e-03 -1.4152794707912439e-01 1.4092526041973122e-01 1.0000000000000000e+00 5.4736842105263162e-01 2.9961218836565101e-01 1.6399825047383004e-01 1.6399825047383004e-01 3.3737424698033422e+07
0.0000000000000000e+00 -4.4896214899082541e-02 1.8297623209692498e-01 2.1883206296378127e-01 -5.7605190186745300e-02 -3.5013547420702930e-02 7.3088569433417683e-03 4.7698190340317581e-03 2.1946669962047711e-02 1.0000000000000000e+00 5.5789473684210522e-01 3.1124653739612185e-01 1.7364280507362584e-01 1.7364280507362584e-01 3.5239632326350734e+07
0.0000000000000000e+00 -2.1224053174835857e-02 1.5906034912211336e-01 2.0970028089474685e-01 -4.7749535198037242e-02 -2.4001065562456333e-02 3.0344329995368825e-02 4.4275793878134018e-02 -5.0825302254619301e-02 1.0000000000000000e+00 5.6842105263157894e-01 3.2310249307479222e-01 1.8365825922146084e-01 1.8365825922146084e-01 3.6761934819881216e+07
0.0000000000000000e+00 -7.5118610931840489e-03 9.1006358925856679e-02 1.7903777288316586e-01 3.8153453082868889e-02 -1.0151995927430326e-02 1.5196115402105520e-01 1.3025265679763930e-02 -1.5397518992547402e-01 1.0000000000000000e+00 5.7894736842105265e-01 3.3518005540166207e-01 1.9405161102201490e-01 1.9405161102201490e-01 3.8304448683193475e+07
0.0000000000000000e+00 -2.0665797005054313e-03 -1.6548982784842764e-02 1.6061288779989019e-01 1.5822506394567862e-01 -3.5443056460756225e-03 1.2158673178356291e-01 3.7591023000768276e-02 -1.5529353537724841e-01 1.0000000000000000e+00 5.8947368421052626e-01 3.4747922437673123e-01 2.0482985857996788e-01 2.0482985857996788e-01 3.9867328474246711e+07
0.0000000000000000e+00 1.7161937073402172e-02 -6.9653755348263402e-02 1.5399106008616245e-01 1.9780764816357715e-01 -4.6490858326503764e-02 8.1234965378308831e-03 -8.1611523566562372e-03 4.5540312664047428e-02 1.0000000000000000e+00 5.9999999999999998e-01 3.5999999999999999e-01 2.1599999999999997e-01 2.1599999999999997e-01 4.1450727994291037e+07
0.0000000000000000e+00 1.0132240451120625e-01 -5.1934993648663420e-02 1.4298555988347192e-01 1.0741407089797238e-01 -1.5375841751499603e-01 -5.6572321965554090e-02 1.3631643973506719e-02 1.9649285156347063e-01 1.0000000000000000e+00 6.1052631578947369e-01 3.7274238227146816e-01 2.2756903338679108e-01 2.2756903338679108e-01 4.3054756450479880e+07
0.0000000000000000e+00 1.9304159120581105e-01 3.6983956204560908e-02 8.6041788334626473e-02 -1.5381611946291066e-02 -9.4132618541648852e-02 -1.5576017970246842e-01 1.4006821465228125e-01 1.1068381743975080e-01 1.0000000000000000e+00 6.2105263157894741e-01 3.8570637119113577e-01 2.3954395684502119e-01 2.3954395684502119e-01 4.4679530292131960e+07
0.0000000000000000e+00 1.8264946904303456e-01 1.5794822030965477e-01 1.9357360471847010e-02 -5.9732660564315754e-02 1.2399604912009474e-01 -1.5574704890232346e-01 3.4956141023667744e-02 -2.8652274804318320e-03 1.0000000000000000e+00 6.3157894736842102e-01 3.9889196675900274e-01 2.5193176847937016e-01 2.5193176847937016e-01 4.6325184495605588e+07
0.0000000000000000e+00 7.9855390960713771e-02 1.9775039775631173e-01 6.6603362128221322e-02 -4.4902260870368436e-02 1.4214467442543610e-01 4.5440611800936892e-02 -1.5354899403779046e-01 -3.5024493669864265e-02 1.0000000000000000e+00 6.4210526315789473e-01 4.1229916897506924e-01 2.6473946639451817e-01 2.6473946639451817e-01 4.7991846162983879e+07
0.0000000000000000e+00 6.1357502586188668e-02 1.1334719941118741e-01 1.4630748208767322e-01 -2.1225142441062165e-02 -8.8135053403643832e-02 1.7892450809972435e-01 -6.6992599968341504e-02 -2.4003098671311795e-02 1.0000000000000000e+00 6.5263157894736845e-01 4.2592797783933523e-01 2.7797404869514508e-01 2.7797404869514508e-01 4.9679642121459484e+07
0.0000000000000000e+00 1.3469063881523385e-01 5.1308279215545170e-02 1.2219866686111240e-01 -7.5118610931840489e-03 -1.1613732274772244e-01 -1.3692848932160245e-02 1.4084140145522778e-01 -1.0151995927430326e-02 1.0000000000000000e+00 6.6315789473684206e-01 4.3977839335180047e-01 2.9164251348593084e-01 2.9164251348593084e-01 5.1388713588894203e+07
0.0000000000000000e+00 1.4626007333230318e-01 1.0946590528481281e-01 4.6562990343610049e-02 -2.0665797005054313e-03 9.0905894555470657e-02 -1.4402181474363498e-01 5.7000139595247135e-02 -3.5443056460756225e-03 1.0000000000000000e+00 6.7368421052631577e-01 4.5385041551246535e-01 3.0575185887155559e-01 3.0575185887155559e-01 5.3119190209192812e+07
0.0000000000000000e+00 4.6783579108841580e-02 1.5625301801967639e-01 7.9108355772958230e-02 1.7161937073402172e-02 1.6909419215668495e-01 1.5403393490703639e-02 -1.3899492880216652e-01 -4.6490858326503764e-02 1.0000000000000000e+00 6.8421052631578949e-01 4.6814404432132967e-01 3.2030908295669924e-01 3.2030908295669924e-01 5.4871193086425975e+07
0.0000000000000000e+00 -4.5114165861211988e-02 8.7043168968356177e-02 1.5653563402563669e-01 1.0132240451120625e-01 5.8334984570757975e-02 1.7384413361107057e-01 -7.8626944610405286e-02 -1.5375841751499603e-01 1.0000000000000000e+00 6.9473684210526321e-01 4.8265927977839340e-01 3.3532118384604176e-01 3.3532118384604176e-01 5.6644850124556869e+07
0.0000000000000000e+00 -5.8562289370767968e-02 -2.2704687697958727e-02 1.8891110966162300e-01 1.9304159120581105e-01 -2.8025072642548346e-02 1.0084915180645485e-01 2.2167773225657120e-02 -9.4132618541648852e-02 1.0000000000000000e+00 7.0526315789473681e-01 4.9739612188365645e-01 3.5079515964426300e-01 3.5079515964426300e-01 5.8440306078570858e+07
0.0000000000000000e+00 1.1154062814447488e-03 -6.1845668350574423e-02 1.7838676171569481e-01 1.8256588961365544e-01 -1.1784073775053558e-01 -6.0375842901551121e-03 -8.4518672459887964e-05 1.2430275447415774e-01 1.0000000000000000e+00 7.1578947368421053e-01 5.1235457063711909e-01 3.6673800845604315e-01 3.6673800845604315e-01 6.0257698738376126e+07
0.0000000000000000e+00 1.2126126777430234e-01 -4.5417107870702682e-02 1.6129075321051228e-01 6.2171976860766448e-02 -1.7974985002261207e-01 -3.5857283869831884e-02 2.6828234871076329e-02 1.8779069754008593e-01 1.0000000000000000e+00 7.2631578947368425e-01 5.2753462603878121e-01 3.8315672838606218e-01 3.8315672838606218e-01 6.2097159447418898e+07
0.0000000000000000e+00 2.0517859877481684e-01 -2.1334707415184437e-02 1.5601870644972088e-01 -4.0075556165366157e-02 -5.2471386249378776e-02 -2.4190077782059498e-02 1.1020868196116446e-02 6.5434351891749037e-02 1.0000000000000000e+00 7.3684210526315785e-01 5.4293628808864258e-01 4.0005831753899979e-01 4.0005831753899979e-01 6.3958803792432591e+07
0.0000000000000000e+00 2.1608926552076596e-01 -7.5328926144319734e-03 1.4933335804331696e-01 -5.7204007150943592e-02 4.9663976708091320e-02 -1.0189388353867119e-02 -1.2692601934889872e-02 -2.5922752571419572e-02 1.0000000000000000e+00 7.4736842105263157e-01 5.5855955678670355e-01 4.1744977401953637e-01 4.1744977401953637e-01 6.5842753707203761e+07
0.0000000000000000e+00 1.8896268921011594e-01 -2.1538040621234878e-03 1.1198410285113368e-01 1.4294012610944554e-03 1.3840377788283981e-02 -3.2442869666311747e-03 1.0706580368252554e-01 -1.1732198074317117e-01 1.0000000000000000e+00 7.5789473684210529e-01 5.7440443213296399e-01 4.3533809593235173e-01 4.3533809593235173e-01 6.7749140587336496e+07
0.0000000000000000e+00 1.4689090968190335e-01 -5.2188135981721137e-04 3.1612713445013561e-02 1.2132514820777866e-01 7.7250795524835256e-02 -8.4664289216444095e-04 1.0224575003366272e-01 -1.7963810414761522e-01 1.0000000000000000e+00 7.6842105263157889e-01 5.9047091412742370e-01 4.5373028138212557e-01 4.5373028138212557e-01 6.9678108008850276e+07
0.0000000000000000e+00 5.5428105217004565e-02 5.8331679992476050e-03 3.9279208899906853e-02 1.9924655952782810e-01 1.6592346585094558e-01 -1.7738200432729435e-02 -1.1349049968501190e-01 -3.4901009676777048e-02 1.0000000000000000e+00 7.7894736842105261e-01 6.0675900277008310e-01 4.7263332847353839e-01 4.7263332847353839e-01 7.1629787035486892e+07
0.0000000000000000e+00 -4.2374488810218575e-02 6.6670845869462786e-02 1.2698999238053343e-01 1.4939937435892975e-01 7.7835621655044193e-02 -1.2441038191784466e-01 -1.2660664896928714e-01 1.7404064308000236e-01 1.0000000000000000e+00 7.8947368421052633e-01 6.2326869806094187e-01 4.9205423531126991e-01 4.9205423531126991e-01 7.3604299627690956e+07
0.0000000000000000e+00 -3.2893308102531944e-02 1.6919523171359327e-01 1.4407276285879275e-01 1.9847702790366510e-02 -1.0079071180019505e-01 -1.4116268491739059e-01 8.7603050781168701e-02 1.5469025969742412e-01 1.0000000000000000e+00 8.0000000000000004e-01 6.4000000000000012e-01 5.1200000000000012e-01 5.1200000000000012e-01 7.5601781557119071e+07
0.0000000000000000e+00 8.8477498470214216e-02 2.0115487455677278e-01 4.6255472056085531e-02 -3.6580955108194153e-02 -2.0082847814410132e-01 5.0426079480257409e-02 1.6823731193294467e-01 -1.8823114750382463e-02 1.0000000000000000e+00 8.1052631578947365e-01 6.5695290858725752e-01 5.3247762064440873e-01 5.3247762064440873e-01 7.7622359293640539e+07
0.0000000000000000e+00 1.8220796480361842e-01 1.1421213364901453e-01 -4.5225909367786872e-02 4.8592852559141052e-02 -5.3155055528298592e-02 1.8029804406925609e-01 5.8143939242299353e-02 -1.8549317172682961e-01 1.0000000000000000e+00 8.2105263157894737e-01 6.7412742382271473e-01 5.5349409534917626e-01 5.5349409534917626e-01 7.9666160208791256e+07
0.0000000000000000e+00 1.4422498690493521e-01 5.1499050785936029e-02 -5.9751297652897432e-02 1.6471298376073357e-01 1.6179298099065725e-01 -1.3371841757522693e-02 -2.4181809130075956e-02 -1.2338009625514385e-01 1.0000000000000000e+00 8.3157894736842108e-01 6.9152354570637120e-01 5.7505642221898245e-01 5.7505642221898245e-01 8.1733321542241797e+07
0.0000000000000000e+00 5.5629670341812348e-02 1.0950336742514234e-01 -3.6627101125479865e-02 1.7171645261874577e-01 6.8294444134684051e-02 -1.4395598224091452e-01 -3.3928990275288611e-02 1.0993044214252626e-01 1.0000000000000000e+00 8.4210526315789469e-01 7.0914127423822704e-01 5.9717159935850694e-01 5.9717159935850694e-01 8.3823983214631885e+07
0.0000000000000000e+00 8.1985713908860280e-02 1.5625982776754024e-01 -1.5453626016904153e-02 7.6514974315381998e-02 -1.3486343490355079e-01 1.5414729585004479e-02 -1.8809739866262339e-02 1.3727024370352697e-01 1.0000000000000000e+00 8.5263157894736841e-01 7.2698060941828258e-01 6.1984662487243036e-01 6.1984662487243036e-01 8.5938269240893766e+07
0.0000000000000000e+00 1.5134500251729271e-01 9.2988080474178669e-02 -5.0503926884532951e-03 6.0504351340969015e-02 -5.9895265756205913e-02 1.5629697850679977e-01 -7.1205556804665180e-03 -8.9487401013700105e-02 1.0000000000000000e+00 8.6315789473684212e-01 7.4504155124653748e-01 6.4308849686543235e-01 6.4308849686543235e-01 8.8076292616174176e+07
0.0000000000000000e+00 1.2238897232005523e-01 4.3987189692751984e-02 -1.9229168781730597e-04 1.3450185347371746e-01 1.4682437746526575e-01 -2.3523837684953017e-02 -5.9866528905411446e-03 -1.1645465304185681e-01 1.0000000000000000e+00 8.7368421052631584e-01 7.6332409972299176e-01 6.6690421344219286e-01 6.6690421344219286e-01 9.0238174707291275e+07
0.0000000000000000e+00 9.1381228485740811e-03 1.0735320829525780e-01 3.7424556698332306e-02 1.4630650141805643e-01 1.4143733075247705e-01 -1.4719358253292714e-01 -8.4437780177661043e-02 9.0533945719118311e-02 1.0000000000000000e+00 8.8421052631578945e-01 7.8182825484764540e-01 6.9130077270739165e-01 6.9130077270739165e-01 9.2424069093082637e+07
0.0000000000000000e+00 -5.7541893808193988e-02 1.5573835074099510e-01 1.3665006985949926e-01 6.4460363182578001e-02 2.2057317074319575e-02 1.4569894373150526e-02 -1.6105153695890059e-01 1.2343612403014882e-01 1.0000000000000000e+00 8.9473684210526316e-01 8.0055401662049863e-01 7.1628517276570935e-01 7.1628517276570935e-01 9.4634114440171003e+07
0.0000000000000000e+00 -5.3681574445025963e-02 9.2877426233830093e-02 2.0427338623106647e-01 5.6317803624116536e-02 -3.3254925682017572e-02 1.5610796628719659e-01 -2.7822830715261444e-02 -9.5236453833490359e-02 1.0000000000000000e+00 9.0526315789473688e-01 8.1950138504155134e-01 7.4186441172182538e-01 7.4186441172182538e-01 9.6868439125350088e+07
0.0000000000000000e+00 -2.8536438078967347e-02 4.3966158171504059e-02 1.5192364711076117e-01 1.3333235659540948e-01 -2.9598523553562982e-02 -2.3561230111389816e-02 1.7225863033171879e-01 -1.1823964281885121e-01 1.0000000000000000e+00 9.1578947368421049e-01 8.3867036011080320e-01 7.6804548768041980e-01 7.6804548768041980e-01 9.9127162507476807e+07
0.0000000000000000e+00 -1.0889922474070838e-02 1.0734956336301885e-01 5.7816670018619114e-02 1.4594607835265347e-01 -1.4443653468108263e-02 -1.4720026920754570e-01 7.1596698888554883e-02 9.0387137548106261e-02 1.0000000000000000e+00 9.2631578947368420e-01 8.5806094182825488e-01 7.9483539874617293e-01 7.9483539874617293e-01 1.0141041149911594e+08
0.0000000000000000e+00 1.4335783373479641e-02 1.5573758696441714e-01 8.2513820961616330e-02 4.6719698675365254e-02 -5.0533597611170288e-02 1.4569504528011028e-02 -1.3400655467981051e-01 1.6898244628168810e-01 1.0000000000000000e+00 9.3684210526315792e-01 8.7767313019390591e-01 8.2224114302376450e-01 8.2224114302376450e-01 1.0371832633873191e+08
0.0000000000000000e+00 1.0057881824010886e-01 8.6933603994233902e-02 1.5145674602386761e-01 -3.9182126614223267e-02 -1.5492378601430459e-01 1.7365715450032285e-01 -5.9704220427747284e-02 4.0764607998156246e-02 1.0000000000000000e+00 9.4736842105263153e-01 8.9750692520775610e-01 8.5026971861787415e-01 8.5026971861787415e-01 1.0605103777870619e+08
0.0000000000000000e+00 1.9404181414642413e-01 -2.2725719219206651e-02 1.2241000384130316e-01 6.9596250301867321e-03 -9.8293212348255610e-02 1.0081175938001805e-01 1.4686176989170255e-01 -1.4852108307555020e-01 1.0000000000000000e+00 9.5789473684210524e-01 9.1756232686980610e-01 8.7892812363318262e-01 8.7892812363318262e-01 1.0840866954560232e+08
0.0000000000000000e+00 2.2027124567695419e-01 -6.1849313282813366e-02 9.1417677808130239e-03 1.3257510965588765e-01 4.0325763516621441e-02 -6.0442709647736733e-03 1.4144401742709561e-01 -1.7507889086387318e-01 1.0000000000000000e+00 9.6842105263157896e-01 9.3783933518005547e-01 9.0822335617436944e-01 9.0822335617436944e-01 1.1079135118806735e+08
0.0000000000000000e+00 1.9888006090410909e-01 -4.5417691925627689e-02 -5.7541309753268981e-02 1.8570223692806570e-01 2.6839251289435345e-02 -3.5858382632556876e-02 2.2058415837044570e-02 3.1619246057030564e-02 1.0000000000000000e+00 9.7894736842105268e-01 9.5833795013850420e-01 9.3816241434611469e-01 9.3816241434611469e-01 1.1319921057647294e+08
0.0000000000000000e+00 1.6420852379208145e-01 -2.1334707415184437e-02 -5.3681574445025963e-02 1.0321791872096504e-01 3.7630676427107587e-02 -2.4190077782059498e-02 -3.3254925682017572e-02 1.9072667660191583e-01 1.0000000000000000e+00 9.8947368421052628e-01 9.7905817174515231e-01 9.6875229625309800e-01 9.6875229625309800e-01 1.1563238092136064e+08
0.0000000000000000e+00 9.2385672666928986e-02 -7.5328926144319734e-03 -2.8536438078967347e-02 -1.6553095053473525e-02 1.5410086651862076e-01 -1.0189388353867119e-02 -2.9598523553562982e-02 1.0889515078225322e-01 1.0000000000000000e+00 1.0000000000000000e+00 1.0000000000000000e+00 1.0000000000000000e+00 1.0000000000000000e+00 1.1809099716453132e+08
"""}
# convert to arrays for convenience
for key, val in fmristat.items():
fmristat[key] = np.fromstring(val, sep='\t').reshape(N_ROWS,-1).T
# time vector
time_vector = np.arange(N_ROWS)*2.5+1.25
| 171.368478 | 398 | 0.758561 | 19,118 | 157,659 | 6.254368 | 0.121979 | 0.267749 | 0.215604 | 0.125942 | 0.683343 | 0.630027 | 0.615509 | 0.595462 | 0.579739 | 0.534419 | 0 | 0.828931 | 0.148257 | 157,659 | 919 | 399 | 171.554951 | 0.061496 | 0.00307 | 0 | 0.008571 | 0 | 0.272857 | 0.985935 | 0.777515 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.002857 | 0 | 0.002857 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
5834f050f9cfaa1ca4eba07ac0ff1ce86e52962d | 833,226 | py | Python | core/src/epicli/cli/licenses.py | constantiusdamarwicaksono/epiphany | fa176f61fbbc59cd439cf329c1de466b9d1287f8 | [
"Apache-2.0"
] | null | null | null | core/src/epicli/cli/licenses.py | constantiusdamarwicaksono/epiphany | fa176f61fbbc59cd439cf329c1de466b9d1287f8 | [
"Apache-2.0"
] | null | null | null | core/src/epicli/cli/licenses.py | constantiusdamarwicaksono/epiphany | fa176f61fbbc59cd439cf329c1de466b9d1287f8 | [
"Apache-2.0"
] | null | null | null |
# This is a generated file so don`t change this manually.
# To re-generate run 'python gen-licenses.py' from the project root.
LICENSES = [
{
"Name": "adal",
"Version": "1.2.2",
"Summary": "The ADAL for Python library makes it easy for python application to authenticate to Azure Active Directory (AAD) in order to access AAD protected web resources.",
"Home-page": "https://github.com/AzureAD/azure-activedirectory-library-for-python",
"Author": "Microsoft Corporation",
"License": "Other",
"License URL": "https://api.github.com/repos/azuread/azure-activedirectory-library-for-python/license",
"License repo": "The MIT License (MIT)\n\nCopyright (c) Microsoft Corporation. \nAll rights reserved.\n\nThis code is licensed under the MIT License.\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files(the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and / or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions :\n\nThe above copyright notice and this permission notice shall be included in\nall copies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT.IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN\nTHE SOFTWARE."
},
{
"Name": "ansible",
"Version": "2.8.8",
"Summary": "Radically simple IT automation",
"Home-page": "https://ansible.com/",
"Author": "Ansible, Inc.",
"License": "GPLv3+"
},
{
"Name": "antlr4-python3-runtime",
"Version": "4.7.2",
"Summary": "ANTLR 4.7.2 runtime for Python 3.6.3",
"Home-page": "http://www.antlr.org",
"Author": "Eric Vergnaud, Terence Parr, Sam Harwell",
"License": "BSD"
},
{
"Name": "applicationinsights",
"Version": "0.11.7",
"Summary": "This project extends the Application Insights API surface to support Python.",
"Home-page": "https://github.com/Microsoft/ApplicationInsights-Python",
"Author": "Microsoft",
"License": "MIT License",
"License URL": "https://api.github.com/repos/microsoft/applicationinsights-python/license",
"License repo": "The MIT License (MIT)\n\nCopyright (c) 2018 Microsoft\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "argcomplete",
"Version": "1.10.0",
"Summary": "Bash tab completion for argparse",
"Home-page": "https://github.com/kislyuk/argcomplete",
"Author": "Andrey Kislyuk",
"License": "Apache License 2.0",
"License URL": "https://api.github.com/repos/kislyuk/argcomplete/license",
"License repo": "\n Apache License\n Version 2.0, January 2004\n http://www.apache.org/licenses/\n\n TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION\n\n 1. Definitions.\n\n \"License\" shall mean the terms and conditions for use, reproduction,\n and distribution as defined by Sections 1 through 9 of this document.\n\n \"Licensor\" shall mean the copyright owner or entity authorized by\n the copyright owner that is granting the License.\n\n \"Legal Entity\" shall mean the union of the acting entity and all\n other entities that control, are controlled by, or are under common\n control with that entity. For the purposes of this definition,\n \"control\" means (i) the power, direct or indirect, to cause the\n direction or management of such entity, whether by contract or\n otherwise, or (ii) ownership of fifty percent (50%) or more of the\n outstanding shares, or (iii) beneficial ownership of such entity.\n\n \"You\" (or \"Your\") shall mean an individual or Legal Entity\n exercising permissions granted by this License.\n\n \"Source\" form shall mean the preferred form for making modifications,\n including but not limited to software source code, documentation\n source, and configuration files.\n\n \"Object\" form shall mean any form resulting from mechanical\n transformation or translation of a Source form, including but\n not limited to compiled object code, generated documentation,\n and conversions to other media types.\n\n \"Work\" shall mean the work of authorship, whether in Source or\n Object form, made available under the License, as indicated by a\n copyright notice that is included in or attached to the work\n (an example is provided in the Appendix below).\n\n \"Derivative Works\" shall mean any work, whether in Source or Object\n form, that is based on (or derived from) the Work and for which the\n editorial revisions, annotations, elaborations, or other modifications\n represent, as a whole, an original work of authorship. For the purposes\n of this License, Derivative Works shall not include works that remain\n separable from, or merely link (or bind by name) to the interfaces of,\n the Work and Derivative Works thereof.\n\n \"Contribution\" shall mean any work of authorship, including\n the original version of the Work and any modifications or additions\n to that Work or Derivative Works thereof, that is intentionally\n submitted to Licensor for inclusion in the Work by the copyright owner\n or by an individual or Legal Entity authorized to submit on behalf of\n the copyright owner. For the purposes of this definition, \"submitted\"\n means any form of electronic, verbal, or written communication sent\n to the Licensor or its representatives, including but not limited to\n communication on electronic mailing lists, source code control systems,\n and issue tracking systems that are managed by, or on behalf of, the\n Licensor for the purpose of discussing and improving the Work, but\n excluding communication that is conspicuously marked or otherwise\n designated in writing by the copyright owner as \"Not a Contribution.\"\n\n \"Contributor\" shall mean Licensor and any individual or Legal Entity\n on behalf of whom a Contribution has been received by Licensor and\n subsequently incorporated within the Work.\n\n 2. Grant of Copyright License. Subject to the terms and conditions of\n this License, each Contributor hereby grants to You a perpetual,\n worldwide, non-exclusive, no-charge, royalty-free, irrevocable\n copyright license to reproduce, prepare Derivative Works of,\n publicly display, publicly perform, sublicense, and distribute the\n Work and such Derivative Works in Source or Object form.\n\n 3. Grant of Patent License. Subject to the terms and conditions of\n this License, each Contributor hereby grants to You a perpetual,\n worldwide, non-exclusive, no-charge, royalty-free, irrevocable\n (except as stated in this section) patent license to make, have made,\n use, offer to sell, sell, import, and otherwise transfer the Work,\n where such license applies only to those patent claims licensable\n by such Contributor that are necessarily infringed by their\n Contribution(s) alone or by combination of their Contribution(s)\n with the Work to which such Contribution(s) was submitted. If You\n institute patent litigation against any entity (including a\n cross-claim or counterclaim in a lawsuit) alleging that the Work\n or a Contribution incorporated within the Work constitutes direct\n or contributory patent infringement, then any patent licenses\n granted to You under this License for that Work shall terminate\n as of the date such litigation is filed.\n\n 4. Redistribution. You may reproduce and distribute copies of the\n Work or Derivative Works thereof in any medium, with or without\n modifications, and in Source or Object form, provided that You\n meet the following conditions:\n\n (a) You must give any other recipients of the Work or\n Derivative Works a copy of this License; and\n\n (b) You must cause any modified files to carry prominent notices\n stating that You changed the files; and\n\n (c) You must retain, in the Source form of any Derivative Works\n that You distribute, all copyright, patent, trademark, and\n attribution notices from the Source form of the Work,\n excluding those notices that do not pertain to any part of\n the Derivative Works; and\n\n (d) If the Work includes a \"NOTICE\" text file as part of its\n distribution, then any Derivative Works that You distribute must\n include a readable copy of the attribution notices contained\n within such NOTICE file, excluding those notices that do not\n pertain to any part of the Derivative Works, in at least one\n of the following places: within a NOTICE text file distributed\n as part of the Derivative Works; within the Source form or\n documentation, if provided along with the Derivative Works; or,\n within a display generated by the Derivative Works, if and\n wherever such third-party notices normally appear. The contents\n of the NOTICE file are for informational purposes only and\n do not modify the License. You may add Your own attribution\n notices within Derivative Works that You distribute, alongside\n or as an addendum to the NOTICE text from the Work, provided\n that such additional attribution notices cannot be construed\n as modifying the License.\n\n You may add Your own copyright statement to Your modifications and\n may provide additional or different license terms and conditions\n for use, reproduction, or distribution of Your modifications, or\n for any such Derivative Works as a whole, provided Your use,\n reproduction, and distribution of the Work otherwise complies with\n the conditions stated in this License.\n\n 5. Submission of Contributions. Unless You explicitly state otherwise,\n any Contribution intentionally submitted for inclusion in the Work\n by You to the Licensor shall be under the terms and conditions of\n this License, without any additional terms or conditions.\n Notwithstanding the above, nothing herein shall supersede or modify\n the terms of any separate license agreement you may have executed\n with Licensor regarding such Contributions.\n\n 6. Trademarks. This License does not grant permission to use the trade\n names, trademarks, service marks, or product names of the Licensor,\n except as required for reasonable and customary use in describing the\n origin of the Work and reproducing the content of the NOTICE file.\n\n 7. Disclaimer of Warranty. Unless required by applicable law or\n agreed to in writing, Licensor provides the Work (and each\n Contributor provides its Contributions) on an \"AS IS\" BASIS,\n WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or\n implied, including, without limitation, any warranties or conditions\n of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A\n PARTICULAR PURPOSE. You are solely responsible for determining the\n appropriateness of using or redistributing the Work and assume any\n risks associated with Your exercise of permissions under this License.\n\n 8. Limitation of Liability. In no event and under no legal theory,\n whether in tort (including negligence), contract, or otherwise,\n unless required by applicable law (such as deliberate and grossly\n negligent acts) or agreed to in writing, shall any Contributor be\n liable to You for damages, including any direct, indirect, special,\n incidental, or consequential damages of any character arising as a\n result of this License or out of the use or inability to use the\n Work (including but not limited to damages for loss of goodwill,\n work stoppage, computer failure or malfunction, or any and all\n other commercial damages or losses), even if such Contributor\n has been advised of the possibility of such damages.\n\n 9. Accepting Warranty or Additional Liability. While redistributing\n the Work or Derivative Works thereof, You may choose to offer,\n and charge a fee for, acceptance of support, warranty, indemnity,\n or other liability obligations and/or rights consistent with this\n License. However, in accepting such obligations, You may act only\n on Your own behalf and on Your sole responsibility, not on behalf\n of any other Contributor, and only if You agree to indemnify,\n defend, and hold each Contributor harmless for any liability\n incurred by, or claims asserted against, such Contributor by reason\n of your accepting any such warranty or additional liability.\n\n END OF TERMS AND CONDITIONS\n",
"License text": " Apache License\n Version 2.0, January 2004\n http://www.apache.org/licenses/\n\n TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION\n\n 1. Definitions.\n\n \"License\" shall mean the terms and conditions for use, reproduction,\n and distribution as defined by Sections 1 through 9 of this document.\n\n \"Licensor\" shall mean the copyright owner or entity authorized by\n the copyright owner that is granting the License.\n\n \"Legal Entity\" shall mean the union of the acting entity and all\n other entities that control, are controlled by, or are under common\n control with that entity. For the purposes of this definition,\n \"control\" means (i) the power, direct or indirect, to cause the\n direction or management of such entity, whether by contract or\n otherwise, or (ii) ownership of fifty percent (50%) or more of the\n outstanding shares, or (iii) beneficial ownership of such entity.\n\n \"You\" (or \"Your\") shall mean an individual or Legal Entity\n exercising permissions granted by this License.\n\n \"Source\" form shall mean the preferred form for making modifications,\n including but not limited to software source code, documentation\n source, and configuration files.\n\n \"Object\" form shall mean any form resulting from mechanical\n transformation or translation of a Source form, including but\n not limited to compiled object code, generated documentation,\n and conversions to other media types.\n\n \"Work\" shall mean the work of authorship, whether in Source or\n Object form, made available under the License, as indicated by a\n copyright notice that is included in or attached to the work\n (an example is provided in the Appendix below).\n\n \"Derivative Works\" shall mean any work, whether in Source or Object\n form, that is based on (or derived from) the Work and for which the\n editorial revisions, annotations, elaborations, or other modifications\n represent, as a whole, an original work of authorship. For the purposes\n of this License, Derivative Works shall not include works that remain\n separable from, or merely link (or bind by name) to the interfaces of,\n the Work and Derivative Works thereof.\n\n \"Contribution\" shall mean any work of authorship, including\n the original version of the Work and any modifications or additions\n to that Work or Derivative Works thereof, that is intentionally\n submitted to Licensor for inclusion in the Work by the copyright owner\n or by an individual or Legal Entity authorized to submit on behalf of\n the copyright owner. For the purposes of this definition, \"submitted\"\n means any form of electronic, verbal, or written communication sent\n to the Licensor or its representatives, including but not limited to\n communication on electronic mailing lists, source code control systems,\n and issue tracking systems that are managed by, or on behalf of, the\n Licensor for the purpose of discussing and improving the Work, but\n excluding communication that is conspicuously marked or otherwise\n designated in writing by the copyright owner as \"Not a Contribution.\"\n\n \"Contributor\" shall mean Licensor and any individual or Legal Entity\n on behalf of whom a Contribution has been received by Licensor and\n subsequently incorporated within the Work.\n\n 2. Grant of Copyright License. Subject to the terms and conditions of\n this License, each Contributor hereby grants to You a perpetual,\n worldwide, non-exclusive, no-charge, royalty-free, irrevocable\n copyright license to reproduce, prepare Derivative Works of,\n publicly display, publicly perform, sublicense, and distribute the\n Work and such Derivative Works in Source or Object form.\n\n 3. Grant of Patent License. Subject to the terms and conditions of\n this License, each Contributor hereby grants to You a perpetual,\n worldwide, non-exclusive, no-charge, royalty-free, irrevocable\n (except as stated in this section) patent license to make, have made,\n use, offer to sell, sell, import, and otherwise transfer the Work,\n where such license applies only to those patent claims licensable\n by such Contributor that are necessarily infringed by their\n Contribution(s) alone or by combination of their Contribution(s)\n with the Work to which such Contribution(s) was submitted. If You\n institute patent litigation against any entity (including a\n cross-claim or counterclaim in a lawsuit) alleging that the Work\n or a Contribution incorporated within the Work constitutes direct\n or contributory patent infringement, then any patent licenses\n granted to You under this License for that Work shall terminate\n as of the date such litigation is filed.\n\n 4. Redistribution. You may reproduce and distribute copies of the\n Work or Derivative Works thereof in any medium, with or without\n modifications, and in Source or Object form, provided that You\n meet the following conditions:\n\n (a) You must give any other recipients of the Work or\n Derivative Works a copy of this License; and\n\n (b) You must cause any modified files to carry prominent notices\n stating that You changed the files; and\n\n (c) You must retain, in the Source form of any Derivative Works\n that You distribute, all copyright, patent, trademark, and\n attribution notices from the Source form of the Work,\n excluding those notices that do not pertain to any part of\n the Derivative Works; and\n\n (d) If the Work includes a \"NOTICE\" text file as part of its\n distribution, then any Derivative Works that You distribute must\n include a readable copy of the attribution notices contained\n within such NOTICE file, excluding those notices that do not\n pertain to any part of the Derivative Works, in at least one\n of the following places: within a NOTICE text file distributed\n as part of the Derivative Works; within the Source form or\n documentation, if provided along with the Derivative Works; or,\n within a display generated by the Derivative Works, if and\n wherever such third-party notices normally appear. The contents\n of the NOTICE file are for informational purposes only and\n do not modify the License. You may add Your own attribution\n notices within Derivative Works that You distribute, alongside\n or as an addendum to the NOTICE text from the Work, provided\n that such additional attribution notices cannot be construed\n as modifying the License.\n\n You may add Your own copyright statement to Your modifications and\n may provide additional or different license terms and conditions\n for use, reproduction, or distribution of Your modifications, or\n for any such Derivative Works as a whole, provided Your use,\n reproduction, and distribution of the Work otherwise complies with\n the conditions stated in this License.\n\n 5. Submission of Contributions. Unless You explicitly state otherwise,\n any Contribution intentionally submitted for inclusion in the Work\n by You to the Licensor shall be under the terms and conditions of\n this License, without any additional terms or conditions.\n Notwithstanding the above, nothing herein shall supersede or modify\n the terms of any separate license agreement you may have executed\n with Licensor regarding such Contributions.\n\n 6. Trademarks. This License does not grant permission to use the trade\n names, trademarks, service marks, or product names of the Licensor,\n except as required for reasonable and customary use in describing the\n origin of the Work and reproducing the content of the NOTICE file.\n\n 7. Disclaimer of Warranty. Unless required by applicable law or\n agreed to in writing, Licensor provides the Work (and each\n Contributor provides its Contributions) on an \"AS IS\" BASIS,\n WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or\n implied, including, without limitation, any warranties or conditions\n of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A\n PARTICULAR PURPOSE. You are solely responsible for determining the\n appropriateness of using or redistributing the Work and assume any\n risks associated with Your exercise of permissions under this License.\n\n 8. Limitation of Liability. In no event and under no legal theory,\n whether in tort (including negligence), contract, or otherwise,\n unless required by applicable law (such as deliberate and grossly\n negligent acts) or agreed to in writing, shall any Contributor be\n liable to You for damages, including any direct, indirect, special,\n incidental, or consequential damages of any character arising as a\n result of this License or out of the use or inability to use the\n Work (including but not limited to damages for loss of goodwill,\n work stoppage, computer failure or malfunction, or any and all\n other commercial damages or losses), even if such Contributor\n has been advised of the possibility of such damages.\n\n 9. Accepting Warranty or Additional Liability. While redistributing\n the Work or Derivative Works thereof, You may choose to offer,\n and charge a fee for, acceptance of support, warranty, indemnity,\n or other liability obligations and/or rights consistent with this\n License. However, in accepting such obligations, You may act only\n on Your own behalf and on Your sole responsibility, not on behalf\n of any other Contributor, and only if You agree to indemnify,\n defend, and hold each Contributor harmless for any liability\n incurred by, or claims asserted against, such Contributor by reason\n of your accepting any such warranty or additional liability.\n\n END OF TERMS AND CONDITIONS\n\n APPENDIX: How to apply the Apache License to your work.\n\n To apply the Apache License to your work, attach the following\n boilerplate notice, with the fields enclosed by brackets \"[]\"\n replaced with your own identifying information. (Don't include\n the brackets!) The text should be enclosed in the appropriate\n comment syntax for the file format. We also recommend that a\n file or class name and description of purpose be included on the\n same \"printed page\" as the copyright notice for easier\n identification within third-party archives.\n\n Copyright [yyyy] [name of copyright owner]\n\n Licensed under the Apache License, Version 2.0 (the \"License\");\n you may not use this file except in compliance with the License.\n You may obtain a copy of the License at\n\n http://www.apache.org/licenses/LICENSE-2.0\n\n Unless required by applicable law or agreed to in writing, software\n distributed under the License is distributed on an \"AS IS\" BASIS,\n WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n See the License for the specific language governing permissions and\n limitations under the License.\n"
},
{
"Name": "astroid",
"Version": "2.3.3",
"Summary": "An abstract syntax tree for Python with inference support.",
"Home-page": "https://github.com/PyCQA/astroid",
"Author": "Python Code Quality Authority",
"License": "GNU Lesser General Public License v2.1",
"License URL": "https://api.github.com/repos/pycqa/astroid/license",
"License repo": "\t\t GNU GENERAL PUBLIC LICENSE\n\t\t Version 2, June 1991\n\n Copyright (C) 1989, 1991 Free Software Foundation, Inc.,\n 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA\n Everyone is permitted to copy and distribute verbatim copies\n of this license document, but changing it is not allowed.\n\n\t\t\t Preamble\n\n The licenses for most software are designed to take away your\nfreedom to share and change it. By contrast, the GNU General Public\nLicense is intended to guarantee your freedom to share and change free\nsoftware--to make sure the software is free for all its users. This\nGeneral Public License applies to most of the Free Software\nFoundation's software and to any other program whose authors commit to\nusing it. (Some other Free Software Foundation software is covered by\nthe GNU Lesser General Public License instead.) You can apply it to\nyour programs, too.\n\n When we speak of free software, we are referring to freedom, not\nprice. Our General Public Licenses are designed to make sure that you\nhave the freedom to distribute copies of free software (and charge for\nthis service if you wish), that you receive source code or can get it\nif you want it, that you can change the software or use pieces of it\nin new free programs; and that you know you can do these things.\n\n To protect your rights, we need to make restrictions that forbid\nanyone to deny you these rights or to ask you to surrender the rights.\nThese restrictions translate to certain responsibilities for you if you\ndistribute copies of the software, or if you modify it.\n\n For example, if you distribute copies of such a program, whether\ngratis or for a fee, you must give the recipients all the rights that\nyou have. You must make sure that they, too, receive or can get the\nsource code. And you must show them these terms so they know their\nrights.\n\n We protect your rights with two steps: (1) copyright the software, and\n(2) offer you this license which gives you legal permission to copy,\ndistribute and/or modify the software.\n\n Also, for each author's protection and ours, we want to make certain\nthat everyone understands that there is no warranty for this free\nsoftware. If the software is modified by someone else and passed on, we\nwant its recipients to know that what they have is not the original, so\nthat any problems introduced by others will not reflect on the original\nauthors' reputations.\n\n Finally, any free program is threatened constantly by software\npatents. We wish to avoid the danger that redistributors of a free\nprogram will individually obtain patent licenses, in effect making the\nprogram proprietary. To prevent this, we have made it clear that any\npatent must be licensed for everyone's free use or not licensed at all.\n\n The precise terms and conditions for copying, distribution and\nmodification follow.\n\n\t\t GNU GENERAL PUBLIC LICENSE\n TERMS AND CONDITIONS FOR COPYING, DISTRIBUTION AND MODIFICATION\n\n 0. This License applies to any program or other work which contains\na notice placed by the copyright holder saying it may be distributed\nunder the terms of this General Public License. The \"Program\", below,\nrefers to any such program or work, and a \"work based on the Program\"\nmeans either the Program or any derivative work under copyright law:\nthat is to say, a work containing the Program or a portion of it,\neither verbatim or with modifications and/or translated into another\nlanguage. (Hereinafter, translation is included without limitation in\nthe term \"modification\".) Each licensee is addressed as \"you\".\n\nActivities other than copying, distribution and modification are not\ncovered by this License; they are outside its scope. The act of\nrunning the Program is not restricted, and the output from the Program\nis covered only if its contents constitute a work based on the\nProgram (independent of having been made by running the Program).\nWhether that is true depends on what the Program does.\n\n 1. You may copy and distribute verbatim copies of the Program's\nsource code as you receive it, in any medium, provided that you\nconspicuously and appropriately publish on each copy an appropriate\ncopyright notice and disclaimer of warranty; keep intact all the\nnotices that refer to this License and to the absence of any warranty;\nand give any other recipients of the Program a copy of this License\nalong with the Program.\n\nYou may charge a fee for the physical act of transferring a copy, and\nyou may at your option offer warranty protection in exchange for a fee.\n\n 2. You may modify your copy or copies of the Program or any portion\nof it, thus forming a work based on the Program, and copy and\ndistribute such modifications or work under the terms of Section 1\nabove, provided that you also meet all of these conditions:\n\n a) You must cause the modified files to carry prominent notices\n stating that you changed the files and the date of any change.\n\n b) You must cause any work that you distribute or publish, that in\n whole or in part contains or is derived from the Program or any\n part thereof, to be licensed as a whole at no charge to all third\n parties under the terms of this License.\n\n c) If the modified program normally reads commands interactively\n when run, you must cause it, when started running for such\n interactive use in the most ordinary way, to print or display an\n announcement including an appropriate copyright notice and a\n notice that there is no warranty (or else, saying that you provide\n a warranty) and that users may redistribute the program under\n these conditions, and telling the user how to view a copy of this\n License. (Exception: if the Program itself is interactive but\n does not normally print such an announcement, your work based on\n the Program is not required to print an announcement.)\n\nThese requirements apply to the modified work as a whole. If\nidentifiable sections of that work are not derived from the Program,\nand can be reasonably considered independent and separate works in\nthemselves, then this License, and its terms, do not apply to those\nsections when you distribute them as separate works. But when you\ndistribute the same sections as part of a whole which is a work based\non the Program, the distribution of the whole must be on the terms of\nthis License, whose permissions for other licensees extend to the\nentire whole, and thus to each and every part regardless of who wrote it.\n\nThus, it is not the intent of this section to claim rights or contest\nyour rights to work written entirely by you; rather, the intent is to\nexercise the right to control the distribution of derivative or\ncollective works based on the Program.\n\nIn addition, mere aggregation of another work not based on the Program\nwith the Program (or with a work based on the Program) on a volume of\na storage or distribution medium does not bring the other work under\nthe scope of this License.\n\n 3. You may copy and distribute the Program (or a work based on it,\nunder Section 2) in object code or executable form under the terms of\nSections 1 and 2 above provided that you also do one of the following:\n\n a) Accompany it with the complete corresponding machine-readable\n source code, which must be distributed under the terms of Sections\n 1 and 2 above on a medium customarily used for software interchange; or,\n\n b) Accompany it with a written offer, valid for at least three\n years, to give any third party, for a charge no more than your\n cost of physically performing source distribution, a complete\n machine-readable copy of the corresponding source code, to be\n distributed under the terms of Sections 1 and 2 above on a medium\n customarily used for software interchange; or,\n\n c) Accompany it with the information you received as to the offer\n to distribute corresponding source code. (This alternative is\n allowed only for noncommercial distribution and only if you\n received the program in object code or executable form with such\n an offer, in accord with Subsection b above.)\n\nThe source code for a work means the preferred form of the work for\nmaking modifications to it. For an executable work, complete source\ncode means all the source code for all modules it contains, plus any\nassociated interface definition files, plus the scripts used to\ncontrol compilation and installation of the executable. However, as a\nspecial exception, the source code distributed need not include\nanything that is normally distributed (in either source or binary\nform) with the major components (compiler, kernel, and so on) of the\noperating system on which the executable runs, unless that component\nitself accompanies the executable.\n\nIf distribution of executable or object code is made by offering\naccess to copy from a designated place, then offering equivalent\naccess to copy the source code from the same place counts as\ndistribution of the source code, even though third parties are not\ncompelled to copy the source along with the object code.\n\n 4. You may not copy, modify, sublicense, or distribute the Program\nexcept as expressly provided under this License. Any attempt\notherwise to copy, modify, sublicense or distribute the Program is\nvoid, and will automatically terminate your rights under this License.\nHowever, parties who have received copies, or rights, from you under\nthis License will not have their licenses terminated so long as such\nparties remain in full compliance.\n\n 5. You are not required to accept this License, since you have not\nsigned it. However, nothing else grants you permission to modify or\ndistribute the Program or its derivative works. These actions are\nprohibited by law if you do not accept this License. Therefore, by\nmodifying or distributing the Program (or any work based on the\nProgram), you indicate your acceptance of this License to do so, and\nall its terms and conditions for copying, distributing or modifying\nthe Program or works based on it.\n\n 6. Each time you redistribute the Program (or any work based on the\nProgram), the recipient automatically receives a license from the\noriginal licensor to copy, distribute or modify the Program subject to\nthese terms and conditions. You may not impose any further\nrestrictions on the recipients' exercise of the rights granted herein.\nYou are not responsible for enforcing compliance by third parties to\nthis License.\n\n 7. If, as a consequence of a court judgment or allegation of patent\ninfringement or for any other reason (not limited to patent issues),\nconditions are imposed on you (whether by court order, agreement or\notherwise) that contradict the conditions of this License, they do not\nexcuse you from the conditions of this License. If you cannot\ndistribute so as to satisfy simultaneously your obligations under this\nLicense and any other pertinent obligations, then as a consequence you\nmay not distribute the Program at all. For example, if a patent\nlicense would not permit royalty-free redistribution of the Program by\nall those who receive copies directly or indirectly through you, then\nthe only way you could satisfy both it and this License would be to\nrefrain entirely from distribution of the Program.\n\nIf any portion of this section is held invalid or unenforceable under\nany particular circumstance, the balance of the section is intended to\napply and the section as a whole is intended to apply in other\ncircumstances.\n\nIt is not the purpose of this section to induce you to infringe any\npatents or other property right claims or to contest validity of any\nsuch claims; this section has the sole purpose of protecting the\nintegrity of the free software distribution system, which is\nimplemented by public license practices. Many people have made\ngenerous contributions to the wide range of software distributed\nthrough that system in reliance on consistent application of that\nsystem; it is up to the author/donor to decide if he or she is willing\nto distribute software through any other system and a licensee cannot\nimpose that choice.\n\nThis section is intended to make thoroughly clear what is believed to\nbe a consequence of the rest of this License.\n\n 8. If the distribution and/or use of the Program is restricted in\ncertain countries either by patents or by copyrighted interfaces, the\noriginal copyright holder who places the Program under this License\nmay add an explicit geographical distribution limitation excluding\nthose countries, so that distribution is permitted only in or among\ncountries not thus excluded. In such case, this License incorporates\nthe limitation as if written in the body of this License.\n\n 9. The Free Software Foundation may publish revised and/or new versions\nof the General Public License from time to time. Such new versions will\nbe similar in spirit to the present version, but may differ in detail to\naddress new problems or concerns.\n\nEach version is given a distinguishing version number. If the Program\nspecifies a version number of this License which applies to it and \"any\nlater version\", you have the option of following the terms and conditions\neither of that version or of any later version published by the Free\nSoftware Foundation. If the Program does not specify a version number of\nthis License, you may choose any version ever published by the Free Software\nFoundation.\n\n 10. If you wish to incorporate parts of the Program into other free\nprograms whose distribution conditions are different, write to the author\nto ask for permission. For software which is copyrighted by the Free\nSoftware Foundation, write to the Free Software Foundation; we sometimes\nmake exceptions for this. Our decision will be guided by the two goals\nof preserving the free status of all derivatives of our free software and\nof promoting the sharing and reuse of software generally.\n\n\t\t\t NO WARRANTY\n\n 11. BECAUSE THE PROGRAM IS LICENSED FREE OF CHARGE, THERE IS NO WARRANTY\nFOR THE PROGRAM, TO THE EXTENT PERMITTED BY APPLICABLE LAW. EXCEPT WHEN\nOTHERWISE STATED IN WRITING THE COPYRIGHT HOLDERS AND/OR OTHER PARTIES\nPROVIDE THE PROGRAM \"AS IS\" WITHOUT WARRANTY OF ANY KIND, EITHER EXPRESSED\nOR IMPLIED, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF\nMERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE. THE ENTIRE RISK AS\nTO THE QUALITY AND PERFORMANCE OF THE PROGRAM IS WITH YOU. SHOULD THE\nPROGRAM PROVE DEFECTIVE, YOU ASSUME THE COST OF ALL NECESSARY SERVICING,\nREPAIR OR CORRECTION.\n\n 12. IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING\nWILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MAY MODIFY AND/OR\nREDISTRIBUTE THE PROGRAM AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES,\nINCLUDING ANY GENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING\nOUT OF THE USE OR INABILITY TO USE THE PROGRAM (INCLUDING BUT NOT LIMITED\nTO LOSS OF DATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY\nYOU OR THIRD PARTIES OR A FAILURE OF THE PROGRAM TO OPERATE WITH ANY OTHER\nPROGRAMS), EVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE\nPOSSIBILITY OF SUCH DAMAGES.\n\n\t\t END OF TERMS AND CONDITIONS\n\n\t How to Apply These Terms to Your New Programs\n\n If you develop a new program, and you want it to be of the greatest\npossible use to the public, the best way to achieve this is to make it\nfree software which everyone can redistribute and change under these terms.\n\n To do so, attach the following notices to the program. It is safest\nto attach them to the start of each source file to most effectively\nconvey the exclusion of warranty; and each file should have at least\nthe \"copyright\" line and a pointer to where the full notice is found.\n\n <one line to give the program's name and a brief idea of what it does.>\n Copyright (C) <year> <name of author>\n\n This program is free software; you can redistribute it and/or modify\n it under the terms of the GNU General Public License as published by\n the Free Software Foundation; either version 2 of the License, or\n (at your option) any later version.\n\n This program is distributed in the hope that it will be useful,\n but WITHOUT ANY WARRANTY; without even the implied warranty of\n MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the\n GNU General Public License for more details.\n\n You should have received a copy of the GNU General Public License along\n with this program; if not, write to the Free Software Foundation, Inc.,\n 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.\n\nAlso add information on how to contact you by electronic and paper mail.\n\nIf the program is interactive, make it output a short notice like this\nwhen it starts in an interactive mode:\n\n Gnomovision version 69, Copyright (C) year name of author\n Gnomovision comes with ABSOLUTELY NO WARRANTY; for details type `show w'.\n This is free software, and you are welcome to redistribute it\n under certain conditions; type `show c' for details.\n\nThe hypothetical commands `show w' and `show c' should show the appropriate\nparts of the General Public License. Of course, the commands you use may\nbe called something other than `show w' and `show c'; they could even be\nmouse-clicks or menu items--whatever suits your program.\n\nYou should also get your employer (if you work as a programmer) or your\nschool, if any, to sign a \"copyright disclaimer\" for the program, if\nnecessary. Here is a sample; alter the names:\n\n Yoyodyne, Inc., hereby disclaims all copyright interest in the program\n `Gnomovision' (which makes passes at compilers) written by James Hacker.\n\n <signature of Ty Coon>, 1 April 1989\n Ty Coon, President of Vice\n\nThis General Public License does not permit incorporating your program into\nproprietary programs. If your program is a subroutine library, you may\nconsider it more useful to permit linking proprietary applications with the\nlibrary. If this is what you want to do, use the GNU Lesser General\nPublic License instead of this License.\n",
"License text": " GNU LESSER GENERAL PUBLIC LICENSE\n Version 2.1, February 1999\n\n Copyright (C) 1991, 1999 Free Software Foundation, Inc.\n 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA\n Everyone is permitted to copy and distribute verbatim copies\n of this license document, but changing it is not allowed.\n\n[This is the first released version of the Lesser GPL. It also counts\n as the successor of the GNU Library Public License, version 2, hence\n the version number 2.1.]\n\n Preamble\n\n The licenses for most software are designed to take away your\nfreedom to share and change it. By contrast, the GNU General Public\nLicenses are intended to guarantee your freedom to share and change\nfree software--to make sure the software is free for all its users.\n\n This license, the Lesser General Public License, applies to some\nspecially designated software packages--typically libraries--of the\nFree Software Foundation and other authors who decide to use it. You\ncan use it too, but we suggest you first think carefully about whether\nthis license or the ordinary General Public License is the better\nstrategy to use in any particular case, based on the explanations below.\n\n When we speak of free software, we are referring to freedom of use,\nnot price. Our General Public Licenses are designed to make sure that\nyou have the freedom to distribute copies of free software (and charge\nfor this service if you wish); that you receive source code or can get\nit if you want it; that you can change the software and use pieces of\nit in new free programs; and that you are informed that you can do\nthese things.\n\n To protect your rights, we need to make restrictions that forbid\ndistributors to deny you these rights or to ask you to surrender these\nrights. These restrictions translate to certain responsibilities for\nyou if you distribute copies of the library or if you modify it.\n\n For example, if you distribute copies of the library, whether gratis\nor for a fee, you must give the recipients all the rights that we gave\nyou. You must make sure that they, too, receive or can get the source\ncode. If you link other code with the library, you must provide\ncomplete object files to the recipients, so that they can relink them\nwith the library after making changes to the library and recompiling\nit. And you must show them these terms so they know their rights.\n\n We protect your rights with a two-step method: (1) we copyright the\nlibrary, and (2) we offer you this license, which gives you legal\npermission to copy, distribute and/or modify the library.\n\n To protect each distributor, we want to make it very clear that\nthere is no warranty for the free library. Also, if the library is\nmodified by someone else and passed on, the recipients should know\nthat what they have is not the original version, so that the original\nauthor's reputation will not be affected by problems that might be\nintroduced by others.\n\n Finally, software patents pose a constant threat to the existence of\nany free program. We wish to make sure that a company cannot\neffectively restrict the users of a free program by obtaining a\nrestrictive license from a patent holder. Therefore, we insist that\nany patent license obtained for a version of the library must be\nconsistent with the full freedom of use specified in this license.\n\n Most GNU software, including some libraries, is covered by the\nordinary GNU General Public License. This license, the GNU Lesser\nGeneral Public License, applies to certain designated libraries, and\nis quite different from the ordinary General Public License. We use\nthis license for certain libraries in order to permit linking those\nlibraries into non-free programs.\n\n When a program is linked with a library, whether statically or using\na shared library, the combination of the two is legally speaking a\ncombined work, a derivative of the original library. The ordinary\nGeneral Public License therefore permits such linking only if the\nentire combination fits its criteria of freedom. The Lesser General\nPublic License permits more lax criteria for linking other code with\nthe library.\n\n We call this license the \"Lesser\" General Public License because it\ndoes Less to protect the user's freedom than the ordinary General\nPublic License. It also provides other free software developers Less\nof an advantage over competing non-free programs. These disadvantages\nare the reason we use the ordinary General Public License for many\nlibraries. However, the Lesser license provides advantages in certain\nspecial circumstances.\n\n For example, on rare occasions, there may be a special need to\nencourage the widest possible use of a certain library, so that it becomes\na de-facto standard. To achieve this, non-free programs must be\nallowed to use the library. A more frequent case is that a free\nlibrary does the same job as widely used non-free libraries. In this\ncase, there is little to gain by limiting the free library to free\nsoftware only, so we use the Lesser General Public License.\n\n In other cases, permission to use a particular library in non-free\nprograms enables a greater number of people to use a large body of\nfree software. For example, permission to use the GNU C Library in\nnon-free programs enables many more people to use the whole GNU\noperating system, as well as its variant, the GNU/Linux operating\nsystem.\n\n Although the Lesser General Public License is Less protective of the\nusers' freedom, it does ensure that the user of a program that is\nlinked with the Library has the freedom and the wherewithal to run\nthat program using a modified version of the Library.\n\n The precise terms and conditions for copying, distribution and\nmodification follow. Pay close attention to the difference between a\n\"work based on the library\" and a \"work that uses the library\". The\nformer contains code derived from the library, whereas the latter must\nbe combined with the library in order to run.\n\n GNU LESSER GENERAL PUBLIC LICENSE\n TERMS AND CONDITIONS FOR COPYING, DISTRIBUTION AND MODIFICATION\n\n 0. This License Agreement applies to any software library or other\nprogram which contains a notice placed by the copyright holder or\nother authorized party saying it may be distributed under the terms of\nthis Lesser General Public License (also called \"this License\").\nEach licensee is addressed as \"you\".\n\n A \"library\" means a collection of software functions and/or data\nprepared so as to be conveniently linked with application programs\n(which use some of those functions and data) to form executables.\n\n The \"Library\", below, refers to any such software library or work\nwhich has been distributed under these terms. A \"work based on the\nLibrary\" means either the Library or any derivative work under\ncopyright law: that is to say, a work containing the Library or a\nportion of it, either verbatim or with modifications and/or translated\nstraightforwardly into another language. (Hereinafter, translation is\nincluded without limitation in the term \"modification\".)\n\n \"Source code\" for a work means the preferred form of the work for\nmaking modifications to it. For a library, complete source code means\nall the source code for all modules it contains, plus any associated\ninterface definition files, plus the scripts used to control compilation\nand installation of the library.\n\n Activities other than copying, distribution and modification are not\ncovered by this License; they are outside its scope. The act of\nrunning a program using the Library is not restricted, and output from\nsuch a program is covered only if its contents constitute a work based\non the Library (independent of the use of the Library in a tool for\nwriting it). Whether that is true depends on what the Library does\nand what the program that uses the Library does.\n\n 1. You may copy and distribute verbatim copies of the Library's\ncomplete source code as you receive it, in any medium, provided that\nyou conspicuously and appropriately publish on each copy an\nappropriate copyright notice and disclaimer of warranty; keep intact\nall the notices that refer to this License and to the absence of any\nwarranty; and distribute a copy of this License along with the\nLibrary.\n\n You may charge a fee for the physical act of transferring a copy,\nand you may at your option offer warranty protection in exchange for a\nfee.\n\n 2. You may modify your copy or copies of the Library or any portion\nof it, thus forming a work based on the Library, and copy and\ndistribute such modifications or work under the terms of Section 1\nabove, provided that you also meet all of these conditions:\n\n a) The modified work must itself be a software library.\n\n b) You must cause the files modified to carry prominent notices\n stating that you changed the files and the date of any change.\n\n c) You must cause the whole of the work to be licensed at no\n charge to all third parties under the terms of this License.\n\n d) If a facility in the modified Library refers to a function or a\n table of data to be supplied by an application program that uses\n the facility, other than as an argument passed when the facility\n is invoked, then you must make a good faith effort to ensure that,\n in the event an application does not supply such function or\n table, the facility still operates, and performs whatever part of\n its purpose remains meaningful.\n\n (For example, a function in a library to compute square roots has\n a purpose that is entirely well-defined independent of the\n application. Therefore, Subsection 2d requires that any\n application-supplied function or table used by this function must\n be optional: if the application does not supply it, the square\n root function must still compute square roots.)\n\nThese requirements apply to the modified work as a whole. If\nidentifiable sections of that work are not derived from the Library,\nand can be reasonably considered independent and separate works in\nthemselves, then this License, and its terms, do not apply to those\nsections when you distribute them as separate works. But when you\ndistribute the same sections as part of a whole which is a work based\non the Library, the distribution of the whole must be on the terms of\nthis License, whose permissions for other licensees extend to the\nentire whole, and thus to each and every part regardless of who wrote\nit.\n\nThus, it is not the intent of this section to claim rights or contest\nyour rights to work written entirely by you; rather, the intent is to\nexercise the right to control the distribution of derivative or\ncollective works based on the Library.\n\nIn addition, mere aggregation of another work not based on the Library\nwith the Library (or with a work based on the Library) on a volume of\na storage or distribution medium does not bring the other work under\nthe scope of this License.\n\n 3. You may opt to apply the terms of the ordinary GNU General Public\nLicense instead of this License to a given copy of the Library. To do\nthis, you must alter all the notices that refer to this License, so\nthat they refer to the ordinary GNU General Public License, version 2,\ninstead of to this License. (If a newer version than version 2 of the\nordinary GNU General Public License has appeared, then you can specify\nthat version instead if you wish.) Do not make any other change in\nthese notices.\n\n Once this change is made in a given copy, it is irreversible for\nthat copy, so the ordinary GNU General Public License applies to all\nsubsequent copies and derivative works made from that copy.\n\n This option is useful when you wish to copy part of the code of\nthe Library into a program that is not a library.\n\n 4. You may copy and distribute the Library (or a portion or\nderivative of it, under Section 2) in object code or executable form\nunder the terms of Sections 1 and 2 above provided that you accompany\nit with the complete corresponding machine-readable source code, which\nmust be distributed under the terms of Sections 1 and 2 above on a\nmedium customarily used for software interchange.\n\n If distribution of object code is made by offering access to copy\nfrom a designated place, then offering equivalent access to copy the\nsource code from the same place satisfies the requirement to\ndistribute the source code, even though third parties are not\ncompelled to copy the source along with the object code.\n\n 5. A program that contains no derivative of any portion of the\nLibrary, but is designed to work with the Library by being compiled or\nlinked with it, is called a \"work that uses the Library\". Such a\nwork, in isolation, is not a derivative work of the Library, and\ntherefore falls outside the scope of this License.\n\n However, linking a \"work that uses the Library\" with the Library\ncreates an executable that is a derivative of the Library (because it\ncontains portions of the Library), rather than a \"work that uses the\nlibrary\". The executable is therefore covered by this License.\nSection 6 states terms for distribution of such executables.\n\n When a \"work that uses the Library\" uses material from a header file\nthat is part of the Library, the object code for the work may be a\nderivative work of the Library even though the source code is not.\nWhether this is true is especially significant if the work can be\nlinked without the Library, or if the work is itself a library. The\nthreshold for this to be true is not precisely defined by law.\n\n If such an object file uses only numerical parameters, data\nstructure layouts and accessors, and small macros and small inline\nfunctions (ten lines or less in length), then the use of the object\nfile is unrestricted, regardless of whether it is legally a derivative\nwork. (Executables containing this object code plus portions of the\nLibrary will still fall under Section 6.)\n\n Otherwise, if the work is a derivative of the Library, you may\ndistribute the object code for the work under the terms of Section 6.\nAny executables containing that work also fall under Section 6,\nwhether or not they are linked directly with the Library itself.\n\n 6. As an exception to the Sections above, you may also combine or\nlink a \"work that uses the Library\" with the Library to produce a\nwork containing portions of the Library, and distribute that work\nunder terms of your choice, provided that the terms permit\nmodification of the work for the customer's own use and reverse\nengineering for debugging such modifications.\n\n You must give prominent notice with each copy of the work that the\nLibrary is used in it and that the Library and its use are covered by\nthis License. You must supply a copy of this License. If the work\nduring execution displays copyright notices, you must include the\ncopyright notice for the Library among them, as well as a reference\ndirecting the user to the copy of this License. Also, you must do one\nof these things:\n\n a) Accompany the work with the complete corresponding\n machine-readable source code for the Library including whatever\n changes were used in the work (which must be distributed under\n Sections 1 and 2 above); and, if the work is an executable linked\n with the Library, with the complete machine-readable \"work that\n uses the Library\", as object code and/or source code, so that the\n user can modify the Library and then relink to produce a modified\n executable containing the modified Library. (It is understood\n that the user who changes the contents of definitions files in the\n Library will not necessarily be able to recompile the application\n to use the modified definitions.)\n\n b) Use a suitable shared library mechanism for linking with the\n Library. A suitable mechanism is one that (1) uses at run time a\n copy of the library already present on the user's computer system,\n rather than copying library functions into the executable, and (2)\n will operate properly with a modified version of the library, if\n the user installs one, as long as the modified version is\n interface-compatible with the version that the work was made with.\n\n c) Accompany the work with a written offer, valid for at\n least three years, to give the same user the materials\n specified in Subsection 6a, above, for a charge no more\n than the cost of performing this distribution.\n\n d) If distribution of the work is made by offering access to copy\n from a designated place, offer equivalent access to copy the above\n specified materials from the same place.\n\n e) Verify that the user has already received a copy of these\n materials or that you have already sent this user a copy.\n\n For an executable, the required form of the \"work that uses the\nLibrary\" must include any data and utility programs needed for\nreproducing the executable from it. However, as a special exception,\nthe materials to be distributed need not include anything that is\nnormally distributed (in either source or binary form) with the major\ncomponents (compiler, kernel, and so on) of the operating system on\nwhich the executable runs, unless that component itself accompanies\nthe executable.\n\n It may happen that this requirement contradicts the license\nrestrictions of other proprietary libraries that do not normally\naccompany the operating system. Such a contradiction means you cannot\nuse both them and the Library together in an executable that you\ndistribute.\n\n 7. You may place library facilities that are a work based on the\nLibrary side-by-side in a single library together with other library\nfacilities not covered by this License, and distribute such a combined\nlibrary, provided that the separate distribution of the work based on\nthe Library and of the other library facilities is otherwise\npermitted, and provided that you do these two things:\n\n a) Accompany the combined library with a copy of the same work\n based on the Library, uncombined with any other library\n facilities. This must be distributed under the terms of the\n Sections above.\n\n b) Give prominent notice with the combined library of the fact\n that part of it is a work based on the Library, and explaining\n where to find the accompanying uncombined form of the same work.\n\n 8. You may not copy, modify, sublicense, link with, or distribute\nthe Library except as expressly provided under this License. Any\nattempt otherwise to copy, modify, sublicense, link with, or\ndistribute the Library is void, and will automatically terminate your\nrights under this License. However, parties who have received copies,\nor rights, from you under this License will not have their licenses\nterminated so long as such parties remain in full compliance.\n\n 9. You are not required to accept this License, since you have not\nsigned it. However, nothing else grants you permission to modify or\ndistribute the Library or its derivative works. These actions are\nprohibited by law if you do not accept this License. Therefore, by\nmodifying or distributing the Library (or any work based on the\nLibrary), you indicate your acceptance of this License to do so, and\nall its terms and conditions for copying, distributing or modifying\nthe Library or works based on it.\n\n 10. Each time you redistribute the Library (or any work based on the\nLibrary), the recipient automatically receives a license from the\noriginal licensor to copy, distribute, link with or modify the Library\nsubject to these terms and conditions. You may not impose any further\nrestrictions on the recipients' exercise of the rights granted herein.\nYou are not responsible for enforcing compliance by third parties with\nthis License.\n\n 11. If, as a consequence of a court judgment or allegation of patent\ninfringement or for any other reason (not limited to patent issues),\nconditions are imposed on you (whether by court order, agreement or\notherwise) that contradict the conditions of this License, they do not\nexcuse you from the conditions of this License. If you cannot\ndistribute so as to satisfy simultaneously your obligations under this\nLicense and any other pertinent obligations, then as a consequence you\nmay not distribute the Library at all. For example, if a patent\nlicense would not permit royalty-free redistribution of the Library by\nall those who receive copies directly or indirectly through you, then\nthe only way you could satisfy both it and this License would be to\nrefrain entirely from distribution of the Library.\n\nIf any portion of this section is held invalid or unenforceable under any\nparticular circumstance, the balance of the section is intended to apply,\nand the section as a whole is intended to apply in other circumstances.\n\nIt is not the purpose of this section to induce you to infringe any\npatents or other property right claims or to contest validity of any\nsuch claims; this section has the sole purpose of protecting the\nintegrity of the free software distribution system which is\nimplemented by public license practices. Many people have made\ngenerous contributions to the wide range of software distributed\nthrough that system in reliance on consistent application of that\nsystem; it is up to the author/donor to decide if he or she is willing\nto distribute software through any other system and a licensee cannot\nimpose that choice.\n\nThis section is intended to make thoroughly clear what is believed to\nbe a consequence of the rest of this License.\n\n 12. If the distribution and/or use of the Library is restricted in\ncertain countries either by patents or by copyrighted interfaces, the\noriginal copyright holder who places the Library under this License may add\nan explicit geographical distribution limitation excluding those countries,\nso that distribution is permitted only in or among countries not thus\nexcluded. In such case, this License incorporates the limitation as if\nwritten in the body of this License.\n\n 13. The Free Software Foundation may publish revised and/or new\nversions of the Lesser General Public License from time to time.\nSuch new versions will be similar in spirit to the present version,\nbut may differ in detail to address new problems or concerns.\n\nEach version is given a distinguishing version number. If the Library\nspecifies a version number of this License which applies to it and\n\"any later version\", you have the option of following the terms and\nconditions either of that version or of any later version published by\nthe Free Software Foundation. If the Library does not specify a\nlicense version number, you may choose any version ever published by\nthe Free Software Foundation.\n\n 14. If you wish to incorporate parts of the Library into other free\nprograms whose distribution conditions are incompatible with these,\nwrite to the author to ask for permission. For software which is\ncopyrighted by the Free Software Foundation, write to the Free\nSoftware Foundation; we sometimes make exceptions for this. Our\ndecision will be guided by the two goals of preserving the free status\nof all derivatives of our free software and of promoting the sharing\nand reuse of software generally.\n\n NO WARRANTY\n\n 15. BECAUSE THE LIBRARY IS LICENSED FREE OF CHARGE, THERE IS NO\nWARRANTY FOR THE LIBRARY, TO THE EXTENT PERMITTED BY APPLICABLE LAW.\nEXCEPT WHEN OTHERWISE STATED IN WRITING THE COPYRIGHT HOLDERS AND/OR\nOTHER PARTIES PROVIDE THE LIBRARY \"AS IS\" WITHOUT WARRANTY OF ANY\nKIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, BUT NOT LIMITED TO, THE\nIMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR\nPURPOSE. THE ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF THE\nLIBRARY IS WITH YOU. SHOULD THE LIBRARY PROVE DEFECTIVE, YOU ASSUME\nTHE COST OF ALL NECESSARY SERVICING, REPAIR OR CORRECTION.\n\n 16. IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN\nWRITING WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MAY MODIFY\nAND/OR REDISTRIBUTE THE LIBRARY AS PERMITTED ABOVE, BE LIABLE TO YOU\nFOR DAMAGES, INCLUDING ANY GENERAL, SPECIAL, INCIDENTAL OR\nCONSEQUENTIAL DAMAGES ARISING OUT OF THE USE OR INABILITY TO USE THE\nLIBRARY (INCLUDING BUT NOT LIMITED TO LOSS OF DATA OR DATA BEING\nRENDERED INACCURATE OR LOSSES SUSTAINED BY YOU OR THIRD PARTIES OR A\nFAILURE OF THE LIBRARY TO OPERATE WITH ANY OTHER SOFTWARE), EVEN IF\nSUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE POSSIBILITY OF SUCH\nDAMAGES.\n\n END OF TERMS AND CONDITIONS\n\n How to Apply These Terms to Your New Libraries\n\n If you develop a new library, and you want it to be of the greatest\npossible use to the public, we recommend making it free software that\neveryone can redistribute and change. You can do so by permitting\nredistribution under these terms (or, alternatively, under the terms of the\nordinary General Public License).\n\n To apply these terms, attach the following notices to the library. It is\nsafest to attach them to the start of each source file to most effectively\nconvey the exclusion of warranty; and each file should have at least the\n\"copyright\" line and a pointer to where the full notice is found.\n\n <one line to give the library's name and a brief idea of what it does.>\n Copyright (C) <year> <name of author>\n\n This library is free software; you can redistribute it and/or\n modify it under the terms of the GNU Lesser General Public\n License as published by the Free Software Foundation; either\n version 2.1 of the License, or (at your option) any later version.\n\n This library is distributed in the hope that it will be useful,\n but WITHOUT ANY WARRANTY; without even the implied warranty of\n MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU\n Lesser General Public License for more details.\n\n You should have received a copy of the GNU Lesser General Public\n License along with this library; if not, write to the Free Software\n Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301\n USA\n\nAlso add information on how to contact you by electronic and paper mail.\n\nYou should also get your employer (if you work as a programmer) or your\nschool, if any, to sign a \"copyright disclaimer\" for the library, if\nnecessary. Here is a sample; alter the names:\n\n Yoyodyne, Inc., hereby disclaims all copyright interest in the\n library `Frob' (a library for tweaking knobs) written by James Random\n Hacker.\n\n <signature of Ty Coon>, 1 April 1990\n Ty Coon, President of Vice\n\nThat's all there is to it!\n"
},
{
"Name": "attrs",
"Version": "19.3.0",
"Summary": "Classes Without Boilerplate",
"Home-page": "https://www.attrs.org/",
"Author": "Hynek Schlawack",
"License": "MIT"
},
{
"Name": "azure-batch",
"Version": "6.0.0",
"Summary": "Microsoft Azure Batch Client Library for Python",
"Home-page": "https://github.com/Azure/azure-sdk-for-python",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-sdk-for-python/license",
"License repo": "The MIT License (MIT)\n\nCopyright (c) 2016 Microsoft\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-cli",
"Version": "2.0.67",
"Summary": "Microsoft Azure Command-Line Tools",
"Home-page": "https://github.com/Azure/azure-cli",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-cli/license",
"License repo": "MIT License\n\nCopyright (c) 2016 Microsoft Corporation\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-cli-acr",
"Version": "2.2.9",
"Summary": "Microsoft Azure Command-Line Tools ACR Command Module",
"Home-page": "https://github.com/Azure/azure-cli",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-cli/license",
"License repo": "MIT License\n\nCopyright (c) 2016 Microsoft Corporation\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-cli-acs",
"Version": "2.4.4",
"Summary": "Microsoft Azure Command-Line Tools ACS Command Module",
"Home-page": "https://github.com/Azure/azure-cli",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-cli/license",
"License repo": "MIT License\n\nCopyright (c) 2016 Microsoft Corporation\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-cli-advisor",
"Version": "2.0.1",
"Summary": "Microsoft Azure Command-Line Tools Advisor Command Module",
"Home-page": "https://github.com/Azure/azure-cli",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-cli/license",
"License repo": "MIT License\n\nCopyright (c) 2016 Microsoft Corporation\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-cli-ams",
"Version": "0.4.7",
"Summary": "Microsoft Azure Command-Line Tools AMS Command Module",
"Home-page": "https://github.com/Azure/azure-cli",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-cli/license",
"License repo": "MIT License\n\nCopyright (c) 2016 Microsoft Corporation\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-cli-appservice",
"Version": "0.2.21",
"Summary": "Microsoft Azure Command-Line Tools AppService Command Module",
"Home-page": "https://github.com/Azure/azure-cli",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-cli/license",
"License repo": "MIT License\n\nCopyright (c) 2016 Microsoft Corporation\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-cli-backup",
"Version": "1.2.5",
"Summary": "Microsoft Azure Command-Line Tools Recovery Services Command Module",
"Home-page": "https://github.com/Azure/azure-cli",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-cli/license",
"License repo": "MIT License\n\nCopyright (c) 2016 Microsoft Corporation\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-cli-batch",
"Version": "4.0.3",
"Summary": "Microsoft Azure Command-Line Tools Batch Command Module",
"Home-page": "https://github.com/Azure/azure-cli",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-cli/license",
"License repo": "MIT License\n\nCopyright (c) 2016 Microsoft Corporation\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-cli-batchai",
"Version": "0.4.10",
"Summary": "Microsoft Azure Batch AI Client Command-Line Tools",
"Home-page": "https://github.com/Azure/azure-cli",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-cli/license",
"License repo": "MIT License\n\nCopyright (c) 2016 Microsoft Corporation\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-cli-billing",
"Version": "0.2.2",
"Summary": "Microsoft Azure Command-Line Tools Billing Command Module",
"Home-page": "https://github.com/Azure/azure-cli",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-cli/license",
"License repo": "MIT License\n\nCopyright (c) 2016 Microsoft Corporation\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-cli-botservice",
"Version": "0.2.2",
"Summary": "Microsoft Azure Command-Line Tools Bot Services Command Module",
"Home-page": "https://github.com/azure/azure-cli",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-cli/license",
"License repo": "MIT License\n\nCopyright (c) 2016 Microsoft Corporation\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-cli-cdn",
"Version": "0.2.4",
"Summary": "Microsoft Azure Command-Line Tools Content Delivery Network (CDN) Command Module",
"Home-page": "https://github.com/Azure/azure-cli",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-cli/license",
"License repo": "MIT License\n\nCopyright (c) 2016 Microsoft Corporation\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-cli-cloud",
"Version": "2.1.1",
"Summary": "Microsoft Azure Command-Line Tools Cloud Command Module",
"Home-page": "https://github.com/Azure/azure-cli",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-cli/license",
"License repo": "MIT License\n\nCopyright (c) 2016 Microsoft Corporation\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-cli-cognitiveservices",
"Version": "0.2.6",
"Summary": "Microsoft Azure Command-Line Tools Cognitive Services Command Module",
"Home-page": "https://github.com/azure/azure-cli",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-cli/license",
"License repo": "MIT License\n\nCopyright (c) 2016 Microsoft Corporation\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-cli-command-modules-nspkg",
"Version": "2.0.2",
"Summary": "Microsoft Azure CLI Command Modules Namespace Package",
"Home-page": "https://github.com/Azure/azure-cli",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-cli/license",
"License repo": "MIT License\n\nCopyright (c) 2016 Microsoft Corporation\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-cli-configure",
"Version": "2.0.24",
"Summary": "Microsoft Azure Command-Line Tools Configure Command Module",
"Home-page": "https://github.com/Azure/azure-cli",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-cli/license",
"License repo": "MIT License\n\nCopyright (c) 2016 Microsoft Corporation\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-cli-consumption",
"Version": "0.4.4",
"Summary": "Microsoft Azure Command-Line Tools Consumption Command Module",
"Home-page": "https://github.com/Azure/azure-cli",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-cli/license",
"License repo": "MIT License\n\nCopyright (c) 2016 Microsoft Corporation\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-cli-container",
"Version": "0.3.18",
"Summary": "Microsoft Azure Command-Line Tools container Command Module",
"Home-page": "https://github.com/Azure/azure-cli",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-cli/license",
"License repo": "MIT License\n\nCopyright (c) 2016 Microsoft Corporation\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-cli-core",
"Version": "2.0.67",
"Summary": "Microsoft Azure Command-Line Tools Core Module",
"Home-page": "https://github.com/Azure/azure-cli",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-cli/license",
"License repo": "MIT License\n\nCopyright (c) 2016 Microsoft Corporation\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-cli-cosmosdb",
"Version": "0.2.11",
"Summary": "Microsoft Azure Command-Line Tools Cosmos DB Command Module",
"Home-page": "https://github.com/Azure/azure-cli",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-cli/license",
"License repo": "MIT License\n\nCopyright (c) 2016 Microsoft Corporation\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-cli-deploymentmanager",
"Version": "0.1.1",
"Summary": "Microsoft Azure Command-Line Tools Deployment Manager Command Module",
"Home-page": "https://github.com/Azure/azure-cli",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-cli/license",
"License repo": "MIT License\n\nCopyright (c) 2016 Microsoft Corporation\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-cli-dla",
"Version": "0.2.6",
"Summary": "Microsoft Azure Command-Line Tools Data Lake Analytics Command Module",
"Home-page": "https://github.com/Azure/azure-cli",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-cli/license",
"License repo": "MIT License\n\nCopyright (c) 2016 Microsoft Corporation\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-cli-dls",
"Version": "0.1.10",
"Summary": "Microsoft Azure Command-Line Tools Data Lake Store Command Module",
"Home-page": "https://github.com/Azure/azure-cli",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-cli/license",
"License repo": "MIT License\n\nCopyright (c) 2016 Microsoft Corporation\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-cli-dms",
"Version": "0.1.4",
"Summary": "Microsoft Azure Command-Line Tools for the Data Migration Service (DMS) Command Module",
"Home-page": "https://github.com/Azure/azure-cli",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-cli/license",
"License repo": "MIT License\n\nCopyright (c) 2016 Microsoft Corporation\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-cli-eventgrid",
"Version": "0.2.4",
"Summary": "Microsoft Azure Command-Line Tools EventGrid Command Module",
"Home-page": "https://github.com/Azure/azure-cli",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-cli/license",
"License repo": "MIT License\n\nCopyright (c) 2016 Microsoft Corporation\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-cli-eventhubs",
"Version": "0.3.7",
"Summary": "Microsoft Azure Command-Line Tools Event Hubs Command Module",
"Home-page": "https://github.com/Azure/azure-cli",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-cli/license",
"License repo": "MIT License\n\nCopyright (c) 2016 Microsoft Corporation\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-cli-extension",
"Version": "0.2.5",
"Summary": "Microsoft Azure Command-Line Tools Extension Command Module",
"Home-page": "https://github.com/Azure/azure-cli",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-cli/license",
"License repo": "MIT License\n\nCopyright (c) 2016 Microsoft Corporation\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-cli-feedback",
"Version": "2.2.1",
"Summary": "Microsoft Azure Command-Line Tools Feedback Command Module",
"Home-page": "https://github.com/Azure/azure-cli",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-cli/license",
"License repo": "MIT License\n\nCopyright (c) 2016 Microsoft Corporation\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-cli-find",
"Version": "0.3.4",
"Summary": "Intelligent querying for CLI Example information.",
"Home-page": "https://github.com/Azure/azure-cli",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-cli/license",
"License repo": "MIT License\n\nCopyright (c) 2016 Microsoft Corporation\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-cli-hdinsight",
"Version": "0.3.5",
"Summary": "Microsoft Azure Command-Line Tools HDInsight Command Module",
"Home-page": "https://github.com/Azure/azure-cli",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-cli/license",
"License repo": "MIT License\n\nCopyright (c) 2016 Microsoft Corporation\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-cli-interactive",
"Version": "0.4.5",
"Summary": "Microsoft Azure Command-Line Interactive Shell",
"Home-page": "https://github.com/Azure/azure-cli",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-cli/license",
"License repo": "MIT License\n\nCopyright (c) 2016 Microsoft Corporation\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-cli-iot",
"Version": "0.3.11",
"Summary": "Microsoft Azure Command-Line Tools IoT Command Module",
"Home-page": "https://github.com/Azure/azure-cli",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-cli/license",
"License repo": "MIT License\n\nCopyright (c) 2016 Microsoft Corporation\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-cli-iotcentral",
"Version": "0.1.7",
"Summary": "Microsoft Azure Command-Line Tools IoT Central Command Module",
"Home-page": "https://github.com/Azure/azure-cli",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-cli/license",
"License repo": "MIT License\n\nCopyright (c) 2016 Microsoft Corporation\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-cli-keyvault",
"Version": "2.2.16",
"Summary": "Microsoft Azure Command-Line Tools Keyvault Command Module",
"Home-page": "https://github.com/Azure/azure-cli",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-cli/license",
"License repo": "MIT License\n\nCopyright (c) 2016 Microsoft Corporation\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-cli-kusto",
"Version": "0.2.3",
"Summary": "Microsoft Azure Command-Line Tools KUSTO Command Module",
"Home-page": "https://github.com/Azure/azure-cli",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-cli/license",
"License repo": "MIT License\n\nCopyright (c) 2016 Microsoft Corporation\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-cli-lab",
"Version": "0.1.8",
"Summary": "Microsoft Azure Command-Line Tools DevTestLabs Command Module",
"Home-page": "https://github.com/Azure/azure-cli",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-cli/license",
"License repo": "MIT License\n\nCopyright (c) 2016 Microsoft Corporation\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-cli-maps",
"Version": "0.3.5",
"Summary": "Microsoft Azure Command-Line Tools Maps Command Module",
"Home-page": "https://github.com/Azure/azure-cli",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-cli/license",
"License repo": "MIT License\n\nCopyright (c) 2016 Microsoft Corporation\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-cli-monitor",
"Version": "0.2.15",
"Summary": "Microsoft Azure Command-Line Tools Monitor Command Module",
"Home-page": "https://github.com/Azure/azure-cli",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-cli/license",
"License repo": "MIT License\n\nCopyright (c) 2016 Microsoft Corporation\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-cli-natgateway",
"Version": "0.1.1",
"Summary": "Microsoft Azure Command-Line Tools NatGateway Command Module",
"Home-page": "https://github.com/Azure/azure-cli",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-cli/license",
"License repo": "MIT License\n\nCopyright (c) 2016 Microsoft Corporation\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-cli-network",
"Version": "2.5.2",
"Summary": "Microsoft Azure Command-Line Tools Network Command Module",
"Home-page": "https://github.com/Azure/azure-cli",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-cli/license",
"License repo": "MIT License\n\nCopyright (c) 2016 Microsoft Corporation\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-cli-nspkg",
"Version": "3.0.3",
"Summary": "Microsoft Azure CLI Namespace Package",
"Home-page": "https://github.com/Azure/azure-cli",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-cli/license",
"License repo": "MIT License\n\nCopyright (c) 2016 Microsoft Corporation\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-cli-policyinsights",
"Version": "0.1.4",
"Summary": "Microsoft Azure Command-Line Tools Policy Insights Command Module",
"Home-page": "https://github.com/Azure/azure-cli",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-cli/license",
"License repo": "MIT License\n\nCopyright (c) 2016 Microsoft Corporation\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-cli-privatedns",
"Version": "1.0.2",
"Summary": "Microsoft Azure Command-Line Tools Network PrivateDns Command Module",
"Home-page": "https://github.com/Azure/azure-cli",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-cli/license",
"License repo": "MIT License\n\nCopyright (c) 2016 Microsoft Corporation\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-cli-profile",
"Version": "2.1.5",
"Summary": "Microsoft Azure Command-Line Tools Profile Command Module",
"Home-page": "https://github.com/Azure/azure-cli",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-cli/license",
"License repo": "MIT License\n\nCopyright (c) 2016 Microsoft Corporation\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-cli-rdbms",
"Version": "0.3.12",
"Summary": "Microsoft Azure Command-Line Tools MySQL and PostgreSQL Command Module",
"Home-page": "https://github.com/Azure/azure-cli",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-cli/license",
"License repo": "MIT License\n\nCopyright (c) 2016 Microsoft Corporation\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-cli-redis",
"Version": "0.4.4",
"Summary": "Microsoft Azure Command-Line Tools Redis Command Module",
"Home-page": "https://github.com/Azure/azure-cli",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-cli/license",
"License repo": "MIT License\n\nCopyright (c) 2016 Microsoft Corporation\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-cli-relay",
"Version": "0.1.5",
"Summary": "Microsoft Azure Command-Line Tools Relay Command Module",
"Home-page": "https://github.com/Azure/azure-cli",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-cli/license",
"License repo": "MIT License\n\nCopyright (c) 2016 Microsoft Corporation\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-cli-reservations",
"Version": "0.4.3",
"Summary": "Microsoft Azure Command-Line Tools Reservations Command Module",
"Home-page": "https://github.com/azure/azure-cli",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-cli/license",
"License repo": "MIT License\n\nCopyright (c) 2016 Microsoft Corporation\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-cli-resource",
"Version": "2.1.16",
"Summary": "Microsoft Azure Command-Line Tools Resource Command Module",
"Home-page": "https://github.com/Azure/azure-cli",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-cli/license",
"License repo": "MIT License\n\nCopyright (c) 2016 Microsoft Corporation\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-cli-role",
"Version": "2.6.4",
"Summary": "Microsoft Azure Command-Line Tools Role Command Module",
"Home-page": "https://github.com/Azure/azure-cli",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-cli/license",
"License repo": "MIT License\n\nCopyright (c) 2016 Microsoft Corporation\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-cli-search",
"Version": "0.1.2",
"Summary": "Microsoft Azure Command-Line Tools Search Command Module",
"Home-page": "https://github.com/Azure/azure-cli",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-cli/license",
"License repo": "MIT License\n\nCopyright (c) 2016 Microsoft Corporation\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-cli-security",
"Version": "0.1.2",
"Summary": "Microsoft Azure Command-Line Tools Azure Security Center",
"Home-page": "https://github.com/Azure/azure-cli",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-cli/license",
"License repo": "MIT License\n\nCopyright (c) 2016 Microsoft Corporation\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-cli-servicebus",
"Version": "0.3.6",
"Summary": "Microsoft Azure Command-Line Tools Service Bus Command Module",
"Home-page": "https://github.com/Azure/azure-cli",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-cli/license",
"License repo": "MIT License\n\nCopyright (c) 2016 Microsoft Corporation\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-cli-servicefabric",
"Version": "0.1.20",
"Summary": "Microsoft Azure Service Fabric Command-Line Tools",
"Home-page": "https://github.com/Azure/azure-cli",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-cli/license",
"License repo": "MIT License\n\nCopyright (c) 2016 Microsoft Corporation\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-cli-signalr",
"Version": "1.0.1",
"Summary": "Microsoft Azure Command-Line Tools SignalR Command Module",
"Home-page": "https://github.com/Azure/azure-cli",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-cli/license",
"License repo": "MIT License\n\nCopyright (c) 2016 Microsoft Corporation\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-cli-sql",
"Version": "2.2.5",
"Summary": "Microsoft Azure Command-Line Tools SQL Command Module",
"Home-page": "https://github.com/Azure/azure-cli",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-cli/license",
"License repo": "MIT License\n\nCopyright (c) 2016 Microsoft Corporation\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-cli-sqlvm",
"Version": "0.2.0",
"Summary": "Microsoft Azure Command-Line Tools SQL virtual machine Command Module",
"Home-page": "https://github.com/Azure/azure-cli",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-cli/license",
"License repo": "MIT License\n\nCopyright (c) 2016 Microsoft Corporation\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-cli-storage",
"Version": "2.4.3",
"Summary": "Microsoft Azure Command-Line Tools Storage Command Module",
"Home-page": "https://github.com/Azure/azure-cli",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-cli/license",
"License repo": "MIT License\n\nCopyright (c) 2016 Microsoft Corporation\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-cli-telemetry",
"Version": "1.0.2",
"Summary": "Microsoft Azure CLI Telemetry Package",
"Home-page": "https://github.com/Azure/azure-cli",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-cli/license",
"License repo": "MIT License\n\nCopyright (c) 2016 Microsoft Corporation\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-cli-vm",
"Version": "2.2.23",
"Summary": "Microsoft Azure Command-Line Tools VM Command Module",
"Home-page": "https://github.com/Azure/azure-cli",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-cli/license",
"License repo": "MIT License\n\nCopyright (c) 2016 Microsoft Corporation\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-common",
"Version": "1.1.23",
"Summary": "Microsoft Azure Client Library for Python (Common)",
"Home-page": "https://github.com/Azure/azure-sdk-for-python",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-sdk-for-python/license",
"License repo": "The MIT License (MIT)\n\nCopyright (c) 2016 Microsoft\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-cosmos",
"Version": "3.1.2",
"Summary": "Azure Cosmos Python SDK",
"Home-page": "https://github.com/Azure/azure-documentdb-python",
"Author": "Microsoft",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-documentdb-python/license",
"License repo": "The MIT License (MIT)\nCopyright (c) 2014 Microsoft Corporation\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-datalake-store",
"Version": "0.0.39",
"Summary": "Azure Data Lake Store Filesystem Client Library for Python",
"Home-page": "https://github.com/Azure/azure-data-lake-store-python",
"Author": "Microsoft Corporation",
"License": "Other",
"License URL": "https://api.github.com/repos/azure/azure-data-lake-store-python/license",
"License repo": "\ufeffThe MIT License (MIT)\n\nCopyright (c) 2016 Microsoft\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-functions-devops-build",
"Version": "0.0.22",
"Summary": "Python package for integrating Azure Functions with Azure DevOps. Specifically made for the Azure CLI",
"Home-page": "https://github.com/Azure/azure-functions-devops-build",
"Author": "Oliver Dolk, Hanzhang Zeng",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-functions-devops-build/license",
"License repo": " MIT License\n\n Copyright (c) Microsoft Corporation. All rights reserved.\n\n Permission is hereby granted, free of charge, to any person obtaining a copy\n of this software and associated documentation files (the \"Software\"), to deal\n in the Software without restriction, including without limitation the rights\n to use, copy, modify, merge, publish, distribute, sublicense, and/or sell\n copies of the Software, and to permit persons to whom the Software is\n furnished to do so, subject to the following conditions:\n\n The above copyright notice and this permission notice shall be included in all\n copies or substantial portions of the Software.\n\n THE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\n IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\n FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\n AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\n LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\n OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\n SOFTWARE\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-graphrbac",
"Version": "0.60.0",
"Summary": "Microsoft Azure Graph RBAC Client Library for Python",
"Home-page": "https://github.com/Azure/azure-sdk-for-python",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-sdk-for-python/license",
"License repo": "The MIT License (MIT)\n\nCopyright (c) 2016 Microsoft\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-keyvault",
"Version": "1.1.0",
"Summary": "Microsoft Azure Key Vault Client Library for Python",
"Home-page": "https://github.com/Azure/azure-sdk-for-python",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-sdk-for-python/license",
"License repo": "The MIT License (MIT)\n\nCopyright (c) 2016 Microsoft\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-mgmt-advisor",
"Version": "2.0.1",
"Summary": "Microsoft Azure Advisor Client Library for Python",
"Home-page": "https://github.com/Azure/azure-sdk-for-python",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-sdk-for-python/license",
"License repo": "The MIT License (MIT)\n\nCopyright (c) 2016 Microsoft\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-mgmt-applicationinsights",
"Version": "0.1.1",
"Summary": "Microsoft Azure Application Insights Management Client Library for Python",
"Home-page": "https://github.com/Azure/azure-sdk-for-python",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-sdk-for-python/license",
"License repo": "The MIT License (MIT)\n\nCopyright (c) 2016 Microsoft\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-mgmt-authorization",
"Version": "0.50.0",
"Summary": "Microsoft Azure Authorization Management Client Library for Python",
"Home-page": "https://github.com/Azure/azure-sdk-for-python",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-sdk-for-python/license",
"License repo": "The MIT License (MIT)\n\nCopyright (c) 2016 Microsoft\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-mgmt-batch",
"Version": "6.0.0",
"Summary": "Microsoft Azure Batch Management Client Library for Python",
"Home-page": "https://github.com/Azure/azure-sdk-for-python",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-sdk-for-python/license",
"License repo": "The MIT License (MIT)\n\nCopyright (c) 2016 Microsoft\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-mgmt-batchai",
"Version": "2.0.0",
"Summary": "Microsoft Azure Batch AI Management Client Library for Python",
"Home-page": "https://github.com/Azure/azure-sdk-for-python",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-sdk-for-python/license",
"License repo": "The MIT License (MIT)\n\nCopyright (c) 2016 Microsoft\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-mgmt-billing",
"Version": "0.2.0",
"Summary": "Microsoft Azure Billing Client Library for Python",
"Home-page": "https://github.com/Azure/azure-sdk-for-python",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-sdk-for-python/license",
"License repo": "The MIT License (MIT)\n\nCopyright (c) 2016 Microsoft\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-mgmt-botservice",
"Version": "0.2.0",
"Summary": "Microsoft Azure Bot Service Client Library for Python",
"Home-page": "https://github.com/Azure/azure-sdk-for-python",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-sdk-for-python/license",
"License repo": "The MIT License (MIT)\n\nCopyright (c) 2016 Microsoft\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-mgmt-cdn",
"Version": "3.1.0",
"Summary": "Microsoft Azure CDN Management Client Library for Python",
"Home-page": "https://github.com/Azure/azure-sdk-for-python",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-sdk-for-python/license",
"License repo": "The MIT License (MIT)\n\nCopyright (c) 2016 Microsoft\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-mgmt-cognitiveservices",
"Version": "3.0.0",
"Summary": "Microsoft Azure Cognitive Services Management Client Library for Python",
"Home-page": "https://github.com/Azure/azure-sdk-for-python",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-sdk-for-python/license",
"License repo": "The MIT License (MIT)\n\nCopyright (c) 2016 Microsoft\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-mgmt-compute",
"Version": "5.0.0",
"Summary": "Microsoft Azure Compute Management Client Library for Python",
"Home-page": "https://github.com/Azure/azure-sdk-for-python",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-sdk-for-python/license",
"License repo": "The MIT License (MIT)\n\nCopyright (c) 2016 Microsoft\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-mgmt-consumption",
"Version": "2.0.0",
"Summary": "Microsoft Azure Consumption Management Client Library for Python",
"Home-page": "https://github.com/Azure/azure-sdk-for-python",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-sdk-for-python/license",
"License repo": "The MIT License (MIT)\n\nCopyright (c) 2016 Microsoft\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-mgmt-containerinstance",
"Version": "1.4.0",
"Summary": "Microsoft Azure Container Instance Client Library for Python",
"Home-page": "https://github.com/Azure/azure-sdk-for-python",
"Author": ": Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-sdk-for-python/license",
"License repo": "The MIT License (MIT)\n\nCopyright (c) 2016 Microsoft\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-mgmt-containerregistry",
"Version": "2.8.0",
"Summary": "Microsoft Azure Container Registry Client Library for Python",
"Home-page": "https://github.com/Azure/azure-sdk-for-python",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-sdk-for-python/license",
"License repo": "The MIT License (MIT)\n\nCopyright (c) 2016 Microsoft\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-mgmt-containerservice",
"Version": "5.2.0",
"Summary": "Microsoft Azure Container Service Management Client Library for Python",
"Home-page": "https://github.com/Azure/azure-sdk-for-python",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-sdk-for-python/license",
"License repo": "The MIT License (MIT)\n\nCopyright (c) 2016 Microsoft\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-mgmt-cosmosdb",
"Version": "0.6.1",
"Summary": "Microsoft Azure Cosmos DB Management Client Library for Python",
"Home-page": "https://github.com/Azure/azure-sdk-for-python",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-sdk-for-python/license",
"License repo": "The MIT License (MIT)\n\nCopyright (c) 2016 Microsoft\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-mgmt-datalake-analytics",
"Version": "0.2.1",
"Summary": "Microsoft Azure Data Lake Analytics Management Client Library for Python",
"Home-page": "https://github.com/Azure/azure-sdk-for-python",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-sdk-for-python/license",
"License repo": "The MIT License (MIT)\n\nCopyright (c) 2016 Microsoft\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-mgmt-datalake-nspkg",
"Version": "3.0.1",
"Summary": "Microsoft Azure Data Lake Management Namespace Package [Internal]",
"Home-page": "https://github.com/Azure/azure-sdk-for-python",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-sdk-for-python/license",
"License repo": "The MIT License (MIT)\n\nCopyright (c) 2016 Microsoft\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-mgmt-datalake-store",
"Version": "0.5.0",
"Summary": "Microsoft Azure Data Lake Store Management Client Library for Python",
"Home-page": "https://github.com/Azure/azure-sdk-for-python",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-sdk-for-python/license",
"License repo": "The MIT License (MIT)\n\nCopyright (c) 2016 Microsoft\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-mgmt-datamigration",
"Version": "0.1.0",
"Summary": "Microsoft Azure Data Migration Client Library for Python",
"Home-page": "https://github.com/Azure/azure-sdk-for-python",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-sdk-for-python/license",
"License repo": "The MIT License (MIT)\n\nCopyright (c) 2016 Microsoft\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-mgmt-deploymentmanager",
"Version": "0.1.0",
"Summary": "Microsoft Azure Deployment Manager Client Library for Python",
"Home-page": "https://github.com/Azure/azure-sdk-for-python",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-sdk-for-python/license",
"License repo": "The MIT License (MIT)\n\nCopyright (c) 2016 Microsoft\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-mgmt-devtestlabs",
"Version": "2.2.0",
"Summary": "Microsoft Azure DevTestLabs Management Client Library for Python",
"Home-page": "https://github.com/Azure/azure-sdk-for-python",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-sdk-for-python/license",
"License repo": "The MIT License (MIT)\n\nCopyright (c) 2016 Microsoft\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-mgmt-dns",
"Version": "2.1.0",
"Summary": "Microsoft Azure DNS Management Client Library for Python",
"Home-page": "https://github.com/Azure/azure-sdk-for-python",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-sdk-for-python/license",
"License repo": "The MIT License (MIT)\n\nCopyright (c) 2016 Microsoft\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-mgmt-eventgrid",
"Version": "2.2.0",
"Summary": "Microsoft Azure EventGrid Management Client Library for Python",
"Home-page": "https://github.com/Azure/azure-sdk-for-python",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-sdk-for-python/license",
"License repo": "The MIT License (MIT)\n\nCopyright (c) 2016 Microsoft\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-mgmt-eventhub",
"Version": "2.6.0",
"Summary": "Microsoft Azure EventHub Management Client Library for Python",
"Home-page": "https://github.com/Azure/azure-sdk-for-python",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-sdk-for-python/license",
"License repo": "The MIT License (MIT)\n\nCopyright (c) 2016 Microsoft\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-mgmt-hdinsight",
"Version": "0.2.1",
"Summary": "Microsoft Azure HDInsight Management Client Library for Python",
"Home-page": "https://github.com/Azure/azure-sdk-for-python",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-sdk-for-python/license",
"License repo": "The MIT License (MIT)\n\nCopyright (c) 2016 Microsoft\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-mgmt-imagebuilder",
"Version": "0.2.1",
"Summary": "Microsoft Azure Image Builder Client Library for Python",
"Home-page": "https://github.com/Azure/azure-sdk-for-python",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-sdk-for-python/license",
"License repo": "The MIT License (MIT)\n\nCopyright (c) 2016 Microsoft\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-mgmt-iotcentral",
"Version": "1.0.0",
"Summary": "Microsoft Azure IoTCentral Management Client Library for Python",
"Home-page": "https://github.com/Azure/azure-sdk-for-python",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-sdk-for-python/license",
"License repo": "The MIT License (MIT)\n\nCopyright (c) 2016 Microsoft\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-mgmt-iothub",
"Version": "0.8.2",
"Summary": "Microsoft Azure IoTHub Management Client Library for Python",
"Home-page": "https://github.com/Azure/azure-sdk-for-python",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-sdk-for-python/license",
"License repo": "The MIT License (MIT)\n\nCopyright (c) 2016 Microsoft\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-mgmt-iothubprovisioningservices",
"Version": "0.2.0",
"Summary": "Microsoft Azure IoTHub Provisioning Services Client Library for Python",
"Home-page": "https://github.com/Azure/azure-sdk-for-python",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-sdk-for-python/license",
"License repo": "The MIT License (MIT)\n\nCopyright (c) 2016 Microsoft\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-mgmt-keyvault",
"Version": "1.1.0",
"Summary": "Microsoft Azure Key Vault Management Client Library for Python",
"Home-page": "https://github.com/Azure/azure-sdk-for-python",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-sdk-for-python/license",
"License repo": "The MIT License (MIT)\n\nCopyright (c) 2016 Microsoft\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-mgmt-kusto",
"Version": "0.3.0",
"Summary": "Microsoft Azure Kusto Management Client Library for Python",
"Home-page": "https://github.com/Azure/azure-sdk-for-python",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-sdk-for-python/license",
"License repo": "The MIT License (MIT)\n\nCopyright (c) 2016 Microsoft\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-mgmt-loganalytics",
"Version": "0.2.0",
"Summary": "Microsoft Azure Log Analytics Management Client Library for Python",
"Home-page": "https://github.com/Azure/azure-sdk-for-python",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-sdk-for-python/license",
"License repo": "The MIT License (MIT)\n\nCopyright (c) 2016 Microsoft\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-mgmt-managementgroups",
"Version": "0.1.0",
"Summary": "Microsoft Azure Management Groups Client Library for Python",
"Home-page": "https://github.com/Azure/azure-sdk-for-python",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-sdk-for-python/license",
"License repo": "The MIT License (MIT)\n\nCopyright (c) 2016 Microsoft\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-mgmt-maps",
"Version": "0.1.0",
"Summary": "Microsoft Azure Maps Client Library for Python",
"Home-page": "https://github.com/Azure/azure-sdk-for-python",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-sdk-for-python/license",
"License repo": "The MIT License (MIT)\n\nCopyright (c) 2016 Microsoft\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-mgmt-marketplaceordering",
"Version": "0.1.0",
"Summary": "Microsoft Azure Market Place Ordering Client Library for Python",
"Home-page": "https://github.com/Azure/azure-sdk-for-python",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-sdk-for-python/license",
"License repo": "The MIT License (MIT)\n\nCopyright (c) 2016 Microsoft\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-mgmt-media",
"Version": "1.1.1",
"Summary": "Microsoft Azure Media Services Client Library for Python",
"Home-page": "https://github.com/Azure/azure-sdk-for-python",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-sdk-for-python/license",
"License repo": "The MIT License (MIT)\n\nCopyright (c) 2016 Microsoft\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-mgmt-monitor",
"Version": "0.5.2",
"Summary": "Microsoft Azure Monitor Client Library for Python",
"Home-page": "https://github.com/Azure/azure-sdk-for-python",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-sdk-for-python/license",
"License repo": "The MIT License (MIT)\n\nCopyright (c) 2016 Microsoft\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-mgmt-msi",
"Version": "0.2.0",
"Summary": "Microsoft Azure MSI Management Client Library for Python",
"Home-page": "https://github.com/Azure/azure-sdk-for-python",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-sdk-for-python/license",
"License repo": "The MIT License (MIT)\n\nCopyright (c) 2016 Microsoft\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-mgmt-network",
"Version": "3.0.0",
"Summary": "Microsoft Azure Network Management Client Library for Python",
"Home-page": "https://github.com/Azure/azure-sdk-for-python",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-sdk-for-python/license",
"License repo": "The MIT License (MIT)\n\nCopyright (c) 2016 Microsoft\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-mgmt-nspkg",
"Version": "3.0.2",
"Summary": "Microsoft Azure Resource Management Namespace Package [Internal]",
"Home-page": "https://github.com/Azure/azure-sdk-for-python",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-sdk-for-python/license",
"License repo": "The MIT License (MIT)\n\nCopyright (c) 2016 Microsoft\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-mgmt-policyinsights",
"Version": "0.3.1",
"Summary": "Microsoft Azure Policy Insights Client Library for Python",
"Home-page": "https://github.com/Azure/azure-sdk-for-python",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-sdk-for-python/license",
"License repo": "The MIT License (MIT)\n\nCopyright (c) 2016 Microsoft\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-mgmt-privatedns",
"Version": "0.1.0",
"Summary": "Microsoft Azure DNS Private Zones Client Library for Python",
"Home-page": "https://github.com/Azure/azure-sdk-for-python",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-sdk-for-python/license",
"License repo": "The MIT License (MIT)\n\nCopyright (c) 2016 Microsoft\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-mgmt-rdbms",
"Version": "1.8.0",
"Summary": "Microsoft Azure RDBMS Management Client Library for Python",
"Home-page": "https://github.com/Azure/azure-sdk-for-python",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-sdk-for-python/license",
"License repo": "The MIT License (MIT)\n\nCopyright (c) 2016 Microsoft\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-mgmt-recoveryservices",
"Version": "0.1.1",
"Summary": "Microsoft Azure Recovery Services Client Library for Python",
"Home-page": "https://github.com/Azure/azure-sdk-for-python",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-sdk-for-python/license",
"License repo": "The MIT License (MIT)\n\nCopyright (c) 2016 Microsoft\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-mgmt-recoveryservicesbackup",
"Version": "0.1.2",
"Summary": "Microsoft Azure Recovery Services Backup Client Library for Python",
"Home-page": "https://github.com/Azure/azure-sdk-for-python",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-sdk-for-python/license",
"License repo": "The MIT License (MIT)\n\nCopyright (c) 2016 Microsoft\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-mgmt-redis",
"Version": "6.0.0",
"Summary": "Microsoft Azure Redis Cache Management Client Library for Python",
"Home-page": "https://github.com/Azure/azure-sdk-for-python",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-sdk-for-python/license",
"License repo": "The MIT License (MIT)\n\nCopyright (c) 2016 Microsoft\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-mgmt-relay",
"Version": "0.1.0",
"Summary": "Microsoft Azure Relay Client Library for Python",
"Home-page": "https://github.com/Azure/azure-sdk-for-python",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-sdk-for-python/license",
"License repo": "The MIT License (MIT)\n\nCopyright (c) 2016 Microsoft\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-mgmt-reservations",
"Version": "0.3.1",
"Summary": "Microsoft Azure Reservations Client Library for Python",
"Home-page": "https://github.com/Azure/azure-sdk-for-python",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-sdk-for-python/license",
"License repo": "The MIT License (MIT)\n\nCopyright (c) 2016 Microsoft\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-mgmt-resource",
"Version": "2.1.0",
"Summary": "Microsoft Azure Resource Management Client Library for Python",
"Home-page": "https://github.com/Azure/azure-sdk-for-python",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-sdk-for-python/license",
"License repo": "The MIT License (MIT)\n\nCopyright (c) 2016 Microsoft\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-mgmt-search",
"Version": "2.0.0",
"Summary": "Microsoft Azure Search Management Client Library for Python",
"Home-page": "https://github.com/Azure/azure-sdk-for-python",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-sdk-for-python/license",
"License repo": "The MIT License (MIT)\n\nCopyright (c) 2016 Microsoft\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-mgmt-security",
"Version": "0.1.0",
"Summary": "Microsoft Azure Secutiry Center Management Client Library for Python",
"Home-page": "https://github.com/Azure/azure-sdk-for-python",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-sdk-for-python/license",
"License repo": "The MIT License (MIT)\n\nCopyright (c) 2016 Microsoft\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-mgmt-servicebus",
"Version": "0.6.0",
"Summary": "Microsoft Azure Service Bus Management Client Library for Python",
"Home-page": "https://github.com/Azure/azure-sdk-for-python",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-sdk-for-python/license",
"License repo": "The MIT License (MIT)\n\nCopyright (c) 2016 Microsoft\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-mgmt-servicefabric",
"Version": "0.2.0",
"Summary": "Microsoft Azure Service Fabric Management Client Library for Python",
"Home-page": "https://github.com/Azure/azure-sdk-for-python",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-sdk-for-python/license",
"License repo": "The MIT License (MIT)\n\nCopyright (c) 2016 Microsoft\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-mgmt-signalr",
"Version": "0.1.1",
"Summary": "Microsoft Azure SignalR Client Library for Python",
"Home-page": "https://github.com/Azure/azure-sdk-for-python",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-sdk-for-python/license",
"License repo": "The MIT License (MIT)\n\nCopyright (c) 2016 Microsoft\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-mgmt-sql",
"Version": "0.12.0",
"Summary": "Microsoft Azure SQL Management Client Library for Python",
"Home-page": "https://github.com/Azure/azure-sdk-for-python",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-sdk-for-python/license",
"License repo": "The MIT License (MIT)\n\nCopyright (c) 2016 Microsoft\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-mgmt-sqlvirtualmachine",
"Version": "0.3.0",
"Summary": "Microsoft Azure SQL Virtual Machine Management Client Library for Python",
"Home-page": "https://github.com/Azure/azure-sdk-for-python",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-sdk-for-python/license",
"License repo": "The MIT License (MIT)\n\nCopyright (c) 2016 Microsoft\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-mgmt-storage",
"Version": "3.3.0",
"Summary": "Microsoft Azure Storage Management Client Library for Python",
"Home-page": "https://github.com/Azure/azure-sdk-for-python",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-sdk-for-python/license",
"License repo": "The MIT License (MIT)\n\nCopyright (c) 2016 Microsoft\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-mgmt-trafficmanager",
"Version": "0.51.0",
"Summary": "Microsoft Azure Traffic Manager Client Library for Python",
"Home-page": "https://github.com/Azure/azure-sdk-for-python",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-sdk-for-python/license",
"License repo": "The MIT License (MIT)\n\nCopyright (c) 2016 Microsoft\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-mgmt-web",
"Version": "0.42.0",
"Summary": "Microsoft Azure Web Apps Management Client Library for Python",
"Home-page": "https://github.com/Azure/azure-sdk-for-python",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-sdk-for-python/license",
"License repo": "The MIT License (MIT)\n\nCopyright (c) 2016 Microsoft\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-multiapi-storage",
"Version": "0.2.3",
"Summary": "Microsoft Azure Storage Client Library for Python with multi API version support.",
"Home-page": "https://github.com/Azure/azure-multiapi-storage-python",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-multiapi-storage-python/license",
"License repo": "MIT License\n\nCopyright (c) 2017 Microsoft Corporation\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-nspkg",
"Version": "3.0.2",
"Summary": "Microsoft Azure Namespace Package [Internal]",
"Home-page": "https://github.com/Azure/azure-sdk-for-python",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-sdk-for-python/license",
"License repo": "The MIT License (MIT)\n\nCopyright (c) 2016 Microsoft\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-storage-blob",
"Version": "1.3.1",
"Summary": "Microsoft Azure Storage Blob Client Library for Python",
"Home-page": "https://github.com/Azure/azure-storage-python",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-storage-python/license",
"License repo": "The MIT License (MIT)\n\nCopyright (c) 2017 Microsoft\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-storage-common",
"Version": "1.4.2",
"Summary": "Microsoft Azure Storage Common Client Library for Python",
"Home-page": "https://github.com/Azure/azure-storage-python",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-storage-python/license",
"License repo": "The MIT License (MIT)\n\nCopyright (c) 2017 Microsoft\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "azure-storage-nspkg",
"Version": "3.1.0",
"Summary": "Microsoft Azure Storage Namespace Package [Internal]",
"Home-page": "https://github.com/Azure/azure-storage-python",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/azure-storage-python/license",
"License repo": "The MIT License (MIT)\n\nCopyright (c) 2017 Microsoft\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "bcrypt",
"Version": "3.1.7",
"Summary": "Modern password hashing for your software and your servers",
"Home-page": "https://github.com/pyca/bcrypt/",
"Author": "The Python Cryptographic Authority developers",
"License": "Apache License 2.0",
"License URL": "https://api.github.com/repos/pyca/bcrypt/license",
"License repo": " Apache License\n Version 2.0, January 2004\n http://www.apache.org/licenses/\n\nTERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION\n\n1. Definitions.\n\n \"License\" shall mean the terms and conditions for use, reproduction,\n and distribution as defined by Sections 1 through 9 of this document.\n\n \"Licensor\" shall mean the copyright owner or entity authorized by\n the copyright owner that is granting the License.\n\n \"Legal Entity\" shall mean the union of the acting entity and all\n other entities that control, are controlled by, or are under common\n control with that entity. For the purposes of this definition,\n \"control\" means (i) the power, direct or indirect, to cause the\n direction or management of such entity, whether by contract or\n otherwise, or (ii) ownership of fifty percent (50%) or more of the\n outstanding shares, or (iii) beneficial ownership of such entity.\n\n \"You\" (or \"Your\") shall mean an individual or Legal Entity\n exercising permissions granted by this License.\n\n \"Source\" form shall mean the preferred form for making modifications,\n including but not limited to software source code, documentation\n source, and configuration files.\n\n \"Object\" form shall mean any form resulting from mechanical\n transformation or translation of a Source form, including but\n not limited to compiled object code, generated documentation,\n and conversions to other media types.\n\n \"Work\" shall mean the work of authorship, whether in Source or\n Object form, made available under the License, as indicated by a\n copyright notice that is included in or attached to the work\n (an example is provided in the Appendix below).\n\n \"Derivative Works\" shall mean any work, whether in Source or Object\n form, that is based on (or derived from) the Work and for which the\n editorial revisions, annotations, elaborations, or other modifications\n represent, as a whole, an original work of authorship. For the purposes\n of this License, Derivative Works shall not include works that remain\n separable from, or merely link (or bind by name) to the interfaces of,\n the Work and Derivative Works thereof.\n\n \"Contribution\" shall mean any work of authorship, including\n the original version of the Work and any modifications or additions\n to that Work or Derivative Works thereof, that is intentionally\n submitted to Licensor for inclusion in the Work by the copyright owner\n or by an individual or Legal Entity authorized to submit on behalf of\n the copyright owner. For the purposes of this definition, \"submitted\"\n means any form of electronic, verbal, or written communication sent\n to the Licensor or its representatives, including but not limited to\n communication on electronic mailing lists, source code control systems,\n and issue tracking systems that are managed by, or on behalf of, the\n Licensor for the purpose of discussing and improving the Work, but\n excluding communication that is conspicuously marked or otherwise\n designated in writing by the copyright owner as \"Not a Contribution.\"\n\n \"Contributor\" shall mean Licensor and any individual or Legal Entity\n on behalf of whom a Contribution has been received by Licensor and\n subsequently incorporated within the Work.\n\n2. Grant of Copyright License. Subject to the terms and conditions of\n this License, each Contributor hereby grants to You a perpetual,\n worldwide, non-exclusive, no-charge, royalty-free, irrevocable\n copyright license to reproduce, prepare Derivative Works of,\n publicly display, publicly perform, sublicense, and distribute the\n Work and such Derivative Works in Source or Object form.\n\n3. Grant of Patent License. Subject to the terms and conditions of\n this License, each Contributor hereby grants to You a perpetual,\n worldwide, non-exclusive, no-charge, royalty-free, irrevocable\n (except as stated in this section) patent license to make, have made,\n use, offer to sell, sell, import, and otherwise transfer the Work,\n where such license applies only to those patent claims licensable\n by such Contributor that are necessarily infringed by their\n Contribution(s) alone or by combination of their Contribution(s)\n with the Work to which such Contribution(s) was submitted. If You\n institute patent litigation against any entity (including a\n cross-claim or counterclaim in a lawsuit) alleging that the Work\n or a Contribution incorporated within the Work constitutes direct\n or contributory patent infringement, then any patent licenses\n granted to You under this License for that Work shall terminate\n as of the date such litigation is filed.\n\n4. Redistribution. You may reproduce and distribute copies of the\n Work or Derivative Works thereof in any medium, with or without\n modifications, and in Source or Object form, provided that You\n meet the following conditions:\n\n (a) You must give any other recipients of the Work or\n Derivative Works a copy of this License; and\n\n (b) You must cause any modified files to carry prominent notices\n stating that You changed the files; and\n\n (c) You must retain, in the Source form of any Derivative Works\n that You distribute, all copyright, patent, trademark, and\n attribution notices from the Source form of the Work,\n excluding those notices that do not pertain to any part of\n the Derivative Works; and\n\n (d) If the Work includes a \"NOTICE\" text file as part of its\n distribution, then any Derivative Works that You distribute must\n include a readable copy of the attribution notices contained\n within such NOTICE file, excluding those notices that do not\n pertain to any part of the Derivative Works, in at least one\n of the following places: within a NOTICE text file distributed\n as part of the Derivative Works; within the Source form or\n documentation, if provided along with the Derivative Works; or,\n within a display generated by the Derivative Works, if and\n wherever such third-party notices normally appear. The contents\n of the NOTICE file are for informational purposes only and\n do not modify the License. You may add Your own attribution\n notices within Derivative Works that You distribute, alongside\n or as an addendum to the NOTICE text from the Work, provided\n that such additional attribution notices cannot be construed\n as modifying the License.\n\n You may add Your own copyright statement to Your modifications and\n may provide additional or different license terms and conditions\n for use, reproduction, or distribution of Your modifications, or\n for any such Derivative Works as a whole, provided Your use,\n reproduction, and distribution of the Work otherwise complies with\n the conditions stated in this License.\n\n5. Submission of Contributions. Unless You explicitly state otherwise,\n any Contribution intentionally submitted for inclusion in the Work\n by You to the Licensor shall be under the terms and conditions of\n this License, without any additional terms or conditions.\n Notwithstanding the above, nothing herein shall supersede or modify\n the terms of any separate license agreement you may have executed\n with Licensor regarding such Contributions.\n\n6. Trademarks. This License does not grant permission to use the trade\n names, trademarks, service marks, or product names of the Licensor,\n except as required for reasonable and customary use in describing the\n origin of the Work and reproducing the content of the NOTICE file.\n\n7. Disclaimer of Warranty. Unless required by applicable law or\n agreed to in writing, Licensor provides the Work (and each\n Contributor provides its Contributions) on an \"AS IS\" BASIS,\n WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or\n implied, including, without limitation, any warranties or conditions\n of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A\n PARTICULAR PURPOSE. You are solely responsible for determining the\n appropriateness of using or redistributing the Work and assume any\n risks associated with Your exercise of permissions under this License.\n\n8. Limitation of Liability. In no event and under no legal theory,\n whether in tort (including negligence), contract, or otherwise,\n unless required by applicable law (such as deliberate and grossly\n negligent acts) or agreed to in writing, shall any Contributor be\n liable to You for damages, including any direct, indirect, special,\n incidental, or consequential damages of any character arising as a\n result of this License or out of the use or inability to use the\n Work (including but not limited to damages for loss of goodwill,\n work stoppage, computer failure or malfunction, or any and all\n other commercial damages or losses), even if such Contributor\n has been advised of the possibility of such damages.\n\n9. Accepting Warranty or Additional Liability. While redistributing\n the Work or Derivative Works thereof, You may choose to offer,\n and charge a fee for, acceptance of support, warranty, indemnity,\n or other liability obligations and/or rights consistent with this\n License. However, in accepting such obligations, You may act only\n on Your own behalf and on Your sole responsibility, not on behalf\n of any other Contributor, and only if You agree to indemnify,\n defend, and hold each Contributor harmless for any liability\n incurred by, or claims asserted against, such Contributor by reason\n of your accepting any such warranty or additional liability.\n\nEND OF TERMS AND CONDITIONS\n\nAPPENDIX: How to apply the Apache License to your work.\n\n To apply the Apache License to your work, attach the following\n boilerplate notice, with the fields enclosed by brackets \"[]\"\n replaced with your own identifying information. (Don't include\n the brackets!) The text should be enclosed in the appropriate\n comment syntax for the file format. We also recommend that a\n file or class name and description of purpose be included on the\n same \"printed page\" as the copyright notice for easier\n identification within third-party archives.\n\nCopyright [yyyy] [name of copyright owner]\n\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n\n http://www.apache.org/licenses/LICENSE-2.0\n\nUnless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License.\n",
"License text": " Apache License\n Version 2.0, January 2004\n http://www.apache.org/licenses/\n\n TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION\n\n 1. Definitions.\n\n \"License\" shall mean the terms and conditions for use, reproduction,\n and distribution as defined by Sections 1 through 9 of this document.\n\n \"Licensor\" shall mean the copyright owner or entity authorized by\n the copyright owner that is granting the License.\n\n \"Legal Entity\" shall mean the union of the acting entity and all\n other entities that control, are controlled by, or are under common\n control with that entity. For the purposes of this definition,\n \"control\" means (i) the power, direct or indirect, to cause the\n direction or management of such entity, whether by contract or\n otherwise, or (ii) ownership of fifty percent (50%) or more of the\n outstanding shares, or (iii) beneficial ownership of such entity.\n\n \"You\" (or \"Your\") shall mean an individual or Legal Entity\n exercising permissions granted by this License.\n\n \"Source\" form shall mean the preferred form for making modifications,\n including but not limited to software source code, documentation\n source, and configuration files.\n\n \"Object\" form shall mean any form resulting from mechanical\n transformation or translation of a Source form, including but\n not limited to compiled object code, generated documentation,\n and conversions to other media types.\n\n \"Work\" shall mean the work of authorship, whether in Source or\n Object form, made available under the License, as indicated by a\n copyright notice that is included in or attached to the work\n (an example is provided in the Appendix below).\n\n \"Derivative Works\" shall mean any work, whether in Source or Object\n form, that is based on (or derived from) the Work and for which the\n editorial revisions, annotations, elaborations, or other modifications\n represent, as a whole, an original work of authorship. For the purposes\n of this License, Derivative Works shall not include works that remain\n separable from, or merely link (or bind by name) to the interfaces of,\n the Work and Derivative Works thereof.\n\n \"Contribution\" shall mean any work of authorship, including\n the original version of the Work and any modifications or additions\n to that Work or Derivative Works thereof, that is intentionally\n submitted to Licensor for inclusion in the Work by the copyright owner\n or by an individual or Legal Entity authorized to submit on behalf of\n the copyright owner. For the purposes of this definition, \"submitted\"\n means any form of electronic, verbal, or written communication sent\n to the Licensor or its representatives, including but not limited to\n communication on electronic mailing lists, source code control systems,\n and issue tracking systems that are managed by, or on behalf of, the\n Licensor for the purpose of discussing and improving the Work, but\n excluding communication that is conspicuously marked or otherwise\n designated in writing by the copyright owner as \"Not a Contribution.\"\n\n \"Contributor\" shall mean Licensor and any individual or Legal Entity\n on behalf of whom a Contribution has been received by Licensor and\n subsequently incorporated within the Work.\n\n 2. Grant of Copyright License. Subject to the terms and conditions of\n this License, each Contributor hereby grants to You a perpetual,\n worldwide, non-exclusive, no-charge, royalty-free, irrevocable\n copyright license to reproduce, prepare Derivative Works of,\n publicly display, publicly perform, sublicense, and distribute the\n Work and such Derivative Works in Source or Object form.\n\n 3. Grant of Patent License. Subject to the terms and conditions of\n this License, each Contributor hereby grants to You a perpetual,\n worldwide, non-exclusive, no-charge, royalty-free, irrevocable\n (except as stated in this section) patent license to make, have made,\n use, offer to sell, sell, import, and otherwise transfer the Work,\n where such license applies only to those patent claims licensable\n by such Contributor that are necessarily infringed by their\n Contribution(s) alone or by combination of their Contribution(s)\n with the Work to which such Contribution(s) was submitted. If You\n institute patent litigation against any entity (including a\n cross-claim or counterclaim in a lawsuit) alleging that the Work\n or a Contribution incorporated within the Work constitutes direct\n or contributory patent infringement, then any patent licenses\n granted to You under this License for that Work shall terminate\n as of the date such litigation is filed.\n\n 4. Redistribution. You may reproduce and distribute copies of the\n Work or Derivative Works thereof in any medium, with or without\n modifications, and in Source or Object form, provided that You\n meet the following conditions:\n\n (a) You must give any other recipients of the Work or\n Derivative Works a copy of this License; and\n\n (b) You must cause any modified files to carry prominent notices\n stating that You changed the files; and\n\n (c) You must retain, in the Source form of any Derivative Works\n that You distribute, all copyright, patent, trademark, and\n attribution notices from the Source form of the Work,\n excluding those notices that do not pertain to any part of\n the Derivative Works; and\n\n (d) If the Work includes a \"NOTICE\" text file as part of its\n distribution, then any Derivative Works that You distribute must\n include a readable copy of the attribution notices contained\n within such NOTICE file, excluding those notices that do not\n pertain to any part of the Derivative Works, in at least one\n of the following places: within a NOTICE text file distributed\n as part of the Derivative Works; within the Source form or\n documentation, if provided along with the Derivative Works; or,\n within a display generated by the Derivative Works, if and\n wherever such third-party notices normally appear. The contents\n of the NOTICE file are for informational purposes only and\n do not modify the License. You may add Your own attribution\n notices within Derivative Works that You distribute, alongside\n or as an addendum to the NOTICE text from the Work, provided\n that such additional attribution notices cannot be construed\n as modifying the License.\n\n You may add Your own copyright statement to Your modifications and\n may provide additional or different license terms and conditions\n for use, reproduction, or distribution of Your modifications, or\n for any such Derivative Works as a whole, provided Your use,\n reproduction, and distribution of the Work otherwise complies with\n the conditions stated in this License.\n\n 5. Submission of Contributions. Unless You explicitly state otherwise,\n any Contribution intentionally submitted for inclusion in the Work\n by You to the Licensor shall be under the terms and conditions of\n this License, without any additional terms or conditions.\n Notwithstanding the above, nothing herein shall supersede or modify\n the terms of any separate license agreement you may have executed\n with Licensor regarding such Contributions.\n\n 6. Trademarks. This License does not grant permission to use the trade\n names, trademarks, service marks, or product names of the Licensor,\n except as required for reasonable and customary use in describing the\n origin of the Work and reproducing the content of the NOTICE file.\n\n 7. Disclaimer of Warranty. Unless required by applicable law or\n agreed to in writing, Licensor provides the Work (and each\n Contributor provides its Contributions) on an \"AS IS\" BASIS,\n WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or\n implied, including, without limitation, any warranties or conditions\n of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A\n PARTICULAR PURPOSE. You are solely responsible for determining the\n appropriateness of using or redistributing the Work and assume any\n risks associated with Your exercise of permissions under this License.\n\n 8. Limitation of Liability. In no event and under no legal theory,\n whether in tort (including negligence), contract, or otherwise,\n unless required by applicable law (such as deliberate and grossly\n negligent acts) or agreed to in writing, shall any Contributor be\n liable to You for damages, including any direct, indirect, special,\n incidental, or consequential damages of any character arising as a\n result of this License or out of the use or inability to use the\n Work (including but not limited to damages for loss of goodwill,\n work stoppage, computer failure or malfunction, or any and all\n other commercial damages or losses), even if such Contributor\n has been advised of the possibility of such damages.\n\n 9. Accepting Warranty or Additional Liability. While redistributing\n the Work or Derivative Works thereof, You may choose to offer,\n and charge a fee for, acceptance of support, warranty, indemnity,\n or other liability obligations and/or rights consistent with this\n License. However, in accepting such obligations, You may act only\n on Your own behalf and on Your sole responsibility, not on behalf\n of any other Contributor, and only if You agree to indemnify,\n defend, and hold each Contributor harmless for any liability\n incurred by, or claims asserted against, such Contributor by reason\n of your accepting any such warranty or additional liability.\n\n END OF TERMS AND CONDITIONS\n\n APPENDIX: How to apply the Apache License to your work.\n\n To apply the Apache License to your work, attach the following\n boilerplate notice, with the fields enclosed by brackets \"[]\"\n replaced with your own identifying information. (Don't include\n the brackets!) The text should be enclosed in the appropriate\n comment syntax for the file format. We also recommend that a\n file or class name and description of purpose be included on the\n same \"printed page\" as the copyright notice for easier\n identification within third-party archives.\n\n Copyright [yyyy] [name of copyright owner]\n\n Licensed under the Apache License, Version 2.0 (the \"License\");\n you may not use this file except in compliance with the License.\n You may obtain a copy of the License at\n\n http://www.apache.org/licenses/LICENSE-2.0\n\n Unless required by applicable law or agreed to in writing, software\n distributed under the License is distributed on an \"AS IS\" BASIS,\n WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n See the License for the specific language governing permissions and\n limitations under the License.\n"
},
{
"Name": "bleach",
"Version": "3.1.4",
"Summary": "An easy safelist-based HTML-sanitizing tool.",
"Home-page": "https://github.com/mozilla/bleach",
"License": "Other",
"License URL": "https://api.github.com/repos/mozilla/bleach/license",
"License repo": "Copyright (c) 2014-2017, Mozilla Foundation\n\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n\n http://www.apache.org/licenses/LICENSE-2.0\n\nUnless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License.\n"
},
{
"Name": "boto3",
"Version": "1.10.9",
"Summary": "The AWS SDK for Python",
"Home-page": "https://github.com/boto/boto3",
"Author": "Amazon Web Services",
"License": "Apache License 2.0",
"License URL": "https://api.github.com/repos/boto/boto3/license",
"License repo": "\n Apache License\n Version 2.0, January 2004\n http://www.apache.org/licenses/\n\n TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION\n\n 1. Definitions.\n\n \"License\" shall mean the terms and conditions for use, reproduction,\n and distribution as defined by Sections 1 through 9 of this document.\n\n \"Licensor\" shall mean the copyright owner or entity authorized by\n the copyright owner that is granting the License.\n\n \"Legal Entity\" shall mean the union of the acting entity and all\n other entities that control, are controlled by, or are under common\n control with that entity. For the purposes of this definition,\n \"control\" means (i) the power, direct or indirect, to cause the\n direction or management of such entity, whether by contract or\n otherwise, or (ii) ownership of fifty percent (50%) or more of the\n outstanding shares, or (iii) beneficial ownership of such entity.\n\n \"You\" (or \"Your\") shall mean an individual or Legal Entity\n exercising permissions granted by this License.\n\n \"Source\" form shall mean the preferred form for making modifications,\n including but not limited to software source code, documentation\n source, and configuration files.\n\n \"Object\" form shall mean any form resulting from mechanical\n transformation or translation of a Source form, including but\n not limited to compiled object code, generated documentation,\n and conversions to other media types.\n\n \"Work\" shall mean the work of authorship, whether in Source or\n Object form, made available under the License, as indicated by a\n copyright notice that is included in or attached to the work\n (an example is provided in the Appendix below).\n\n \"Derivative Works\" shall mean any work, whether in Source or Object\n form, that is based on (or derived from) the Work and for which the\n editorial revisions, annotations, elaborations, or other modifications\n represent, as a whole, an original work of authorship. For the purposes\n of this License, Derivative Works shall not include works that remain\n separable from, or merely link (or bind by name) to the interfaces of,\n the Work and Derivative Works thereof.\n\n \"Contribution\" shall mean any work of authorship, including\n the original version of the Work and any modifications or additions\n to that Work or Derivative Works thereof, that is intentionally\n submitted to Licensor for inclusion in the Work by the copyright owner\n or by an individual or Legal Entity authorized to submit on behalf of\n the copyright owner. For the purposes of this definition, \"submitted\"\n means any form of electronic, verbal, or written communication sent\n to the Licensor or its representatives, including but not limited to\n communication on electronic mailing lists, source code control systems,\n and issue tracking systems that are managed by, or on behalf of, the\n Licensor for the purpose of discussing and improving the Work, but\n excluding communication that is conspicuously marked or otherwise\n designated in writing by the copyright owner as \"Not a Contribution.\"\n\n \"Contributor\" shall mean Licensor and any individual or Legal Entity\n on behalf of whom a Contribution has been received by Licensor and\n subsequently incorporated within the Work.\n\n 2. Grant of Copyright License. Subject to the terms and conditions of\n this License, each Contributor hereby grants to You a perpetual,\n worldwide, non-exclusive, no-charge, royalty-free, irrevocable\n copyright license to reproduce, prepare Derivative Works of,\n publicly display, publicly perform, sublicense, and distribute the\n Work and such Derivative Works in Source or Object form.\n\n 3. Grant of Patent License. Subject to the terms and conditions of\n this License, each Contributor hereby grants to You a perpetual,\n worldwide, non-exclusive, no-charge, royalty-free, irrevocable\n (except as stated in this section) patent license to make, have made,\n use, offer to sell, sell, import, and otherwise transfer the Work,\n where such license applies only to those patent claims licensable\n by such Contributor that are necessarily infringed by their\n Contribution(s) alone or by combination of their Contribution(s)\n with the Work to which such Contribution(s) was submitted. If You\n institute patent litigation against any entity (including a\n cross-claim or counterclaim in a lawsuit) alleging that the Work\n or a Contribution incorporated within the Work constitutes direct\n or contributory patent infringement, then any patent licenses\n granted to You under this License for that Work shall terminate\n as of the date such litigation is filed.\n\n 4. Redistribution. You may reproduce and distribute copies of the\n Work or Derivative Works thereof in any medium, with or without\n modifications, and in Source or Object form, provided that You\n meet the following conditions:\n\n (a) You must give any other recipients of the Work or\n Derivative Works a copy of this License; and\n\n (b) You must cause any modified files to carry prominent notices\n stating that You changed the files; and\n\n (c) You must retain, in the Source form of any Derivative Works\n that You distribute, all copyright, patent, trademark, and\n attribution notices from the Source form of the Work,\n excluding those notices that do not pertain to any part of\n the Derivative Works; and\n\n (d) If the Work includes a \"NOTICE\" text file as part of its\n distribution, then any Derivative Works that You distribute must\n include a readable copy of the attribution notices contained\n within such NOTICE file, excluding those notices that do not\n pertain to any part of the Derivative Works, in at least one\n of the following places: within a NOTICE text file distributed\n as part of the Derivative Works; within the Source form or\n documentation, if provided along with the Derivative Works; or,\n within a display generated by the Derivative Works, if and\n wherever such third-party notices normally appear. The contents\n of the NOTICE file are for informational purposes only and\n do not modify the License. You may add Your own attribution\n notices within Derivative Works that You distribute, alongside\n or as an addendum to the NOTICE text from the Work, provided\n that such additional attribution notices cannot be construed\n as modifying the License.\n\n You may add Your own copyright statement to Your modifications and\n may provide additional or different license terms and conditions\n for use, reproduction, or distribution of Your modifications, or\n for any such Derivative Works as a whole, provided Your use,\n reproduction, and distribution of the Work otherwise complies with\n the conditions stated in this License.\n\n 5. Submission of Contributions. Unless You explicitly state otherwise,\n any Contribution intentionally submitted for inclusion in the Work\n by You to the Licensor shall be under the terms and conditions of\n this License, without any additional terms or conditions.\n Notwithstanding the above, nothing herein shall supersede or modify\n the terms of any separate license agreement you may have executed\n with Licensor regarding such Contributions.\n\n 6. Trademarks. This License does not grant permission to use the trade\n names, trademarks, service marks, or product names of the Licensor,\n except as required for reasonable and customary use in describing the\n origin of the Work and reproducing the content of the NOTICE file.\n\n 7. Disclaimer of Warranty. Unless required by applicable law or\n agreed to in writing, Licensor provides the Work (and each\n Contributor provides its Contributions) on an \"AS IS\" BASIS,\n WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or\n implied, including, without limitation, any warranties or conditions\n of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A\n PARTICULAR PURPOSE. You are solely responsible for determining the\n appropriateness of using or redistributing the Work and assume any\n risks associated with Your exercise of permissions under this License.\n\n 8. Limitation of Liability. In no event and under no legal theory,\n whether in tort (including negligence), contract, or otherwise,\n unless required by applicable law (such as deliberate and grossly\n negligent acts) or agreed to in writing, shall any Contributor be\n liable to You for damages, including any direct, indirect, special,\n incidental, or consequential damages of any character arising as a\n result of this License or out of the use or inability to use the\n Work (including but not limited to damages for loss of goodwill,\n work stoppage, computer failure or malfunction, or any and all\n other commercial damages or losses), even if such Contributor\n has been advised of the possibility of such damages.\n\n 9. Accepting Warranty or Additional Liability. While redistributing\n the Work or Derivative Works thereof, You may choose to offer,\n and charge a fee for, acceptance of support, warranty, indemnity,\n or other liability obligations and/or rights consistent with this\n License. However, in accepting such obligations, You may act only\n on Your own behalf and on Your sole responsibility, not on behalf\n of any other Contributor, and only if You agree to indemnify,\n defend, and hold each Contributor harmless for any liability\n incurred by, or claims asserted against, such Contributor by reason\n of your accepting any such warranty or additional liability.\n\n END OF TERMS AND CONDITIONS\n",
"License text": " Apache License\n Version 2.0, January 2004\n http://www.apache.org/licenses/\n\n TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION\n\n 1. Definitions.\n\n \"License\" shall mean the terms and conditions for use, reproduction,\n and distribution as defined by Sections 1 through 9 of this document.\n\n \"Licensor\" shall mean the copyright owner or entity authorized by\n the copyright owner that is granting the License.\n\n \"Legal Entity\" shall mean the union of the acting entity and all\n other entities that control, are controlled by, or are under common\n control with that entity. For the purposes of this definition,\n \"control\" means (i) the power, direct or indirect, to cause the\n direction or management of such entity, whether by contract or\n otherwise, or (ii) ownership of fifty percent (50%) or more of the\n outstanding shares, or (iii) beneficial ownership of such entity.\n\n \"You\" (or \"Your\") shall mean an individual or Legal Entity\n exercising permissions granted by this License.\n\n \"Source\" form shall mean the preferred form for making modifications,\n including but not limited to software source code, documentation\n source, and configuration files.\n\n \"Object\" form shall mean any form resulting from mechanical\n transformation or translation of a Source form, including but\n not limited to compiled object code, generated documentation,\n and conversions to other media types.\n\n \"Work\" shall mean the work of authorship, whether in Source or\n Object form, made available under the License, as indicated by a\n copyright notice that is included in or attached to the work\n (an example is provided in the Appendix below).\n\n \"Derivative Works\" shall mean any work, whether in Source or Object\n form, that is based on (or derived from) the Work and for which the\n editorial revisions, annotations, elaborations, or other modifications\n represent, as a whole, an original work of authorship. For the purposes\n of this License, Derivative Works shall not include works that remain\n separable from, or merely link (or bind by name) to the interfaces of,\n the Work and Derivative Works thereof.\n\n \"Contribution\" shall mean any work of authorship, including\n the original version of the Work and any modifications or additions\n to that Work or Derivative Works thereof, that is intentionally\n submitted to Licensor for inclusion in the Work by the copyright owner\n or by an individual or Legal Entity authorized to submit on behalf of\n the copyright owner. For the purposes of this definition, \"submitted\"\n means any form of electronic, verbal, or written communication sent\n to the Licensor or its representatives, including but not limited to\n communication on electronic mailing lists, source code control systems,\n and issue tracking systems that are managed by, or on behalf of, the\n Licensor for the purpose of discussing and improving the Work, but\n excluding communication that is conspicuously marked or otherwise\n designated in writing by the copyright owner as \"Not a Contribution.\"\n\n \"Contributor\" shall mean Licensor and any individual or Legal Entity\n on behalf of whom a Contribution has been received by Licensor and\n subsequently incorporated within the Work.\n\n 2. Grant of Copyright License. Subject to the terms and conditions of\n this License, each Contributor hereby grants to You a perpetual,\n worldwide, non-exclusive, no-charge, royalty-free, irrevocable\n copyright license to reproduce, prepare Derivative Works of,\n publicly display, publicly perform, sublicense, and distribute the\n Work and such Derivative Works in Source or Object form.\n\n 3. Grant of Patent License. Subject to the terms and conditions of\n this License, each Contributor hereby grants to You a perpetual,\n worldwide, non-exclusive, no-charge, royalty-free, irrevocable\n (except as stated in this section) patent license to make, have made,\n use, offer to sell, sell, import, and otherwise transfer the Work,\n where such license applies only to those patent claims licensable\n by such Contributor that are necessarily infringed by their\n Contribution(s) alone or by combination of their Contribution(s)\n with the Work to which such Contribution(s) was submitted. If You\n institute patent litigation against any entity (including a\n cross-claim or counterclaim in a lawsuit) alleging that the Work\n or a Contribution incorporated within the Work constitutes direct\n or contributory patent infringement, then any patent licenses\n granted to You under this License for that Work shall terminate\n as of the date such litigation is filed.\n\n 4. Redistribution. You may reproduce and distribute copies of the\n Work or Derivative Works thereof in any medium, with or without\n modifications, and in Source or Object form, provided that You\n meet the following conditions:\n\n (a) You must give any other recipients of the Work or\n Derivative Works a copy of this License; and\n\n (b) You must cause any modified files to carry prominent notices\n stating that You changed the files; and\n\n (c) You must retain, in the Source form of any Derivative Works\n that You distribute, all copyright, patent, trademark, and\n attribution notices from the Source form of the Work,\n excluding those notices that do not pertain to any part of\n the Derivative Works; and\n\n (d) If the Work includes a \"NOTICE\" text file as part of its\n distribution, then any Derivative Works that You distribute must\n include a readable copy of the attribution notices contained\n within such NOTICE file, excluding those notices that do not\n pertain to any part of the Derivative Works, in at least one\n of the following places: within a NOTICE text file distributed\n as part of the Derivative Works; within the Source form or\n documentation, if provided along with the Derivative Works; or,\n within a display generated by the Derivative Works, if and\n wherever such third-party notices normally appear. The contents\n of the NOTICE file are for informational purposes only and\n do not modify the License. You may add Your own attribution\n notices within Derivative Works that You distribute, alongside\n or as an addendum to the NOTICE text from the Work, provided\n that such additional attribution notices cannot be construed\n as modifying the License.\n\n You may add Your own copyright statement to Your modifications and\n may provide additional or different license terms and conditions\n for use, reproduction, or distribution of Your modifications, or\n for any such Derivative Works as a whole, provided Your use,\n reproduction, and distribution of the Work otherwise complies with\n the conditions stated in this License.\n\n 5. Submission of Contributions. Unless You explicitly state otherwise,\n any Contribution intentionally submitted for inclusion in the Work\n by You to the Licensor shall be under the terms and conditions of\n this License, without any additional terms or conditions.\n Notwithstanding the above, nothing herein shall supersede or modify\n the terms of any separate license agreement you may have executed\n with Licensor regarding such Contributions.\n\n 6. Trademarks. This License does not grant permission to use the trade\n names, trademarks, service marks, or product names of the Licensor,\n except as required for reasonable and customary use in describing the\n origin of the Work and reproducing the content of the NOTICE file.\n\n 7. Disclaimer of Warranty. Unless required by applicable law or\n agreed to in writing, Licensor provides the Work (and each\n Contributor provides its Contributions) on an \"AS IS\" BASIS,\n WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or\n implied, including, without limitation, any warranties or conditions\n of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A\n PARTICULAR PURPOSE. You are solely responsible for determining the\n appropriateness of using or redistributing the Work and assume any\n risks associated with Your exercise of permissions under this License.\n\n 8. Limitation of Liability. In no event and under no legal theory,\n whether in tort (including negligence), contract, or otherwise,\n unless required by applicable law (such as deliberate and grossly\n negligent acts) or agreed to in writing, shall any Contributor be\n liable to You for damages, including any direct, indirect, special,\n incidental, or consequential damages of any character arising as a\n result of this License or out of the use or inability to use the\n Work (including but not limited to damages for loss of goodwill,\n work stoppage, computer failure or malfunction, or any and all\n other commercial damages or losses), even if such Contributor\n has been advised of the possibility of such damages.\n\n 9. Accepting Warranty or Additional Liability. While redistributing\n the Work or Derivative Works thereof, You may choose to offer,\n and charge a fee for, acceptance of support, warranty, indemnity,\n or other liability obligations and/or rights consistent with this\n License. However, in accepting such obligations, You may act only\n on Your own behalf and on Your sole responsibility, not on behalf\n of any other Contributor, and only if You agree to indemnify,\n defend, and hold each Contributor harmless for any liability\n incurred by, or claims asserted against, such Contributor by reason\n of your accepting any such warranty or additional liability.\n\n END OF TERMS AND CONDITIONS\n\n APPENDIX: How to apply the Apache License to your work.\n\n To apply the Apache License to your work, attach the following\n boilerplate notice, with the fields enclosed by brackets \"[]\"\n replaced with your own identifying information. (Don't include\n the brackets!) The text should be enclosed in the appropriate\n comment syntax for the file format. We also recommend that a\n file or class name and description of purpose be included on the\n same \"printed page\" as the copyright notice for easier\n identification within third-party archives.\n\n Copyright [yyyy] [name of copyright owner]\n\n Licensed under the Apache License, Version 2.0 (the \"License\");\n you may not use this file except in compliance with the License.\n You may obtain a copy of the License at\n\n http://www.apache.org/licenses/LICENSE-2.0\n\n Unless required by applicable law or agreed to in writing, software\n distributed under the License is distributed on an \"AS IS\" BASIS,\n WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n See the License for the specific language governing permissions and\n limitations under the License.\n"
},
{
"Name": "botocore",
"Version": "1.13.9",
"Summary": "Low-level, data-driven core of boto 3.",
"Home-page": "https://github.com/boto/botocore",
"Author": "Amazon Web Services",
"License": "Apache License 2.0",
"License URL": "https://api.github.com/repos/boto/botocore/license",
"License repo": "\n Apache License\n Version 2.0, January 2004\n http://www.apache.org/licenses/\n\n TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION\n\n 1. Definitions.\n\n \"License\" shall mean the terms and conditions for use, reproduction,\n and distribution as defined by Sections 1 through 9 of this document.\n\n \"Licensor\" shall mean the copyright owner or entity authorized by\n the copyright owner that is granting the License.\n\n \"Legal Entity\" shall mean the union of the acting entity and all\n other entities that control, are controlled by, or are under common\n control with that entity. For the purposes of this definition,\n \"control\" means (i) the power, direct or indirect, to cause the\n direction or management of such entity, whether by contract or\n otherwise, or (ii) ownership of fifty percent (50%) or more of the\n outstanding shares, or (iii) beneficial ownership of such entity.\n\n \"You\" (or \"Your\") shall mean an individual or Legal Entity\n exercising permissions granted by this License.\n\n \"Source\" form shall mean the preferred form for making modifications,\n including but not limited to software source code, documentation\n source, and configuration files.\n\n \"Object\" form shall mean any form resulting from mechanical\n transformation or translation of a Source form, including but\n not limited to compiled object code, generated documentation,\n and conversions to other media types.\n\n \"Work\" shall mean the work of authorship, whether in Source or\n Object form, made available under the License, as indicated by a\n copyright notice that is included in or attached to the work\n (an example is provided in the Appendix below).\n\n \"Derivative Works\" shall mean any work, whether in Source or Object\n form, that is based on (or derived from) the Work and for which the\n editorial revisions, annotations, elaborations, or other modifications\n represent, as a whole, an original work of authorship. For the purposes\n of this License, Derivative Works shall not include works that remain\n separable from, or merely link (or bind by name) to the interfaces of,\n the Work and Derivative Works thereof.\n\n \"Contribution\" shall mean any work of authorship, including\n the original version of the Work and any modifications or additions\n to that Work or Derivative Works thereof, that is intentionally\n submitted to Licensor for inclusion in the Work by the copyright owner\n or by an individual or Legal Entity authorized to submit on behalf of\n the copyright owner. For the purposes of this definition, \"submitted\"\n means any form of electronic, verbal, or written communication sent\n to the Licensor or its representatives, including but not limited to\n communication on electronic mailing lists, source code control systems,\n and issue tracking systems that are managed by, or on behalf of, the\n Licensor for the purpose of discussing and improving the Work, but\n excluding communication that is conspicuously marked or otherwise\n designated in writing by the copyright owner as \"Not a Contribution.\"\n\n \"Contributor\" shall mean Licensor and any individual or Legal Entity\n on behalf of whom a Contribution has been received by Licensor and\n subsequently incorporated within the Work.\n\n 2. Grant of Copyright License. Subject to the terms and conditions of\n this License, each Contributor hereby grants to You a perpetual,\n worldwide, non-exclusive, no-charge, royalty-free, irrevocable\n copyright license to reproduce, prepare Derivative Works of,\n publicly display, publicly perform, sublicense, and distribute the\n Work and such Derivative Works in Source or Object form.\n\n 3. Grant of Patent License. Subject to the terms and conditions of\n this License, each Contributor hereby grants to You a perpetual,\n worldwide, non-exclusive, no-charge, royalty-free, irrevocable\n (except as stated in this section) patent license to make, have made,\n use, offer to sell, sell, import, and otherwise transfer the Work,\n where such license applies only to those patent claims licensable\n by such Contributor that are necessarily infringed by their\n Contribution(s) alone or by combination of their Contribution(s)\n with the Work to which such Contribution(s) was submitted. If You\n institute patent litigation against any entity (including a\n cross-claim or counterclaim in a lawsuit) alleging that the Work\n or a Contribution incorporated within the Work constitutes direct\n or contributory patent infringement, then any patent licenses\n granted to You under this License for that Work shall terminate\n as of the date such litigation is filed.\n\n 4. Redistribution. You may reproduce and distribute copies of the\n Work or Derivative Works thereof in any medium, with or without\n modifications, and in Source or Object form, provided that You\n meet the following conditions:\n\n (a) You must give any other recipients of the Work or\n Derivative Works a copy of this License; and\n\n (b) You must cause any modified files to carry prominent notices\n stating that You changed the files; and\n\n (c) You must retain, in the Source form of any Derivative Works\n that You distribute, all copyright, patent, trademark, and\n attribution notices from the Source form of the Work,\n excluding those notices that do not pertain to any part of\n the Derivative Works; and\n\n (d) If the Work includes a \"NOTICE\" text file as part of its\n distribution, then any Derivative Works that You distribute must\n include a readable copy of the attribution notices contained\n within such NOTICE file, excluding those notices that do not\n pertain to any part of the Derivative Works, in at least one\n of the following places: within a NOTICE text file distributed\n as part of the Derivative Works; within the Source form or\n documentation, if provided along with the Derivative Works; or,\n within a display generated by the Derivative Works, if and\n wherever such third-party notices normally appear. The contents\n of the NOTICE file are for informational purposes only and\n do not modify the License. You may add Your own attribution\n notices within Derivative Works that You distribute, alongside\n or as an addendum to the NOTICE text from the Work, provided\n that such additional attribution notices cannot be construed\n as modifying the License.\n\n You may add Your own copyright statement to Your modifications and\n may provide additional or different license terms and conditions\n for use, reproduction, or distribution of Your modifications, or\n for any such Derivative Works as a whole, provided Your use,\n reproduction, and distribution of the Work otherwise complies with\n the conditions stated in this License.\n\n 5. Submission of Contributions. Unless You explicitly state otherwise,\n any Contribution intentionally submitted for inclusion in the Work\n by You to the Licensor shall be under the terms and conditions of\n this License, without any additional terms or conditions.\n Notwithstanding the above, nothing herein shall supersede or modify\n the terms of any separate license agreement you may have executed\n with Licensor regarding such Contributions.\n\n 6. Trademarks. This License does not grant permission to use the trade\n names, trademarks, service marks, or product names of the Licensor,\n except as required for reasonable and customary use in describing the\n origin of the Work and reproducing the content of the NOTICE file.\n\n 7. Disclaimer of Warranty. Unless required by applicable law or\n agreed to in writing, Licensor provides the Work (and each\n Contributor provides its Contributions) on an \"AS IS\" BASIS,\n WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or\n implied, including, without limitation, any warranties or conditions\n of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A\n PARTICULAR PURPOSE. You are solely responsible for determining the\n appropriateness of using or redistributing the Work and assume any\n risks associated with Your exercise of permissions under this License.\n\n 8. Limitation of Liability. In no event and under no legal theory,\n whether in tort (including negligence), contract, or otherwise,\n unless required by applicable law (such as deliberate and grossly\n negligent acts) or agreed to in writing, shall any Contributor be\n liable to You for damages, including any direct, indirect, special,\n incidental, or consequential damages of any character arising as a\n result of this License or out of the use or inability to use the\n Work (including but not limited to damages for loss of goodwill,\n work stoppage, computer failure or malfunction, or any and all\n other commercial damages or losses), even if such Contributor\n has been advised of the possibility of such damages.\n\n 9. Accepting Warranty or Additional Liability. While redistributing\n the Work or Derivative Works thereof, You may choose to offer,\n and charge a fee for, acceptance of support, warranty, indemnity,\n or other liability obligations and/or rights consistent with this\n License. However, in accepting such obligations, You may act only\n on Your own behalf and on Your sole responsibility, not on behalf\n of any other Contributor, and only if You agree to indemnify,\n defend, and hold each Contributor harmless for any liability\n incurred by, or claims asserted against, such Contributor by reason\n of your accepting any such warranty or additional liability.\n\n END OF TERMS AND CONDITIONS\n",
"License text": " Apache License\n Version 2.0, January 2004\n http://www.apache.org/licenses/\n\n TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION\n\n 1. Definitions.\n\n \"License\" shall mean the terms and conditions for use, reproduction,\n and distribution as defined by Sections 1 through 9 of this document.\n\n \"Licensor\" shall mean the copyright owner or entity authorized by\n the copyright owner that is granting the License.\n\n \"Legal Entity\" shall mean the union of the acting entity and all\n other entities that control, are controlled by, or are under common\n control with that entity. For the purposes of this definition,\n \"control\" means (i) the power, direct or indirect, to cause the\n direction or management of such entity, whether by contract or\n otherwise, or (ii) ownership of fifty percent (50%) or more of the\n outstanding shares, or (iii) beneficial ownership of such entity.\n\n \"You\" (or \"Your\") shall mean an individual or Legal Entity\n exercising permissions granted by this License.\n\n \"Source\" form shall mean the preferred form for making modifications,\n including but not limited to software source code, documentation\n source, and configuration files.\n\n \"Object\" form shall mean any form resulting from mechanical\n transformation or translation of a Source form, including but\n not limited to compiled object code, generated documentation,\n and conversions to other media types.\n\n \"Work\" shall mean the work of authorship, whether in Source or\n Object form, made available under the License, as indicated by a\n copyright notice that is included in or attached to the work\n (an example is provided in the Appendix below).\n\n \"Derivative Works\" shall mean any work, whether in Source or Object\n form, that is based on (or derived from) the Work and for which the\n editorial revisions, annotations, elaborations, or other modifications\n represent, as a whole, an original work of authorship. For the purposes\n of this License, Derivative Works shall not include works that remain\n separable from, or merely link (or bind by name) to the interfaces of,\n the Work and Derivative Works thereof.\n\n \"Contribution\" shall mean any work of authorship, including\n the original version of the Work and any modifications or additions\n to that Work or Derivative Works thereof, that is intentionally\n submitted to Licensor for inclusion in the Work by the copyright owner\n or by an individual or Legal Entity authorized to submit on behalf of\n the copyright owner. For the purposes of this definition, \"submitted\"\n means any form of electronic, verbal, or written communication sent\n to the Licensor or its representatives, including but not limited to\n communication on electronic mailing lists, source code control systems,\n and issue tracking systems that are managed by, or on behalf of, the\n Licensor for the purpose of discussing and improving the Work, but\n excluding communication that is conspicuously marked or otherwise\n designated in writing by the copyright owner as \"Not a Contribution.\"\n\n \"Contributor\" shall mean Licensor and any individual or Legal Entity\n on behalf of whom a Contribution has been received by Licensor and\n subsequently incorporated within the Work.\n\n 2. Grant of Copyright License. Subject to the terms and conditions of\n this License, each Contributor hereby grants to You a perpetual,\n worldwide, non-exclusive, no-charge, royalty-free, irrevocable\n copyright license to reproduce, prepare Derivative Works of,\n publicly display, publicly perform, sublicense, and distribute the\n Work and such Derivative Works in Source or Object form.\n\n 3. Grant of Patent License. Subject to the terms and conditions of\n this License, each Contributor hereby grants to You a perpetual,\n worldwide, non-exclusive, no-charge, royalty-free, irrevocable\n (except as stated in this section) patent license to make, have made,\n use, offer to sell, sell, import, and otherwise transfer the Work,\n where such license applies only to those patent claims licensable\n by such Contributor that are necessarily infringed by their\n Contribution(s) alone or by combination of their Contribution(s)\n with the Work to which such Contribution(s) was submitted. If You\n institute patent litigation against any entity (including a\n cross-claim or counterclaim in a lawsuit) alleging that the Work\n or a Contribution incorporated within the Work constitutes direct\n or contributory patent infringement, then any patent licenses\n granted to You under this License for that Work shall terminate\n as of the date such litigation is filed.\n\n 4. Redistribution. You may reproduce and distribute copies of the\n Work or Derivative Works thereof in any medium, with or without\n modifications, and in Source or Object form, provided that You\n meet the following conditions:\n\n (a) You must give any other recipients of the Work or\n Derivative Works a copy of this License; and\n\n (b) You must cause any modified files to carry prominent notices\n stating that You changed the files; and\n\n (c) You must retain, in the Source form of any Derivative Works\n that You distribute, all copyright, patent, trademark, and\n attribution notices from the Source form of the Work,\n excluding those notices that do not pertain to any part of\n the Derivative Works; and\n\n (d) If the Work includes a \"NOTICE\" text file as part of its\n distribution, then any Derivative Works that You distribute must\n include a readable copy of the attribution notices contained\n within such NOTICE file, excluding those notices that do not\n pertain to any part of the Derivative Works, in at least one\n of the following places: within a NOTICE text file distributed\n as part of the Derivative Works; within the Source form or\n documentation, if provided along with the Derivative Works; or,\n within a display generated by the Derivative Works, if and\n wherever such third-party notices normally appear. The contents\n of the NOTICE file are for informational purposes only and\n do not modify the License. You may add Your own attribution\n notices within Derivative Works that You distribute, alongside\n or as an addendum to the NOTICE text from the Work, provided\n that such additional attribution notices cannot be construed\n as modifying the License.\n\n You may add Your own copyright statement to Your modifications and\n may provide additional or different license terms and conditions\n for use, reproduction, or distribution of Your modifications, or\n for any such Derivative Works as a whole, provided Your use,\n reproduction, and distribution of the Work otherwise complies with\n the conditions stated in this License.\n\n 5. Submission of Contributions. Unless You explicitly state otherwise,\n any Contribution intentionally submitted for inclusion in the Work\n by You to the Licensor shall be under the terms and conditions of\n this License, without any additional terms or conditions.\n Notwithstanding the above, nothing herein shall supersede or modify\n the terms of any separate license agreement you may have executed\n with Licensor regarding such Contributions.\n\n 6. Trademarks. This License does not grant permission to use the trade\n names, trademarks, service marks, or product names of the Licensor,\n except as required for reasonable and customary use in describing the\n origin of the Work and reproducing the content of the NOTICE file.\n\n 7. Disclaimer of Warranty. Unless required by applicable law or\n agreed to in writing, Licensor provides the Work (and each\n Contributor provides its Contributions) on an \"AS IS\" BASIS,\n WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or\n implied, including, without limitation, any warranties or conditions\n of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A\n PARTICULAR PURPOSE. You are solely responsible for determining the\n appropriateness of using or redistributing the Work and assume any\n risks associated with Your exercise of permissions under this License.\n\n 8. Limitation of Liability. In no event and under no legal theory,\n whether in tort (including negligence), contract, or otherwise,\n unless required by applicable law (such as deliberate and grossly\n negligent acts) or agreed to in writing, shall any Contributor be\n liable to You for damages, including any direct, indirect, special,\n incidental, or consequential damages of any character arising as a\n result of this License or out of the use or inability to use the\n Work (including but not limited to damages for loss of goodwill,\n work stoppage, computer failure or malfunction, or any and all\n other commercial damages or losses), even if such Contributor\n has been advised of the possibility of such damages.\n\n 9. Accepting Warranty or Additional Liability. While redistributing\n the Work or Derivative Works thereof, You may choose to offer,\n and charge a fee for, acceptance of support, warranty, indemnity,\n or other liability obligations and/or rights consistent with this\n License. However, in accepting such obligations, You may act only\n on Your own behalf and on Your sole responsibility, not on behalf\n of any other Contributor, and only if You agree to indemnify,\n defend, and hold each Contributor harmless for any liability\n incurred by, or claims asserted against, such Contributor by reason\n of your accepting any such warranty or additional liability.\n\n END OF TERMS AND CONDITIONS\n\n APPENDIX: How to apply the Apache License to your work.\n\n To apply the Apache License to your work, attach the following\n boilerplate notice, with the fields enclosed by brackets \"[]\"\n replaced with your own identifying information. (Don't include\n the brackets!) The text should be enclosed in the appropriate\n comment syntax for the file format. We also recommend that a\n file or class name and description of purpose be included on the\n same \"printed page\" as the copyright notice for easier\n identification within third-party archives.\n\n Copyright [yyyy] [name of copyright owner]\n\n Licensed under the Apache License, Version 2.0 (the \"License\");\n you may not use this file except in compliance with the License.\n You may obtain a copy of the License at\n\n http://www.apache.org/licenses/LICENSE-2.0\n\n Unless required by applicable law or agreed to in writing, software\n distributed under the License is distributed on an \"AS IS\" BASIS,\n WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n See the License for the specific language governing permissions and\n limitations under the License.\n"
},
{
"Name": "certifi",
"Version": "2019.9.11",
"Summary": "Python package for providing Mozilla's CA Bundle.",
"Home-page": "https://certifi.io/",
"Author": "Kenneth Reitz",
"License": "MPL-2.0"
},
{
"Name": "cffi",
"Version": "1.13.2",
"Summary": "Foreign Function Interface for Python calling C code.",
"Home-page": "http://cffi.readthedocs.org",
"Author": "Armin Rigo, Maciej Fijalkowski",
"License": "MIT"
},
{
"Name": "chardet",
"Version": "3.0.4",
"Summary": "Universal encoding detector for Python 2 and 3",
"Home-page": "https://github.com/chardet/chardet",
"Author": "Daniel Blanchard",
"License": "GNU Lesser General Public License v2.1",
"License URL": "https://api.github.com/repos/chardet/chardet/license",
"License repo": "\t\t GNU LESSER GENERAL PUBLIC LICENSE\n\t\t Version 2.1, February 1999\n\n Copyright (C) 1991, 1999 Free Software Foundation, Inc.\n 51 Franklin St, Fifth Floor, Boston, MA 02110-1301 USA\n Everyone is permitted to copy and distribute verbatim copies\n of this license document, but changing it is not allowed.\n\n[This is the first released version of the Lesser GPL. It also counts\n as the successor of the GNU Library Public License, version 2, hence\n the version number 2.1.]\n\n\t\t\t Preamble\n\n The licenses for most software are designed to take away your\nfreedom to share and change it. By contrast, the GNU General Public\nLicenses are intended to guarantee your freedom to share and change\nfree software--to make sure the software is free for all its users.\n\n This license, the Lesser General Public License, applies to some\nspecially designated software packages--typically libraries--of the\nFree Software Foundation and other authors who decide to use it. You\ncan use it too, but we suggest you first think carefully about whether\nthis license or the ordinary General Public License is the better\nstrategy to use in any particular case, based on the explanations below.\n\n When we speak of free software, we are referring to freedom of use,\nnot price. Our General Public Licenses are designed to make sure that\nyou have the freedom to distribute copies of free software (and charge\nfor this service if you wish); that you receive source code or can get\nit if you want it; that you can change the software and use pieces of\nit in new free programs; and that you are informed that you can do\nthese things.\n\n To protect your rights, we need to make restrictions that forbid\ndistributors to deny you these rights or to ask you to surrender these\nrights. These restrictions translate to certain responsibilities for\nyou if you distribute copies of the library or if you modify it.\n\n For example, if you distribute copies of the library, whether gratis\nor for a fee, you must give the recipients all the rights that we gave\nyou. You must make sure that they, too, receive or can get the source\ncode. If you link other code with the library, you must provide\ncomplete object files to the recipients, so that they can relink them\nwith the library after making changes to the library and recompiling\nit. And you must show them these terms so they know their rights.\n\n We protect your rights with a two-step method: (1) we copyright the\nlibrary, and (2) we offer you this license, which gives you legal\npermission to copy, distribute and/or modify the library.\n\n To protect each distributor, we want to make it very clear that\nthere is no warranty for the free library. Also, if the library is\nmodified by someone else and passed on, the recipients should know\nthat what they have is not the original version, so that the original\nauthor's reputation will not be affected by problems that might be\nintroduced by others.\n\f\n Finally, software patents pose a constant threat to the existence of\nany free program. We wish to make sure that a company cannot\neffectively restrict the users of a free program by obtaining a\nrestrictive license from a patent holder. Therefore, we insist that\nany patent license obtained for a version of the library must be\nconsistent with the full freedom of use specified in this license.\n\n Most GNU software, including some libraries, is covered by the\nordinary GNU General Public License. This license, the GNU Lesser\nGeneral Public License, applies to certain designated libraries, and\nis quite different from the ordinary General Public License. We use\nthis license for certain libraries in order to permit linking those\nlibraries into non-free programs.\n\n When a program is linked with a library, whether statically or using\na shared library, the combination of the two is legally speaking a\ncombined work, a derivative of the original library. The ordinary\nGeneral Public License therefore permits such linking only if the\nentire combination fits its criteria of freedom. The Lesser General\nPublic License permits more lax criteria for linking other code with\nthe library.\n\n We call this license the \"Lesser\" General Public License because it\ndoes Less to protect the user's freedom than the ordinary General\nPublic License. It also provides other free software developers Less\nof an advantage over competing non-free programs. These disadvantages\nare the reason we use the ordinary General Public License for many\nlibraries. However, the Lesser license provides advantages in certain\nspecial circumstances.\n\n For example, on rare occasions, there may be a special need to\nencourage the widest possible use of a certain library, so that it becomes\na de-facto standard. To achieve this, non-free programs must be\nallowed to use the library. A more frequent case is that a free\nlibrary does the same job as widely used non-free libraries. In this\ncase, there is little to gain by limiting the free library to free\nsoftware only, so we use the Lesser General Public License.\n\n In other cases, permission to use a particular library in non-free\nprograms enables a greater number of people to use a large body of\nfree software. For example, permission to use the GNU C Library in\nnon-free programs enables many more people to use the whole GNU\noperating system, as well as its variant, the GNU/Linux operating\nsystem.\n\n Although the Lesser General Public License is Less protective of the\nusers' freedom, it does ensure that the user of a program that is\nlinked with the Library has the freedom and the wherewithal to run\nthat program using a modified version of the Library.\n\n The precise terms and conditions for copying, distribution and\nmodification follow. Pay close attention to the difference between a\n\"work based on the library\" and a \"work that uses the library\". The\nformer contains code derived from the library, whereas the latter must\nbe combined with the library in order to run.\n\f\n\t\t GNU LESSER GENERAL PUBLIC LICENSE\n TERMS AND CONDITIONS FOR COPYING, DISTRIBUTION AND MODIFICATION\n\n 0. This License Agreement applies to any software library or other\nprogram which contains a notice placed by the copyright holder or\nother authorized party saying it may be distributed under the terms of\nthis Lesser General Public License (also called \"this License\").\nEach licensee is addressed as \"you\".\n\n A \"library\" means a collection of software functions and/or data\nprepared so as to be conveniently linked with application programs\n(which use some of those functions and data) to form executables.\n\n The \"Library\", below, refers to any such software library or work\nwhich has been distributed under these terms. A \"work based on the\nLibrary\" means either the Library or any derivative work under\ncopyright law: that is to say, a work containing the Library or a\nportion of it, either verbatim or with modifications and/or translated\nstraightforwardly into another language. (Hereinafter, translation is\nincluded without limitation in the term \"modification\".)\n\n \"Source code\" for a work means the preferred form of the work for\nmaking modifications to it. For a library, complete source code means\nall the source code for all modules it contains, plus any associated\ninterface definition files, plus the scripts used to control compilation\nand installation of the library.\n\n Activities other than copying, distribution and modification are not\ncovered by this License; they are outside its scope. The act of\nrunning a program using the Library is not restricted, and output from\nsuch a program is covered only if its contents constitute a work based\non the Library (independent of the use of the Library in a tool for\nwriting it). Whether that is true depends on what the Library does\nand what the program that uses the Library does.\n \n 1. You may copy and distribute verbatim copies of the Library's\ncomplete source code as you receive it, in any medium, provided that\nyou conspicuously and appropriately publish on each copy an\nappropriate copyright notice and disclaimer of warranty; keep intact\nall the notices that refer to this License and to the absence of any\nwarranty; and distribute a copy of this License along with the\nLibrary.\n\n You may charge a fee for the physical act of transferring a copy,\nand you may at your option offer warranty protection in exchange for a\nfee.\n\f\n 2. You may modify your copy or copies of the Library or any portion\nof it, thus forming a work based on the Library, and copy and\ndistribute such modifications or work under the terms of Section 1\nabove, provided that you also meet all of these conditions:\n\n a) The modified work must itself be a software library.\n\n b) You must cause the files modified to carry prominent notices\n stating that you changed the files and the date of any change.\n\n c) You must cause the whole of the work to be licensed at no\n charge to all third parties under the terms of this License.\n\n d) If a facility in the modified Library refers to a function or a\n table of data to be supplied by an application program that uses\n the facility, other than as an argument passed when the facility\n is invoked, then you must make a good faith effort to ensure that,\n in the event an application does not supply such function or\n table, the facility still operates, and performs whatever part of\n its purpose remains meaningful.\n\n (For example, a function in a library to compute square roots has\n a purpose that is entirely well-defined independent of the\n application. Therefore, Subsection 2d requires that any\n application-supplied function or table used by this function must\n be optional: if the application does not supply it, the square\n root function must still compute square roots.)\n\nThese requirements apply to the modified work as a whole. If\nidentifiable sections of that work are not derived from the Library,\nand can be reasonably considered independent and separate works in\nthemselves, then this License, and its terms, do not apply to those\nsections when you distribute them as separate works. But when you\ndistribute the same sections as part of a whole which is a work based\non the Library, the distribution of the whole must be on the terms of\nthis License, whose permissions for other licensees extend to the\nentire whole, and thus to each and every part regardless of who wrote\nit.\n\nThus, it is not the intent of this section to claim rights or contest\nyour rights to work written entirely by you; rather, the intent is to\nexercise the right to control the distribution of derivative or\ncollective works based on the Library.\n\nIn addition, mere aggregation of another work not based on the Library\nwith the Library (or with a work based on the Library) on a volume of\na storage or distribution medium does not bring the other work under\nthe scope of this License.\n\n 3. You may opt to apply the terms of the ordinary GNU General Public\nLicense instead of this License to a given copy of the Library. To do\nthis, you must alter all the notices that refer to this License, so\nthat they refer to the ordinary GNU General Public License, version 2,\ninstead of to this License. (If a newer version than version 2 of the\nordinary GNU General Public License has appeared, then you can specify\nthat version instead if you wish.) Do not make any other change in\nthese notices.\n\f\n Once this change is made in a given copy, it is irreversible for\nthat copy, so the ordinary GNU General Public License applies to all\nsubsequent copies and derivative works made from that copy.\n\n This option is useful when you wish to copy part of the code of\nthe Library into a program that is not a library.\n\n 4. You may copy and distribute the Library (or a portion or\nderivative of it, under Section 2) in object code or executable form\nunder the terms of Sections 1 and 2 above provided that you accompany\nit with the complete corresponding machine-readable source code, which\nmust be distributed under the terms of Sections 1 and 2 above on a\nmedium customarily used for software interchange.\n\n If distribution of object code is made by offering access to copy\nfrom a designated place, then offering equivalent access to copy the\nsource code from the same place satisfies the requirement to\ndistribute the source code, even though third parties are not\ncompelled to copy the source along with the object code.\n\n 5. A program that contains no derivative of any portion of the\nLibrary, but is designed to work with the Library by being compiled or\nlinked with it, is called a \"work that uses the Library\". Such a\nwork, in isolation, is not a derivative work of the Library, and\ntherefore falls outside the scope of this License.\n\n However, linking a \"work that uses the Library\" with the Library\ncreates an executable that is a derivative of the Library (because it\ncontains portions of the Library), rather than a \"work that uses the\nlibrary\". The executable is therefore covered by this License.\nSection 6 states terms for distribution of such executables.\n\n When a \"work that uses the Library\" uses material from a header file\nthat is part of the Library, the object code for the work may be a\nderivative work of the Library even though the source code is not.\nWhether this is true is especially significant if the work can be\nlinked without the Library, or if the work is itself a library. The\nthreshold for this to be true is not precisely defined by law.\n\n If such an object file uses only numerical parameters, data\nstructure layouts and accessors, and small macros and small inline\nfunctions (ten lines or less in length), then the use of the object\nfile is unrestricted, regardless of whether it is legally a derivative\nwork. (Executables containing this object code plus portions of the\nLibrary will still fall under Section 6.)\n\n Otherwise, if the work is a derivative of the Library, you may\ndistribute the object code for the work under the terms of Section 6.\nAny executables containing that work also fall under Section 6,\nwhether or not they are linked directly with the Library itself.\n\f\n 6. As an exception to the Sections above, you may also combine or\nlink a \"work that uses the Library\" with the Library to produce a\nwork containing portions of the Library, and distribute that work\nunder terms of your choice, provided that the terms permit\nmodification of the work for the customer's own use and reverse\nengineering for debugging such modifications.\n\n You must give prominent notice with each copy of the work that the\nLibrary is used in it and that the Library and its use are covered by\nthis License. You must supply a copy of this License. If the work\nduring execution displays copyright notices, you must include the\ncopyright notice for the Library among them, as well as a reference\ndirecting the user to the copy of this License. Also, you must do one\nof these things:\n\n a) Accompany the work with the complete corresponding\n machine-readable source code for the Library including whatever\n changes were used in the work (which must be distributed under\n Sections 1 and 2 above); and, if the work is an executable linked\n with the Library, with the complete machine-readable \"work that\n uses the Library\", as object code and/or source code, so that the\n user can modify the Library and then relink to produce a modified\n executable containing the modified Library. (It is understood\n that the user who changes the contents of definitions files in the\n Library will not necessarily be able to recompile the application\n to use the modified definitions.)\n\n b) Use a suitable shared library mechanism for linking with the\n Library. A suitable mechanism is one that (1) uses at run time a\n copy of the library already present on the user's computer system,\n rather than copying library functions into the executable, and (2)\n will operate properly with a modified version of the library, if\n the user installs one, as long as the modified version is\n interface-compatible with the version that the work was made with.\n\n c) Accompany the work with a written offer, valid for at\n least three years, to give the same user the materials\n specified in Subsection 6a, above, for a charge no more\n than the cost of performing this distribution.\n\n d) If distribution of the work is made by offering access to copy\n from a designated place, offer equivalent access to copy the above\n specified materials from the same place.\n\n e) Verify that the user has already received a copy of these\n materials or that you have already sent this user a copy.\n\n For an executable, the required form of the \"work that uses the\nLibrary\" must include any data and utility programs needed for\nreproducing the executable from it. However, as a special exception,\nthe materials to be distributed need not include anything that is\nnormally distributed (in either source or binary form) with the major\ncomponents (compiler, kernel, and so on) of the operating system on\nwhich the executable runs, unless that component itself accompanies\nthe executable.\n\n It may happen that this requirement contradicts the license\nrestrictions of other proprietary libraries that do not normally\naccompany the operating system. Such a contradiction means you cannot\nuse both them and the Library together in an executable that you\ndistribute.\n\f\n 7. You may place library facilities that are a work based on the\nLibrary side-by-side in a single library together with other library\nfacilities not covered by this License, and distribute such a combined\nlibrary, provided that the separate distribution of the work based on\nthe Library and of the other library facilities is otherwise\npermitted, and provided that you do these two things:\n\n a) Accompany the combined library with a copy of the same work\n based on the Library, uncombined with any other library\n facilities. This must be distributed under the terms of the\n Sections above.\n\n b) Give prominent notice with the combined library of the fact\n that part of it is a work based on the Library, and explaining\n where to find the accompanying uncombined form of the same work.\n\n 8. You may not copy, modify, sublicense, link with, or distribute\nthe Library except as expressly provided under this License. Any\nattempt otherwise to copy, modify, sublicense, link with, or\ndistribute the Library is void, and will automatically terminate your\nrights under this License. However, parties who have received copies,\nor rights, from you under this License will not have their licenses\nterminated so long as such parties remain in full compliance.\n\n 9. You are not required to accept this License, since you have not\nsigned it. However, nothing else grants you permission to modify or\ndistribute the Library or its derivative works. These actions are\nprohibited by law if you do not accept this License. Therefore, by\nmodifying or distributing the Library (or any work based on the\nLibrary), you indicate your acceptance of this License to do so, and\nall its terms and conditions for copying, distributing or modifying\nthe Library or works based on it.\n\n 10. Each time you redistribute the Library (or any work based on the\nLibrary), the recipient automatically receives a license from the\noriginal licensor to copy, distribute, link with or modify the Library\nsubject to these terms and conditions. You may not impose any further\nrestrictions on the recipients' exercise of the rights granted herein.\nYou are not responsible for enforcing compliance by third parties with\nthis License.\n\f\n 11. If, as a consequence of a court judgment or allegation of patent\ninfringement or for any other reason (not limited to patent issues),\nconditions are imposed on you (whether by court order, agreement or\notherwise) that contradict the conditions of this License, they do not\nexcuse you from the conditions of this License. If you cannot\ndistribute so as to satisfy simultaneously your obligations under this\nLicense and any other pertinent obligations, then as a consequence you\nmay not distribute the Library at all. For example, if a patent\nlicense would not permit royalty-free redistribution of the Library by\nall those who receive copies directly or indirectly through you, then\nthe only way you could satisfy both it and this License would be to\nrefrain entirely from distribution of the Library.\n\nIf any portion of this section is held invalid or unenforceable under any\nparticular circumstance, the balance of the section is intended to apply,\nand the section as a whole is intended to apply in other circumstances.\n\nIt is not the purpose of this section to induce you to infringe any\npatents or other property right claims or to contest validity of any\nsuch claims; this section has the sole purpose of protecting the\nintegrity of the free software distribution system which is\nimplemented by public license practices. Many people have made\ngenerous contributions to the wide range of software distributed\nthrough that system in reliance on consistent application of that\nsystem; it is up to the author/donor to decide if he or she is willing\nto distribute software through any other system and a licensee cannot\nimpose that choice.\n\nThis section is intended to make thoroughly clear what is believed to\nbe a consequence of the rest of this License.\n\n 12. If the distribution and/or use of the Library is restricted in\ncertain countries either by patents or by copyrighted interfaces, the\noriginal copyright holder who places the Library under this License may add\nan explicit geographical distribution limitation excluding those countries,\nso that distribution is permitted only in or among countries not thus\nexcluded. In such case, this License incorporates the limitation as if\nwritten in the body of this License.\n\n 13. The Free Software Foundation may publish revised and/or new\nversions of the Lesser General Public License from time to time.\nSuch new versions will be similar in spirit to the present version,\nbut may differ in detail to address new problems or concerns.\n\nEach version is given a distinguishing version number. If the Library\nspecifies a version number of this License which applies to it and\n\"any later version\", you have the option of following the terms and\nconditions either of that version or of any later version published by\nthe Free Software Foundation. If the Library does not specify a\nlicense version number, you may choose any version ever published by\nthe Free Software Foundation.\n\f\n 14. If you wish to incorporate parts of the Library into other free\nprograms whose distribution conditions are incompatible with these,\nwrite to the author to ask for permission. For software which is\ncopyrighted by the Free Software Foundation, write to the Free\nSoftware Foundation; we sometimes make exceptions for this. Our\ndecision will be guided by the two goals of preserving the free status\nof all derivatives of our free software and of promoting the sharing\nand reuse of software generally.\n\n\t\t\t NO WARRANTY\n\n 15. BECAUSE THE LIBRARY IS LICENSED FREE OF CHARGE, THERE IS NO\nWARRANTY FOR THE LIBRARY, TO THE EXTENT PERMITTED BY APPLICABLE LAW.\nEXCEPT WHEN OTHERWISE STATED IN WRITING THE COPYRIGHT HOLDERS AND/OR\nOTHER PARTIES PROVIDE THE LIBRARY \"AS IS\" WITHOUT WARRANTY OF ANY\nKIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, BUT NOT LIMITED TO, THE\nIMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR\nPURPOSE. THE ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF THE\nLIBRARY IS WITH YOU. SHOULD THE LIBRARY PROVE DEFECTIVE, YOU ASSUME\nTHE COST OF ALL NECESSARY SERVICING, REPAIR OR CORRECTION.\n\n 16. IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN\nWRITING WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MAY MODIFY\nAND/OR REDISTRIBUTE THE LIBRARY AS PERMITTED ABOVE, BE LIABLE TO YOU\nFOR DAMAGES, INCLUDING ANY GENERAL, SPECIAL, INCIDENTAL OR\nCONSEQUENTIAL DAMAGES ARISING OUT OF THE USE OR INABILITY TO USE THE\nLIBRARY (INCLUDING BUT NOT LIMITED TO LOSS OF DATA OR DATA BEING\nRENDERED INACCURATE OR LOSSES SUSTAINED BY YOU OR THIRD PARTIES OR A\nFAILURE OF THE LIBRARY TO OPERATE WITH ANY OTHER SOFTWARE), EVEN IF\nSUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE POSSIBILITY OF SUCH\nDAMAGES.\n\n\t\t END OF TERMS AND CONDITIONS\n\f\n How to Apply These Terms to Your New Libraries\n\n If you develop a new library, and you want it to be of the greatest\npossible use to the public, we recommend making it free software that\neveryone can redistribute and change. You can do so by permitting\nredistribution under these terms (or, alternatively, under the terms of the\nordinary General Public License).\n\n To apply these terms, attach the following notices to the library. It is\nsafest to attach them to the start of each source file to most effectively\nconvey the exclusion of warranty; and each file should have at least the\n\"copyright\" line and a pointer to where the full notice is found.\n\n <one line to give the library's name and a brief idea of what it does.>\n Copyright (C) <year> <name of author>\n\n This library is free software; you can redistribute it and/or\n modify it under the terms of the GNU Lesser General Public\n License as published by the Free Software Foundation; either\n version 2.1 of the License, or (at your option) any later version.\n\n This library is distributed in the hope that it will be useful,\n but WITHOUT ANY WARRANTY; without even the implied warranty of\n MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU\n Lesser General Public License for more details.\n\n You should have received a copy of the GNU Lesser General Public\n License along with this library; if not, write to the Free Software\n Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA 02110-1301 USA\n\nAlso add information on how to contact you by electronic and paper mail.\n\nYou should also get your employer (if you work as a programmer) or your\nschool, if any, to sign a \"copyright disclaimer\" for the library, if\nnecessary. Here is a sample; alter the names:\n\n Yoyodyne, Inc., hereby disclaims all copyright interest in the\n library `Frob' (a library for tweaking knobs) written by James Random Hacker.\n\n <signature of Ty Coon>, 1 April 1990\n Ty Coon, President of Vice\n\nThat's all there is to it!\n\n\n",
"License text": " GNU LESSER GENERAL PUBLIC LICENSE\n Version 2.1, February 1999\n\n Copyright (C) 1991, 1999 Free Software Foundation, Inc.\n 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA\n Everyone is permitted to copy and distribute verbatim copies\n of this license document, but changing it is not allowed.\n\n[This is the first released version of the Lesser GPL. It also counts\n as the successor of the GNU Library Public License, version 2, hence\n the version number 2.1.]\n\n Preamble\n\n The licenses for most software are designed to take away your\nfreedom to share and change it. By contrast, the GNU General Public\nLicenses are intended to guarantee your freedom to share and change\nfree software--to make sure the software is free for all its users.\n\n This license, the Lesser General Public License, applies to some\nspecially designated software packages--typically libraries--of the\nFree Software Foundation and other authors who decide to use it. You\ncan use it too, but we suggest you first think carefully about whether\nthis license or the ordinary General Public License is the better\nstrategy to use in any particular case, based on the explanations below.\n\n When we speak of free software, we are referring to freedom of use,\nnot price. Our General Public Licenses are designed to make sure that\nyou have the freedom to distribute copies of free software (and charge\nfor this service if you wish); that you receive source code or can get\nit if you want it; that you can change the software and use pieces of\nit in new free programs; and that you are informed that you can do\nthese things.\n\n To protect your rights, we need to make restrictions that forbid\ndistributors to deny you these rights or to ask you to surrender these\nrights. These restrictions translate to certain responsibilities for\nyou if you distribute copies of the library or if you modify it.\n\n For example, if you distribute copies of the library, whether gratis\nor for a fee, you must give the recipients all the rights that we gave\nyou. You must make sure that they, too, receive or can get the source\ncode. If you link other code with the library, you must provide\ncomplete object files to the recipients, so that they can relink them\nwith the library after making changes to the library and recompiling\nit. And you must show them these terms so they know their rights.\n\n We protect your rights with a two-step method: (1) we copyright the\nlibrary, and (2) we offer you this license, which gives you legal\npermission to copy, distribute and/or modify the library.\n\n To protect each distributor, we want to make it very clear that\nthere is no warranty for the free library. Also, if the library is\nmodified by someone else and passed on, the recipients should know\nthat what they have is not the original version, so that the original\nauthor's reputation will not be affected by problems that might be\nintroduced by others.\n\n Finally, software patents pose a constant threat to the existence of\nany free program. We wish to make sure that a company cannot\neffectively restrict the users of a free program by obtaining a\nrestrictive license from a patent holder. Therefore, we insist that\nany patent license obtained for a version of the library must be\nconsistent with the full freedom of use specified in this license.\n\n Most GNU software, including some libraries, is covered by the\nordinary GNU General Public License. This license, the GNU Lesser\nGeneral Public License, applies to certain designated libraries, and\nis quite different from the ordinary General Public License. We use\nthis license for certain libraries in order to permit linking those\nlibraries into non-free programs.\n\n When a program is linked with a library, whether statically or using\na shared library, the combination of the two is legally speaking a\ncombined work, a derivative of the original library. The ordinary\nGeneral Public License therefore permits such linking only if the\nentire combination fits its criteria of freedom. The Lesser General\nPublic License permits more lax criteria for linking other code with\nthe library.\n\n We call this license the \"Lesser\" General Public License because it\ndoes Less to protect the user's freedom than the ordinary General\nPublic License. It also provides other free software developers Less\nof an advantage over competing non-free programs. These disadvantages\nare the reason we use the ordinary General Public License for many\nlibraries. However, the Lesser license provides advantages in certain\nspecial circumstances.\n\n For example, on rare occasions, there may be a special need to\nencourage the widest possible use of a certain library, so that it becomes\na de-facto standard. To achieve this, non-free programs must be\nallowed to use the library. A more frequent case is that a free\nlibrary does the same job as widely used non-free libraries. In this\ncase, there is little to gain by limiting the free library to free\nsoftware only, so we use the Lesser General Public License.\n\n In other cases, permission to use a particular library in non-free\nprograms enables a greater number of people to use a large body of\nfree software. For example, permission to use the GNU C Library in\nnon-free programs enables many more people to use the whole GNU\noperating system, as well as its variant, the GNU/Linux operating\nsystem.\n\n Although the Lesser General Public License is Less protective of the\nusers' freedom, it does ensure that the user of a program that is\nlinked with the Library has the freedom and the wherewithal to run\nthat program using a modified version of the Library.\n\n The precise terms and conditions for copying, distribution and\nmodification follow. Pay close attention to the difference between a\n\"work based on the library\" and a \"work that uses the library\". The\nformer contains code derived from the library, whereas the latter must\nbe combined with the library in order to run.\n\n GNU LESSER GENERAL PUBLIC LICENSE\n TERMS AND CONDITIONS FOR COPYING, DISTRIBUTION AND MODIFICATION\n\n 0. This License Agreement applies to any software library or other\nprogram which contains a notice placed by the copyright holder or\nother authorized party saying it may be distributed under the terms of\nthis Lesser General Public License (also called \"this License\").\nEach licensee is addressed as \"you\".\n\n A \"library\" means a collection of software functions and/or data\nprepared so as to be conveniently linked with application programs\n(which use some of those functions and data) to form executables.\n\n The \"Library\", below, refers to any such software library or work\nwhich has been distributed under these terms. A \"work based on the\nLibrary\" means either the Library or any derivative work under\ncopyright law: that is to say, a work containing the Library or a\nportion of it, either verbatim or with modifications and/or translated\nstraightforwardly into another language. (Hereinafter, translation is\nincluded without limitation in the term \"modification\".)\n\n \"Source code\" for a work means the preferred form of the work for\nmaking modifications to it. For a library, complete source code means\nall the source code for all modules it contains, plus any associated\ninterface definition files, plus the scripts used to control compilation\nand installation of the library.\n\n Activities other than copying, distribution and modification are not\ncovered by this License; they are outside its scope. The act of\nrunning a program using the Library is not restricted, and output from\nsuch a program is covered only if its contents constitute a work based\non the Library (independent of the use of the Library in a tool for\nwriting it). Whether that is true depends on what the Library does\nand what the program that uses the Library does.\n\n 1. You may copy and distribute verbatim copies of the Library's\ncomplete source code as you receive it, in any medium, provided that\nyou conspicuously and appropriately publish on each copy an\nappropriate copyright notice and disclaimer of warranty; keep intact\nall the notices that refer to this License and to the absence of any\nwarranty; and distribute a copy of this License along with the\nLibrary.\n\n You may charge a fee for the physical act of transferring a copy,\nand you may at your option offer warranty protection in exchange for a\nfee.\n\n 2. You may modify your copy or copies of the Library or any portion\nof it, thus forming a work based on the Library, and copy and\ndistribute such modifications or work under the terms of Section 1\nabove, provided that you also meet all of these conditions:\n\n a) The modified work must itself be a software library.\n\n b) You must cause the files modified to carry prominent notices\n stating that you changed the files and the date of any change.\n\n c) You must cause the whole of the work to be licensed at no\n charge to all third parties under the terms of this License.\n\n d) If a facility in the modified Library refers to a function or a\n table of data to be supplied by an application program that uses\n the facility, other than as an argument passed when the facility\n is invoked, then you must make a good faith effort to ensure that,\n in the event an application does not supply such function or\n table, the facility still operates, and performs whatever part of\n its purpose remains meaningful.\n\n (For example, a function in a library to compute square roots has\n a purpose that is entirely well-defined independent of the\n application. Therefore, Subsection 2d requires that any\n application-supplied function or table used by this function must\n be optional: if the application does not supply it, the square\n root function must still compute square roots.)\n\nThese requirements apply to the modified work as a whole. If\nidentifiable sections of that work are not derived from the Library,\nand can be reasonably considered independent and separate works in\nthemselves, then this License, and its terms, do not apply to those\nsections when you distribute them as separate works. But when you\ndistribute the same sections as part of a whole which is a work based\non the Library, the distribution of the whole must be on the terms of\nthis License, whose permissions for other licensees extend to the\nentire whole, and thus to each and every part regardless of who wrote\nit.\n\nThus, it is not the intent of this section to claim rights or contest\nyour rights to work written entirely by you; rather, the intent is to\nexercise the right to control the distribution of derivative or\ncollective works based on the Library.\n\nIn addition, mere aggregation of another work not based on the Library\nwith the Library (or with a work based on the Library) on a volume of\na storage or distribution medium does not bring the other work under\nthe scope of this License.\n\n 3. You may opt to apply the terms of the ordinary GNU General Public\nLicense instead of this License to a given copy of the Library. To do\nthis, you must alter all the notices that refer to this License, so\nthat they refer to the ordinary GNU General Public License, version 2,\ninstead of to this License. (If a newer version than version 2 of the\nordinary GNU General Public License has appeared, then you can specify\nthat version instead if you wish.) Do not make any other change in\nthese notices.\n\n Once this change is made in a given copy, it is irreversible for\nthat copy, so the ordinary GNU General Public License applies to all\nsubsequent copies and derivative works made from that copy.\n\n This option is useful when you wish to copy part of the code of\nthe Library into a program that is not a library.\n\n 4. You may copy and distribute the Library (or a portion or\nderivative of it, under Section 2) in object code or executable form\nunder the terms of Sections 1 and 2 above provided that you accompany\nit with the complete corresponding machine-readable source code, which\nmust be distributed under the terms of Sections 1 and 2 above on a\nmedium customarily used for software interchange.\n\n If distribution of object code is made by offering access to copy\nfrom a designated place, then offering equivalent access to copy the\nsource code from the same place satisfies the requirement to\ndistribute the source code, even though third parties are not\ncompelled to copy the source along with the object code.\n\n 5. A program that contains no derivative of any portion of the\nLibrary, but is designed to work with the Library by being compiled or\nlinked with it, is called a \"work that uses the Library\". Such a\nwork, in isolation, is not a derivative work of the Library, and\ntherefore falls outside the scope of this License.\n\n However, linking a \"work that uses the Library\" with the Library\ncreates an executable that is a derivative of the Library (because it\ncontains portions of the Library), rather than a \"work that uses the\nlibrary\". The executable is therefore covered by this License.\nSection 6 states terms for distribution of such executables.\n\n When a \"work that uses the Library\" uses material from a header file\nthat is part of the Library, the object code for the work may be a\nderivative work of the Library even though the source code is not.\nWhether this is true is especially significant if the work can be\nlinked without the Library, or if the work is itself a library. The\nthreshold for this to be true is not precisely defined by law.\n\n If such an object file uses only numerical parameters, data\nstructure layouts and accessors, and small macros and small inline\nfunctions (ten lines or less in length), then the use of the object\nfile is unrestricted, regardless of whether it is legally a derivative\nwork. (Executables containing this object code plus portions of the\nLibrary will still fall under Section 6.)\n\n Otherwise, if the work is a derivative of the Library, you may\ndistribute the object code for the work under the terms of Section 6.\nAny executables containing that work also fall under Section 6,\nwhether or not they are linked directly with the Library itself.\n\n 6. As an exception to the Sections above, you may also combine or\nlink a \"work that uses the Library\" with the Library to produce a\nwork containing portions of the Library, and distribute that work\nunder terms of your choice, provided that the terms permit\nmodification of the work for the customer's own use and reverse\nengineering for debugging such modifications.\n\n You must give prominent notice with each copy of the work that the\nLibrary is used in it and that the Library and its use are covered by\nthis License. You must supply a copy of this License. If the work\nduring execution displays copyright notices, you must include the\ncopyright notice for the Library among them, as well as a reference\ndirecting the user to the copy of this License. Also, you must do one\nof these things:\n\n a) Accompany the work with the complete corresponding\n machine-readable source code for the Library including whatever\n changes were used in the work (which must be distributed under\n Sections 1 and 2 above); and, if the work is an executable linked\n with the Library, with the complete machine-readable \"work that\n uses the Library\", as object code and/or source code, so that the\n user can modify the Library and then relink to produce a modified\n executable containing the modified Library. (It is understood\n that the user who changes the contents of definitions files in the\n Library will not necessarily be able to recompile the application\n to use the modified definitions.)\n\n b) Use a suitable shared library mechanism for linking with the\n Library. A suitable mechanism is one that (1) uses at run time a\n copy of the library already present on the user's computer system,\n rather than copying library functions into the executable, and (2)\n will operate properly with a modified version of the library, if\n the user installs one, as long as the modified version is\n interface-compatible with the version that the work was made with.\n\n c) Accompany the work with a written offer, valid for at\n least three years, to give the same user the materials\n specified in Subsection 6a, above, for a charge no more\n than the cost of performing this distribution.\n\n d) If distribution of the work is made by offering access to copy\n from a designated place, offer equivalent access to copy the above\n specified materials from the same place.\n\n e) Verify that the user has already received a copy of these\n materials or that you have already sent this user a copy.\n\n For an executable, the required form of the \"work that uses the\nLibrary\" must include any data and utility programs needed for\nreproducing the executable from it. However, as a special exception,\nthe materials to be distributed need not include anything that is\nnormally distributed (in either source or binary form) with the major\ncomponents (compiler, kernel, and so on) of the operating system on\nwhich the executable runs, unless that component itself accompanies\nthe executable.\n\n It may happen that this requirement contradicts the license\nrestrictions of other proprietary libraries that do not normally\naccompany the operating system. Such a contradiction means you cannot\nuse both them and the Library together in an executable that you\ndistribute.\n\n 7. You may place library facilities that are a work based on the\nLibrary side-by-side in a single library together with other library\nfacilities not covered by this License, and distribute such a combined\nlibrary, provided that the separate distribution of the work based on\nthe Library and of the other library facilities is otherwise\npermitted, and provided that you do these two things:\n\n a) Accompany the combined library with a copy of the same work\n based on the Library, uncombined with any other library\n facilities. This must be distributed under the terms of the\n Sections above.\n\n b) Give prominent notice with the combined library of the fact\n that part of it is a work based on the Library, and explaining\n where to find the accompanying uncombined form of the same work.\n\n 8. You may not copy, modify, sublicense, link with, or distribute\nthe Library except as expressly provided under this License. Any\nattempt otherwise to copy, modify, sublicense, link with, or\ndistribute the Library is void, and will automatically terminate your\nrights under this License. However, parties who have received copies,\nor rights, from you under this License will not have their licenses\nterminated so long as such parties remain in full compliance.\n\n 9. You are not required to accept this License, since you have not\nsigned it. However, nothing else grants you permission to modify or\ndistribute the Library or its derivative works. These actions are\nprohibited by law if you do not accept this License. Therefore, by\nmodifying or distributing the Library (or any work based on the\nLibrary), you indicate your acceptance of this License to do so, and\nall its terms and conditions for copying, distributing or modifying\nthe Library or works based on it.\n\n 10. Each time you redistribute the Library (or any work based on the\nLibrary), the recipient automatically receives a license from the\noriginal licensor to copy, distribute, link with or modify the Library\nsubject to these terms and conditions. You may not impose any further\nrestrictions on the recipients' exercise of the rights granted herein.\nYou are not responsible for enforcing compliance by third parties with\nthis License.\n\n 11. If, as a consequence of a court judgment or allegation of patent\ninfringement or for any other reason (not limited to patent issues),\nconditions are imposed on you (whether by court order, agreement or\notherwise) that contradict the conditions of this License, they do not\nexcuse you from the conditions of this License. If you cannot\ndistribute so as to satisfy simultaneously your obligations under this\nLicense and any other pertinent obligations, then as a consequence you\nmay not distribute the Library at all. For example, if a patent\nlicense would not permit royalty-free redistribution of the Library by\nall those who receive copies directly or indirectly through you, then\nthe only way you could satisfy both it and this License would be to\nrefrain entirely from distribution of the Library.\n\nIf any portion of this section is held invalid or unenforceable under any\nparticular circumstance, the balance of the section is intended to apply,\nand the section as a whole is intended to apply in other circumstances.\n\nIt is not the purpose of this section to induce you to infringe any\npatents or other property right claims or to contest validity of any\nsuch claims; this section has the sole purpose of protecting the\nintegrity of the free software distribution system which is\nimplemented by public license practices. Many people have made\ngenerous contributions to the wide range of software distributed\nthrough that system in reliance on consistent application of that\nsystem; it is up to the author/donor to decide if he or she is willing\nto distribute software through any other system and a licensee cannot\nimpose that choice.\n\nThis section is intended to make thoroughly clear what is believed to\nbe a consequence of the rest of this License.\n\n 12. If the distribution and/or use of the Library is restricted in\ncertain countries either by patents or by copyrighted interfaces, the\noriginal copyright holder who places the Library under this License may add\nan explicit geographical distribution limitation excluding those countries,\nso that distribution is permitted only in or among countries not thus\nexcluded. In such case, this License incorporates the limitation as if\nwritten in the body of this License.\n\n 13. The Free Software Foundation may publish revised and/or new\nversions of the Lesser General Public License from time to time.\nSuch new versions will be similar in spirit to the present version,\nbut may differ in detail to address new problems or concerns.\n\nEach version is given a distinguishing version number. If the Library\nspecifies a version number of this License which applies to it and\n\"any later version\", you have the option of following the terms and\nconditions either of that version or of any later version published by\nthe Free Software Foundation. If the Library does not specify a\nlicense version number, you may choose any version ever published by\nthe Free Software Foundation.\n\n 14. If you wish to incorporate parts of the Library into other free\nprograms whose distribution conditions are incompatible with these,\nwrite to the author to ask for permission. For software which is\ncopyrighted by the Free Software Foundation, write to the Free\nSoftware Foundation; we sometimes make exceptions for this. Our\ndecision will be guided by the two goals of preserving the free status\nof all derivatives of our free software and of promoting the sharing\nand reuse of software generally.\n\n NO WARRANTY\n\n 15. BECAUSE THE LIBRARY IS LICENSED FREE OF CHARGE, THERE IS NO\nWARRANTY FOR THE LIBRARY, TO THE EXTENT PERMITTED BY APPLICABLE LAW.\nEXCEPT WHEN OTHERWISE STATED IN WRITING THE COPYRIGHT HOLDERS AND/OR\nOTHER PARTIES PROVIDE THE LIBRARY \"AS IS\" WITHOUT WARRANTY OF ANY\nKIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, BUT NOT LIMITED TO, THE\nIMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR\nPURPOSE. THE ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF THE\nLIBRARY IS WITH YOU. SHOULD THE LIBRARY PROVE DEFECTIVE, YOU ASSUME\nTHE COST OF ALL NECESSARY SERVICING, REPAIR OR CORRECTION.\n\n 16. IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN\nWRITING WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MAY MODIFY\nAND/OR REDISTRIBUTE THE LIBRARY AS PERMITTED ABOVE, BE LIABLE TO YOU\nFOR DAMAGES, INCLUDING ANY GENERAL, SPECIAL, INCIDENTAL OR\nCONSEQUENTIAL DAMAGES ARISING OUT OF THE USE OR INABILITY TO USE THE\nLIBRARY (INCLUDING BUT NOT LIMITED TO LOSS OF DATA OR DATA BEING\nRENDERED INACCURATE OR LOSSES SUSTAINED BY YOU OR THIRD PARTIES OR A\nFAILURE OF THE LIBRARY TO OPERATE WITH ANY OTHER SOFTWARE), EVEN IF\nSUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE POSSIBILITY OF SUCH\nDAMAGES.\n\n END OF TERMS AND CONDITIONS\n\n How to Apply These Terms to Your New Libraries\n\n If you develop a new library, and you want it to be of the greatest\npossible use to the public, we recommend making it free software that\neveryone can redistribute and change. You can do so by permitting\nredistribution under these terms (or, alternatively, under the terms of the\nordinary General Public License).\n\n To apply these terms, attach the following notices to the library. It is\nsafest to attach them to the start of each source file to most effectively\nconvey the exclusion of warranty; and each file should have at least the\n\"copyright\" line and a pointer to where the full notice is found.\n\n <one line to give the library's name and a brief idea of what it does.>\n Copyright (C) <year> <name of author>\n\n This library is free software; you can redistribute it and/or\n modify it under the terms of the GNU Lesser General Public\n License as published by the Free Software Foundation; either\n version 2.1 of the License, or (at your option) any later version.\n\n This library is distributed in the hope that it will be useful,\n but WITHOUT ANY WARRANTY; without even the implied warranty of\n MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU\n Lesser General Public License for more details.\n\n You should have received a copy of the GNU Lesser General Public\n License along with this library; if not, write to the Free Software\n Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301\n USA\n\nAlso add information on how to contact you by electronic and paper mail.\n\nYou should also get your employer (if you work as a programmer) or your\nschool, if any, to sign a \"copyright disclaimer\" for the library, if\nnecessary. Here is a sample; alter the names:\n\n Yoyodyne, Inc., hereby disclaims all copyright interest in the\n library `Frob' (a library for tweaking knobs) written by James Random\n Hacker.\n\n <signature of Ty Coon>, 1 April 1990\n Ty Coon, President of Vice\n\nThat's all there is to it!\n"
},
{
"Name": "colorama",
"Version": "0.4.1",
"Summary": "Cross-platform colored terminal text.",
"Home-page": "https://github.com/tartley/colorama",
"Author": "Jonathan Hartley",
"License": "BSD 3-Clause \"New\" or \"Revised\" License",
"License URL": "https://api.github.com/repos/tartley/colorama/license",
"License repo": "Copyright (c) 2010 Jonathan Hartley\nAll rights reserved.\n\nRedistribution and use in source and binary forms, with or without\nmodification, are permitted provided that the following conditions are met:\n\n* Redistributions of source code must retain the above copyright notice, this\n list of conditions and the following disclaimer.\n\n* Redistributions in binary form must reproduce the above copyright notice,\n this list of conditions and the following disclaimer in the documentation\n and/or other materials provided with the distribution.\n\n* Neither the name of the copyright holders, nor those of its contributors\n may be used to endorse or promote products derived from this software without\n specific prior written permission.\n\nTHIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS \"AS IS\" AND\nANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED\nWARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE\nDISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE\nFOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL\nDAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR\nSERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER\nCAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,\nOR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE\nOF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.\n",
"License text": "BSD 3-Clause License\n\nCopyright (c) [year], [fullname]\nAll rights reserved.\n\nRedistribution and use in source and binary forms, with or without\nmodification, are permitted provided that the following conditions are met:\n\n1. Redistributions of source code must retain the above copyright notice, this\n list of conditions and the following disclaimer.\n\n2. Redistributions in binary form must reproduce the above copyright notice,\n this list of conditions and the following disclaimer in the documentation\n and/or other materials provided with the distribution.\n\n3. Neither the name of the copyright holder nor the names of its\n contributors may be used to endorse or promote products derived from\n this software without specific prior written permission.\n\nTHIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS \"AS IS\"\nAND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE\nIMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE\nDISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE\nFOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL\nDAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR\nSERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER\nCAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,\nOR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE\nOF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.\n"
},
{
"Name": "cryptography",
"Version": "2.8",
"Summary": "cryptography is a package which provides cryptographic recipes and primitives to Python developers.",
"Home-page": "https://github.com/pyca/cryptography",
"Author": "The cryptography developers",
"License": "Other",
"License URL": "https://api.github.com/repos/pyca/cryptography/license",
"License repo": "This software is made available under the terms of *either* of the licenses\nfound in LICENSE.APACHE or LICENSE.BSD. Contributions to cryptography are made\nunder the terms of *both* these licenses.\n\nThe code used in the OpenSSL locking callback and OS random engine is derived\nfrom CPython, and is licensed under the terms of the PSF License Agreement.\n"
},
{
"Name": "docutils",
"Version": "0.15.2",
"Summary": "Docutils -- Python Documentation Utilities",
"Home-page": "http://docutils.sourceforge.net/",
"Author": "David Goodger",
"License": "public domain, Python, 2-Clause BSD, GPL 3 (see COPYING.txt)"
},
{
"Name": "fabric",
"Version": "2.5.0",
"Summary": "High level SSH command execution",
"Home-page": "http://fabfile.org",
"Author": "Jeff Forcier",
"License": "BSD"
},
{
"Name": "humanfriendly",
"Version": "4.18",
"Summary": "Human friendly output for text interfaces using Python",
"Home-page": "https://humanfriendly.readthedocs.io",
"Author": "Peter Odding",
"License": "MIT"
},
{
"Name": "idna",
"Version": "2.8",
"Summary": "Internationalized Domain Names in Applications (IDNA)",
"Home-page": "https://github.com/kjd/idna",
"Author": "Kim Davies",
"License": "Other",
"License URL": "https://api.github.com/repos/kjd/idna/license",
"License repo": "License\n-------\n\nLicense: bsd-3-clause\n\nCopyright (c) 2013-2020, Kim Davies. All rights reserved.\n\nRedistribution and use in source and binary forms, with or without\nmodification, are permitted provided that the following conditions are met:\n\n#. Redistributions of source code must retain the above copyright\n notice, this list of conditions and the following disclaimer.\n\n#. Redistributions in binary form must reproduce the above\n copyright notice, this list of conditions and the following\n disclaimer in the documentation and/or other materials provided with\n the distribution.\n\n#. Neither the name of the copyright holder nor the names of the \n contributors may be used to endorse or promote products derived \n from this software without specific prior written permission.\n\n#. THIS SOFTWARE IS PROVIDED BY THE CONTRIBUTORS \"AS IS\" AND ANY\n EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE\n IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR\n PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDERS OR \n CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, \n SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT \n LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,\n DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY\n THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT\n (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE\n USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH\n DAMAGE.\n"
},
{
"Name": "importlib-metadata",
"Version": "0.23",
"Summary": "Read metadata from Python packages",
"Home-page": "http://importlib-metadata.readthedocs.io/",
"Author": "Barry Warsaw",
"License": "Apache Software License"
},
{
"Name": "invoke",
"Version": "1.3.0",
"Summary": "Pythonic task execution",
"Home-page": "http://docs.pyinvoke.org",
"Author": "Jeff Forcier",
"License": "BSD"
},
{
"Name": "isodate",
"Version": "0.6.0",
"Summary": "An ISO 8601 date/time/duration parser and formatter",
"Home-page": "https://github.com/gweis/isodate/",
"Author": "Gerhard Weis",
"License": "BSD"
},
{
"Name": "isort",
"Version": "4.3.21",
"Summary": "A Python utility / library to sort Python imports.",
"Home-page": "https://github.com/timothycrosley/isort",
"Author": "Timothy Crosley",
"License": "MIT License",
"License URL": "https://api.github.com/repos/timothycrosley/isort/license",
"License repo": "The MIT License (MIT)\n\nCopyright (c) 2013 Timothy Edmund Crosley\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in\nall copies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN\nTHE SOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "jeepney",
"Version": "0.4.2",
"Summary": "Low-level, pure Python DBus protocol wrapper.",
"Home-page": "https://gitlab.com/takluyver/jeepney",
"License": "UNKNOWN",
"Author": "Thomas Kluyver"
},
{
"Name": "Jinja2",
"Version": "2.10.3",
"Summary": "A very fast and expressive template engine.",
"Home-page": "https://palletsprojects.com/p/jinja/",
"Author": "Armin Ronacher",
"License": "BSD-3-Clause"
},
{
"Name": "jmespath",
"Version": "0.9.4",
"Summary": "JSON Matching Expressions",
"Home-page": "https://github.com/jmespath/jmespath.py",
"Author": "James Saryerwinnie",
"License": "Other",
"License URL": "https://api.github.com/repos/jmespath/jmespath.py/license",
"License repo": "Copyright (c) 2013 Amazon.com, Inc. or its affiliates. All Rights Reserved\n\nPermission is hereby granted, free of charge, to any person obtaining a\ncopy of this software and associated documentation files (the\n\"Software\"), to deal in the Software without restriction, including\nwithout limitation the rights to use, copy, modify, merge, publish, dis-\ntribute, sublicense, and/or sell copies of the Software, and to permit\npersons to whom the Software is furnished to do so, subject to the fol-\nlowing conditions:\n\nThe above copyright notice and this permission notice shall be included\nin all copies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS\nOR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABIL-\nITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT\nSHALL THE AUTHOR BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY,\nWHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS\nIN THE SOFTWARE.\n"
},
{
"Name": "jsonschema",
"Version": "3.1.1",
"Summary": "An implementation of JSON Schema validation for Python",
"Home-page": "https://github.com/Julian/jsonschema",
"Author": "Julian Berman",
"License": "MIT License",
"License URL": "https://api.github.com/repos/julian/jsonschema/license",
"License repo": "Copyright (c) 2013 Julian Berman\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in\nall copies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN\nTHE SOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "keyring",
"Version": "21.1.0",
"Summary": "Store and access your passwords safely.",
"Home-page": "https://github.com/jaraco/keyring",
"Author": "Kang Zhang",
"License": "MIT License",
"License URL": "https://api.github.com/repos/jaraco/keyring/license",
"License repo": "Copyright Jason R. Coombs\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to\ndeal in the Software without restriction, including without limitation the\nrights to use, copy, modify, merge, publish, distribute, sublicense, and/or\nsell copies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in\nall copies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING\nFROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS\nIN THE SOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "knack",
"Version": "0.6.3",
"Summary": "A Command-Line Interface framework",
"Home-page": "https://github.com/microsoft/knack",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/microsoft/knack/license",
"License repo": " MIT License\n\n Copyright (c) Microsoft Corporation. All rights reserved.\n\n Permission is hereby granted, free of charge, to any person obtaining a copy\n of this software and associated documentation files (the \"Software\"), to deal\n in the Software without restriction, including without limitation the rights\n to use, copy, modify, merge, publish, distribute, sublicense, and/or sell\n copies of the Software, and to permit persons to whom the Software is\n furnished to do so, subject to the following conditions:\n\n The above copyright notice and this permission notice shall be included in all\n copies or substantial portions of the Software.\n\n THE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\n IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\n FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\n AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\n LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\n OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\n SOFTWARE",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "lazy-object-proxy",
"Version": "1.4.3",
"Summary": "A fast and thorough lazy object proxy.",
"Home-page": "https://github.com/ionelmc/python-lazy-object-proxy",
"Author": "Ionel Cristian M\u0103rie\u0219",
"License": "BSD 2-Clause \"Simplified\" License",
"License URL": "https://api.github.com/repos/ionelmc/python-lazy-object-proxy/license",
"License repo": "BSD 2-Clause License\n\nCopyright (c) 2014-2019, Ionel Cristian M\u0103rie\u0219\nAll rights reserved.\n\nRedistribution and use in source and binary forms, with or without modification, are permitted provided that the\nfollowing conditions are met:\n\n1. Redistributions of source code must retain the above copyright notice, this list of conditions and the following\ndisclaimer.\n\n2. Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following\ndisclaimer in the documentation and/or other materials provided with the distribution.\n\nTHIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS \"AS IS\" AND ANY EXPRESS OR IMPLIED WARRANTIES,\nINCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE\nDISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,\nSPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR\nSERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY,\nWHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF\nTHIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.\n",
"License text": "BSD 2-Clause License\n\nCopyright (c) [year], [fullname]\nAll rights reserved.\n\nRedistribution and use in source and binary forms, with or without\nmodification, are permitted provided that the following conditions are met:\n\n1. Redistributions of source code must retain the above copyright notice, this\n list of conditions and the following disclaimer.\n\n2. Redistributions in binary form must reproduce the above copyright notice,\n this list of conditions and the following disclaimer in the documentation\n and/or other materials provided with the distribution.\n\nTHIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS \"AS IS\"\nAND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE\nIMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE\nDISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE\nFOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL\nDAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR\nSERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER\nCAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,\nOR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE\nOF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.\n"
},
{
"Name": "MarkupSafe",
"Version": "1.1.1",
"Summary": "Safely add untrusted strings to HTML/XML markup.",
"Home-page": "https://palletsprojects.com/p/markupsafe/",
"Author": "Armin Ronacher",
"License": "BSD-3-Clause"
},
{
"Name": "mccabe",
"Version": "0.6.1",
"Summary": "McCabe checker, plugin for flake8",
"Home-page": "https://github.com/pycqa/mccabe",
"Author": "Ian Cordasco",
"License": "Other",
"License URL": "https://api.github.com/repos/pycqa/mccabe/license",
"License repo": "Copyright \u00a9 <year> Ned Batchelder\nCopyright \u00a9 2011-2013 Tarek Ziade <tarek@ziade.org>\nCopyright \u00a9 2013 Florent Xicluna <florent.xicluna@gmail.com>\n\nLicensed under the terms of the Expat License\n\nPermission is hereby granted, free of charge, to any person\nobtaining a copy of this software and associated documentation files\n(the \"Software\"), to deal in the Software without restriction,\nincluding without limitation the rights to use, copy, modify, merge,\npublish, distribute, sublicense, and/or sell copies of the Software,\nand to permit persons to whom the Software is furnished to do so,\nsubject to the following conditions:\n\nThe above copyright notice and this permission notice shall be\nincluded in all copies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND,\nEXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF\nMERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND\nNONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS\nBE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN\nACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN\nCONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "mock",
"Version": "3.0.5",
"Summary": "Rolling backport of unittest.mock for all Pythons",
"Home-page": "http://mock.readthedocs.org/en/latest/",
"Author": "Testing Cabal",
"License": "OSI Approved :: BSD License"
},
{
"Name": "more-itertools",
"Version": "7.2.0",
"Summary": "More routines for operating on iterables, beyond itertools",
"Home-page": "https://github.com/erikrose/more-itertools",
"Author": "Erik Rose",
"License": "MIT License",
"License URL": "https://api.github.com/repos/erikrose/more-itertools/license",
"License repo": "Copyright (c) 2012 Erik Rose\n\nPermission is hereby granted, free of charge, to any person obtaining a copy of\nthis software and associated documentation files (the \"Software\"), to deal in\nthe Software without restriction, including without limitation the rights to\nuse, copy, modify, merge, publish, distribute, sublicense, and/or sell copies\nof the Software, and to permit persons to whom the Software is furnished to do\nso, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "msrest",
"Version": "0.6.10",
"Summary": "AutoRest swagger generator Python client runtime.",
"Home-page": "https://github.com/Azure/msrest-for-python",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/msrest-for-python/license",
"License repo": "MIT License\n\nCopyright (c) 2016 Microsoft Azure\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "msrestazure",
"Version": "0.6.2",
"Summary": "AutoRest swagger generator Python client runtime. Azure-specific module.",
"Home-page": "https://github.com/Azure/msrestazure-for-python",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/azure/msrestazure-for-python/license",
"License repo": "MIT License\n\nCopyright (c) 2016 Microsoft Azure\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "oauthlib",
"Version": "3.1.0",
"Summary": "A generic, spec-compliant, thorough implementation of the OAuth request-signing logic",
"Home-page": "https://github.com/oauthlib/oauthlib",
"Author": "The OAuthlib Community",
"License": "BSD 3-Clause \"New\" or \"Revised\" License",
"License URL": "https://api.github.com/repos/oauthlib/oauthlib/license",
"License repo": "Copyright (c) 2019 The OAuthlib Community\nAll rights reserved.\n\nRedistribution and use in source and binary forms, with or without\nmodification, are permitted provided that the following conditions are met:\n\n 1. Redistributions of source code must retain the above copyright notice,\n this list of conditions and the following disclaimer.\n\n 2. Redistributions in binary form must reproduce the above copyright\n notice, this list of conditions and the following disclaimer in the\n documentation and/or other materials provided with the distribution.\n\n 3. Neither the name of this project nor the names of its contributors may\n be used to endorse or promote products derived from this software without\n specific prior written permission.\n\nTHIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS \"AS IS\"\nAND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE\nIMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE\nDISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE\nFOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL\nDAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR\nSERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER\nCAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,\nOR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE\nOF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.\n",
"License text": "BSD 3-Clause License\n\nCopyright (c) [year], [fullname]\nAll rights reserved.\n\nRedistribution and use in source and binary forms, with or without\nmodification, are permitted provided that the following conditions are met:\n\n1. Redistributions of source code must retain the above copyright notice, this\n list of conditions and the following disclaimer.\n\n2. Redistributions in binary form must reproduce the above copyright notice,\n this list of conditions and the following disclaimer in the documentation\n and/or other materials provided with the distribution.\n\n3. Neither the name of the copyright holder nor the names of its\n contributors may be used to endorse or promote products derived from\n this software without specific prior written permission.\n\nTHIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS \"AS IS\"\nAND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE\nIMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE\nDISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE\nFOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL\nDAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR\nSERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER\nCAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,\nOR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE\nOF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.\n"
},
{
"Name": "packaging",
"Version": "20.1",
"Summary": "Core utilities for Python packages",
"Home-page": "https://github.com/pypa/packaging",
"Author": "Donald Stufft and individual contributors",
"License": "Other",
"License URL": "https://api.github.com/repos/pypa/packaging/license",
"License repo": "This software is made available under the terms of *either* of the licenses\nfound in LICENSE.APACHE or LICENSE.BSD. Contributions to this software is made\nunder the terms of *both* these licenses.\n"
},
{
"Name": "paramiko",
"Version": "2.6.0",
"Summary": "SSH2 protocol library",
"Home-page": "https://github.com/paramiko/paramiko/",
"Author": "Jeff Forcier",
"License": "GNU Lesser General Public License v2.1",
"License URL": "https://api.github.com/repos/paramiko/paramiko/license",
"License repo": "\t\t GNU LESSER GENERAL PUBLIC LICENSE\n\t\t Version 2.1, February 1999\n\n Copyright (C) 1991, 1999 Free Software Foundation, Inc.\n 51 Franklin Street, Suite 500, Boston, MA 02110-1335 USA\n Everyone is permitted to copy and distribute verbatim copies\n of this license document, but changing it is not allowed.\n\n[This is the first released version of the Lesser GPL. It also counts\n as the successor of the GNU Library Public License, version 2, hence\n the version number 2.1.]\n\n\t\t\t Preamble\n\n The licenses for most software are designed to take away your\nfreedom to share and change it. By contrast, the GNU General Public\nLicenses are intended to guarantee your freedom to share and change\nfree software--to make sure the software is free for all its users.\n\n This license, the Lesser General Public License, applies to some\nspecially designated software packages--typically libraries--of the\nFree Software Foundation and other authors who decide to use it. You\ncan use it too, but we suggest you first think carefully about whether\nthis license or the ordinary General Public License is the better\nstrategy to use in any particular case, based on the explanations below.\n\n When we speak of free software, we are referring to freedom of use,\nnot price. Our General Public Licenses are designed to make sure that\nyou have the freedom to distribute copies of free software (and charge\nfor this service if you wish); that you receive source code or can get\nit if you want it; that you can change the software and use pieces of\nit in new free programs; and that you are informed that you can do\nthese things.\n\n To protect your rights, we need to make restrictions that forbid\ndistributors to deny you these rights or to ask you to surrender these\nrights. These restrictions translate to certain responsibilities for\nyou if you distribute copies of the library or if you modify it.\n\n For example, if you distribute copies of the library, whether gratis\nor for a fee, you must give the recipients all the rights that we gave\nyou. You must make sure that they, too, receive or can get the source\ncode. If you link other code with the library, you must provide\ncomplete object files to the recipients, so that they can relink them\nwith the library after making changes to the library and recompiling\nit. And you must show them these terms so they know their rights.\n\n We protect your rights with a two-step method: (1) we copyright the\nlibrary, and (2) we offer you this license, which gives you legal\npermission to copy, distribute and/or modify the library.\n\n To protect each distributor, we want to make it very clear that\nthere is no warranty for the free library. Also, if the library is\nmodified by someone else and passed on, the recipients should know\nthat what they have is not the original version, so that the original\nauthor's reputation will not be affected by problems that might be\nintroduced by others.\n\f\n Finally, software patents pose a constant threat to the existence of\nany free program. We wish to make sure that a company cannot\neffectively restrict the users of a free program by obtaining a\nrestrictive license from a patent holder. Therefore, we insist that\nany patent license obtained for a version of the library must be\nconsistent with the full freedom of use specified in this license.\n\n Most GNU software, including some libraries, is covered by the\nordinary GNU General Public License. This license, the GNU Lesser\nGeneral Public License, applies to certain designated libraries, and\nis quite different from the ordinary General Public License. We use\nthis license for certain libraries in order to permit linking those\nlibraries into non-free programs.\n\n When a program is linked with a library, whether statically or using\na shared library, the combination of the two is legally speaking a\ncombined work, a derivative of the original library. The ordinary\nGeneral Public License therefore permits such linking only if the\nentire combination fits its criteria of freedom. The Lesser General\nPublic License permits more lax criteria for linking other code with\nthe library.\n\n We call this license the \"Lesser\" General Public License because it\ndoes Less to protect the user's freedom than the ordinary General\nPublic License. It also provides other free software developers Less\nof an advantage over competing non-free programs. These disadvantages\nare the reason we use the ordinary General Public License for many\nlibraries. However, the Lesser license provides advantages in certain\nspecial circumstances.\n\n For example, on rare occasions, there may be a special need to\nencourage the widest possible use of a certain library, so that it becomes\na de-facto standard. To achieve this, non-free programs must be\nallowed to use the library. A more frequent case is that a free\nlibrary does the same job as widely used non-free libraries. In this\ncase, there is little to gain by limiting the free library to free\nsoftware only, so we use the Lesser General Public License.\n\n In other cases, permission to use a particular library in non-free\nprograms enables a greater number of people to use a large body of\nfree software. For example, permission to use the GNU C Library in\nnon-free programs enables many more people to use the whole GNU\noperating system, as well as its variant, the GNU/Linux operating\nsystem.\n\n Although the Lesser General Public License is Less protective of the\nusers' freedom, it does ensure that the user of a program that is\nlinked with the Library has the freedom and the wherewithal to run\nthat program using a modified version of the Library.\n\n The precise terms and conditions for copying, distribution and\nmodification follow. Pay close attention to the difference between a\n\"work based on the library\" and a \"work that uses the library\". The\nformer contains code derived from the library, whereas the latter must\nbe combined with the library in order to run.\n\f\n\t\t GNU LESSER GENERAL PUBLIC LICENSE\n TERMS AND CONDITIONS FOR COPYING, DISTRIBUTION AND MODIFICATION\n\n 0. This License Agreement applies to any software library or other\nprogram which contains a notice placed by the copyright holder or\nother authorized party saying it may be distributed under the terms of\nthis Lesser General Public License (also called \"this License\").\nEach licensee is addressed as \"you\".\n\n A \"library\" means a collection of software functions and/or data\nprepared so as to be conveniently linked with application programs\n(which use some of those functions and data) to form executables.\n\n The \"Library\", below, refers to any such software library or work\nwhich has been distributed under these terms. A \"work based on the\nLibrary\" means either the Library or any derivative work under\ncopyright law: that is to say, a work containing the Library or a\nportion of it, either verbatim or with modifications and/or translated\nstraightforwardly into another language. (Hereinafter, translation is\nincluded without limitation in the term \"modification\".)\n\n \"Source code\" for a work means the preferred form of the work for\nmaking modifications to it. For a library, complete source code means\nall the source code for all modules it contains, plus any associated\ninterface definition files, plus the scripts used to control compilation\nand installation of the library.\n\n Activities other than copying, distribution and modification are not\ncovered by this License; they are outside its scope. The act of\nrunning a program using the Library is not restricted, and output from\nsuch a program is covered only if its contents constitute a work based\non the Library (independent of the use of the Library in a tool for\nwriting it). Whether that is true depends on what the Library does\nand what the program that uses the Library does.\n \n 1. You may copy and distribute verbatim copies of the Library's\ncomplete source code as you receive it, in any medium, provided that\nyou conspicuously and appropriately publish on each copy an\nappropriate copyright notice and disclaimer of warranty; keep intact\nall the notices that refer to this License and to the absence of any\nwarranty; and distribute a copy of this License along with the\nLibrary.\n\n You may charge a fee for the physical act of transferring a copy,\nand you may at your option offer warranty protection in exchange for a\nfee.\n\f\n 2. You may modify your copy or copies of the Library or any portion\nof it, thus forming a work based on the Library, and copy and\ndistribute such modifications or work under the terms of Section 1\nabove, provided that you also meet all of these conditions:\n\n a) The modified work must itself be a software library.\n\n b) You must cause the files modified to carry prominent notices\n stating that you changed the files and the date of any change.\n\n c) You must cause the whole of the work to be licensed at no\n charge to all third parties under the terms of this License.\n\n d) If a facility in the modified Library refers to a function or a\n table of data to be supplied by an application program that uses\n the facility, other than as an argument passed when the facility\n is invoked, then you must make a good faith effort to ensure that,\n in the event an application does not supply such function or\n table, the facility still operates, and performs whatever part of\n its purpose remains meaningful.\n\n (For example, a function in a library to compute square roots has\n a purpose that is entirely well-defined independent of the\n application. Therefore, Subsection 2d requires that any\n application-supplied function or table used by this function must\n be optional: if the application does not supply it, the square\n root function must still compute square roots.)\n\nThese requirements apply to the modified work as a whole. If\nidentifiable sections of that work are not derived from the Library,\nand can be reasonably considered independent and separate works in\nthemselves, then this License, and its terms, do not apply to those\nsections when you distribute them as separate works. But when you\ndistribute the same sections as part of a whole which is a work based\non the Library, the distribution of the whole must be on the terms of\nthis License, whose permissions for other licensees extend to the\nentire whole, and thus to each and every part regardless of who wrote\nit.\n\nThus, it is not the intent of this section to claim rights or contest\nyour rights to work written entirely by you; rather, the intent is to\nexercise the right to control the distribution of derivative or\ncollective works based on the Library.\n\nIn addition, mere aggregation of another work not based on the Library\nwith the Library (or with a work based on the Library) on a volume of\na storage or distribution medium does not bring the other work under\nthe scope of this License.\n\n 3. You may opt to apply the terms of the ordinary GNU General Public\nLicense instead of this License to a given copy of the Library. To do\nthis, you must alter all the notices that refer to this License, so\nthat they refer to the ordinary GNU General Public License, version 2,\ninstead of to this License. (If a newer version than version 2 of the\nordinary GNU General Public License has appeared, then you can specify\nthat version instead if you wish.) Do not make any other change in\nthese notices.\n\f\n Once this change is made in a given copy, it is irreversible for\nthat copy, so the ordinary GNU General Public License applies to all\nsubsequent copies and derivative works made from that copy.\n\n This option is useful when you wish to copy part of the code of\nthe Library into a program that is not a library.\n\n 4. You may copy and distribute the Library (or a portion or\nderivative of it, under Section 2) in object code or executable form\nunder the terms of Sections 1 and 2 above provided that you accompany\nit with the complete corresponding machine-readable source code, which\nmust be distributed under the terms of Sections 1 and 2 above on a\nmedium customarily used for software interchange.\n\n If distribution of object code is made by offering access to copy\nfrom a designated place, then offering equivalent access to copy the\nsource code from the same place satisfies the requirement to\ndistribute the source code, even though third parties are not\ncompelled to copy the source along with the object code.\n\n 5. A program that contains no derivative of any portion of the\nLibrary, but is designed to work with the Library by being compiled or\nlinked with it, is called a \"work that uses the Library\". Such a\nwork, in isolation, is not a derivative work of the Library, and\ntherefore falls outside the scope of this License.\n\n However, linking a \"work that uses the Library\" with the Library\ncreates an executable that is a derivative of the Library (because it\ncontains portions of the Library), rather than a \"work that uses the\nlibrary\". The executable is therefore covered by this License.\nSection 6 states terms for distribution of such executables.\n\n When a \"work that uses the Library\" uses material from a header file\nthat is part of the Library, the object code for the work may be a\nderivative work of the Library even though the source code is not.\nWhether this is true is especially significant if the work can be\nlinked without the Library, or if the work is itself a library. The\nthreshold for this to be true is not precisely defined by law.\n\n If such an object file uses only numerical parameters, data\nstructure layouts and accessors, and small macros and small inline\nfunctions (ten lines or less in length), then the use of the object\nfile is unrestricted, regardless of whether it is legally a derivative\nwork. (Executables containing this object code plus portions of the\nLibrary will still fall under Section 6.)\n\n Otherwise, if the work is a derivative of the Library, you may\ndistribute the object code for the work under the terms of Section 6.\nAny executables containing that work also fall under Section 6,\nwhether or not they are linked directly with the Library itself.\n\f\n 6. As an exception to the Sections above, you may also combine or\nlink a \"work that uses the Library\" with the Library to produce a\nwork containing portions of the Library, and distribute that work\nunder terms of your choice, provided that the terms permit\nmodification of the work for the customer's own use and reverse\nengineering for debugging such modifications.\n\n You must give prominent notice with each copy of the work that the\nLibrary is used in it and that the Library and its use are covered by\nthis License. You must supply a copy of this License. If the work\nduring execution displays copyright notices, you must include the\ncopyright notice for the Library among them, as well as a reference\ndirecting the user to the copy of this License. Also, you must do one\nof these things:\n\n a) Accompany the work with the complete corresponding\n machine-readable source code for the Library including whatever\n changes were used in the work (which must be distributed under\n Sections 1 and 2 above); and, if the work is an executable linked\n with the Library, with the complete machine-readable \"work that\n uses the Library\", as object code and/or source code, so that the\n user can modify the Library and then relink to produce a modified\n executable containing the modified Library. (It is understood\n that the user who changes the contents of definitions files in the\n Library will not necessarily be able to recompile the application\n to use the modified definitions.)\n\n b) Use a suitable shared library mechanism for linking with the\n Library. A suitable mechanism is one that (1) uses at run time a\n copy of the library already present on the user's computer system,\n rather than copying library functions into the executable, and (2)\n will operate properly with a modified version of the library, if\n the user installs one, as long as the modified version is\n interface-compatible with the version that the work was made with.\n\n c) Accompany the work with a written offer, valid for at\n least three years, to give the same user the materials\n specified in Subsection 6a, above, for a charge no more\n than the cost of performing this distribution.\n\n d) If distribution of the work is made by offering access to copy\n from a designated place, offer equivalent access to copy the above\n specified materials from the same place.\n\n e) Verify that the user has already received a copy of these\n materials or that you have already sent this user a copy.\n\n For an executable, the required form of the \"work that uses the\nLibrary\" must include any data and utility programs needed for\nreproducing the executable from it. However, as a special exception,\nthe materials to be distributed need not include anything that is\nnormally distributed (in either source or binary form) with the major\ncomponents (compiler, kernel, and so on) of the operating system on\nwhich the executable runs, unless that component itself accompanies\nthe executable.\n\n It may happen that this requirement contradicts the license\nrestrictions of other proprietary libraries that do not normally\naccompany the operating system. Such a contradiction means you cannot\nuse both them and the Library together in an executable that you\ndistribute.\n\f\n 7. You may place library facilities that are a work based on the\nLibrary side-by-side in a single library together with other library\nfacilities not covered by this License, and distribute such a combined\nlibrary, provided that the separate distribution of the work based on\nthe Library and of the other library facilities is otherwise\npermitted, and provided that you do these two things:\n\n a) Accompany the combined library with a copy of the same work\n based on the Library, uncombined with any other library\n facilities. This must be distributed under the terms of the\n Sections above.\n\n b) Give prominent notice with the combined library of the fact\n that part of it is a work based on the Library, and explaining\n where to find the accompanying uncombined form of the same work.\n\n 8. You may not copy, modify, sublicense, link with, or distribute\nthe Library except as expressly provided under this License. Any\nattempt otherwise to copy, modify, sublicense, link with, or\ndistribute the Library is void, and will automatically terminate your\nrights under this License. However, parties who have received copies,\nor rights, from you under this License will not have their licenses\nterminated so long as such parties remain in full compliance.\n\n 9. You are not required to accept this License, since you have not\nsigned it. However, nothing else grants you permission to modify or\ndistribute the Library or its derivative works. These actions are\nprohibited by law if you do not accept this License. Therefore, by\nmodifying or distributing the Library (or any work based on the\nLibrary), you indicate your acceptance of this License to do so, and\nall its terms and conditions for copying, distributing or modifying\nthe Library or works based on it.\n\n 10. Each time you redistribute the Library (or any work based on the\nLibrary), the recipient automatically receives a license from the\noriginal licensor to copy, distribute, link with or modify the Library\nsubject to these terms and conditions. You may not impose any further\nrestrictions on the recipients' exercise of the rights granted herein.\nYou are not responsible for enforcing compliance by third parties with\nthis License.\n\f\n 11. If, as a consequence of a court judgment or allegation of patent\ninfringement or for any other reason (not limited to patent issues),\nconditions are imposed on you (whether by court order, agreement or\notherwise) that contradict the conditions of this License, they do not\nexcuse you from the conditions of this License. If you cannot\ndistribute so as to satisfy simultaneously your obligations under this\nLicense and any other pertinent obligations, then as a consequence you\nmay not distribute the Library at all. For example, if a patent\nlicense would not permit royalty-free redistribution of the Library by\nall those who receive copies directly or indirectly through you, then\nthe only way you could satisfy both it and this License would be to\nrefrain entirely from distribution of the Library.\n\nIf any portion of this section is held invalid or unenforceable under any\nparticular circumstance, the balance of the section is intended to apply,\nand the section as a whole is intended to apply in other circumstances.\n\nIt is not the purpose of this section to induce you to infringe any\npatents or other property right claims or to contest validity of any\nsuch claims; this section has the sole purpose of protecting the\nintegrity of the free software distribution system which is\nimplemented by public license practices. Many people have made\ngenerous contributions to the wide range of software distributed\nthrough that system in reliance on consistent application of that\nsystem; it is up to the author/donor to decide if he or she is willing\nto distribute software through any other system and a licensee cannot\nimpose that choice.\n\nThis section is intended to make thoroughly clear what is believed to\nbe a consequence of the rest of this License.\n\n 12. If the distribution and/or use of the Library is restricted in\ncertain countries either by patents or by copyrighted interfaces, the\noriginal copyright holder who places the Library under this License may add\nan explicit geographical distribution limitation excluding those countries,\nso that distribution is permitted only in or among countries not thus\nexcluded. In such case, this License incorporates the limitation as if\nwritten in the body of this License.\n\n 13. The Free Software Foundation may publish revised and/or new\nversions of the Lesser General Public License from time to time.\nSuch new versions will be similar in spirit to the present version,\nbut may differ in detail to address new problems or concerns.\n\nEach version is given a distinguishing version number. If the Library\nspecifies a version number of this License which applies to it and\n\"any later version\", you have the option of following the terms and\nconditions either of that version or of any later version published by\nthe Free Software Foundation. If the Library does not specify a\nlicense version number, you may choose any version ever published by\nthe Free Software Foundation.\n\f\n 14. If you wish to incorporate parts of the Library into other free\nprograms whose distribution conditions are incompatible with these,\nwrite to the author to ask for permission. For software which is\ncopyrighted by the Free Software Foundation, write to the Free\nSoftware Foundation; we sometimes make exceptions for this. Our\ndecision will be guided by the two goals of preserving the free status\nof all derivatives of our free software and of promoting the sharing\nand reuse of software generally.\n\n\t\t\t NO WARRANTY\n\n 15. BECAUSE THE LIBRARY IS LICENSED FREE OF CHARGE, THERE IS NO\nWARRANTY FOR THE LIBRARY, TO THE EXTENT PERMITTED BY APPLICABLE LAW.\nEXCEPT WHEN OTHERWISE STATED IN WRITING THE COPYRIGHT HOLDERS AND/OR\nOTHER PARTIES PROVIDE THE LIBRARY \"AS IS\" WITHOUT WARRANTY OF ANY\nKIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, BUT NOT LIMITED TO, THE\nIMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR\nPURPOSE. THE ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF THE\nLIBRARY IS WITH YOU. SHOULD THE LIBRARY PROVE DEFECTIVE, YOU ASSUME\nTHE COST OF ALL NECESSARY SERVICING, REPAIR OR CORRECTION.\n\n 16. IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN\nWRITING WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MAY MODIFY\nAND/OR REDISTRIBUTE THE LIBRARY AS PERMITTED ABOVE, BE LIABLE TO YOU\nFOR DAMAGES, INCLUDING ANY GENERAL, SPECIAL, INCIDENTAL OR\nCONSEQUENTIAL DAMAGES ARISING OUT OF THE USE OR INABILITY TO USE THE\nLIBRARY (INCLUDING BUT NOT LIMITED TO LOSS OF DATA OR DATA BEING\nRENDERED INACCURATE OR LOSSES SUSTAINED BY YOU OR THIRD PARTIES OR A\nFAILURE OF THE LIBRARY TO OPERATE WITH ANY OTHER SOFTWARE), EVEN IF\nSUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE POSSIBILITY OF SUCH\nDAMAGES.\n\n\t\t END OF TERMS AND CONDITIONS\n\f\n How to Apply These Terms to Your New Libraries\n\n If you develop a new library, and you want it to be of the greatest\npossible use to the public, we recommend making it free software that\neveryone can redistribute and change. You can do so by permitting\nredistribution under these terms (or, alternatively, under the terms of the\nordinary General Public License).\n\n To apply these terms, attach the following notices to the library. It is\nsafest to attach them to the start of each source file to most effectively\nconvey the exclusion of warranty; and each file should have at least the\n\"copyright\" line and a pointer to where the full notice is found.\n\n <one line to give the library's name and a brief idea of what it does.>\n Copyright (C) <year> <name of author>\n\n This library is free software; you can redistribute it and/or\n modify it under the terms of the GNU Lesser General Public\n License as published by the Free Software Foundation; either\n version 2.1 of the License, or (at your option) any later version.\n\n This library is distributed in the hope that it will be useful,\n but WITHOUT ANY WARRANTY; without even the implied warranty of\n MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU\n Lesser General Public License for more details.\n\n You should have received a copy of the GNU Lesser General Public\n License along with this library; if not, write to the Free Software\n Foundation, Inc., 51 Franklin Street, Suite 500, Boston, MA 02110-1335 USA\n\nAlso add information on how to contact you by electronic and paper mail.\n\nYou should also get your employer (if you work as a programmer) or your\nschool, if any, to sign a \"copyright disclaimer\" for the library, if\nnecessary. Here is a sample; alter the names:\n\n Yoyodyne, Inc., hereby disclaims all copyright interest in the\n library `Frob' (a library for tweaking knobs) written by James Random Hacker.\n\n <signature of Ty Coon>, 1 April 1990\n Ty Coon, President of Vice\n\nThat's all there is to it!\n\n\n",
"License text": " GNU LESSER GENERAL PUBLIC LICENSE\n Version 2.1, February 1999\n\n Copyright (C) 1991, 1999 Free Software Foundation, Inc.\n 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA\n Everyone is permitted to copy and distribute verbatim copies\n of this license document, but changing it is not allowed.\n\n[This is the first released version of the Lesser GPL. It also counts\n as the successor of the GNU Library Public License, version 2, hence\n the version number 2.1.]\n\n Preamble\n\n The licenses for most software are designed to take away your\nfreedom to share and change it. By contrast, the GNU General Public\nLicenses are intended to guarantee your freedom to share and change\nfree software--to make sure the software is free for all its users.\n\n This license, the Lesser General Public License, applies to some\nspecially designated software packages--typically libraries--of the\nFree Software Foundation and other authors who decide to use it. You\ncan use it too, but we suggest you first think carefully about whether\nthis license or the ordinary General Public License is the better\nstrategy to use in any particular case, based on the explanations below.\n\n When we speak of free software, we are referring to freedom of use,\nnot price. Our General Public Licenses are designed to make sure that\nyou have the freedom to distribute copies of free software (and charge\nfor this service if you wish); that you receive source code or can get\nit if you want it; that you can change the software and use pieces of\nit in new free programs; and that you are informed that you can do\nthese things.\n\n To protect your rights, we need to make restrictions that forbid\ndistributors to deny you these rights or to ask you to surrender these\nrights. These restrictions translate to certain responsibilities for\nyou if you distribute copies of the library or if you modify it.\n\n For example, if you distribute copies of the library, whether gratis\nor for a fee, you must give the recipients all the rights that we gave\nyou. You must make sure that they, too, receive or can get the source\ncode. If you link other code with the library, you must provide\ncomplete object files to the recipients, so that they can relink them\nwith the library after making changes to the library and recompiling\nit. And you must show them these terms so they know their rights.\n\n We protect your rights with a two-step method: (1) we copyright the\nlibrary, and (2) we offer you this license, which gives you legal\npermission to copy, distribute and/or modify the library.\n\n To protect each distributor, we want to make it very clear that\nthere is no warranty for the free library. Also, if the library is\nmodified by someone else and passed on, the recipients should know\nthat what they have is not the original version, so that the original\nauthor's reputation will not be affected by problems that might be\nintroduced by others.\n\n Finally, software patents pose a constant threat to the existence of\nany free program. We wish to make sure that a company cannot\neffectively restrict the users of a free program by obtaining a\nrestrictive license from a patent holder. Therefore, we insist that\nany patent license obtained for a version of the library must be\nconsistent with the full freedom of use specified in this license.\n\n Most GNU software, including some libraries, is covered by the\nordinary GNU General Public License. This license, the GNU Lesser\nGeneral Public License, applies to certain designated libraries, and\nis quite different from the ordinary General Public License. We use\nthis license for certain libraries in order to permit linking those\nlibraries into non-free programs.\n\n When a program is linked with a library, whether statically or using\na shared library, the combination of the two is legally speaking a\ncombined work, a derivative of the original library. The ordinary\nGeneral Public License therefore permits such linking only if the\nentire combination fits its criteria of freedom. The Lesser General\nPublic License permits more lax criteria for linking other code with\nthe library.\n\n We call this license the \"Lesser\" General Public License because it\ndoes Less to protect the user's freedom than the ordinary General\nPublic License. It also provides other free software developers Less\nof an advantage over competing non-free programs. These disadvantages\nare the reason we use the ordinary General Public License for many\nlibraries. However, the Lesser license provides advantages in certain\nspecial circumstances.\n\n For example, on rare occasions, there may be a special need to\nencourage the widest possible use of a certain library, so that it becomes\na de-facto standard. To achieve this, non-free programs must be\nallowed to use the library. A more frequent case is that a free\nlibrary does the same job as widely used non-free libraries. In this\ncase, there is little to gain by limiting the free library to free\nsoftware only, so we use the Lesser General Public License.\n\n In other cases, permission to use a particular library in non-free\nprograms enables a greater number of people to use a large body of\nfree software. For example, permission to use the GNU C Library in\nnon-free programs enables many more people to use the whole GNU\noperating system, as well as its variant, the GNU/Linux operating\nsystem.\n\n Although the Lesser General Public License is Less protective of the\nusers' freedom, it does ensure that the user of a program that is\nlinked with the Library has the freedom and the wherewithal to run\nthat program using a modified version of the Library.\n\n The precise terms and conditions for copying, distribution and\nmodification follow. Pay close attention to the difference between a\n\"work based on the library\" and a \"work that uses the library\". The\nformer contains code derived from the library, whereas the latter must\nbe combined with the library in order to run.\n\n GNU LESSER GENERAL PUBLIC LICENSE\n TERMS AND CONDITIONS FOR COPYING, DISTRIBUTION AND MODIFICATION\n\n 0. This License Agreement applies to any software library or other\nprogram which contains a notice placed by the copyright holder or\nother authorized party saying it may be distributed under the terms of\nthis Lesser General Public License (also called \"this License\").\nEach licensee is addressed as \"you\".\n\n A \"library\" means a collection of software functions and/or data\nprepared so as to be conveniently linked with application programs\n(which use some of those functions and data) to form executables.\n\n The \"Library\", below, refers to any such software library or work\nwhich has been distributed under these terms. A \"work based on the\nLibrary\" means either the Library or any derivative work under\ncopyright law: that is to say, a work containing the Library or a\nportion of it, either verbatim or with modifications and/or translated\nstraightforwardly into another language. (Hereinafter, translation is\nincluded without limitation in the term \"modification\".)\n\n \"Source code\" for a work means the preferred form of the work for\nmaking modifications to it. For a library, complete source code means\nall the source code for all modules it contains, plus any associated\ninterface definition files, plus the scripts used to control compilation\nand installation of the library.\n\n Activities other than copying, distribution and modification are not\ncovered by this License; they are outside its scope. The act of\nrunning a program using the Library is not restricted, and output from\nsuch a program is covered only if its contents constitute a work based\non the Library (independent of the use of the Library in a tool for\nwriting it). Whether that is true depends on what the Library does\nand what the program that uses the Library does.\n\n 1. You may copy and distribute verbatim copies of the Library's\ncomplete source code as you receive it, in any medium, provided that\nyou conspicuously and appropriately publish on each copy an\nappropriate copyright notice and disclaimer of warranty; keep intact\nall the notices that refer to this License and to the absence of any\nwarranty; and distribute a copy of this License along with the\nLibrary.\n\n You may charge a fee for the physical act of transferring a copy,\nand you may at your option offer warranty protection in exchange for a\nfee.\n\n 2. You may modify your copy or copies of the Library or any portion\nof it, thus forming a work based on the Library, and copy and\ndistribute such modifications or work under the terms of Section 1\nabove, provided that you also meet all of these conditions:\n\n a) The modified work must itself be a software library.\n\n b) You must cause the files modified to carry prominent notices\n stating that you changed the files and the date of any change.\n\n c) You must cause the whole of the work to be licensed at no\n charge to all third parties under the terms of this License.\n\n d) If a facility in the modified Library refers to a function or a\n table of data to be supplied by an application program that uses\n the facility, other than as an argument passed when the facility\n is invoked, then you must make a good faith effort to ensure that,\n in the event an application does not supply such function or\n table, the facility still operates, and performs whatever part of\n its purpose remains meaningful.\n\n (For example, a function in a library to compute square roots has\n a purpose that is entirely well-defined independent of the\n application. Therefore, Subsection 2d requires that any\n application-supplied function or table used by this function must\n be optional: if the application does not supply it, the square\n root function must still compute square roots.)\n\nThese requirements apply to the modified work as a whole. If\nidentifiable sections of that work are not derived from the Library,\nand can be reasonably considered independent and separate works in\nthemselves, then this License, and its terms, do not apply to those\nsections when you distribute them as separate works. But when you\ndistribute the same sections as part of a whole which is a work based\non the Library, the distribution of the whole must be on the terms of\nthis License, whose permissions for other licensees extend to the\nentire whole, and thus to each and every part regardless of who wrote\nit.\n\nThus, it is not the intent of this section to claim rights or contest\nyour rights to work written entirely by you; rather, the intent is to\nexercise the right to control the distribution of derivative or\ncollective works based on the Library.\n\nIn addition, mere aggregation of another work not based on the Library\nwith the Library (or with a work based on the Library) on a volume of\na storage or distribution medium does not bring the other work under\nthe scope of this License.\n\n 3. You may opt to apply the terms of the ordinary GNU General Public\nLicense instead of this License to a given copy of the Library. To do\nthis, you must alter all the notices that refer to this License, so\nthat they refer to the ordinary GNU General Public License, version 2,\ninstead of to this License. (If a newer version than version 2 of the\nordinary GNU General Public License has appeared, then you can specify\nthat version instead if you wish.) Do not make any other change in\nthese notices.\n\n Once this change is made in a given copy, it is irreversible for\nthat copy, so the ordinary GNU General Public License applies to all\nsubsequent copies and derivative works made from that copy.\n\n This option is useful when you wish to copy part of the code of\nthe Library into a program that is not a library.\n\n 4. You may copy and distribute the Library (or a portion or\nderivative of it, under Section 2) in object code or executable form\nunder the terms of Sections 1 and 2 above provided that you accompany\nit with the complete corresponding machine-readable source code, which\nmust be distributed under the terms of Sections 1 and 2 above on a\nmedium customarily used for software interchange.\n\n If distribution of object code is made by offering access to copy\nfrom a designated place, then offering equivalent access to copy the\nsource code from the same place satisfies the requirement to\ndistribute the source code, even though third parties are not\ncompelled to copy the source along with the object code.\n\n 5. A program that contains no derivative of any portion of the\nLibrary, but is designed to work with the Library by being compiled or\nlinked with it, is called a \"work that uses the Library\". Such a\nwork, in isolation, is not a derivative work of the Library, and\ntherefore falls outside the scope of this License.\n\n However, linking a \"work that uses the Library\" with the Library\ncreates an executable that is a derivative of the Library (because it\ncontains portions of the Library), rather than a \"work that uses the\nlibrary\". The executable is therefore covered by this License.\nSection 6 states terms for distribution of such executables.\n\n When a \"work that uses the Library\" uses material from a header file\nthat is part of the Library, the object code for the work may be a\nderivative work of the Library even though the source code is not.\nWhether this is true is especially significant if the work can be\nlinked without the Library, or if the work is itself a library. The\nthreshold for this to be true is not precisely defined by law.\n\n If such an object file uses only numerical parameters, data\nstructure layouts and accessors, and small macros and small inline\nfunctions (ten lines or less in length), then the use of the object\nfile is unrestricted, regardless of whether it is legally a derivative\nwork. (Executables containing this object code plus portions of the\nLibrary will still fall under Section 6.)\n\n Otherwise, if the work is a derivative of the Library, you may\ndistribute the object code for the work under the terms of Section 6.\nAny executables containing that work also fall under Section 6,\nwhether or not they are linked directly with the Library itself.\n\n 6. As an exception to the Sections above, you may also combine or\nlink a \"work that uses the Library\" with the Library to produce a\nwork containing portions of the Library, and distribute that work\nunder terms of your choice, provided that the terms permit\nmodification of the work for the customer's own use and reverse\nengineering for debugging such modifications.\n\n You must give prominent notice with each copy of the work that the\nLibrary is used in it and that the Library and its use are covered by\nthis License. You must supply a copy of this License. If the work\nduring execution displays copyright notices, you must include the\ncopyright notice for the Library among them, as well as a reference\ndirecting the user to the copy of this License. Also, you must do one\nof these things:\n\n a) Accompany the work with the complete corresponding\n machine-readable source code for the Library including whatever\n changes were used in the work (which must be distributed under\n Sections 1 and 2 above); and, if the work is an executable linked\n with the Library, with the complete machine-readable \"work that\n uses the Library\", as object code and/or source code, so that the\n user can modify the Library and then relink to produce a modified\n executable containing the modified Library. (It is understood\n that the user who changes the contents of definitions files in the\n Library will not necessarily be able to recompile the application\n to use the modified definitions.)\n\n b) Use a suitable shared library mechanism for linking with the\n Library. A suitable mechanism is one that (1) uses at run time a\n copy of the library already present on the user's computer system,\n rather than copying library functions into the executable, and (2)\n will operate properly with a modified version of the library, if\n the user installs one, as long as the modified version is\n interface-compatible with the version that the work was made with.\n\n c) Accompany the work with a written offer, valid for at\n least three years, to give the same user the materials\n specified in Subsection 6a, above, for a charge no more\n than the cost of performing this distribution.\n\n d) If distribution of the work is made by offering access to copy\n from a designated place, offer equivalent access to copy the above\n specified materials from the same place.\n\n e) Verify that the user has already received a copy of these\n materials or that you have already sent this user a copy.\n\n For an executable, the required form of the \"work that uses the\nLibrary\" must include any data and utility programs needed for\nreproducing the executable from it. However, as a special exception,\nthe materials to be distributed need not include anything that is\nnormally distributed (in either source or binary form) with the major\ncomponents (compiler, kernel, and so on) of the operating system on\nwhich the executable runs, unless that component itself accompanies\nthe executable.\n\n It may happen that this requirement contradicts the license\nrestrictions of other proprietary libraries that do not normally\naccompany the operating system. Such a contradiction means you cannot\nuse both them and the Library together in an executable that you\ndistribute.\n\n 7. You may place library facilities that are a work based on the\nLibrary side-by-side in a single library together with other library\nfacilities not covered by this License, and distribute such a combined\nlibrary, provided that the separate distribution of the work based on\nthe Library and of the other library facilities is otherwise\npermitted, and provided that you do these two things:\n\n a) Accompany the combined library with a copy of the same work\n based on the Library, uncombined with any other library\n facilities. This must be distributed under the terms of the\n Sections above.\n\n b) Give prominent notice with the combined library of the fact\n that part of it is a work based on the Library, and explaining\n where to find the accompanying uncombined form of the same work.\n\n 8. You may not copy, modify, sublicense, link with, or distribute\nthe Library except as expressly provided under this License. Any\nattempt otherwise to copy, modify, sublicense, link with, or\ndistribute the Library is void, and will automatically terminate your\nrights under this License. However, parties who have received copies,\nor rights, from you under this License will not have their licenses\nterminated so long as such parties remain in full compliance.\n\n 9. You are not required to accept this License, since you have not\nsigned it. However, nothing else grants you permission to modify or\ndistribute the Library or its derivative works. These actions are\nprohibited by law if you do not accept this License. Therefore, by\nmodifying or distributing the Library (or any work based on the\nLibrary), you indicate your acceptance of this License to do so, and\nall its terms and conditions for copying, distributing or modifying\nthe Library or works based on it.\n\n 10. Each time you redistribute the Library (or any work based on the\nLibrary), the recipient automatically receives a license from the\noriginal licensor to copy, distribute, link with or modify the Library\nsubject to these terms and conditions. You may not impose any further\nrestrictions on the recipients' exercise of the rights granted herein.\nYou are not responsible for enforcing compliance by third parties with\nthis License.\n\n 11. If, as a consequence of a court judgment or allegation of patent\ninfringement or for any other reason (not limited to patent issues),\nconditions are imposed on you (whether by court order, agreement or\notherwise) that contradict the conditions of this License, they do not\nexcuse you from the conditions of this License. If you cannot\ndistribute so as to satisfy simultaneously your obligations under this\nLicense and any other pertinent obligations, then as a consequence you\nmay not distribute the Library at all. For example, if a patent\nlicense would not permit royalty-free redistribution of the Library by\nall those who receive copies directly or indirectly through you, then\nthe only way you could satisfy both it and this License would be to\nrefrain entirely from distribution of the Library.\n\nIf any portion of this section is held invalid or unenforceable under any\nparticular circumstance, the balance of the section is intended to apply,\nand the section as a whole is intended to apply in other circumstances.\n\nIt is not the purpose of this section to induce you to infringe any\npatents or other property right claims or to contest validity of any\nsuch claims; this section has the sole purpose of protecting the\nintegrity of the free software distribution system which is\nimplemented by public license practices. Many people have made\ngenerous contributions to the wide range of software distributed\nthrough that system in reliance on consistent application of that\nsystem; it is up to the author/donor to decide if he or she is willing\nto distribute software through any other system and a licensee cannot\nimpose that choice.\n\nThis section is intended to make thoroughly clear what is believed to\nbe a consequence of the rest of this License.\n\n 12. If the distribution and/or use of the Library is restricted in\ncertain countries either by patents or by copyrighted interfaces, the\noriginal copyright holder who places the Library under this License may add\nan explicit geographical distribution limitation excluding those countries,\nso that distribution is permitted only in or among countries not thus\nexcluded. In such case, this License incorporates the limitation as if\nwritten in the body of this License.\n\n 13. The Free Software Foundation may publish revised and/or new\nversions of the Lesser General Public License from time to time.\nSuch new versions will be similar in spirit to the present version,\nbut may differ in detail to address new problems or concerns.\n\nEach version is given a distinguishing version number. If the Library\nspecifies a version number of this License which applies to it and\n\"any later version\", you have the option of following the terms and\nconditions either of that version or of any later version published by\nthe Free Software Foundation. If the Library does not specify a\nlicense version number, you may choose any version ever published by\nthe Free Software Foundation.\n\n 14. If you wish to incorporate parts of the Library into other free\nprograms whose distribution conditions are incompatible with these,\nwrite to the author to ask for permission. For software which is\ncopyrighted by the Free Software Foundation, write to the Free\nSoftware Foundation; we sometimes make exceptions for this. Our\ndecision will be guided by the two goals of preserving the free status\nof all derivatives of our free software and of promoting the sharing\nand reuse of software generally.\n\n NO WARRANTY\n\n 15. BECAUSE THE LIBRARY IS LICENSED FREE OF CHARGE, THERE IS NO\nWARRANTY FOR THE LIBRARY, TO THE EXTENT PERMITTED BY APPLICABLE LAW.\nEXCEPT WHEN OTHERWISE STATED IN WRITING THE COPYRIGHT HOLDERS AND/OR\nOTHER PARTIES PROVIDE THE LIBRARY \"AS IS\" WITHOUT WARRANTY OF ANY\nKIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, BUT NOT LIMITED TO, THE\nIMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR\nPURPOSE. THE ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF THE\nLIBRARY IS WITH YOU. SHOULD THE LIBRARY PROVE DEFECTIVE, YOU ASSUME\nTHE COST OF ALL NECESSARY SERVICING, REPAIR OR CORRECTION.\n\n 16. IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN\nWRITING WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MAY MODIFY\nAND/OR REDISTRIBUTE THE LIBRARY AS PERMITTED ABOVE, BE LIABLE TO YOU\nFOR DAMAGES, INCLUDING ANY GENERAL, SPECIAL, INCIDENTAL OR\nCONSEQUENTIAL DAMAGES ARISING OUT OF THE USE OR INABILITY TO USE THE\nLIBRARY (INCLUDING BUT NOT LIMITED TO LOSS OF DATA OR DATA BEING\nRENDERED INACCURATE OR LOSSES SUSTAINED BY YOU OR THIRD PARTIES OR A\nFAILURE OF THE LIBRARY TO OPERATE WITH ANY OTHER SOFTWARE), EVEN IF\nSUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE POSSIBILITY OF SUCH\nDAMAGES.\n\n END OF TERMS AND CONDITIONS\n\n How to Apply These Terms to Your New Libraries\n\n If you develop a new library, and you want it to be of the greatest\npossible use to the public, we recommend making it free software that\neveryone can redistribute and change. You can do so by permitting\nredistribution under these terms (or, alternatively, under the terms of the\nordinary General Public License).\n\n To apply these terms, attach the following notices to the library. It is\nsafest to attach them to the start of each source file to most effectively\nconvey the exclusion of warranty; and each file should have at least the\n\"copyright\" line and a pointer to where the full notice is found.\n\n <one line to give the library's name and a brief idea of what it does.>\n Copyright (C) <year> <name of author>\n\n This library is free software; you can redistribute it and/or\n modify it under the terms of the GNU Lesser General Public\n License as published by the Free Software Foundation; either\n version 2.1 of the License, or (at your option) any later version.\n\n This library is distributed in the hope that it will be useful,\n but WITHOUT ANY WARRANTY; without even the implied warranty of\n MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU\n Lesser General Public License for more details.\n\n You should have received a copy of the GNU Lesser General Public\n License along with this library; if not, write to the Free Software\n Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301\n USA\n\nAlso add information on how to contact you by electronic and paper mail.\n\nYou should also get your employer (if you work as a programmer) or your\nschool, if any, to sign a \"copyright disclaimer\" for the library, if\nnecessary. Here is a sample; alter the names:\n\n Yoyodyne, Inc., hereby disclaims all copyright interest in the\n library `Frob' (a library for tweaking knobs) written by James Random\n Hacker.\n\n <signature of Ty Coon>, 1 April 1990\n Ty Coon, President of Vice\n\nThat's all there is to it!\n"
},
{
"Name": "pkginfo",
"Version": "1.5.0.1",
"Summary": "Query metadatdata from sdists / bdists / installed packages.",
"Home-page": "https://code.launchpad.net/~tseaver/pkginfo/trunk",
"Author": "Tres Seaver, Agendaless Consulting",
"License": "MIT"
},
{
"Name": "pluggy",
"Version": "0.13.1",
"Summary": "plugin and hook calling mechanisms for python",
"Home-page": "https://github.com/pytest-dev/pluggy",
"Author": "Holger Krekel",
"License": "MIT License",
"License URL": "https://api.github.com/repos/pytest-dev/pluggy/license",
"License repo": "The MIT License (MIT)\n\nCopyright (c) 2015 holger krekel (rather uses bitbucket/hpk42)\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "portalocker",
"Version": "1.2.1",
"Summary": "Wraps the portalocker recipe for easy usage",
"Home-page": "https://github.com/WoLpH/portalocker",
"Author": "Rick van Hattem",
"License": "Other",
"License URL": "https://api.github.com/repos/wolph/portalocker/license",
"License repo": "PYTHON SOFTWARE FOUNDATION LICENSE VERSION 2\n--------------------------------------------\n\n1. This LICENSE AGREEMENT is between the Python Software Foundation\n(\"PSF\"), and the Individual or Organization (\"Licensee\") accessing and\notherwise using this software (\"Python\") in source or binary form and\nits associated documentation.\n\n2. Subject to the terms and conditions of this License Agreement, PSF hereby\ngrants Licensee a nonexclusive, royalty-free, world-wide license to reproduce,\nanalyze, test, perform and/or display publicly, prepare derivative works,\ndistribute, and otherwise use Python alone or in any derivative version,\nprovided, however, that PSF's License Agreement and PSF's notice of copyright,\ni.e., \"Copyright (c) 2001, 2002, 2003, 2004, 2005, 2006, 2007, 2008, 2009, 2010\nPython Software Foundation; All Rights Reserved\" are retained in Python alone or\nin any derivative version prepared by Licensee.\n\n3. In the event Licensee prepares a derivative work that is based on\nor incorporates Python or any part thereof, and wants to make\nthe derivative work available to others as provided herein, then\nLicensee hereby agrees to include in any such work a brief summary of\nthe changes made to Python.\n\n4. PSF is making Python available to Licensee on an \"AS IS\"\nbasis. PSF MAKES NO REPRESENTATIONS OR WARRANTIES, EXPRESS OR\nIMPLIED. BY WAY OF EXAMPLE, BUT NOT LIMITATION, PSF MAKES NO AND\nDISCLAIMS ANY REPRESENTATION OR WARRANTY OF MERCHANTABILITY OR FITNESS\nFOR ANY PARTICULAR PURPOSE OR THAT THE USE OF PYTHON WILL NOT\nINFRINGE ANY THIRD PARTY RIGHTS.\n\n5. PSF SHALL NOT BE LIABLE TO LICENSEE OR ANY OTHER USERS OF PYTHON\nFOR ANY INCIDENTAL, SPECIAL, OR CONSEQUENTIAL DAMAGES OR LOSS AS\nA RESULT OF MODIFYING, DISTRIBUTING, OR OTHERWISE USING PYTHON,\nOR ANY DERIVATIVE THEREOF, EVEN IF ADVISED OF THE POSSIBILITY THEREOF.\n\n6. This License Agreement will automatically terminate upon a material\nbreach of its terms and conditions.\n\n7. Nothing in this License Agreement shall be deemed to create any\nrelationship of agency, partnership, or joint venture between PSF and\nLicensee. This License Agreement does not grant permission to use PSF\ntrademarks or trade name in a trademark sense to endorse or promote\nproducts or services of Licensee, or any third party.\n\n8. By copying, installing or otherwise using Python, Licensee\nagrees to be bound by the terms and conditions of this License\nAgreement.\n\n"
},
{
"Name": "prompt-toolkit",
"Version": "1.0.18",
"Summary": "Library for building powerful interactive command lines in Python",
"Home-page": "https://github.com/jonathanslenders/python-prompt-toolkit",
"Author": "Jonathan Slenders",
"License": "BSD 3-Clause \"New\" or \"Revised\" License",
"License URL": "https://api.github.com/repos/jonathanslenders/python-prompt-toolkit/license",
"License repo": "Copyright (c) 2014, Jonathan Slenders\nAll rights reserved.\n\nRedistribution and use in source and binary forms, with or without modification,\nare permitted provided that the following conditions are met:\n\n* Redistributions of source code must retain the above copyright notice, this\n list of conditions and the following disclaimer.\n\n* Redistributions in binary form must reproduce the above copyright notice, this\n list of conditions and the following disclaimer in the documentation and/or\n other materials provided with the distribution.\n\n* Neither the name of the {organization} nor the names of its\n contributors may be used to endorse or promote products derived from\n this software without specific prior written permission.\n\nTHIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS \"AS IS\" AND\nANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED\nWARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE\nDISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE FOR\nANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES\n(INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES;\nLOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON\nANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT\n(INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS\nSOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.\n",
"License text": "BSD 3-Clause License\n\nCopyright (c) [year], [fullname]\nAll rights reserved.\n\nRedistribution and use in source and binary forms, with or without\nmodification, are permitted provided that the following conditions are met:\n\n1. Redistributions of source code must retain the above copyright notice, this\n list of conditions and the following disclaimer.\n\n2. Redistributions in binary form must reproduce the above copyright notice,\n this list of conditions and the following disclaimer in the documentation\n and/or other materials provided with the distribution.\n\n3. Neither the name of the copyright holder nor the names of its\n contributors may be used to endorse or promote products derived from\n this software without specific prior written permission.\n\nTHIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS \"AS IS\"\nAND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE\nIMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE\nDISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE\nFOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL\nDAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR\nSERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER\nCAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,\nOR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE\nOF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.\n"
},
{
"Name": "psutil",
"Version": "5.6.6",
"Summary": "Cross-platform lib for process and system monitoring in Python.",
"Home-page": "https://github.com/giampaolo/psutil",
"Author": "Giampaolo Rodola",
"License": "BSD 3-Clause \"New\" or \"Revised\" License",
"License URL": "https://api.github.com/repos/giampaolo/psutil/license",
"License repo": "BSD 3-Clause License\n\nCopyright (c) 2009, Jay Loden, Dave Daeschler, Giampaolo Rodola'\nAll rights reserved.\n\nRedistribution and use in source and binary forms, with or without modification,\nare permitted provided that the following conditions are met:\n\n * Redistributions of source code must retain the above copyright notice, this\n list of conditions and the following disclaimer.\n\n * Redistributions in binary form must reproduce the above copyright notice,\n this list of conditions and the following disclaimer in the documentation\n and/or other materials provided with the distribution.\n\n * Neither the name of the psutil authors nor the names of its contributors\n may be used to endorse or promote products derived from this software without\n specific prior written permission.\n\nTHIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS \"AS IS\" AND\nANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED\nWARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE\nDISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE FOR\nANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES\n(INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES;\nLOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON\nANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT\n(INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS\nSOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.\n",
"License text": "BSD 3-Clause License\n\nCopyright (c) [year], [fullname]\nAll rights reserved.\n\nRedistribution and use in source and binary forms, with or without\nmodification, are permitted provided that the following conditions are met:\n\n1. Redistributions of source code must retain the above copyright notice, this\n list of conditions and the following disclaimer.\n\n2. Redistributions in binary form must reproduce the above copyright notice,\n this list of conditions and the following disclaimer in the documentation\n and/or other materials provided with the distribution.\n\n3. Neither the name of the copyright holder nor the names of its\n contributors may be used to endorse or promote products derived from\n this software without specific prior written permission.\n\nTHIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS \"AS IS\"\nAND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE\nIMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE\nDISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE\nFOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL\nDAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR\nSERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER\nCAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,\nOR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE\nOF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.\n"
},
{
"Name": "py",
"Version": "1.8.1",
"Summary": "library with cross-python path, ini-parsing, io, code, log facilities",
"Home-page": "http://py.readthedocs.io/",
"Author": "holger krekel, Ronny Pfannschmidt, Benjamin Peterson and others",
"License": "MIT license"
},
{
"Name": "pycparser",
"Version": "2.19",
"Summary": "C parser in Python",
"Home-page": "https://github.com/eliben/pycparser",
"Author": "Eli Bendersky",
"License": "Other",
"License URL": "https://api.github.com/repos/eliben/pycparser/license",
"License repo": "pycparser -- A C parser in Python\n\nCopyright (c) 2008-2017, Eli Bendersky\nAll rights reserved.\n\nRedistribution and use in source and binary forms, with or without modification,\nare permitted provided that the following conditions are met:\n\n* Redistributions of source code must retain the above copyright notice, this \n list of conditions and the following disclaimer.\n* Redistributions in binary form must reproduce the above copyright notice, \n this list of conditions and the following disclaimer in the documentation \n and/or other materials provided with the distribution.\n* Neither the name of Eli Bendersky nor the names of its contributors may \n be used to endorse or promote products derived from this software without \n specific prior written permission.\n\nTHIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS \"AS IS\" AND \nANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED \nWARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE \nDISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE \nLIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR \nCONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE \nGOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) \nHOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT \nLIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT \nOF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.\n"
},
{
"Name": "Pygments",
"Version": "2.4.2",
"Summary": "Pygments is a syntax highlighting package written in Python.",
"Home-page": "http://pygments.org/",
"Author": "Georg Brandl",
"License": "BSD License"
},
{
"Name": "PyJWT",
"Version": "1.7.1",
"Summary": "JSON Web Token implementation in Python",
"Home-page": "http://github.com/jpadilla/pyjwt",
"Author": "Jose Padilla",
"License": "MIT License",
"License URL": "https://api.github.com/repos/jpadilla/pyjwt/license",
"License repo": "The MIT License (MIT)\n\nCopyright (c) 2015 Jos\u00e9 Padilla\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "PyNaCl",
"Version": "1.3.0",
"Summary": "Python binding to the Networking and Cryptography (NaCl) library",
"Home-page": "https://github.com/pyca/pynacl/",
"Author": "The PyNaCl developers",
"License": "Apache License 2.0",
"License URL": "https://api.github.com/repos/pyca/pynacl/license",
"License repo": " Apache License\n Version 2.0, January 2004\n http://www.apache.org/licenses/\n\nTERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION\n\n1. Definitions.\n\n \"License\" shall mean the terms and conditions for use, reproduction,\n and distribution as defined by Sections 1 through 9 of this document.\n\n \"Licensor\" shall mean the copyright owner or entity authorized by\n the copyright owner that is granting the License.\n\n \"Legal Entity\" shall mean the union of the acting entity and all\n other entities that control, are controlled by, or are under common\n control with that entity. For the purposes of this definition,\n \"control\" means (i) the power, direct or indirect, to cause the\n direction or management of such entity, whether by contract or\n otherwise, or (ii) ownership of fifty percent (50%) or more of the\n outstanding shares, or (iii) beneficial ownership of such entity.\n\n \"You\" (or \"Your\") shall mean an individual or Legal Entity\n exercising permissions granted by this License.\n\n \"Source\" form shall mean the preferred form for making modifications,\n including but not limited to software source code, documentation\n source, and configuration files.\n\n \"Object\" form shall mean any form resulting from mechanical\n transformation or translation of a Source form, including but\n not limited to compiled object code, generated documentation,\n and conversions to other media types.\n\n \"Work\" shall mean the work of authorship, whether in Source or\n Object form, made available under the License, as indicated by a\n copyright notice that is included in or attached to the work\n (an example is provided in the Appendix below).\n\n \"Derivative Works\" shall mean any work, whether in Source or Object\n form, that is based on (or derived from) the Work and for which the\n editorial revisions, annotations, elaborations, or other modifications\n represent, as a whole, an original work of authorship. For the purposes\n of this License, Derivative Works shall not include works that remain\n separable from, or merely link (or bind by name) to the interfaces of,\n the Work and Derivative Works thereof.\n\n \"Contribution\" shall mean any work of authorship, including\n the original version of the Work and any modifications or additions\n to that Work or Derivative Works thereof, that is intentionally\n submitted to Licensor for inclusion in the Work by the copyright owner\n or by an individual or Legal Entity authorized to submit on behalf of\n the copyright owner. For the purposes of this definition, \"submitted\"\n means any form of electronic, verbal, or written communication sent\n to the Licensor or its representatives, including but not limited to\n communication on electronic mailing lists, source code control systems,\n and issue tracking systems that are managed by, or on behalf of, the\n Licensor for the purpose of discussing and improving the Work, but\n excluding communication that is conspicuously marked or otherwise\n designated in writing by the copyright owner as \"Not a Contribution.\"\n\n \"Contributor\" shall mean Licensor and any individual or Legal Entity\n on behalf of whom a Contribution has been received by Licensor and\n subsequently incorporated within the Work.\n\n2. Grant of Copyright License. Subject to the terms and conditions of\n this License, each Contributor hereby grants to You a perpetual,\n worldwide, non-exclusive, no-charge, royalty-free, irrevocable\n copyright license to reproduce, prepare Derivative Works of,\n publicly display, publicly perform, sublicense, and distribute the\n Work and such Derivative Works in Source or Object form.\n\n3. Grant of Patent License. Subject to the terms and conditions of\n this License, each Contributor hereby grants to You a perpetual,\n worldwide, non-exclusive, no-charge, royalty-free, irrevocable\n (except as stated in this section) patent license to make, have made,\n use, offer to sell, sell, import, and otherwise transfer the Work,\n where such license applies only to those patent claims licensable\n by such Contributor that are necessarily infringed by their\n Contribution(s) alone or by combination of their Contribution(s)\n with the Work to which such Contribution(s) was submitted. If You\n institute patent litigation against any entity (including a\n cross-claim or counterclaim in a lawsuit) alleging that the Work\n or a Contribution incorporated within the Work constitutes direct\n or contributory patent infringement, then any patent licenses\n granted to You under this License for that Work shall terminate\n as of the date such litigation is filed.\n\n4. Redistribution. You may reproduce and distribute copies of the\n Work or Derivative Works thereof in any medium, with or without\n modifications, and in Source or Object form, provided that You\n meet the following conditions:\n\n (a) You must give any other recipients of the Work or\n Derivative Works a copy of this License; and\n\n (b) You must cause any modified files to carry prominent notices\n stating that You changed the files; and\n\n (c) You must retain, in the Source form of any Derivative Works\n that You distribute, all copyright, patent, trademark, and\n attribution notices from the Source form of the Work,\n excluding those notices that do not pertain to any part of\n the Derivative Works; and\n\n (d) If the Work includes a \"NOTICE\" text file as part of its\n distribution, then any Derivative Works that You distribute must\n include a readable copy of the attribution notices contained\n within such NOTICE file, excluding those notices that do not\n pertain to any part of the Derivative Works, in at least one\n of the following places: within a NOTICE text file distributed\n as part of the Derivative Works; within the Source form or\n documentation, if provided along with the Derivative Works; or,\n within a display generated by the Derivative Works, if and\n wherever such third-party notices normally appear. The contents\n of the NOTICE file are for informational purposes only and\n do not modify the License. You may add Your own attribution\n notices within Derivative Works that You distribute, alongside\n or as an addendum to the NOTICE text from the Work, provided\n that such additional attribution notices cannot be construed\n as modifying the License.\n\n You may add Your own copyright statement to Your modifications and\n may provide additional or different license terms and conditions\n for use, reproduction, or distribution of Your modifications, or\n for any such Derivative Works as a whole, provided Your use,\n reproduction, and distribution of the Work otherwise complies with\n the conditions stated in this License.\n\n5. Submission of Contributions. Unless You explicitly state otherwise,\n any Contribution intentionally submitted for inclusion in the Work\n by You to the Licensor shall be under the terms and conditions of\n this License, without any additional terms or conditions.\n Notwithstanding the above, nothing herein shall supersede or modify\n the terms of any separate license agreement you may have executed\n with Licensor regarding such Contributions.\n\n6. Trademarks. This License does not grant permission to use the trade\n names, trademarks, service marks, or product names of the Licensor,\n except as required for reasonable and customary use in describing the\n origin of the Work and reproducing the content of the NOTICE file.\n\n7. Disclaimer of Warranty. Unless required by applicable law or\n agreed to in writing, Licensor provides the Work (and each\n Contributor provides its Contributions) on an \"AS IS\" BASIS,\n WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or\n implied, including, without limitation, any warranties or conditions\n of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A\n PARTICULAR PURPOSE. You are solely responsible for determining the\n appropriateness of using or redistributing the Work and assume any\n risks associated with Your exercise of permissions under this License.\n\n8. Limitation of Liability. In no event and under no legal theory,\n whether in tort (including negligence), contract, or otherwise,\n unless required by applicable law (such as deliberate and grossly\n negligent acts) or agreed to in writing, shall any Contributor be\n liable to You for damages, including any direct, indirect, special,\n incidental, or consequential damages of any character arising as a\n result of this License or out of the use or inability to use the\n Work (including but not limited to damages for loss of goodwill,\n work stoppage, computer failure or malfunction, or any and all\n other commercial damages or losses), even if such Contributor\n has been advised of the possibility of such damages.\n\n9. Accepting Warranty or Additional Liability. While redistributing\n the Work or Derivative Works thereof, You may choose to offer,\n and charge a fee for, acceptance of support, warranty, indemnity,\n or other liability obligations and/or rights consistent with this\n License. However, in accepting such obligations, You may act only\n on Your own behalf and on Your sole responsibility, not on behalf\n of any other Contributor, and only if You agree to indemnify,\n defend, and hold each Contributor harmless for any liability\n incurred by, or claims asserted against, such Contributor by reason\n of your accepting any such warranty or additional liability.\n",
"License text": " Apache License\n Version 2.0, January 2004\n http://www.apache.org/licenses/\n\n TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION\n\n 1. Definitions.\n\n \"License\" shall mean the terms and conditions for use, reproduction,\n and distribution as defined by Sections 1 through 9 of this document.\n\n \"Licensor\" shall mean the copyright owner or entity authorized by\n the copyright owner that is granting the License.\n\n \"Legal Entity\" shall mean the union of the acting entity and all\n other entities that control, are controlled by, or are under common\n control with that entity. For the purposes of this definition,\n \"control\" means (i) the power, direct or indirect, to cause the\n direction or management of such entity, whether by contract or\n otherwise, or (ii) ownership of fifty percent (50%) or more of the\n outstanding shares, or (iii) beneficial ownership of such entity.\n\n \"You\" (or \"Your\") shall mean an individual or Legal Entity\n exercising permissions granted by this License.\n\n \"Source\" form shall mean the preferred form for making modifications,\n including but not limited to software source code, documentation\n source, and configuration files.\n\n \"Object\" form shall mean any form resulting from mechanical\n transformation or translation of a Source form, including but\n not limited to compiled object code, generated documentation,\n and conversions to other media types.\n\n \"Work\" shall mean the work of authorship, whether in Source or\n Object form, made available under the License, as indicated by a\n copyright notice that is included in or attached to the work\n (an example is provided in the Appendix below).\n\n \"Derivative Works\" shall mean any work, whether in Source or Object\n form, that is based on (or derived from) the Work and for which the\n editorial revisions, annotations, elaborations, or other modifications\n represent, as a whole, an original work of authorship. For the purposes\n of this License, Derivative Works shall not include works that remain\n separable from, or merely link (or bind by name) to the interfaces of,\n the Work and Derivative Works thereof.\n\n \"Contribution\" shall mean any work of authorship, including\n the original version of the Work and any modifications or additions\n to that Work or Derivative Works thereof, that is intentionally\n submitted to Licensor for inclusion in the Work by the copyright owner\n or by an individual or Legal Entity authorized to submit on behalf of\n the copyright owner. For the purposes of this definition, \"submitted\"\n means any form of electronic, verbal, or written communication sent\n to the Licensor or its representatives, including but not limited to\n communication on electronic mailing lists, source code control systems,\n and issue tracking systems that are managed by, or on behalf of, the\n Licensor for the purpose of discussing and improving the Work, but\n excluding communication that is conspicuously marked or otherwise\n designated in writing by the copyright owner as \"Not a Contribution.\"\n\n \"Contributor\" shall mean Licensor and any individual or Legal Entity\n on behalf of whom a Contribution has been received by Licensor and\n subsequently incorporated within the Work.\n\n 2. Grant of Copyright License. Subject to the terms and conditions of\n this License, each Contributor hereby grants to You a perpetual,\n worldwide, non-exclusive, no-charge, royalty-free, irrevocable\n copyright license to reproduce, prepare Derivative Works of,\n publicly display, publicly perform, sublicense, and distribute the\n Work and such Derivative Works in Source or Object form.\n\n 3. Grant of Patent License. Subject to the terms and conditions of\n this License, each Contributor hereby grants to You a perpetual,\n worldwide, non-exclusive, no-charge, royalty-free, irrevocable\n (except as stated in this section) patent license to make, have made,\n use, offer to sell, sell, import, and otherwise transfer the Work,\n where such license applies only to those patent claims licensable\n by such Contributor that are necessarily infringed by their\n Contribution(s) alone or by combination of their Contribution(s)\n with the Work to which such Contribution(s) was submitted. If You\n institute patent litigation against any entity (including a\n cross-claim or counterclaim in a lawsuit) alleging that the Work\n or a Contribution incorporated within the Work constitutes direct\n or contributory patent infringement, then any patent licenses\n granted to You under this License for that Work shall terminate\n as of the date such litigation is filed.\n\n 4. Redistribution. You may reproduce and distribute copies of the\n Work or Derivative Works thereof in any medium, with or without\n modifications, and in Source or Object form, provided that You\n meet the following conditions:\n\n (a) You must give any other recipients of the Work or\n Derivative Works a copy of this License; and\n\n (b) You must cause any modified files to carry prominent notices\n stating that You changed the files; and\n\n (c) You must retain, in the Source form of any Derivative Works\n that You distribute, all copyright, patent, trademark, and\n attribution notices from the Source form of the Work,\n excluding those notices that do not pertain to any part of\n the Derivative Works; and\n\n (d) If the Work includes a \"NOTICE\" text file as part of its\n distribution, then any Derivative Works that You distribute must\n include a readable copy of the attribution notices contained\n within such NOTICE file, excluding those notices that do not\n pertain to any part of the Derivative Works, in at least one\n of the following places: within a NOTICE text file distributed\n as part of the Derivative Works; within the Source form or\n documentation, if provided along with the Derivative Works; or,\n within a display generated by the Derivative Works, if and\n wherever such third-party notices normally appear. The contents\n of the NOTICE file are for informational purposes only and\n do not modify the License. You may add Your own attribution\n notices within Derivative Works that You distribute, alongside\n or as an addendum to the NOTICE text from the Work, provided\n that such additional attribution notices cannot be construed\n as modifying the License.\n\n You may add Your own copyright statement to Your modifications and\n may provide additional or different license terms and conditions\n for use, reproduction, or distribution of Your modifications, or\n for any such Derivative Works as a whole, provided Your use,\n reproduction, and distribution of the Work otherwise complies with\n the conditions stated in this License.\n\n 5. Submission of Contributions. Unless You explicitly state otherwise,\n any Contribution intentionally submitted for inclusion in the Work\n by You to the Licensor shall be under the terms and conditions of\n this License, without any additional terms or conditions.\n Notwithstanding the above, nothing herein shall supersede or modify\n the terms of any separate license agreement you may have executed\n with Licensor regarding such Contributions.\n\n 6. Trademarks. This License does not grant permission to use the trade\n names, trademarks, service marks, or product names of the Licensor,\n except as required for reasonable and customary use in describing the\n origin of the Work and reproducing the content of the NOTICE file.\n\n 7. Disclaimer of Warranty. Unless required by applicable law or\n agreed to in writing, Licensor provides the Work (and each\n Contributor provides its Contributions) on an \"AS IS\" BASIS,\n WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or\n implied, including, without limitation, any warranties or conditions\n of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A\n PARTICULAR PURPOSE. You are solely responsible for determining the\n appropriateness of using or redistributing the Work and assume any\n risks associated with Your exercise of permissions under this License.\n\n 8. Limitation of Liability. In no event and under no legal theory,\n whether in tort (including negligence), contract, or otherwise,\n unless required by applicable law (such as deliberate and grossly\n negligent acts) or agreed to in writing, shall any Contributor be\n liable to You for damages, including any direct, indirect, special,\n incidental, or consequential damages of any character arising as a\n result of this License or out of the use or inability to use the\n Work (including but not limited to damages for loss of goodwill,\n work stoppage, computer failure or malfunction, or any and all\n other commercial damages or losses), even if such Contributor\n has been advised of the possibility of such damages.\n\n 9. Accepting Warranty or Additional Liability. While redistributing\n the Work or Derivative Works thereof, You may choose to offer,\n and charge a fee for, acceptance of support, warranty, indemnity,\n or other liability obligations and/or rights consistent with this\n License. However, in accepting such obligations, You may act only\n on Your own behalf and on Your sole responsibility, not on behalf\n of any other Contributor, and only if You agree to indemnify,\n defend, and hold each Contributor harmless for any liability\n incurred by, or claims asserted against, such Contributor by reason\n of your accepting any such warranty or additional liability.\n\n END OF TERMS AND CONDITIONS\n\n APPENDIX: How to apply the Apache License to your work.\n\n To apply the Apache License to your work, attach the following\n boilerplate notice, with the fields enclosed by brackets \"[]\"\n replaced with your own identifying information. (Don't include\n the brackets!) The text should be enclosed in the appropriate\n comment syntax for the file format. We also recommend that a\n file or class name and description of purpose be included on the\n same \"printed page\" as the copyright notice for easier\n identification within third-party archives.\n\n Copyright [yyyy] [name of copyright owner]\n\n Licensed under the Apache License, Version 2.0 (the \"License\");\n you may not use this file except in compliance with the License.\n You may obtain a copy of the License at\n\n http://www.apache.org/licenses/LICENSE-2.0\n\n Unless required by applicable law or agreed to in writing, software\n distributed under the License is distributed on an \"AS IS\" BASIS,\n WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n See the License for the specific language governing permissions and\n limitations under the License.\n"
},
{
"Name": "pyOpenSSL",
"Version": "19.0.0",
"Summary": "Python wrapper module around the OpenSSL library",
"Home-page": "https://pyopenssl.org/",
"Author": "The pyOpenSSL developers",
"License": "Apache License, Version 2.0"
},
{
"Name": "pyparsing",
"Version": "2.4.6",
"Summary": "Python parsing module",
"Home-page": "https://github.com/pyparsing/pyparsing/",
"Author": "Paul McGuire",
"License": "MIT License",
"License URL": "https://api.github.com/repos/pyparsing/pyparsing/license",
"License repo": "Permission is hereby granted, free of charge, to any person obtaining\na copy of this software and associated documentation files (the\n\"Software\"), to deal in the Software without restriction, including\nwithout limitation the rights to use, copy, modify, merge, publish,\ndistribute, sublicense, and/or sell copies of the Software, and to\npermit persons to whom the Software is furnished to do so, subject to\nthe following conditions:\n\nThe above copyright notice and this permission notice shall be\nincluded in all copies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND,\nEXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF\nMERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT.\nIN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY\nCLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT,\nTORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE\nSOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "pyperclip",
"Version": "1.7.0",
"Summary": "A cross-platform clipboard module for Python. (Only handles plain text for now.)",
"Home-page": "https://github.com/asweigart/pyperclip",
"Author": "Al Sweigart",
"License": "BSD 3-Clause \"New\" or \"Revised\" License",
"License URL": "https://api.github.com/repos/asweigart/pyperclip/license",
"License repo": "Copyright (c) 2014, Al Sweigart\nAll rights reserved.\n\nRedistribution and use in source and binary forms, with or without\nmodification, are permitted provided that the following conditions are met:\n\n* Redistributions of source code must retain the above copyright notice, this\n list of conditions and the following disclaimer.\n\n* Redistributions in binary form must reproduce the above copyright notice,\n this list of conditions and the following disclaimer in the documentation\n and/or other materials provided with the distribution.\n\n* Neither the name of the {organization} nor the names of its\n contributors may be used to endorse or promote products derived from\n this software without specific prior written permission.\n\nTHIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS \"AS IS\"\nAND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE\nIMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE\nDISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE\nFOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL\nDAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR\nSERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER\nCAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,\nOR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE\nOF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.\n",
"License text": "BSD 3-Clause License\n\nCopyright (c) [year], [fullname]\nAll rights reserved.\n\nRedistribution and use in source and binary forms, with or without\nmodification, are permitted provided that the following conditions are met:\n\n1. Redistributions of source code must retain the above copyright notice, this\n list of conditions and the following disclaimer.\n\n2. Redistributions in binary form must reproduce the above copyright notice,\n this list of conditions and the following disclaimer in the documentation\n and/or other materials provided with the distribution.\n\n3. Neither the name of the copyright holder nor the names of its\n contributors may be used to endorse or promote products derived from\n this software without specific prior written permission.\n\nTHIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS \"AS IS\"\nAND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE\nIMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE\nDISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE\nFOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL\nDAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR\nSERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER\nCAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,\nOR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE\nOF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.\n"
},
{
"Name": "pyrsistent",
"Version": "0.15.5",
"Summary": "Persistent/Functional/Immutable data structures",
"Home-page": "http://github.com/tobgu/pyrsistent/",
"Author": "Tobias Gustafsson",
"License": "MIT License",
"License URL": "https://api.github.com/repos/tobgu/pyrsistent/license",
"License repo": "Copyright (c) 2019 Tobias Gustafsson\n\nPermission is hereby granted, free of charge, to any person\nobtaining a copy of this software and associated documentation\nfiles (the \"Software\"), to deal in the Software without\nrestriction, including without limitation the rights to use,\ncopy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the\nSoftware is furnished to do so, subject to the following\nconditions:\n\nThe above copyright notice and this permission notice shall be\nincluded in all copies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND,\nEXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES\nOF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND\nNONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT\nHOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY,\nWHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING\nFROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR\nOTHER DEALINGS IN THE SOFTWARE.",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "python-dateutil",
"Version": "2.8.0",
"Summary": "Extensions to the standard Python datetime module",
"Home-page": "https://dateutil.readthedocs.io",
"Author": "Gustavo Niemeyer",
"License": "Dual License"
},
{
"Name": "python-json-logger",
"Version": "0.1.11",
"Summary": "A python library adding a json log formatter",
"Home-page": "http://github.com/madzak/python-json-logger",
"Author": "Zakaria Zajac",
"License": "BSD 2-Clause \"Simplified\" License",
"License URL": "https://api.github.com/repos/madzak/python-json-logger/license",
"License repo": "Copyright (c) 2011, Zakaria Zajac \nAll rights reserved.\n\nRedistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met:\n\n* Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer.\n* Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution.\n\nTHIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS \"AS IS\" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.\n",
"License text": "BSD 2-Clause License\n\nCopyright (c) [year], [fullname]\nAll rights reserved.\n\nRedistribution and use in source and binary forms, with or without\nmodification, are permitted provided that the following conditions are met:\n\n1. Redistributions of source code must retain the above copyright notice, this\n list of conditions and the following disclaimer.\n\n2. Redistributions in binary form must reproduce the above copyright notice,\n this list of conditions and the following disclaimer in the documentation\n and/or other materials provided with the distribution.\n\nTHIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS \"AS IS\"\nAND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE\nIMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE\nDISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE\nFOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL\nDAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR\nSERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER\nCAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,\nOR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE\nOF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.\n"
},
{
"Name": "pytz",
"Version": "2019.3",
"Summary": "World timezone definitions, modern and historical",
"Home-page": "http://pythonhosted.org/pytz",
"Author": "Stuart Bishop",
"License": "MIT"
},
{
"Name": "PyYAML",
"Version": "5.3",
"Summary": "YAML parser and emitter for Python",
"Home-page": "https://github.com/yaml/pyyaml",
"Author": "Kirill Simonov",
"License": "MIT License",
"License URL": "https://api.github.com/repos/yaml/pyyaml/license",
"License repo": "Copyright (c) 2017-2020 Ingy d\u00f6t Net\nCopyright (c) 2006-2016 Kirill Simonov\n\nPermission is hereby granted, free of charge, to any person obtaining a copy of\nthis software and associated documentation files (the \"Software\"), to deal in\nthe Software without restriction, including without limitation the rights to\nuse, copy, modify, merge, publish, distribute, sublicense, and/or sell copies\nof the Software, and to permit persons to whom the Software is furnished to do\nso, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "readme-renderer",
"Version": "24.0",
"Summary": "readme_renderer is a library for rendering \"readme\" descriptions for Warehouse",
"Home-page": "https://github.com/pypa/readme_renderer",
"Author": "The Python Packaging Authority",
"License": "Apache License 2.0",
"License URL": "https://api.github.com/repos/pypa/readme_renderer/license",
"License repo": " Apache License\n Version 2.0, January 2004\n http://www.apache.org/licenses/\n\nTERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION\n\n1. Definitions.\n\n \"License\" shall mean the terms and conditions for use, reproduction,\n and distribution as defined by Sections 1 through 9 of this document.\n\n \"Licensor\" shall mean the copyright owner or entity authorized by\n the copyright owner that is granting the License.\n\n \"Legal Entity\" shall mean the union of the acting entity and all\n other entities that control, are controlled by, or are under common\n control with that entity. For the purposes of this definition,\n \"control\" means (i) the power, direct or indirect, to cause the\n direction or management of such entity, whether by contract or\n otherwise, or (ii) ownership of fifty percent (50%) or more of the\n outstanding shares, or (iii) beneficial ownership of such entity.\n\n \"You\" (or \"Your\") shall mean an individual or Legal Entity\n exercising permissions granted by this License.\n\n \"Source\" form shall mean the preferred form for making modifications,\n including but not limited to software source code, documentation\n source, and configuration files.\n\n \"Object\" form shall mean any form resulting from mechanical\n transformation or translation of a Source form, including but\n not limited to compiled object code, generated documentation,\n and conversions to other media types.\n\n \"Work\" shall mean the work of authorship, whether in Source or\n Object form, made available under the License, as indicated by a\n copyright notice that is included in or attached to the work\n (an example is provided in the Appendix below).\n\n \"Derivative Works\" shall mean any work, whether in Source or Object\n form, that is based on (or derived from) the Work and for which the\n editorial revisions, annotations, elaborations, or other modifications\n represent, as a whole, an original work of authorship. For the purposes\n of this License, Derivative Works shall not include works that remain\n separable from, or merely link (or bind by name) to the interfaces of,\n the Work and Derivative Works thereof.\n\n \"Contribution\" shall mean any work of authorship, including\n the original version of the Work and any modifications or additions\n to that Work or Derivative Works thereof, that is intentionally\n submitted to Licensor for inclusion in the Work by the copyright owner\n or by an individual or Legal Entity authorized to submit on behalf of\n the copyright owner. For the purposes of this definition, \"submitted\"\n means any form of electronic, verbal, or written communication sent\n to the Licensor or its representatives, including but not limited to\n communication on electronic mailing lists, source code control systems,\n and issue tracking systems that are managed by, or on behalf of, the\n Licensor for the purpose of discussing and improving the Work, but\n excluding communication that is conspicuously marked or otherwise\n designated in writing by the copyright owner as \"Not a Contribution.\"\n\n \"Contributor\" shall mean Licensor and any individual or Legal Entity\n on behalf of whom a Contribution has been received by Licensor and\n subsequently incorporated within the Work.\n\n2. Grant of Copyright License. Subject to the terms and conditions of\n this License, each Contributor hereby grants to You a perpetual,\n worldwide, non-exclusive, no-charge, royalty-free, irrevocable\n copyright license to reproduce, prepare Derivative Works of,\n publicly display, publicly perform, sublicense, and distribute the\n Work and such Derivative Works in Source or Object form.\n\n3. Grant of Patent License. Subject to the terms and conditions of\n this License, each Contributor hereby grants to You a perpetual,\n worldwide, non-exclusive, no-charge, royalty-free, irrevocable\n (except as stated in this section) patent license to make, have made,\n use, offer to sell, sell, import, and otherwise transfer the Work,\n where such license applies only to those patent claims licensable\n by such Contributor that are necessarily infringed by their\n Contribution(s) alone or by combination of their Contribution(s)\n with the Work to which such Contribution(s) was submitted. If You\n institute patent litigation against any entity (including a\n cross-claim or counterclaim in a lawsuit) alleging that the Work\n or a Contribution incorporated within the Work constitutes direct\n or contributory patent infringement, then any patent licenses\n granted to You under this License for that Work shall terminate\n as of the date such litigation is filed.\n\n4. Redistribution. You may reproduce and distribute copies of the\n Work or Derivative Works thereof in any medium, with or without\n modifications, and in Source or Object form, provided that You\n meet the following conditions:\n\n (a) You must give any other recipients of the Work or\n Derivative Works a copy of this License; and\n\n (b) You must cause any modified files to carry prominent notices\n stating that You changed the files; and\n\n (c) You must retain, in the Source form of any Derivative Works\n that You distribute, all copyright, patent, trademark, and\n attribution notices from the Source form of the Work,\n excluding those notices that do not pertain to any part of\n the Derivative Works; and\n\n (d) If the Work includes a \"NOTICE\" text file as part of its\n distribution, then any Derivative Works that You distribute must\n include a readable copy of the attribution notices contained\n within such NOTICE file, excluding those notices that do not\n pertain to any part of the Derivative Works, in at least one\n of the following places: within a NOTICE text file distributed\n as part of the Derivative Works; within the Source form or\n documentation, if provided along with the Derivative Works; or,\n within a display generated by the Derivative Works, if and\n wherever such third-party notices normally appear. The contents\n of the NOTICE file are for informational purposes only and\n do not modify the License. You may add Your own attribution\n notices within Derivative Works that You distribute, alongside\n or as an addendum to the NOTICE text from the Work, provided\n that such additional attribution notices cannot be construed\n as modifying the License.\n\n You may add Your own copyright statement to Your modifications and\n may provide additional or different license terms and conditions\n for use, reproduction, or distribution of Your modifications, or\n for any such Derivative Works as a whole, provided Your use,\n reproduction, and distribution of the Work otherwise complies with\n the conditions stated in this License.\n\n5. Submission of Contributions. Unless You explicitly state otherwise,\n any Contribution intentionally submitted for inclusion in the Work\n by You to the Licensor shall be under the terms and conditions of\n this License, without any additional terms or conditions.\n Notwithstanding the above, nothing herein shall supersede or modify\n the terms of any separate license agreement you may have executed\n with Licensor regarding such Contributions.\n\n6. Trademarks. This License does not grant permission to use the trade\n names, trademarks, service marks, or product names of the Licensor,\n except as required for reasonable and customary use in describing the\n origin of the Work and reproducing the content of the NOTICE file.\n\n7. Disclaimer of Warranty. Unless required by applicable law or\n agreed to in writing, Licensor provides the Work (and each\n Contributor provides its Contributions) on an \"AS IS\" BASIS,\n WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or\n implied, including, without limitation, any warranties or conditions\n of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A\n PARTICULAR PURPOSE. You are solely responsible for determining the\n appropriateness of using or redistributing the Work and assume any\n risks associated with Your exercise of permissions under this License.\n\n8. Limitation of Liability. In no event and under no legal theory,\n whether in tort (including negligence), contract, or otherwise,\n unless required by applicable law (such as deliberate and grossly\n negligent acts) or agreed to in writing, shall any Contributor be\n liable to You for damages, including any direct, indirect, special,\n incidental, or consequential damages of any character arising as a\n result of this License or out of the use or inability to use the\n Work (including but not limited to damages for loss of goodwill,\n work stoppage, computer failure or malfunction, or any and all\n other commercial damages or losses), even if such Contributor\n has been advised of the possibility of such damages.\n\n9. Accepting Warranty or Additional Liability. While redistributing\n the Work or Derivative Works thereof, You may choose to offer,\n and charge a fee for, acceptance of support, warranty, indemnity,\n or other liability obligations and/or rights consistent with this\n License. However, in accepting such obligations, You may act only\n on Your own behalf and on Your sole responsibility, not on behalf\n of any other Contributor, and only if You agree to indemnify,\n defend, and hold each Contributor harmless for any liability\n incurred by, or claims asserted against, such Contributor by reason\n of your accepting any such warranty or additional liability.\n",
"License text": " Apache License\n Version 2.0, January 2004\n http://www.apache.org/licenses/\n\n TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION\n\n 1. Definitions.\n\n \"License\" shall mean the terms and conditions for use, reproduction,\n and distribution as defined by Sections 1 through 9 of this document.\n\n \"Licensor\" shall mean the copyright owner or entity authorized by\n the copyright owner that is granting the License.\n\n \"Legal Entity\" shall mean the union of the acting entity and all\n other entities that control, are controlled by, or are under common\n control with that entity. For the purposes of this definition,\n \"control\" means (i) the power, direct or indirect, to cause the\n direction or management of such entity, whether by contract or\n otherwise, or (ii) ownership of fifty percent (50%) or more of the\n outstanding shares, or (iii) beneficial ownership of such entity.\n\n \"You\" (or \"Your\") shall mean an individual or Legal Entity\n exercising permissions granted by this License.\n\n \"Source\" form shall mean the preferred form for making modifications,\n including but not limited to software source code, documentation\n source, and configuration files.\n\n \"Object\" form shall mean any form resulting from mechanical\n transformation or translation of a Source form, including but\n not limited to compiled object code, generated documentation,\n and conversions to other media types.\n\n \"Work\" shall mean the work of authorship, whether in Source or\n Object form, made available under the License, as indicated by a\n copyright notice that is included in or attached to the work\n (an example is provided in the Appendix below).\n\n \"Derivative Works\" shall mean any work, whether in Source or Object\n form, that is based on (or derived from) the Work and for which the\n editorial revisions, annotations, elaborations, or other modifications\n represent, as a whole, an original work of authorship. For the purposes\n of this License, Derivative Works shall not include works that remain\n separable from, or merely link (or bind by name) to the interfaces of,\n the Work and Derivative Works thereof.\n\n \"Contribution\" shall mean any work of authorship, including\n the original version of the Work and any modifications or additions\n to that Work or Derivative Works thereof, that is intentionally\n submitted to Licensor for inclusion in the Work by the copyright owner\n or by an individual or Legal Entity authorized to submit on behalf of\n the copyright owner. For the purposes of this definition, \"submitted\"\n means any form of electronic, verbal, or written communication sent\n to the Licensor or its representatives, including but not limited to\n communication on electronic mailing lists, source code control systems,\n and issue tracking systems that are managed by, or on behalf of, the\n Licensor for the purpose of discussing and improving the Work, but\n excluding communication that is conspicuously marked or otherwise\n designated in writing by the copyright owner as \"Not a Contribution.\"\n\n \"Contributor\" shall mean Licensor and any individual or Legal Entity\n on behalf of whom a Contribution has been received by Licensor and\n subsequently incorporated within the Work.\n\n 2. Grant of Copyright License. Subject to the terms and conditions of\n this License, each Contributor hereby grants to You a perpetual,\n worldwide, non-exclusive, no-charge, royalty-free, irrevocable\n copyright license to reproduce, prepare Derivative Works of,\n publicly display, publicly perform, sublicense, and distribute the\n Work and such Derivative Works in Source or Object form.\n\n 3. Grant of Patent License. Subject to the terms and conditions of\n this License, each Contributor hereby grants to You a perpetual,\n worldwide, non-exclusive, no-charge, royalty-free, irrevocable\n (except as stated in this section) patent license to make, have made,\n use, offer to sell, sell, import, and otherwise transfer the Work,\n where such license applies only to those patent claims licensable\n by such Contributor that are necessarily infringed by their\n Contribution(s) alone or by combination of their Contribution(s)\n with the Work to which such Contribution(s) was submitted. If You\n institute patent litigation against any entity (including a\n cross-claim or counterclaim in a lawsuit) alleging that the Work\n or a Contribution incorporated within the Work constitutes direct\n or contributory patent infringement, then any patent licenses\n granted to You under this License for that Work shall terminate\n as of the date such litigation is filed.\n\n 4. Redistribution. You may reproduce and distribute copies of the\n Work or Derivative Works thereof in any medium, with or without\n modifications, and in Source or Object form, provided that You\n meet the following conditions:\n\n (a) You must give any other recipients of the Work or\n Derivative Works a copy of this License; and\n\n (b) You must cause any modified files to carry prominent notices\n stating that You changed the files; and\n\n (c) You must retain, in the Source form of any Derivative Works\n that You distribute, all copyright, patent, trademark, and\n attribution notices from the Source form of the Work,\n excluding those notices that do not pertain to any part of\n the Derivative Works; and\n\n (d) If the Work includes a \"NOTICE\" text file as part of its\n distribution, then any Derivative Works that You distribute must\n include a readable copy of the attribution notices contained\n within such NOTICE file, excluding those notices that do not\n pertain to any part of the Derivative Works, in at least one\n of the following places: within a NOTICE text file distributed\n as part of the Derivative Works; within the Source form or\n documentation, if provided along with the Derivative Works; or,\n within a display generated by the Derivative Works, if and\n wherever such third-party notices normally appear. The contents\n of the NOTICE file are for informational purposes only and\n do not modify the License. You may add Your own attribution\n notices within Derivative Works that You distribute, alongside\n or as an addendum to the NOTICE text from the Work, provided\n that such additional attribution notices cannot be construed\n as modifying the License.\n\n You may add Your own copyright statement to Your modifications and\n may provide additional or different license terms and conditions\n for use, reproduction, or distribution of Your modifications, or\n for any such Derivative Works as a whole, provided Your use,\n reproduction, and distribution of the Work otherwise complies with\n the conditions stated in this License.\n\n 5. Submission of Contributions. Unless You explicitly state otherwise,\n any Contribution intentionally submitted for inclusion in the Work\n by You to the Licensor shall be under the terms and conditions of\n this License, without any additional terms or conditions.\n Notwithstanding the above, nothing herein shall supersede or modify\n the terms of any separate license agreement you may have executed\n with Licensor regarding such Contributions.\n\n 6. Trademarks. This License does not grant permission to use the trade\n names, trademarks, service marks, or product names of the Licensor,\n except as required for reasonable and customary use in describing the\n origin of the Work and reproducing the content of the NOTICE file.\n\n 7. Disclaimer of Warranty. Unless required by applicable law or\n agreed to in writing, Licensor provides the Work (and each\n Contributor provides its Contributions) on an \"AS IS\" BASIS,\n WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or\n implied, including, without limitation, any warranties or conditions\n of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A\n PARTICULAR PURPOSE. You are solely responsible for determining the\n appropriateness of using or redistributing the Work and assume any\n risks associated with Your exercise of permissions under this License.\n\n 8. Limitation of Liability. In no event and under no legal theory,\n whether in tort (including negligence), contract, or otherwise,\n unless required by applicable law (such as deliberate and grossly\n negligent acts) or agreed to in writing, shall any Contributor be\n liable to You for damages, including any direct, indirect, special,\n incidental, or consequential damages of any character arising as a\n result of this License or out of the use or inability to use the\n Work (including but not limited to damages for loss of goodwill,\n work stoppage, computer failure or malfunction, or any and all\n other commercial damages or losses), even if such Contributor\n has been advised of the possibility of such damages.\n\n 9. Accepting Warranty or Additional Liability. While redistributing\n the Work or Derivative Works thereof, You may choose to offer,\n and charge a fee for, acceptance of support, warranty, indemnity,\n or other liability obligations and/or rights consistent with this\n License. However, in accepting such obligations, You may act only\n on Your own behalf and on Your sole responsibility, not on behalf\n of any other Contributor, and only if You agree to indemnify,\n defend, and hold each Contributor harmless for any liability\n incurred by, or claims asserted against, such Contributor by reason\n of your accepting any such warranty or additional liability.\n\n END OF TERMS AND CONDITIONS\n\n APPENDIX: How to apply the Apache License to your work.\n\n To apply the Apache License to your work, attach the following\n boilerplate notice, with the fields enclosed by brackets \"[]\"\n replaced with your own identifying information. (Don't include\n the brackets!) The text should be enclosed in the appropriate\n comment syntax for the file format. We also recommend that a\n file or class name and description of purpose be included on the\n same \"printed page\" as the copyright notice for easier\n identification within third-party archives.\n\n Copyright [yyyy] [name of copyright owner]\n\n Licensed under the Apache License, Version 2.0 (the \"License\");\n you may not use this file except in compliance with the License.\n You may obtain a copy of the License at\n\n http://www.apache.org/licenses/LICENSE-2.0\n\n Unless required by applicable law or agreed to in writing, software\n distributed under the License is distributed on an \"AS IS\" BASIS,\n WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n See the License for the specific language governing permissions and\n limitations under the License.\n"
},
{
"Name": "requests",
"Version": "2.22.0",
"Summary": "Python HTTP for Humans.",
"Home-page": "http://python-requests.org",
"Author": "Kenneth Reitz",
"License": "Apache 2.0"
},
{
"Name": "requests-oauthlib",
"Version": "1.2.0",
"Summary": "OAuthlib authentication support for Requests.",
"Home-page": "https://github.com/requests/requests-oauthlib",
"Author": "Kenneth Reitz",
"License": "ISC License",
"License URL": "https://api.github.com/repos/requests/requests-oauthlib/license",
"License repo": "ISC License\n\nCopyright (c) 2014 Kenneth Reitz.\n\nPermission to use, copy, modify, and/or distribute this software for any\npurpose with or without fee is hereby granted, provided that the above\ncopyright notice and this permission notice appear in all copies.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\" AND THE AUTHOR DISCLAIMS ALL WARRANTIES\nWITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF\nMERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR\nANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES\nWHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN\nACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF\nOR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE.\n",
"License text": "ISC License\n\nCopyright (c) [year], [fullname]\n\nPermission to use, copy, modify, and/or distribute this software for any\npurpose with or without fee is hereby granted, provided that the above\ncopyright notice and this permission notice appear in all copies.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\" AND THE AUTHOR DISCLAIMS ALL WARRANTIES\nWITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF\nMERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR\nANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES\nWHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN\nACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF\nOR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE.\n"
},
{
"Name": "requests-toolbelt",
"Version": "0.9.1",
"Summary": "A utility belt for advanced users of python-requests",
"Home-page": "https://toolbelt.readthedocs.org",
"Author": "Ian Cordasco, Cory Benfield",
"License": "Apache 2.0"
},
{
"Name": "ruamel.yaml",
"Version": "0.16.10",
"Summary": "ruamel.yaml is a YAML parser/emitter that supports roundtrip preservation of comments, seq/map flow style, and map key order",
"Home-page": "https://sourceforge.net/p/ruamel-yaml/code/ci/default/tree",
"Author": "Anthon van der Neut",
"License": "MIT license"
},
{
"Name": "ruamel.yaml.clib",
"Version": "0.2.0",
"Summary": "C version of reader, parser and emitter for ruamel.yaml derived from libyaml",
"Home-page": "https://bitbucket.org/ruamel/yaml.clib",
"Author": "Anthon van der Neut",
"License": "MIT"
},
{
"Name": "s3transfer",
"Version": "0.2.1",
"Summary": "An Amazon S3 Transfer Manager",
"Home-page": "https://github.com/boto/s3transfer",
"Author": "Amazon Web Services",
"License": "Apache License 2.0",
"License URL": "https://api.github.com/repos/boto/s3transfer/license",
"License repo": "\n Apache License\n Version 2.0, January 2004\n http://www.apache.org/licenses/\n\n TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION\n\n 1. Definitions.\n\n \"License\" shall mean the terms and conditions for use, reproduction,\n and distribution as defined by Sections 1 through 9 of this document.\n\n \"Licensor\" shall mean the copyright owner or entity authorized by\n the copyright owner that is granting the License.\n\n \"Legal Entity\" shall mean the union of the acting entity and all\n other entities that control, are controlled by, or are under common\n control with that entity. For the purposes of this definition,\n \"control\" means (i) the power, direct or indirect, to cause the\n direction or management of such entity, whether by contract or\n otherwise, or (ii) ownership of fifty percent (50%) or more of the\n outstanding shares, or (iii) beneficial ownership of such entity.\n\n \"You\" (or \"Your\") shall mean an individual or Legal Entity\n exercising permissions granted by this License.\n\n \"Source\" form shall mean the preferred form for making modifications,\n including but not limited to software source code, documentation\n source, and configuration files.\n\n \"Object\" form shall mean any form resulting from mechanical\n transformation or translation of a Source form, including but\n not limited to compiled object code, generated documentation,\n and conversions to other media types.\n\n \"Work\" shall mean the work of authorship, whether in Source or\n Object form, made available under the License, as indicated by a\n copyright notice that is included in or attached to the work\n (an example is provided in the Appendix below).\n\n \"Derivative Works\" shall mean any work, whether in Source or Object\n form, that is based on (or derived from) the Work and for which the\n editorial revisions, annotations, elaborations, or other modifications\n represent, as a whole, an original work of authorship. For the purposes\n of this License, Derivative Works shall not include works that remain\n separable from, or merely link (or bind by name) to the interfaces of,\n the Work and Derivative Works thereof.\n\n \"Contribution\" shall mean any work of authorship, including\n the original version of the Work and any modifications or additions\n to that Work or Derivative Works thereof, that is intentionally\n submitted to Licensor for inclusion in the Work by the copyright owner\n or by an individual or Legal Entity authorized to submit on behalf of\n the copyright owner. For the purposes of this definition, \"submitted\"\n means any form of electronic, verbal, or written communication sent\n to the Licensor or its representatives, including but not limited to\n communication on electronic mailing lists, source code control systems,\n and issue tracking systems that are managed by, or on behalf of, the\n Licensor for the purpose of discussing and improving the Work, but\n excluding communication that is conspicuously marked or otherwise\n designated in writing by the copyright owner as \"Not a Contribution.\"\n\n \"Contributor\" shall mean Licensor and any individual or Legal Entity\n on behalf of whom a Contribution has been received by Licensor and\n subsequently incorporated within the Work.\n\n 2. Grant of Copyright License. Subject to the terms and conditions of\n this License, each Contributor hereby grants to You a perpetual,\n worldwide, non-exclusive, no-charge, royalty-free, irrevocable\n copyright license to reproduce, prepare Derivative Works of,\n publicly display, publicly perform, sublicense, and distribute the\n Work and such Derivative Works in Source or Object form.\n\n 3. Grant of Patent License. Subject to the terms and conditions of\n this License, each Contributor hereby grants to You a perpetual,\n worldwide, non-exclusive, no-charge, royalty-free, irrevocable\n (except as stated in this section) patent license to make, have made,\n use, offer to sell, sell, import, and otherwise transfer the Work,\n where such license applies only to those patent claims licensable\n by such Contributor that are necessarily infringed by their\n Contribution(s) alone or by combination of their Contribution(s)\n with the Work to which such Contribution(s) was submitted. If You\n institute patent litigation against any entity (including a\n cross-claim or counterclaim in a lawsuit) alleging that the Work\n or a Contribution incorporated within the Work constitutes direct\n or contributory patent infringement, then any patent licenses\n granted to You under this License for that Work shall terminate\n as of the date such litigation is filed.\n\n 4. Redistribution. You may reproduce and distribute copies of the\n Work or Derivative Works thereof in any medium, with or without\n modifications, and in Source or Object form, provided that You\n meet the following conditions:\n\n (a) You must give any other recipients of the Work or\n Derivative Works a copy of this License; and\n\n (b) You must cause any modified files to carry prominent notices\n stating that You changed the files; and\n\n (c) You must retain, in the Source form of any Derivative Works\n that You distribute, all copyright, patent, trademark, and\n attribution notices from the Source form of the Work,\n excluding those notices that do not pertain to any part of\n the Derivative Works; and\n\n (d) If the Work includes a \"NOTICE\" text file as part of its\n distribution, then any Derivative Works that You distribute must\n include a readable copy of the attribution notices contained\n within such NOTICE file, excluding those notices that do not\n pertain to any part of the Derivative Works, in at least one\n of the following places: within a NOTICE text file distributed\n as part of the Derivative Works; within the Source form or\n documentation, if provided along with the Derivative Works; or,\n within a display generated by the Derivative Works, if and\n wherever such third-party notices normally appear. The contents\n of the NOTICE file are for informational purposes only and\n do not modify the License. You may add Your own attribution\n notices within Derivative Works that You distribute, alongside\n or as an addendum to the NOTICE text from the Work, provided\n that such additional attribution notices cannot be construed\n as modifying the License.\n\n You may add Your own copyright statement to Your modifications and\n may provide additional or different license terms and conditions\n for use, reproduction, or distribution of Your modifications, or\n for any such Derivative Works as a whole, provided Your use,\n reproduction, and distribution of the Work otherwise complies with\n the conditions stated in this License.\n\n 5. Submission of Contributions. Unless You explicitly state otherwise,\n any Contribution intentionally submitted for inclusion in the Work\n by You to the Licensor shall be under the terms and conditions of\n this License, without any additional terms or conditions.\n Notwithstanding the above, nothing herein shall supersede or modify\n the terms of any separate license agreement you may have executed\n with Licensor regarding such Contributions.\n\n 6. Trademarks. This License does not grant permission to use the trade\n names, trademarks, service marks, or product names of the Licensor,\n except as required for reasonable and customary use in describing the\n origin of the Work and reproducing the content of the NOTICE file.\n\n 7. Disclaimer of Warranty. Unless required by applicable law or\n agreed to in writing, Licensor provides the Work (and each\n Contributor provides its Contributions) on an \"AS IS\" BASIS,\n WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or\n implied, including, without limitation, any warranties or conditions\n of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A\n PARTICULAR PURPOSE. You are solely responsible for determining the\n appropriateness of using or redistributing the Work and assume any\n risks associated with Your exercise of permissions under this License.\n\n 8. Limitation of Liability. In no event and under no legal theory,\n whether in tort (including negligence), contract, or otherwise,\n unless required by applicable law (such as deliberate and grossly\n negligent acts) or agreed to in writing, shall any Contributor be\n liable to You for damages, including any direct, indirect, special,\n incidental, or consequential damages of any character arising as a\n result of this License or out of the use or inability to use the\n Work (including but not limited to damages for loss of goodwill,\n work stoppage, computer failure or malfunction, or any and all\n other commercial damages or losses), even if such Contributor\n has been advised of the possibility of such damages.\n\n 9. Accepting Warranty or Additional Liability. While redistributing\n the Work or Derivative Works thereof, You may choose to offer,\n and charge a fee for, acceptance of support, warranty, indemnity,\n or other liability obligations and/or rights consistent with this\n License. However, in accepting such obligations, You may act only\n on Your own behalf and on Your sole responsibility, not on behalf\n of any other Contributor, and only if You agree to indemnify,\n defend, and hold each Contributor harmless for any liability\n incurred by, or claims asserted against, such Contributor by reason\n of your accepting any such warranty or additional liability.\n\n END OF TERMS AND CONDITIONS\n\n APPENDIX: How to apply the Apache License to your work.\n\n To apply the Apache License to your work, attach the following\n boilerplate notice, with the fields enclosed by brackets \"[]\"\n replaced with your own identifying information. (Don't include\n the brackets!) The text should be enclosed in the appropriate\n comment syntax for the file format. We also recommend that a\n file or class name and description of purpose be included on the\n same \"printed page\" as the copyright notice for easier\n identification within third-party archives.\n\n Copyright [yyyy] [name of copyright owner]\n\n Licensed under the Apache License, Version 2.0 (the \"License\");\n you may not use this file except in compliance with the License.\n You may obtain a copy of the License at\n\n http://www.apache.org/licenses/LICENSE-2.0\n\n Unless required by applicable law or agreed to in writing, software\n distributed under the License is distributed on an \"AS IS\" BASIS,\n WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n See the License for the specific language governing permissions and\n limitations under the License.\n\n",
"License text": " Apache License\n Version 2.0, January 2004\n http://www.apache.org/licenses/\n\n TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION\n\n 1. Definitions.\n\n \"License\" shall mean the terms and conditions for use, reproduction,\n and distribution as defined by Sections 1 through 9 of this document.\n\n \"Licensor\" shall mean the copyright owner or entity authorized by\n the copyright owner that is granting the License.\n\n \"Legal Entity\" shall mean the union of the acting entity and all\n other entities that control, are controlled by, or are under common\n control with that entity. For the purposes of this definition,\n \"control\" means (i) the power, direct or indirect, to cause the\n direction or management of such entity, whether by contract or\n otherwise, or (ii) ownership of fifty percent (50%) or more of the\n outstanding shares, or (iii) beneficial ownership of such entity.\n\n \"You\" (or \"Your\") shall mean an individual or Legal Entity\n exercising permissions granted by this License.\n\n \"Source\" form shall mean the preferred form for making modifications,\n including but not limited to software source code, documentation\n source, and configuration files.\n\n \"Object\" form shall mean any form resulting from mechanical\n transformation or translation of a Source form, including but\n not limited to compiled object code, generated documentation,\n and conversions to other media types.\n\n \"Work\" shall mean the work of authorship, whether in Source or\n Object form, made available under the License, as indicated by a\n copyright notice that is included in or attached to the work\n (an example is provided in the Appendix below).\n\n \"Derivative Works\" shall mean any work, whether in Source or Object\n form, that is based on (or derived from) the Work and for which the\n editorial revisions, annotations, elaborations, or other modifications\n represent, as a whole, an original work of authorship. For the purposes\n of this License, Derivative Works shall not include works that remain\n separable from, or merely link (or bind by name) to the interfaces of,\n the Work and Derivative Works thereof.\n\n \"Contribution\" shall mean any work of authorship, including\n the original version of the Work and any modifications or additions\n to that Work or Derivative Works thereof, that is intentionally\n submitted to Licensor for inclusion in the Work by the copyright owner\n or by an individual or Legal Entity authorized to submit on behalf of\n the copyright owner. For the purposes of this definition, \"submitted\"\n means any form of electronic, verbal, or written communication sent\n to the Licensor or its representatives, including but not limited to\n communication on electronic mailing lists, source code control systems,\n and issue tracking systems that are managed by, or on behalf of, the\n Licensor for the purpose of discussing and improving the Work, but\n excluding communication that is conspicuously marked or otherwise\n designated in writing by the copyright owner as \"Not a Contribution.\"\n\n \"Contributor\" shall mean Licensor and any individual or Legal Entity\n on behalf of whom a Contribution has been received by Licensor and\n subsequently incorporated within the Work.\n\n 2. Grant of Copyright License. Subject to the terms and conditions of\n this License, each Contributor hereby grants to You a perpetual,\n worldwide, non-exclusive, no-charge, royalty-free, irrevocable\n copyright license to reproduce, prepare Derivative Works of,\n publicly display, publicly perform, sublicense, and distribute the\n Work and such Derivative Works in Source or Object form.\n\n 3. Grant of Patent License. Subject to the terms and conditions of\n this License, each Contributor hereby grants to You a perpetual,\n worldwide, non-exclusive, no-charge, royalty-free, irrevocable\n (except as stated in this section) patent license to make, have made,\n use, offer to sell, sell, import, and otherwise transfer the Work,\n where such license applies only to those patent claims licensable\n by such Contributor that are necessarily infringed by their\n Contribution(s) alone or by combination of their Contribution(s)\n with the Work to which such Contribution(s) was submitted. If You\n institute patent litigation against any entity (including a\n cross-claim or counterclaim in a lawsuit) alleging that the Work\n or a Contribution incorporated within the Work constitutes direct\n or contributory patent infringement, then any patent licenses\n granted to You under this License for that Work shall terminate\n as of the date such litigation is filed.\n\n 4. Redistribution. You may reproduce and distribute copies of the\n Work or Derivative Works thereof in any medium, with or without\n modifications, and in Source or Object form, provided that You\n meet the following conditions:\n\n (a) You must give any other recipients of the Work or\n Derivative Works a copy of this License; and\n\n (b) You must cause any modified files to carry prominent notices\n stating that You changed the files; and\n\n (c) You must retain, in the Source form of any Derivative Works\n that You distribute, all copyright, patent, trademark, and\n attribution notices from the Source form of the Work,\n excluding those notices that do not pertain to any part of\n the Derivative Works; and\n\n (d) If the Work includes a \"NOTICE\" text file as part of its\n distribution, then any Derivative Works that You distribute must\n include a readable copy of the attribution notices contained\n within such NOTICE file, excluding those notices that do not\n pertain to any part of the Derivative Works, in at least one\n of the following places: within a NOTICE text file distributed\n as part of the Derivative Works; within the Source form or\n documentation, if provided along with the Derivative Works; or,\n within a display generated by the Derivative Works, if and\n wherever such third-party notices normally appear. The contents\n of the NOTICE file are for informational purposes only and\n do not modify the License. You may add Your own attribution\n notices within Derivative Works that You distribute, alongside\n or as an addendum to the NOTICE text from the Work, provided\n that such additional attribution notices cannot be construed\n as modifying the License.\n\n You may add Your own copyright statement to Your modifications and\n may provide additional or different license terms and conditions\n for use, reproduction, or distribution of Your modifications, or\n for any such Derivative Works as a whole, provided Your use,\n reproduction, and distribution of the Work otherwise complies with\n the conditions stated in this License.\n\n 5. Submission of Contributions. Unless You explicitly state otherwise,\n any Contribution intentionally submitted for inclusion in the Work\n by You to the Licensor shall be under the terms and conditions of\n this License, without any additional terms or conditions.\n Notwithstanding the above, nothing herein shall supersede or modify\n the terms of any separate license agreement you may have executed\n with Licensor regarding such Contributions.\n\n 6. Trademarks. This License does not grant permission to use the trade\n names, trademarks, service marks, or product names of the Licensor,\n except as required for reasonable and customary use in describing the\n origin of the Work and reproducing the content of the NOTICE file.\n\n 7. Disclaimer of Warranty. Unless required by applicable law or\n agreed to in writing, Licensor provides the Work (and each\n Contributor provides its Contributions) on an \"AS IS\" BASIS,\n WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or\n implied, including, without limitation, any warranties or conditions\n of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A\n PARTICULAR PURPOSE. You are solely responsible for determining the\n appropriateness of using or redistributing the Work and assume any\n risks associated with Your exercise of permissions under this License.\n\n 8. Limitation of Liability. In no event and under no legal theory,\n whether in tort (including negligence), contract, or otherwise,\n unless required by applicable law (such as deliberate and grossly\n negligent acts) or agreed to in writing, shall any Contributor be\n liable to You for damages, including any direct, indirect, special,\n incidental, or consequential damages of any character arising as a\n result of this License or out of the use or inability to use the\n Work (including but not limited to damages for loss of goodwill,\n work stoppage, computer failure or malfunction, or any and all\n other commercial damages or losses), even if such Contributor\n has been advised of the possibility of such damages.\n\n 9. Accepting Warranty or Additional Liability. While redistributing\n the Work or Derivative Works thereof, You may choose to offer,\n and charge a fee for, acceptance of support, warranty, indemnity,\n or other liability obligations and/or rights consistent with this\n License. However, in accepting such obligations, You may act only\n on Your own behalf and on Your sole responsibility, not on behalf\n of any other Contributor, and only if You agree to indemnify,\n defend, and hold each Contributor harmless for any liability\n incurred by, or claims asserted against, such Contributor by reason\n of your accepting any such warranty or additional liability.\n\n END OF TERMS AND CONDITIONS\n\n APPENDIX: How to apply the Apache License to your work.\n\n To apply the Apache License to your work, attach the following\n boilerplate notice, with the fields enclosed by brackets \"[]\"\n replaced with your own identifying information. (Don't include\n the brackets!) The text should be enclosed in the appropriate\n comment syntax for the file format. We also recommend that a\n file or class name and description of purpose be included on the\n same \"printed page\" as the copyright notice for easier\n identification within third-party archives.\n\n Copyright [yyyy] [name of copyright owner]\n\n Licensed under the Apache License, Version 2.0 (the \"License\");\n you may not use this file except in compliance with the License.\n You may obtain a copy of the License at\n\n http://www.apache.org/licenses/LICENSE-2.0\n\n Unless required by applicable law or agreed to in writing, software\n distributed under the License is distributed on an \"AS IS\" BASIS,\n WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n See the License for the specific language governing permissions and\n limitations under the License.\n"
},
{
"Name": "scp",
"Version": "0.13.2",
"Summary": "scp module for paramiko",
"Home-page": "https://github.com/jbardin/scp.py",
"Author": "James Bardin",
"License": "Other",
"License URL": "https://api.github.com/repos/jbardin/scp.py/license",
"License repo": "# This library is free software; you can redistribute it and/or\n# modify it under the terms of the GNU Lesser General Public\n# License as published by the Free Software Foundation; either\n# version 2.1 of the License, or (at your option) any later version.\n#\n# This library is distributed in the hope that it will be useful,\n# but WITHOUT ANY WARRANTY; without even the implied warranty of\n# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU\n# Lesser General Public License for more details.\n#\n# You should have received a copy of the GNU Lesser General Public\n# License along with this library; if not, write to the Free Software\n# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA\n\n"
},
{
"Name": "SecretStorage",
"Version": "3.1.2",
"Summary": "Python bindings to FreeDesktop.org Secret Service API",
"Home-page": "https://github.com/mitya57/secretstorage",
"Author": "Dmitry Shachnev",
"License": "Other",
"License URL": "https://api.github.com/repos/mitya57/secretstorage/license",
"License repo": "Copyright 2012-2018 Dmitry Shachnev <mitya57@gmail.com>\nAll rights reserved.\n\nRedistribution and use in source and binary forms, with or without\nmodification, are permitted provided that the following conditions are met:\n\n1. Redistributions of source code must retain the above copyright notice, this\n list of conditions and the following disclaimer.\n2. Redistributions in binary form must reproduce the above copyright notice,\n this list of conditions and the following disclaimer in the documentation\n and/or other materials provided with the distribution.\n3. Neither the name of the University nor the names of its contributors may be\n used to endorse or promote products derived from this software without\n specific prior written permission.\n\nTHIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS \"AS IS\" AND\nANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED\nWARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE\nDISCLAIMED. IN NO EVENT SHALL THE REGENTS OR CONTRIBUTORS BE LIABLE FOR ANY\nDIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES\n(INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES;\nLOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON\nANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT\n(INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS\nSOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.\n"
},
{
"Name": "six",
"Version": "1.12.0",
"Summary": "Python 2 and 3 compatibility utilities",
"Home-page": "https://github.com/benjaminp/six",
"Author": "Benjamin Peterson",
"License": "MIT License",
"License URL": "https://api.github.com/repos/benjaminp/six/license",
"License repo": "Copyright (c) 2010-2020 Benjamin Peterson\n\nPermission is hereby granted, free of charge, to any person obtaining a copy of\nthis software and associated documentation files (the \"Software\"), to deal in\nthe Software without restriction, including without limitation the rights to\nuse, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of\nthe Software, and to permit persons to whom the Software is furnished to do so,\nsubject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS\nFOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR\nCOPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER\nIN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN\nCONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "skopeo-bin",
"Version": "1.0.3",
"Summary": "UNKNOWN",
"Home-page": "https://github.com/epiphany-platform/skopeo-bin",
"Author": "Epiphany Team",
"License": "Apache License 2.0",
"License URL": "https://api.github.com/repos/epiphany-platform/skopeo-bin/license",
"License repo": "\n Apache License\n Version 2.0, January 2004\n http://www.apache.org/licenses/\n\n TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION\n\n 1. Definitions.\n\n \"License\" shall mean the terms and conditions for use, reproduction,\n and distribution as defined by Sections 1 through 9 of this document.\n\n \"Licensor\" shall mean the copyright owner or entity authorized by\n the copyright owner that is granting the License.\n\n \"Legal Entity\" shall mean the union of the acting entity and all\n other entities that control, are controlled by, or are under common\n control with that entity. For the purposes of this definition,\n \"control\" means (i) the power, direct or indirect, to cause the\n direction or management of such entity, whether by contract or\n otherwise, or (ii) ownership of fifty percent (50%) or more of the\n outstanding shares, or (iii) beneficial ownership of such entity.\n\n \"You\" (or \"Your\") shall mean an individual or Legal Entity\n exercising permissions granted by this License.\n\n \"Source\" form shall mean the preferred form for making modifications,\n including but not limited to software source code, documentation\n source, and configuration files.\n\n \"Object\" form shall mean any form resulting from mechanical\n transformation or translation of a Source form, including but\n not limited to compiled object code, generated documentation,\n and conversions to other media types.\n\n \"Work\" shall mean the work of authorship, whether in Source or\n Object form, made available under the License, as indicated by a\n copyright notice that is included in or attached to the work\n (an example is provided in the Appendix below).\n\n \"Derivative Works\" shall mean any work, whether in Source or Object\n form, that is based on (or derived from) the Work and for which the\n editorial revisions, annotations, elaborations, or other modifications\n represent, as a whole, an original work of authorship. For the purposes\n of this License, Derivative Works shall not include works that remain\n separable from, or merely link (or bind by name) to the interfaces of,\n the Work and Derivative Works thereof.\n\n \"Contribution\" shall mean any work of authorship, including\n the original version of the Work and any modifications or additions\n to that Work or Derivative Works thereof, that is intentionally\n submitted to Licensor for inclusion in the Work by the copyright owner\n or by an individual or Legal Entity authorized to submit on behalf of\n the copyright owner. For the purposes of this definition, \"submitted\"\n means any form of electronic, verbal, or written communication sent\n to the Licensor or its representatives, including but not limited to\n communication on electronic mailing lists, source code control systems,\n and issue tracking systems that are managed by, or on behalf of, the\n Licensor for the purpose of discussing and improving the Work, but\n excluding communication that is conspicuously marked or otherwise\n designated in writing by the copyright owner as \"Not a Contribution.\"\n\n \"Contributor\" shall mean Licensor and any individual or Legal Entity\n on behalf of whom a Contribution has been received by Licensor and\n subsequently incorporated within the Work.\n\n 2. Grant of Copyright License. Subject to the terms and conditions of\n this License, each Contributor hereby grants to You a perpetual,\n worldwide, non-exclusive, no-charge, royalty-free, irrevocable\n copyright license to reproduce, prepare Derivative Works of,\n publicly display, publicly perform, sublicense, and distribute the\n Work and such Derivative Works in Source or Object form.\n\n 3. Grant of Patent License. Subject to the terms and conditions of\n this License, each Contributor hereby grants to You a perpetual,\n worldwide, non-exclusive, no-charge, royalty-free, irrevocable\n (except as stated in this section) patent license to make, have made,\n use, offer to sell, sell, import, and otherwise transfer the Work,\n where such license applies only to those patent claims licensable\n by such Contributor that are necessarily infringed by their\n Contribution(s) alone or by combination of their Contribution(s)\n with the Work to which such Contribution(s) was submitted. If You\n institute patent litigation against any entity (including a\n cross-claim or counterclaim in a lawsuit) alleging that the Work\n or a Contribution incorporated within the Work constitutes direct\n or contributory patent infringement, then any patent licenses\n granted to You under this License for that Work shall terminate\n as of the date such litigation is filed.\n\n 4. Redistribution. You may reproduce and distribute copies of the\n Work or Derivative Works thereof in any medium, with or without\n modifications, and in Source or Object form, provided that You\n meet the following conditions:\n\n (a) You must give any other recipients of the Work or\n Derivative Works a copy of this License; and\n\n (b) You must cause any modified files to carry prominent notices\n stating that You changed the files; and\n\n (c) You must retain, in the Source form of any Derivative Works\n that You distribute, all copyright, patent, trademark, and\n attribution notices from the Source form of the Work,\n excluding those notices that do not pertain to any part of\n the Derivative Works; and\n\n (d) If the Work includes a \"NOTICE\" text file as part of its\n distribution, then any Derivative Works that You distribute must\n include a readable copy of the attribution notices contained\n within such NOTICE file, excluding those notices that do not\n pertain to any part of the Derivative Works, in at least one\n of the following places: within a NOTICE text file distributed\n as part of the Derivative Works; within the Source form or\n documentation, if provided along with the Derivative Works; or,\n within a display generated by the Derivative Works, if and\n wherever such third-party notices normally appear. The contents\n of the NOTICE file are for informational purposes only and\n do not modify the License. You may add Your own attribution\n notices within Derivative Works that You distribute, alongside\n or as an addendum to the NOTICE text from the Work, provided\n that such additional attribution notices cannot be construed\n as modifying the License.\n\n You may add Your own copyright statement to Your modifications and\n may provide additional or different license terms and conditions\n for use, reproduction, or distribution of Your modifications, or\n for any such Derivative Works as a whole, provided Your use,\n reproduction, and distribution of the Work otherwise complies with\n the conditions stated in this License.\n\n 5. Submission of Contributions. Unless You explicitly state otherwise,\n any Contribution intentionally submitted for inclusion in the Work\n by You to the Licensor shall be under the terms and conditions of\n this License, without any additional terms or conditions.\n Notwithstanding the above, nothing herein shall supersede or modify\n the terms of any separate license agreement you may have executed\n with Licensor regarding such Contributions.\n\n 6. Trademarks. This License does not grant permission to use the trade\n names, trademarks, service marks, or product names of the Licensor,\n except as required for reasonable and customary use in describing the\n origin of the Work and reproducing the content of the NOTICE file.\n\n 7. Disclaimer of Warranty. Unless required by applicable law or\n agreed to in writing, Licensor provides the Work (and each\n Contributor provides its Contributions) on an \"AS IS\" BASIS,\n WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or\n implied, including, without limitation, any warranties or conditions\n of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A\n PARTICULAR PURPOSE. You are solely responsible for determining the\n appropriateness of using or redistributing the Work and assume any\n risks associated with Your exercise of permissions under this License.\n\n 8. Limitation of Liability. In no event and under no legal theory,\n whether in tort (including negligence), contract, or otherwise,\n unless required by applicable law (such as deliberate and grossly\n negligent acts) or agreed to in writing, shall any Contributor be\n liable to You for damages, including any direct, indirect, special,\n incidental, or consequential damages of any character arising as a\n result of this License or out of the use or inability to use the\n Work (including but not limited to damages for loss of goodwill,\n work stoppage, computer failure or malfunction, or any and all\n other commercial damages or losses), even if such Contributor\n has been advised of the possibility of such damages.\n\n 9. Accepting Warranty or Additional Liability. While redistributing\n the Work or Derivative Works thereof, You may choose to offer,\n and charge a fee for, acceptance of support, warranty, indemnity,\n or other liability obligations and/or rights consistent with this\n License. However, in accepting such obligations, You may act only\n on Your own behalf and on Your sole responsibility, not on behalf\n of any other Contributor, and only if You agree to indemnify,\n defend, and hold each Contributor harmless for any liability\n incurred by, or claims asserted against, such Contributor by reason\n of your accepting any such warranty or additional liability.\n\n END OF TERMS AND CONDITIONS\n\n APPENDIX: How to apply the Apache License to your work.\n\n To apply the Apache License to your work, attach the following\n boilerplate notice, with the fields enclosed by brackets \"[]\"\n replaced with your own identifying information. (Don't include\n the brackets!) The text should be enclosed in the appropriate\n comment syntax for the file format. We also recommend that a\n file or class name and description of purpose be included on the\n same \"printed page\" as the copyright notice for easier\n identification within third-party archives.\n\n Copyright 2019 ABB. All rights reserved.\n\n Licensed under the Apache License, Version 2.0 (the \"License\");\n you may not use this file except in compliance with the License.\n You may obtain a copy of the License at\n\n http://www.apache.org/licenses/LICENSE-2.0\n\n Unless required by applicable law or agreed to in writing, software\n distributed under the License is distributed on an \"AS IS\" BASIS,\n WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n See the License for the specific language governing permissions and\n limitations under the License.",
"License text": " Apache License\n Version 2.0, January 2004\n http://www.apache.org/licenses/\n\n TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION\n\n 1. Definitions.\n\n \"License\" shall mean the terms and conditions for use, reproduction,\n and distribution as defined by Sections 1 through 9 of this document.\n\n \"Licensor\" shall mean the copyright owner or entity authorized by\n the copyright owner that is granting the License.\n\n \"Legal Entity\" shall mean the union of the acting entity and all\n other entities that control, are controlled by, or are under common\n control with that entity. For the purposes of this definition,\n \"control\" means (i) the power, direct or indirect, to cause the\n direction or management of such entity, whether by contract or\n otherwise, or (ii) ownership of fifty percent (50%) or more of the\n outstanding shares, or (iii) beneficial ownership of such entity.\n\n \"You\" (or \"Your\") shall mean an individual or Legal Entity\n exercising permissions granted by this License.\n\n \"Source\" form shall mean the preferred form for making modifications,\n including but not limited to software source code, documentation\n source, and configuration files.\n\n \"Object\" form shall mean any form resulting from mechanical\n transformation or translation of a Source form, including but\n not limited to compiled object code, generated documentation,\n and conversions to other media types.\n\n \"Work\" shall mean the work of authorship, whether in Source or\n Object form, made available under the License, as indicated by a\n copyright notice that is included in or attached to the work\n (an example is provided in the Appendix below).\n\n \"Derivative Works\" shall mean any work, whether in Source or Object\n form, that is based on (or derived from) the Work and for which the\n editorial revisions, annotations, elaborations, or other modifications\n represent, as a whole, an original work of authorship. For the purposes\n of this License, Derivative Works shall not include works that remain\n separable from, or merely link (or bind by name) to the interfaces of,\n the Work and Derivative Works thereof.\n\n \"Contribution\" shall mean any work of authorship, including\n the original version of the Work and any modifications or additions\n to that Work or Derivative Works thereof, that is intentionally\n submitted to Licensor for inclusion in the Work by the copyright owner\n or by an individual or Legal Entity authorized to submit on behalf of\n the copyright owner. For the purposes of this definition, \"submitted\"\n means any form of electronic, verbal, or written communication sent\n to the Licensor or its representatives, including but not limited to\n communication on electronic mailing lists, source code control systems,\n and issue tracking systems that are managed by, or on behalf of, the\n Licensor for the purpose of discussing and improving the Work, but\n excluding communication that is conspicuously marked or otherwise\n designated in writing by the copyright owner as \"Not a Contribution.\"\n\n \"Contributor\" shall mean Licensor and any individual or Legal Entity\n on behalf of whom a Contribution has been received by Licensor and\n subsequently incorporated within the Work.\n\n 2. Grant of Copyright License. Subject to the terms and conditions of\n this License, each Contributor hereby grants to You a perpetual,\n worldwide, non-exclusive, no-charge, royalty-free, irrevocable\n copyright license to reproduce, prepare Derivative Works of,\n publicly display, publicly perform, sublicense, and distribute the\n Work and such Derivative Works in Source or Object form.\n\n 3. Grant of Patent License. Subject to the terms and conditions of\n this License, each Contributor hereby grants to You a perpetual,\n worldwide, non-exclusive, no-charge, royalty-free, irrevocable\n (except as stated in this section) patent license to make, have made,\n use, offer to sell, sell, import, and otherwise transfer the Work,\n where such license applies only to those patent claims licensable\n by such Contributor that are necessarily infringed by their\n Contribution(s) alone or by combination of their Contribution(s)\n with the Work to which such Contribution(s) was submitted. If You\n institute patent litigation against any entity (including a\n cross-claim or counterclaim in a lawsuit) alleging that the Work\n or a Contribution incorporated within the Work constitutes direct\n or contributory patent infringement, then any patent licenses\n granted to You under this License for that Work shall terminate\n as of the date such litigation is filed.\n\n 4. Redistribution. You may reproduce and distribute copies of the\n Work or Derivative Works thereof in any medium, with or without\n modifications, and in Source or Object form, provided that You\n meet the following conditions:\n\n (a) You must give any other recipients of the Work or\n Derivative Works a copy of this License; and\n\n (b) You must cause any modified files to carry prominent notices\n stating that You changed the files; and\n\n (c) You must retain, in the Source form of any Derivative Works\n that You distribute, all copyright, patent, trademark, and\n attribution notices from the Source form of the Work,\n excluding those notices that do not pertain to any part of\n the Derivative Works; and\n\n (d) If the Work includes a \"NOTICE\" text file as part of its\n distribution, then any Derivative Works that You distribute must\n include a readable copy of the attribution notices contained\n within such NOTICE file, excluding those notices that do not\n pertain to any part of the Derivative Works, in at least one\n of the following places: within a NOTICE text file distributed\n as part of the Derivative Works; within the Source form or\n documentation, if provided along with the Derivative Works; or,\n within a display generated by the Derivative Works, if and\n wherever such third-party notices normally appear. The contents\n of the NOTICE file are for informational purposes only and\n do not modify the License. You may add Your own attribution\n notices within Derivative Works that You distribute, alongside\n or as an addendum to the NOTICE text from the Work, provided\n that such additional attribution notices cannot be construed\n as modifying the License.\n\n You may add Your own copyright statement to Your modifications and\n may provide additional or different license terms and conditions\n for use, reproduction, or distribution of Your modifications, or\n for any such Derivative Works as a whole, provided Your use,\n reproduction, and distribution of the Work otherwise complies with\n the conditions stated in this License.\n\n 5. Submission of Contributions. Unless You explicitly state otherwise,\n any Contribution intentionally submitted for inclusion in the Work\n by You to the Licensor shall be under the terms and conditions of\n this License, without any additional terms or conditions.\n Notwithstanding the above, nothing herein shall supersede or modify\n the terms of any separate license agreement you may have executed\n with Licensor regarding such Contributions.\n\n 6. Trademarks. This License does not grant permission to use the trade\n names, trademarks, service marks, or product names of the Licensor,\n except as required for reasonable and customary use in describing the\n origin of the Work and reproducing the content of the NOTICE file.\n\n 7. Disclaimer of Warranty. Unless required by applicable law or\n agreed to in writing, Licensor provides the Work (and each\n Contributor provides its Contributions) on an \"AS IS\" BASIS,\n WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or\n implied, including, without limitation, any warranties or conditions\n of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A\n PARTICULAR PURPOSE. You are solely responsible for determining the\n appropriateness of using or redistributing the Work and assume any\n risks associated with Your exercise of permissions under this License.\n\n 8. Limitation of Liability. In no event and under no legal theory,\n whether in tort (including negligence), contract, or otherwise,\n unless required by applicable law (such as deliberate and grossly\n negligent acts) or agreed to in writing, shall any Contributor be\n liable to You for damages, including any direct, indirect, special,\n incidental, or consequential damages of any character arising as a\n result of this License or out of the use or inability to use the\n Work (including but not limited to damages for loss of goodwill,\n work stoppage, computer failure or malfunction, or any and all\n other commercial damages or losses), even if such Contributor\n has been advised of the possibility of such damages.\n\n 9. Accepting Warranty or Additional Liability. While redistributing\n the Work or Derivative Works thereof, You may choose to offer,\n and charge a fee for, acceptance of support, warranty, indemnity,\n or other liability obligations and/or rights consistent with this\n License. However, in accepting such obligations, You may act only\n on Your own behalf and on Your sole responsibility, not on behalf\n of any other Contributor, and only if You agree to indemnify,\n defend, and hold each Contributor harmless for any liability\n incurred by, or claims asserted against, such Contributor by reason\n of your accepting any such warranty or additional liability.\n\n END OF TERMS AND CONDITIONS\n\n APPENDIX: How to apply the Apache License to your work.\n\n To apply the Apache License to your work, attach the following\n boilerplate notice, with the fields enclosed by brackets \"[]\"\n replaced with your own identifying information. (Don't include\n the brackets!) The text should be enclosed in the appropriate\n comment syntax for the file format. We also recommend that a\n file or class name and description of purpose be included on the\n same \"printed page\" as the copyright notice for easier\n identification within third-party archives.\n\n Copyright [yyyy] [name of copyright owner]\n\n Licensed under the Apache License, Version 2.0 (the \"License\");\n you may not use this file except in compliance with the License.\n You may obtain a copy of the License at\n\n http://www.apache.org/licenses/LICENSE-2.0\n\n Unless required by applicable law or agreed to in writing, software\n distributed under the License is distributed on an \"AS IS\" BASIS,\n WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n See the License for the specific language governing permissions and\n limitations under the License.\n"
},
{
"Name": "sshtunnel",
"Version": "0.1.5",
"Summary": "Pure python SSH tunnels",
"Home-page": "https://github.com/pahaz/sshtunnel",
"Author": "Pahaz Blinov",
"License": "MIT License",
"License URL": "https://api.github.com/repos/pahaz/sshtunnel/license",
"License repo": "Copyright (c) 2014-2019 Pahaz Blinov\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in\nall copies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN\nTHE SOFTWARE.\n\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "tabulate",
"Version": "0.8.5",
"Summary": "Pretty-print tabular data",
"Home-page": "https://github.com/astanin/python-tabulate",
"Author": "Sergey Astanin",
"License": "MIT License",
"License URL": "https://api.github.com/repos/astanin/python-tabulate/license",
"License repo": "Copyright (c) 2011-2020 Sergey Astanin and contributors\n\nPermission is hereby granted, free of charge, to any person obtaining\na copy of this software and associated documentation files (the\n\"Software\"), to deal in the Software without restriction, including\nwithout limitation the rights to use, copy, modify, merge, publish,\ndistribute, sublicense, and/or sell copies of the Software, and to\npermit persons to whom the Software is furnished to do so, subject to\nthe following conditions:\n\nThe above copyright notice and this permission notice shall be\nincluded in all copies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND,\nEXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF\nMERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND\nNONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE\nLIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION\nOF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION\nWITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "terraform-bin",
"Version": "1.0.1",
"Summary": "UNKNOWN",
"Home-page": "https://github.com/epiphany-platform/terraform-bin",
"Author": "Epiphany Team",
"License": "Apache License 2.0",
"License URL": "https://api.github.com/repos/epiphany-platform/terraform-bin/license",
"License repo": "\n Apache License\n Version 2.0, January 2004\n http://www.apache.org/licenses/\n\n TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION\n\n 1. Definitions.\n\n \"License\" shall mean the terms and conditions for use, reproduction,\n and distribution as defined by Sections 1 through 9 of this document.\n\n \"Licensor\" shall mean the copyright owner or entity authorized by\n the copyright owner that is granting the License.\n\n \"Legal Entity\" shall mean the union of the acting entity and all\n other entities that control, are controlled by, or are under common\n control with that entity. For the purposes of this definition,\n \"control\" means (i) the power, direct or indirect, to cause the\n direction or management of such entity, whether by contract or\n otherwise, or (ii) ownership of fifty percent (50%) or more of the\n outstanding shares, or (iii) beneficial ownership of such entity.\n\n \"You\" (or \"Your\") shall mean an individual or Legal Entity\n exercising permissions granted by this License.\n\n \"Source\" form shall mean the preferred form for making modifications,\n including but not limited to software source code, documentation\n source, and configuration files.\n\n \"Object\" form shall mean any form resulting from mechanical\n transformation or translation of a Source form, including but\n not limited to compiled object code, generated documentation,\n and conversions to other media types.\n\n \"Work\" shall mean the work of authorship, whether in Source or\n Object form, made available under the License, as indicated by a\n copyright notice that is included in or attached to the work\n (an example is provided in the Appendix below).\n\n \"Derivative Works\" shall mean any work, whether in Source or Object\n form, that is based on (or derived from) the Work and for which the\n editorial revisions, annotations, elaborations, or other modifications\n represent, as a whole, an original work of authorship. For the purposes\n of this License, Derivative Works shall not include works that remain\n separable from, or merely link (or bind by name) to the interfaces of,\n the Work and Derivative Works thereof.\n\n \"Contribution\" shall mean any work of authorship, including\n the original version of the Work and any modifications or additions\n to that Work or Derivative Works thereof, that is intentionally\n submitted to Licensor for inclusion in the Work by the copyright owner\n or by an individual or Legal Entity authorized to submit on behalf of\n the copyright owner. For the purposes of this definition, \"submitted\"\n means any form of electronic, verbal, or written communication sent\n to the Licensor or its representatives, including but not limited to\n communication on electronic mailing lists, source code control systems,\n and issue tracking systems that are managed by, or on behalf of, the\n Licensor for the purpose of discussing and improving the Work, but\n excluding communication that is conspicuously marked or otherwise\n designated in writing by the copyright owner as \"Not a Contribution.\"\n\n \"Contributor\" shall mean Licensor and any individual or Legal Entity\n on behalf of whom a Contribution has been received by Licensor and\n subsequently incorporated within the Work.\n\n 2. Grant of Copyright License. Subject to the terms and conditions of\n this License, each Contributor hereby grants to You a perpetual,\n worldwide, non-exclusive, no-charge, royalty-free, irrevocable\n copyright license to reproduce, prepare Derivative Works of,\n publicly display, publicly perform, sublicense, and distribute the\n Work and such Derivative Works in Source or Object form.\n\n 3. Grant of Patent License. Subject to the terms and conditions of\n this License, each Contributor hereby grants to You a perpetual,\n worldwide, non-exclusive, no-charge, royalty-free, irrevocable\n (except as stated in this section) patent license to make, have made,\n use, offer to sell, sell, import, and otherwise transfer the Work,\n where such license applies only to those patent claims licensable\n by such Contributor that are necessarily infringed by their\n Contribution(s) alone or by combination of their Contribution(s)\n with the Work to which such Contribution(s) was submitted. If You\n institute patent litigation against any entity (including a\n cross-claim or counterclaim in a lawsuit) alleging that the Work\n or a Contribution incorporated within the Work constitutes direct\n or contributory patent infringement, then any patent licenses\n granted to You under this License for that Work shall terminate\n as of the date such litigation is filed.\n\n 4. Redistribution. You may reproduce and distribute copies of the\n Work or Derivative Works thereof in any medium, with or without\n modifications, and in Source or Object form, provided that You\n meet the following conditions:\n\n (a) You must give any other recipients of the Work or\n Derivative Works a copy of this License; and\n\n (b) You must cause any modified files to carry prominent notices\n stating that You changed the files; and\n\n (c) You must retain, in the Source form of any Derivative Works\n that You distribute, all copyright, patent, trademark, and\n attribution notices from the Source form of the Work,\n excluding those notices that do not pertain to any part of\n the Derivative Works; and\n\n (d) If the Work includes a \"NOTICE\" text file as part of its\n distribution, then any Derivative Works that You distribute must\n include a readable copy of the attribution notices contained\n within such NOTICE file, excluding those notices that do not\n pertain to any part of the Derivative Works, in at least one\n of the following places: within a NOTICE text file distributed\n as part of the Derivative Works; within the Source form or\n documentation, if provided along with the Derivative Works; or,\n within a display generated by the Derivative Works, if and\n wherever such third-party notices normally appear. The contents\n of the NOTICE file are for informational purposes only and\n do not modify the License. You may add Your own attribution\n notices within Derivative Works that You distribute, alongside\n or as an addendum to the NOTICE text from the Work, provided\n that such additional attribution notices cannot be construed\n as modifying the License.\n\n You may add Your own copyright statement to Your modifications and\n may provide additional or different license terms and conditions\n for use, reproduction, or distribution of Your modifications, or\n for any such Derivative Works as a whole, provided Your use,\n reproduction, and distribution of the Work otherwise complies with\n the conditions stated in this License.\n\n 5. Submission of Contributions. Unless You explicitly state otherwise,\n any Contribution intentionally submitted for inclusion in the Work\n by You to the Licensor shall be under the terms and conditions of\n this License, without any additional terms or conditions.\n Notwithstanding the above, nothing herein shall supersede or modify\n the terms of any separate license agreement you may have executed\n with Licensor regarding such Contributions.\n\n 6. Trademarks. This License does not grant permission to use the trade\n names, trademarks, service marks, or product names of the Licensor,\n except as required for reasonable and customary use in describing the\n origin of the Work and reproducing the content of the NOTICE file.\n\n 7. Disclaimer of Warranty. Unless required by applicable law or\n agreed to in writing, Licensor provides the Work (and each\n Contributor provides its Contributions) on an \"AS IS\" BASIS,\n WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or\n implied, including, without limitation, any warranties or conditions\n of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A\n PARTICULAR PURPOSE. You are solely responsible for determining the\n appropriateness of using or redistributing the Work and assume any\n risks associated with Your exercise of permissions under this License.\n\n 8. Limitation of Liability. In no event and under no legal theory,\n whether in tort (including negligence), contract, or otherwise,\n unless required by applicable law (such as deliberate and grossly\n negligent acts) or agreed to in writing, shall any Contributor be\n liable to You for damages, including any direct, indirect, special,\n incidental, or consequential damages of any character arising as a\n result of this License or out of the use or inability to use the\n Work (including but not limited to damages for loss of goodwill,\n work stoppage, computer failure or malfunction, or any and all\n other commercial damages or losses), even if such Contributor\n has been advised of the possibility of such damages.\n\n 9. Accepting Warranty or Additional Liability. While redistributing\n the Work or Derivative Works thereof, You may choose to offer,\n and charge a fee for, acceptance of support, warranty, indemnity,\n or other liability obligations and/or rights consistent with this\n License. However, in accepting such obligations, You may act only\n on Your own behalf and on Your sole responsibility, not on behalf\n of any other Contributor, and only if You agree to indemnify,\n defend, and hold each Contributor harmless for any liability\n incurred by, or claims asserted against, such Contributor by reason\n of your accepting any such warranty or additional liability.\n\n END OF TERMS AND CONDITIONS\n\n APPENDIX: How to apply the Apache License to your work.\n\n To apply the Apache License to your work, attach the following\n boilerplate notice, with the fields enclosed by brackets \"[]\"\n replaced with your own identifying information. (Don't include\n the brackets!) The text should be enclosed in the appropriate\n comment syntax for the file format. We also recommend that a\n file or class name and description of purpose be included on the\n same \"printed page\" as the copyright notice for easier\n identification within third-party archives.\n\n Copyright 2019 ABB. All rights reserved.\n\n Licensed under the Apache License, Version 2.0 (the \"License\");\n you may not use this file except in compliance with the License.\n You may obtain a copy of the License at\n\n http://www.apache.org/licenses/LICENSE-2.0\n\n Unless required by applicable law or agreed to in writing, software\n distributed under the License is distributed on an \"AS IS\" BASIS,\n WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n See the License for the specific language governing permissions and\n limitations under the License.",
"License text": " Apache License\n Version 2.0, January 2004\n http://www.apache.org/licenses/\n\n TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION\n\n 1. Definitions.\n\n \"License\" shall mean the terms and conditions for use, reproduction,\n and distribution as defined by Sections 1 through 9 of this document.\n\n \"Licensor\" shall mean the copyright owner or entity authorized by\n the copyright owner that is granting the License.\n\n \"Legal Entity\" shall mean the union of the acting entity and all\n other entities that control, are controlled by, or are under common\n control with that entity. For the purposes of this definition,\n \"control\" means (i) the power, direct or indirect, to cause the\n direction or management of such entity, whether by contract or\n otherwise, or (ii) ownership of fifty percent (50%) or more of the\n outstanding shares, or (iii) beneficial ownership of such entity.\n\n \"You\" (or \"Your\") shall mean an individual or Legal Entity\n exercising permissions granted by this License.\n\n \"Source\" form shall mean the preferred form for making modifications,\n including but not limited to software source code, documentation\n source, and configuration files.\n\n \"Object\" form shall mean any form resulting from mechanical\n transformation or translation of a Source form, including but\n not limited to compiled object code, generated documentation,\n and conversions to other media types.\n\n \"Work\" shall mean the work of authorship, whether in Source or\n Object form, made available under the License, as indicated by a\n copyright notice that is included in or attached to the work\n (an example is provided in the Appendix below).\n\n \"Derivative Works\" shall mean any work, whether in Source or Object\n form, that is based on (or derived from) the Work and for which the\n editorial revisions, annotations, elaborations, or other modifications\n represent, as a whole, an original work of authorship. For the purposes\n of this License, Derivative Works shall not include works that remain\n separable from, or merely link (or bind by name) to the interfaces of,\n the Work and Derivative Works thereof.\n\n \"Contribution\" shall mean any work of authorship, including\n the original version of the Work and any modifications or additions\n to that Work or Derivative Works thereof, that is intentionally\n submitted to Licensor for inclusion in the Work by the copyright owner\n or by an individual or Legal Entity authorized to submit on behalf of\n the copyright owner. For the purposes of this definition, \"submitted\"\n means any form of electronic, verbal, or written communication sent\n to the Licensor or its representatives, including but not limited to\n communication on electronic mailing lists, source code control systems,\n and issue tracking systems that are managed by, or on behalf of, the\n Licensor for the purpose of discussing and improving the Work, but\n excluding communication that is conspicuously marked or otherwise\n designated in writing by the copyright owner as \"Not a Contribution.\"\n\n \"Contributor\" shall mean Licensor and any individual or Legal Entity\n on behalf of whom a Contribution has been received by Licensor and\n subsequently incorporated within the Work.\n\n 2. Grant of Copyright License. Subject to the terms and conditions of\n this License, each Contributor hereby grants to You a perpetual,\n worldwide, non-exclusive, no-charge, royalty-free, irrevocable\n copyright license to reproduce, prepare Derivative Works of,\n publicly display, publicly perform, sublicense, and distribute the\n Work and such Derivative Works in Source or Object form.\n\n 3. Grant of Patent License. Subject to the terms and conditions of\n this License, each Contributor hereby grants to You a perpetual,\n worldwide, non-exclusive, no-charge, royalty-free, irrevocable\n (except as stated in this section) patent license to make, have made,\n use, offer to sell, sell, import, and otherwise transfer the Work,\n where such license applies only to those patent claims licensable\n by such Contributor that are necessarily infringed by their\n Contribution(s) alone or by combination of their Contribution(s)\n with the Work to which such Contribution(s) was submitted. If You\n institute patent litigation against any entity (including a\n cross-claim or counterclaim in a lawsuit) alleging that the Work\n or a Contribution incorporated within the Work constitutes direct\n or contributory patent infringement, then any patent licenses\n granted to You under this License for that Work shall terminate\n as of the date such litigation is filed.\n\n 4. Redistribution. You may reproduce and distribute copies of the\n Work or Derivative Works thereof in any medium, with or without\n modifications, and in Source or Object form, provided that You\n meet the following conditions:\n\n (a) You must give any other recipients of the Work or\n Derivative Works a copy of this License; and\n\n (b) You must cause any modified files to carry prominent notices\n stating that You changed the files; and\n\n (c) You must retain, in the Source form of any Derivative Works\n that You distribute, all copyright, patent, trademark, and\n attribution notices from the Source form of the Work,\n excluding those notices that do not pertain to any part of\n the Derivative Works; and\n\n (d) If the Work includes a \"NOTICE\" text file as part of its\n distribution, then any Derivative Works that You distribute must\n include a readable copy of the attribution notices contained\n within such NOTICE file, excluding those notices that do not\n pertain to any part of the Derivative Works, in at least one\n of the following places: within a NOTICE text file distributed\n as part of the Derivative Works; within the Source form or\n documentation, if provided along with the Derivative Works; or,\n within a display generated by the Derivative Works, if and\n wherever such third-party notices normally appear. The contents\n of the NOTICE file are for informational purposes only and\n do not modify the License. You may add Your own attribution\n notices within Derivative Works that You distribute, alongside\n or as an addendum to the NOTICE text from the Work, provided\n that such additional attribution notices cannot be construed\n as modifying the License.\n\n You may add Your own copyright statement to Your modifications and\n may provide additional or different license terms and conditions\n for use, reproduction, or distribution of Your modifications, or\n for any such Derivative Works as a whole, provided Your use,\n reproduction, and distribution of the Work otherwise complies with\n the conditions stated in this License.\n\n 5. Submission of Contributions. Unless You explicitly state otherwise,\n any Contribution intentionally submitted for inclusion in the Work\n by You to the Licensor shall be under the terms and conditions of\n this License, without any additional terms or conditions.\n Notwithstanding the above, nothing herein shall supersede or modify\n the terms of any separate license agreement you may have executed\n with Licensor regarding such Contributions.\n\n 6. Trademarks. This License does not grant permission to use the trade\n names, trademarks, service marks, or product names of the Licensor,\n except as required for reasonable and customary use in describing the\n origin of the Work and reproducing the content of the NOTICE file.\n\n 7. Disclaimer of Warranty. Unless required by applicable law or\n agreed to in writing, Licensor provides the Work (and each\n Contributor provides its Contributions) on an \"AS IS\" BASIS,\n WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or\n implied, including, without limitation, any warranties or conditions\n of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A\n PARTICULAR PURPOSE. You are solely responsible for determining the\n appropriateness of using or redistributing the Work and assume any\n risks associated with Your exercise of permissions under this License.\n\n 8. Limitation of Liability. In no event and under no legal theory,\n whether in tort (including negligence), contract, or otherwise,\n unless required by applicable law (such as deliberate and grossly\n negligent acts) or agreed to in writing, shall any Contributor be\n liable to You for damages, including any direct, indirect, special,\n incidental, or consequential damages of any character arising as a\n result of this License or out of the use or inability to use the\n Work (including but not limited to damages for loss of goodwill,\n work stoppage, computer failure or malfunction, or any and all\n other commercial damages or losses), even if such Contributor\n has been advised of the possibility of such damages.\n\n 9. Accepting Warranty or Additional Liability. While redistributing\n the Work or Derivative Works thereof, You may choose to offer,\n and charge a fee for, acceptance of support, warranty, indemnity,\n or other liability obligations and/or rights consistent with this\n License. However, in accepting such obligations, You may act only\n on Your own behalf and on Your sole responsibility, not on behalf\n of any other Contributor, and only if You agree to indemnify,\n defend, and hold each Contributor harmless for any liability\n incurred by, or claims asserted against, such Contributor by reason\n of your accepting any such warranty or additional liability.\n\n END OF TERMS AND CONDITIONS\n\n APPENDIX: How to apply the Apache License to your work.\n\n To apply the Apache License to your work, attach the following\n boilerplate notice, with the fields enclosed by brackets \"[]\"\n replaced with your own identifying information. (Don't include\n the brackets!) The text should be enclosed in the appropriate\n comment syntax for the file format. We also recommend that a\n file or class name and description of purpose be included on the\n same \"printed page\" as the copyright notice for easier\n identification within third-party archives.\n\n Copyright [yyyy] [name of copyright owner]\n\n Licensed under the Apache License, Version 2.0 (the \"License\");\n you may not use this file except in compliance with the License.\n You may obtain a copy of the License at\n\n http://www.apache.org/licenses/LICENSE-2.0\n\n Unless required by applicable law or agreed to in writing, software\n distributed under the License is distributed on an \"AS IS\" BASIS,\n WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n See the License for the specific language governing permissions and\n limitations under the License.\n"
},
{
"Name": "tqdm",
"Version": "4.43.0",
"Summary": "Fast, Extensible Progress Meter",
"Home-page": "https://github.com/tqdm/tqdm",
"License": "Other",
"License URL": "https://api.github.com/repos/tqdm/tqdm/license",
"License repo": "`tqdm` is a product of collaborative work.\nUnless otherwise stated, all authors (see commit logs) retain copyright\nfor their respective work, and release the work under the MIT licence\n(text below).\n\nExceptions or notable authors are listed below\nin reverse chronological order:\n\n* files: *\n MPLv2.0 2015-2020 (c) Casper da Costa-Luis\n [casperdcl](https://github.com/casperdcl).\n* files: tqdm/_tqdm.py\n MIT 2016 (c) [PR #96] on behalf of Google Inc.\n* files: tqdm/_tqdm.py setup.py README.rst MANIFEST.in .gitignore\n MIT 2013 (c) Noam Yorav-Raphael, original author.\n\n[PR #96]: https://github.com/tqdm/tqdm/pull/96\n\n\nMozilla Public Licence (MPL) v. 2.0 - Exhibit A\n-----------------------------------------------\n\nThis Source Code Form is subject to the terms of the\nMozilla Public License, v. 2.0.\nIf a copy of the MPL was not distributed with this file,\nYou can obtain one at https://mozilla.org/MPL/2.0/.\n\n\nMIT License (MIT)\n-----------------\n\nCopyright (c) 2013 noamraph\n\nPermission is hereby granted, free of charge, to any person obtaining a copy of\nthis software and associated documentation files (the \"Software\"), to deal in\nthe Software without restriction, including without limitation the rights to\nuse, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of\nthe Software, and to permit persons to whom the Software is furnished to do so,\nsubject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS\nFOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR\nCOPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER\nIN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN\nCONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.\n"
},
{
"Name": "typed-ast",
"Version": "1.4.1",
"Summary": "a fork of Python 2 and 3 ast modules with type comment support",
"Home-page": "https://github.com/python/typed_ast",
"Author": "David Fisher",
"License": "Other",
"License URL": "https://api.github.com/repos/python/typed_ast/license",
"License repo": "Format: https://www.debian.org/doc/packaging-manuals/copyright-format/1.0/\nUpstream-Name: typed-ast\nSource: https://pypi.python.org/pypi/typed-ast\n\nFiles: *\nCopyright: \u00a9 2016 David Fisher <ddfisher@dropbox.com>\nLicense: Apache-2.0\n\nFiles: *\nCopyright: \u00a9 2016 David Fisher <ddfisher@dropbox.com>\n \u00a9 2008 Armin Ronacher\nComment: The original CPython source is licensed under the\n Python Software Foundation License Version 2\nLicense: Python\n\nFiles: ast27/Parser/spark.py\nCopyright: \u00a9 1998-2002 John Aycock\nLicense: Expat\n Permission is hereby granted, free of charge, to any person obtaining\n a copy of this software and associated documentation files (the\n \"Software\"), to deal in the Software without restriction, including\n without limitation the rights to use, copy, modify, merge, publish,\n distribute, sublicense, and/or sell copies of the Software, and to\n permit persons to whom the Software is furnished to do so, subject to\n the following conditions:\n\n The above copyright notice and this permission notice shall be\n included in all copies or substantial portions of the Software.\n\n THE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND,\n EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF\n MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT.\n IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY\n CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT,\n TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE\n SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.\n\n\nLicense: Apache-2.0\n Apache License\n Version 2.0, January 2004\n http://www.apache.org/licenses/\n .\n TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION\n .\n 1. Definitions.\n .\n \"License\" shall mean the terms and conditions for use, reproduction,\n and distribution as defined by Sections 1 through 9 of this document.\n .\n \"Licensor\" shall mean the copyright owner or entity authorized by\n the copyright owner that is granting the License.\n .\n \"Legal Entity\" shall mean the union of the acting entity and all\n other entities that control, are controlled by, or are under common\n control with that entity. For the purposes of this definition,\n \"control\" means (i) the power, direct or indirect, to cause the\n direction or management of such entity, whether by contract or\n otherwise, or (ii) ownership of fifty percent (50%) or more of the\n outstanding shares, or (iii) beneficial ownership of such entity.\n .\n \"You\" (or \"Your\") shall mean an individual or Legal Entity\n exercising permissions granted by this License.\n .\n \"Source\" form shall mean the preferred form for making modifications,\n including but not limited to software source code, documentation\n source, and configuration files.\n .\n \"Object\" form shall mean any form resulting from mechanical\n transformation or translation of a Source form, including but\n not limited to compiled object code, generated documentation,\n and conversions to other media types.\n .\n \"Work\" shall mean the work of authorship, whether in Source or\n Object form, made available under the License, as indicated by a\n copyright notice that is included in or attached to the work\n (an example is provided in the Appendix below).\n .\n \"Derivative Works\" shall mean any work, whether in Source or Object\n form, that is based on (or derived from) the Work and for which the\n editorial revisions, annotations, elaborations, or other modifications\n represent, as a whole, an original work of authorship. For the purposes\n of this License, Derivative Works shall not include works that remain\n separable from, or merely link (or bind by name) to the interfaces of,\n the Work and Derivative Works thereof.\n .\n \"Contribution\" shall mean any work of authorship, including\n the original version of the Work and any modifications or additions\n to that Work or Derivative Works thereof, that is intentionally\n submitted to Licensor for inclusion in the Work by the copyright owner\n or by an individual or Legal Entity authorized to submit on behalf of\n the copyright owner. For the purposes of this definition, \"submitted\"\n means any form of electronic, verbal, or written communication sent\n to the Licensor or its representatives, including but not limited to\n communication on electronic mailing lists, source code control systems,\n and issue tracking systems that are managed by, or on behalf of, the\n Licensor for the purpose of discussing and improving the Work, but\n excluding communication that is conspicuously marked or otherwise\n designated in writing by the copyright owner as \"Not a Contribution.\"\n .\n \"Contributor\" shall mean Licensor and any individual or Legal Entity\n on behalf of whom a Contribution has been received by Licensor and\n subsequently incorporated within the Work.\n .\n 2. Grant of Copyright License. Subject to the terms and conditions of\n this License, each Contributor hereby grants to You a perpetual,\n worldwide, non-exclusive, no-charge, royalty-free, irrevocable\n copyright license to reproduce, prepare Derivative Works of,\n publicly display, publicly perform, sublicense, and distribute the\n Work and such Derivative Works in Source or Object form.\n .\n 3. Grant of Patent License. Subject to the terms and conditions of\n this License, each Contributor hereby grants to You a perpetual,\n worldwide, non-exclusive, no-charge, royalty-free, irrevocable\n (except as stated in this section) patent license to make, have made,\n use, offer to sell, sell, import, and otherwise transfer the Work,\n where such license applies only to those patent claims licensable\n by such Contributor that are necessarily infringed by their\n Contribution(s) alone or by combination of their Contribution(s)\n with the Work to which such Contribution(s) was submitted. If You\n institute patent litigation against any entity (including a\n cross-claim or counterclaim in a lawsuit) alleging that the Work\n or a Contribution incorporated within the Work constitutes direct\n or contributory patent infringement, then any patent licenses\n granted to You under this License for that Work shall terminate\n as of the date such litigation is filed.\n .\n 4. Redistribution. You may reproduce and distribute copies of the\n Work or Derivative Works thereof in any medium, with or without\n modifications, and in Source or Object form, provided that You\n meet the following conditions:\n .\n (a) You must give any other recipients of the Work or\n Derivative Works a copy of this License; and\n .\n (b) You must cause any modified files to carry prominent notices\n stating that You changed the files; and\n .\n (c) You must retain, in the Source form of any Derivative Works\n that You distribute, all copyright, patent, trademark, and\n attribution notices from the Source form of the Work,\n excluding those notices that do not pertain to any part of\n the Derivative Works; and\n .\n (d) If the Work includes a \"NOTICE\" text file as part of its\n distribution, then any Derivative Works that You distribute must\n include a readable copy of the attribution notices contained\n within such NOTICE file, excluding those notices that do not\n pertain to any part of the Derivative Works, in at least one\n of the following places: within a NOTICE text file distributed\n as part of the Derivative Works; within the Source form or\n documentation, if provided along with the Derivative Works; or,\n within a display generated by the Derivative Works, if and\n wherever such third-party notices normally appear. The contents\n of the NOTICE file are for informational purposes only and\n do not modify the License. You may add Your own attribution\n notices within Derivative Works that You distribute, alongside\n or as an addendum to the NOTICE text from the Work, provided\n that such additional attribution notices cannot be construed\n as modifying the License.\n .\n You may add Your own copyright statement to Your modifications and\n may provide additional or different license terms and conditions\n for use, reproduction, or distribution of Your modifications, or\n for any such Derivative Works as a whole, provided Your use,\n reproduction, and distribution of the Work otherwise complies with\n the conditions stated in this License.\n .\n 5. Submission of Contributions. Unless You explicitly state otherwise,\n any Contribution intentionally submitted for inclusion in the Work\n by You to the Licensor shall be under the terms and conditions of\n this License, without any additional terms or conditions.\n Notwithstanding the above, nothing herein shall supersede or modify\n the terms of any separate license agreement you may have executed\n with Licensor regarding such Contributions.\n .\n 6. Trademarks. This License does not grant permission to use the trade\n names, trademarks, service marks, or product names of the Licensor,\n except as required for reasonable and customary use in describing the\n origin of the Work and reproducing the content of the NOTICE file.\n .\n 7. Disclaimer of Warranty. Unless required by applicable law or\n agreed to in writing, Licensor provides the Work (and each\n Contributor provides its Contributions) on an \"AS IS\" BASIS,\n WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or\n implied, including, without limitation, any warranties or conditions\n of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A\n PARTICULAR PURPOSE. You are solely responsible for determining the\n appropriateness of using or redistributing the Work and assume any\n risks associated with Your exercise of permissions under this License.\n .\n 8. Limitation of Liability. In no event and under no legal theory,\n whether in tort (including negligence), contract, or otherwise,\n unless required by applicable law (such as deliberate and grossly\n negligent acts) or agreed to in writing, shall any Contributor be\n liable to You for damages, including any direct, indirect, special,\n incidental, or consequential damages of any character arising as a\n result of this License or out of the use or inability to use the\n Work (including but not limited to damages for loss of goodwill,\n work stoppage, computer failure or malfunction, or any and all\n other commercial damages or losses), even if such Contributor\n has been advised of the possibility of such damages.\n .\n 9. Accepting Warranty or Additional Liability. While redistributing\n the Work or Derivative Works thereof, You may choose to offer,\n and charge a fee for, acceptance of support, warranty, indemnity,\n or other liability obligations and/or rights consistent with this\n License. However, in accepting such obligations, You may act only\n on Your own behalf and on Your sole responsibility, not on behalf\n of any other Contributor, and only if You agree to indemnify,\n defend, and hold each Contributor harmless for any liability\n incurred by, or claims asserted against, such Contributor by reason\n of your accepting any such warranty or additional liability.\n .\n END OF TERMS AND CONDITIONS\n .\n APPENDIX: How to apply the Apache License to your work.\n .\n To apply the Apache License to your work, attach the following\n boilerplate notice, with the fields enclosed by brackets \"[]\"\n replaced with your own identifying information. (Don't include\n the brackets!) The text should be enclosed in the appropriate\n comment syntax for the file format. We also recommend that a\n file or class name and description of purpose be included on the\n same \"printed page\" as the copyright notice for easier\n identification within third-party archives.\n .\n Copyright 2016 Dropbox, Inc.\n .\n Licensed under the Apache License, Version 2.0 (the \"License\");\n you may not use this file except in compliance with the License.\n You may obtain a copy of the License at\n .\n http://www.apache.org/licenses/LICENSE-2.0\n .\n Unless required by applicable law or agreed to in writing, software\n distributed under the License is distributed on an \"AS IS\" BASIS,\n WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n See the License for the specific language governing permissions and\n limitations under the License.\n \nLicense: Python\n PYTHON SOFTWARE FOUNDATION LICENSE VERSION 2\n --------------------------------------------\n .\n 1. This LICENSE AGREEMENT is between the Python Software Foundation\n (\"PSF\"), and the Individual or Organization (\"Licensee\") accessing and\n otherwise using this software (\"Python\") in source or binary form and\n its associated documentation.\n .\n 2. Subject to the terms and conditions of this License Agreement, PSF hereby\n grants Licensee a nonexclusive, royalty-free, world-wide license to reproduce,\n analyze, test, perform and/or display publicly, prepare derivative works,\n distribute, and otherwise use Python alone or in any derivative version,\n provided, however, that PSF's License Agreement and PSF's notice of copyright,\n i.e., \"Copyright (c) 2001, 2002, 2003, 2004, 2005, 2006, 2007, 2008, 2009, 2010,\n 2011, 2012, 2013, 2014, 2015, 2016 Python Software Foundation; All Rights\n Reserved\" are retained in Python alone or in any derivative version prepared by\n Licensee.\n .\n 3. In the event Licensee prepares a derivative work that is based on\n or incorporates Python or any part thereof, and wants to make\n the derivative work available to others as provided herein, then\n Licensee hereby agrees to include in any such work a brief summary of\n the changes made to Python.\n .\n 4. PSF is making Python available to Licensee on an \"AS IS\"\n basis. PSF MAKES NO REPRESENTATIONS OR WARRANTIES, EXPRESS OR\n IMPLIED. BY WAY OF EXAMPLE, BUT NOT LIMITATION, PSF MAKES NO AND\n DISCLAIMS ANY REPRESENTATION OR WARRANTY OF MERCHANTABILITY OR FITNESS\n FOR ANY PARTICULAR PURPOSE OR THAT THE USE OF PYTHON WILL NOT\n INFRINGE ANY THIRD PARTY RIGHTS.\n .\n 5. PSF SHALL NOT BE LIABLE TO LICENSEE OR ANY OTHER USERS OF PYTHON\n FOR ANY INCIDENTAL, SPECIAL, OR CONSEQUENTIAL DAMAGES OR LOSS AS\n A RESULT OF MODIFYING, DISTRIBUTING, OR OTHERWISE USING PYTHON,\n OR ANY DERIVATIVE THEREOF, EVEN IF ADVISED OF THE POSSIBILITY THEREOF.\n .\n 6. This License Agreement will automatically terminate upon a material\n breach of its terms and conditions.\n .\n 7. Nothing in this License Agreement shall be deemed to create any\n relationship of agency, partnership, or joint venture between PSF and\n Licensee. This License Agreement does not grant permission to use PSF\n trademarks or trade name in a trademark sense to endorse or promote\n products or services of Licensee, or any third party.\n .\n 8. By copying, installing or otherwise using Python, Licensee\n agrees to be bound by the terms and conditions of this License\n Agreement.\n"
},
{
"Name": "urllib3",
"Version": "1.25.6",
"Summary": "HTTP library with thread-safe connection pooling, file post, and more.",
"Home-page": "https://urllib3.readthedocs.io/",
"Author": "Andrey Petrov",
"License": "MIT"
},
{
"Name": "vsts",
"Version": "0.1.25",
"Summary": "Python wrapper around the VSTS APIs",
"Home-page": "https://github.com/Microsoft/vsts-python-api",
"Author": "Microsoft Corporation",
"License": "MIT License",
"License URL": "https://api.github.com/repos/microsoft/vsts-python-api/license",
"License repo": " MIT License\n\n Copyright (c) Microsoft Corporation. All rights reserved.\n\n Permission is hereby granted, free of charge, to any person obtaining a copy\n of this software and associated documentation files (the \"Software\"), to deal\n in the Software without restriction, including without limitation the rights\n to use, copy, modify, merge, publish, distribute, sublicense, and/or sell\n copies of the Software, and to permit persons to whom the Software is\n furnished to do so, subject to the following conditions:\n\n The above copyright notice and this permission notice shall be included in all\n copies or substantial portions of the Software.\n\n THE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\n IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\n FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\n AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\n LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\n OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\n SOFTWARE\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "vsts-cd-manager",
"Version": "1.0.2",
"Summary": "Python wrapper around some of the VSTS APIs",
"Home-page": "https://github.com/microsoft/vsts-cd-manager",
"Author": "UNKNOWN",
"License": "MIT License",
"License URL": "https://api.github.com/repos/microsoft/vsts-cd-manager/license",
"License repo": " MIT License\r\n\r\n Copyright (c) Microsoft Corporation. All rights reserved.\r\n\r\n Permission is hereby granted, free of charge, to any person obtaining a copy\r\n of this software and associated documentation files (the \"Software\"), to deal\r\n in the Software without restriction, including without limitation the rights\r\n to use, copy, modify, merge, publish, distribute, sublicense, and/or sell\r\n copies of the Software, and to permit persons to whom the Software is\r\n furnished to do so, subject to the following conditions:\r\n\r\n The above copyright notice and this permission notice shall be included in all\r\n copies or substantial portions of the Software.\r\n\r\n THE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\r\n IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\r\n FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\r\n AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\r\n LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\r\n OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\r\n SOFTWARE\r\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "wcwidth",
"Version": "0.1.7",
"Summary": "Measures number of Terminal column cells of wide-character codes",
"Home-page": "https://github.com/jquast/wcwidth",
"Author": "Jeff Quast",
"License": "MIT License",
"License URL": "https://api.github.com/repos/jquast/wcwidth/license",
"License repo": "The MIT License (MIT)\n\nCopyright (c) 2014 Jeff Quast <contact@jeffquast.com>\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "webencodings",
"Version": "0.5.1",
"Summary": "Character encoding aliases for legacy web content",
"Home-page": "https://github.com/SimonSapin/python-webencodings",
"Author": "Geoffrey Sneddon",
"License": "Other",
"License URL": "https://api.github.com/repos/simonsapin/python-webencodings/license",
"License repo": "Copyright (c) 2012 by Simon Sapin.\n\nSome rights reserved.\n\nRedistribution and use in source and binary forms, with or without\nmodification, are permitted provided that the following conditions are\nmet:\n\n * Redistributions of source code must retain the above copyright\n notice, this list of conditions and the following disclaimer.\n\n * Redistributions in binary form must reproduce the above\n copyright notice, this list of conditions and the following\n disclaimer in the documentation and/or other materials provided\n with the distribution.\n\n * The names of the contributors may not be used to endorse or\n promote products derived from this software without specific\n prior written permission.\n\nTHIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS\n\"AS IS\" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT\nLIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR\nA PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT\nOWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,\nSPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT\nLIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,\nDATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY\nTHEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT\n(INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE\nOF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.\n"
},
{
"Name": "websocket-client",
"Version": "0.56.0",
"Summary": "WebSocket client for Python. hybi13 is supported.",
"Home-page": "https://github.com/websocket-client/websocket-client.git",
"Author": "liris",
"License": "BSD"
},
{
"Name": "wrapt",
"Version": "1.11.2",
"Summary": "Module for decorators, wrappers and monkey patching.",
"Home-page": "https://github.com/GrahamDumpleton/wrapt",
"Author": "Graham Dumpleton",
"License": "BSD 2-Clause \"Simplified\" License",
"License URL": "https://api.github.com/repos/grahamdumpleton/wrapt/license",
"License repo": "Copyright (c) 2013-2019, Graham Dumpleton\nAll rights reserved.\n\nRedistribution and use in source and binary forms, with or without\nmodification, are permitted provided that the following conditions are met:\n\n* Redistributions of source code must retain the above copyright notice, this\n list of conditions and the following disclaimer.\n\n* Redistributions in binary form must reproduce the above copyright notice,\n this list of conditions and the following disclaimer in the documentation\n and/or other materials provided with the distribution.\n\nTHIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS \"AS IS\"\nAND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE\nIMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE\nARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE\nLIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR\nCONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF\nSUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS\nINTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN\nCONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE)\nARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE\nPOSSIBILITY OF SUCH DAMAGE.\n",
"License text": "BSD 2-Clause License\n\nCopyright (c) [year], [fullname]\nAll rights reserved.\n\nRedistribution and use in source and binary forms, with or without\nmodification, are permitted provided that the following conditions are met:\n\n1. Redistributions of source code must retain the above copyright notice, this\n list of conditions and the following disclaimer.\n\n2. Redistributions in binary form must reproduce the above copyright notice,\n this list of conditions and the following disclaimer in the documentation\n and/or other materials provided with the distribution.\n\nTHIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS \"AS IS\"\nAND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE\nIMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE\nDISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE\nFOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL\nDAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR\nSERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER\nCAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,\nOR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE\nOF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.\n"
},
{
"Name": "xmltodict",
"Version": "0.12.0",
"Summary": "Makes working with XML feel like you are working with JSON",
"Home-page": "https://github.com/martinblech/xmltodict",
"Author": "Martin Blech",
"License": "MIT License",
"License URL": "https://api.github.com/repos/martinblech/xmltodict/license",
"License repo": "Copyright (C) 2012 Martin Blech and individual contributors.\n\nPermission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the \"Software\"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
},
{
"Name": "zipp",
"Version": "0.6.0",
"Summary": "Backport of pathlib-compatible object wrapper for zip files",
"Home-page": "https://github.com/jaraco/zipp",
"Author": "Jason R. Coombs",
"License": "MIT License",
"License URL": "https://api.github.com/repos/jaraco/zipp/license",
"License repo": "Copyright Jason R. Coombs\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to\ndeal in the Software without restriction, including without limitation the\nrights to use, copy, modify, merge, publish, distribute, sublicense, and/or\nsell copies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in\nall copies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING\nFROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS\nIN THE SOFTWARE.\n",
"License text": "MIT License\n\nCopyright (c) [year] [fullname]\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
}
] | 368.35809 | 27,098 | 0.759176 | 129,429 | 833,226 | 4.887297 | 0.016789 | 0.032449 | 0.012598 | 0.014364 | 0.967345 | 0.962645 | 0.95863 | 0.952937 | 0.949258 | 0.945494 | 0 | 0.003959 | 0.175223 | 833,226 | 2,262 | 27,099 | 368.35809 | 0.916492 | 0.000148 | 0 | 0.450841 | 1 | 0.659433 | 0.946364 | 0.022512 | 0 | 0 | 0 | 0 | 0.008415 | 1 | 0 | false | 0.003543 | 0.009743 | 0 | 0.009743 | 0.007086 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 11 |
587b0c38db458015bad62802c767200ea1c6c167 | 45,383 | py | Python | premade_modules/2.78/2.78c/bpy/ops/transform.py | echantry/fake-bpy-module | 004cdf198841e639b7d9a4c4db95ca1c0d3aa2c7 | [
"MIT"
] | null | null | null | premade_modules/2.78/2.78c/bpy/ops/transform.py | echantry/fake-bpy-module | 004cdf198841e639b7d9a4c4db95ca1c0d3aa2c7 | [
"MIT"
] | null | null | null | premade_modules/2.78/2.78c/bpy/ops/transform.py | echantry/fake-bpy-module | 004cdf198841e639b7d9a4c4db95ca1c0d3aa2c7 | [
"MIT"
] | null | null | null | def bend(
value=(0.0),
mirror=False,
proportional='DISABLED',
proportional_edit_falloff='SMOOTH',
proportional_size=1.0,
snap=False,
snap_target='CLOSEST',
snap_point=(0.0, 0.0, 0.0),
snap_align=False,
snap_normal=(0.0, 0.0, 0.0),
gpencil_strokes=False,
release_confirm=False):
'''Bend selected items between the 3D cursor and the mouse
:param value: Angle
:type value: float array of 1 items in [-inf, inf], (optional)
:param mirror: Mirror Editing
:type mirror: boolean, (optional)
:param proportional: Proportional EditingDISABLED Disable, Proportional Editing disabled.ENABLED Enable, Proportional Editing enabled.PROJECTED Projected (2D), Proportional Editing using screen space locations.CONNECTED Connected, Proportional Editing using connected geometry only.
:type proportional: enum in ['DISABLED', 'ENABLED', 'PROJECTED', 'CONNECTED'], (optional)
:param proportional_edit_falloff: Proportional Editing Falloff, Falloff type for proportional editing modeSMOOTH Smooth, Smooth falloff.SPHERE Sphere, Spherical falloff.ROOT Root, Root falloff.INVERSE_SQUARE Inverse Square, Inverse Square falloff.SHARP Sharp, Sharp falloff.LINEAR Linear, Linear falloff.CONSTANT Constant, Constant falloff.RANDOM Random, Random falloff.
:type proportional_edit_falloff: enum in ['SMOOTH', 'SPHERE', 'ROOT', 'INVERSE_SQUARE', 'SHARP', 'LINEAR', 'CONSTANT', 'RANDOM'], (optional)
:param proportional_size: Proportional Size
:type proportional_size: float in [1e-06, inf], (optional)
:param snap: Use Snapping Options
:type snap: boolean, (optional)
:param snap_target: TargetCLOSEST Closest, Snap closest point onto target.CENTER Center, Snap center onto target.MEDIAN Median, Snap median onto target.ACTIVE Active, Snap active onto target.
:type snap_target: enum in ['CLOSEST', 'CENTER', 'MEDIAN', 'ACTIVE'], (optional)
:param snap_point: Point
:type snap_point: float array of 3 items in [-inf, inf], (optional)
:param snap_align: Align with Point Normal
:type snap_align: boolean, (optional)
:param snap_normal: Normal
:type snap_normal: float array of 3 items in [-inf, inf], (optional)
:param gpencil_strokes: Edit Grease Pencil, Edit selected Grease Pencil strokes
:type gpencil_strokes: boolean, (optional)
:param release_confirm: Confirm on Release, Always confirm operation when releasing button
:type release_confirm: boolean, (optional)
'''
pass
def create_orientation(name="", use_view=False, use=False, overwrite=False):
'''Create transformation orientation from selection
:param name: Name, Name of the new custom orientation
:type name: string, (optional, never None)
:param use_view: Use View, Use the current view instead of the active object to create the new orientation
:type use_view: boolean, (optional)
:param use: Use after creation, Select orientation after its creation
:type use: boolean, (optional)
:param overwrite: Overwrite previous, Overwrite previously created orientation with same name
:type overwrite: boolean, (optional)
'''
pass
def delete_orientation():
'''Delete transformation orientation
'''
pass
def edge_bevelweight(value=0.0,
snap=False,
snap_target='CLOSEST',
snap_point=(0.0, 0.0, 0.0),
snap_align=False,
snap_normal=(0.0, 0.0, 0.0),
release_confirm=False):
'''Change the bevel weight of edges
:param value: Factor
:type value: float in [-1, 1], (optional)
:param snap: Use Snapping Options
:type snap: boolean, (optional)
:param snap_target: TargetCLOSEST Closest, Snap closest point onto target.CENTER Center, Snap center onto target.MEDIAN Median, Snap median onto target.ACTIVE Active, Snap active onto target.
:type snap_target: enum in ['CLOSEST', 'CENTER', 'MEDIAN', 'ACTIVE'], (optional)
:param snap_point: Point
:type snap_point: float array of 3 items in [-inf, inf], (optional)
:param snap_align: Align with Point Normal
:type snap_align: boolean, (optional)
:param snap_normal: Normal
:type snap_normal: float array of 3 items in [-inf, inf], (optional)
:param release_confirm: Confirm on Release, Always confirm operation when releasing button
:type release_confirm: boolean, (optional)
'''
pass
def edge_crease(value=0.0,
snap=False,
snap_target='CLOSEST',
snap_point=(0.0, 0.0, 0.0),
snap_align=False,
snap_normal=(0.0, 0.0, 0.0),
release_confirm=False):
'''Change the crease of edges
:param value: Factor
:type value: float in [-1, 1], (optional)
:param snap: Use Snapping Options
:type snap: boolean, (optional)
:param snap_target: TargetCLOSEST Closest, Snap closest point onto target.CENTER Center, Snap center onto target.MEDIAN Median, Snap median onto target.ACTIVE Active, Snap active onto target.
:type snap_target: enum in ['CLOSEST', 'CENTER', 'MEDIAN', 'ACTIVE'], (optional)
:param snap_point: Point
:type snap_point: float array of 3 items in [-inf, inf], (optional)
:param snap_align: Align with Point Normal
:type snap_align: boolean, (optional)
:param snap_normal: Normal
:type snap_normal: float array of 3 items in [-inf, inf], (optional)
:param release_confirm: Confirm on Release, Always confirm operation when releasing button
:type release_confirm: boolean, (optional)
'''
pass
def edge_slide(value=0.0,
single_side=False,
use_even=False,
flipped=False,
use_clamp=True,
mirror=False,
snap=False,
snap_target='CLOSEST',
snap_point=(0.0, 0.0, 0.0),
snap_align=False,
snap_normal=(0.0, 0.0, 0.0),
correct_uv=False,
release_confirm=False):
'''Slide an edge loop along a mesh
:param value: Factor
:type value: float in [-10, 10], (optional)
:param single_side: Single Side
:type single_side: boolean, (optional)
:param use_even: Even, Make the edge loop match the shape of the adjacent edge loop
:type use_even: boolean, (optional)
:param flipped: Flipped, When Even mode is active, flips between the two adjacent edge loops
:type flipped: boolean, (optional)
:param use_clamp: Clamp, Clamp within the edge extents
:type use_clamp: boolean, (optional)
:param mirror: Mirror Editing
:type mirror: boolean, (optional)
:param snap: Use Snapping Options
:type snap: boolean, (optional)
:param snap_target: TargetCLOSEST Closest, Snap closest point onto target.CENTER Center, Snap center onto target.MEDIAN Median, Snap median onto target.ACTIVE Active, Snap active onto target.
:type snap_target: enum in ['CLOSEST', 'CENTER', 'MEDIAN', 'ACTIVE'], (optional)
:param snap_point: Point
:type snap_point: float array of 3 items in [-inf, inf], (optional)
:param snap_align: Align with Point Normal
:type snap_align: boolean, (optional)
:param snap_normal: Normal
:type snap_normal: float array of 3 items in [-inf, inf], (optional)
:param correct_uv: Correct UVs, Correct UV coordinates when transforming
:type correct_uv: boolean, (optional)
:param release_confirm: Confirm on Release, Always confirm operation when releasing button
:type release_confirm: boolean, (optional)
'''
pass
def mirror(
constraint_axis=(False, False, False),
constraint_orientation='GLOBAL',
proportional='DISABLED',
proportional_edit_falloff='SMOOTH',
proportional_size=1.0,
gpencil_strokes=False,
release_confirm=False):
'''Mirror selected items around one or more axes
:param constraint_axis: Constraint Axis
:type constraint_axis: boolean array of 3 items, (optional)
:param constraint_orientation: Orientation, Transformation orientation
:type constraint_orientation: enum in [], (optional)
:param proportional: Proportional EditingDISABLED Disable, Proportional Editing disabled.ENABLED Enable, Proportional Editing enabled.PROJECTED Projected (2D), Proportional Editing using screen space locations.CONNECTED Connected, Proportional Editing using connected geometry only.
:type proportional: enum in ['DISABLED', 'ENABLED', 'PROJECTED', 'CONNECTED'], (optional)
:param proportional_edit_falloff: Proportional Editing Falloff, Falloff type for proportional editing modeSMOOTH Smooth, Smooth falloff.SPHERE Sphere, Spherical falloff.ROOT Root, Root falloff.INVERSE_SQUARE Inverse Square, Inverse Square falloff.SHARP Sharp, Sharp falloff.LINEAR Linear, Linear falloff.CONSTANT Constant, Constant falloff.RANDOM Random, Random falloff.
:type proportional_edit_falloff: enum in ['SMOOTH', 'SPHERE', 'ROOT', 'INVERSE_SQUARE', 'SHARP', 'LINEAR', 'CONSTANT', 'RANDOM'], (optional)
:param proportional_size: Proportional Size
:type proportional_size: float in [1e-06, inf], (optional)
:param gpencil_strokes: Edit Grease Pencil, Edit selected Grease Pencil strokes
:type gpencil_strokes: boolean, (optional)
:param release_confirm: Confirm on Release, Always confirm operation when releasing button
:type release_confirm: boolean, (optional)
'''
pass
def push_pull(value=0.0,
mirror=False,
proportional='DISABLED',
proportional_edit_falloff='SMOOTH',
proportional_size=1.0,
snap=False,
snap_target='CLOSEST',
snap_point=(0.0, 0.0, 0.0),
snap_align=False,
snap_normal=(0.0, 0.0, 0.0),
release_confirm=False):
'''Push/Pull selected items
:param value: Distance
:type value: float in [-inf, inf], (optional)
:param mirror: Mirror Editing
:type mirror: boolean, (optional)
:param proportional: Proportional EditingDISABLED Disable, Proportional Editing disabled.ENABLED Enable, Proportional Editing enabled.PROJECTED Projected (2D), Proportional Editing using screen space locations.CONNECTED Connected, Proportional Editing using connected geometry only.
:type proportional: enum in ['DISABLED', 'ENABLED', 'PROJECTED', 'CONNECTED'], (optional)
:param proportional_edit_falloff: Proportional Editing Falloff, Falloff type for proportional editing modeSMOOTH Smooth, Smooth falloff.SPHERE Sphere, Spherical falloff.ROOT Root, Root falloff.INVERSE_SQUARE Inverse Square, Inverse Square falloff.SHARP Sharp, Sharp falloff.LINEAR Linear, Linear falloff.CONSTANT Constant, Constant falloff.RANDOM Random, Random falloff.
:type proportional_edit_falloff: enum in ['SMOOTH', 'SPHERE', 'ROOT', 'INVERSE_SQUARE', 'SHARP', 'LINEAR', 'CONSTANT', 'RANDOM'], (optional)
:param proportional_size: Proportional Size
:type proportional_size: float in [1e-06, inf], (optional)
:param snap: Use Snapping Options
:type snap: boolean, (optional)
:param snap_target: TargetCLOSEST Closest, Snap closest point onto target.CENTER Center, Snap center onto target.MEDIAN Median, Snap median onto target.ACTIVE Active, Snap active onto target.
:type snap_target: enum in ['CLOSEST', 'CENTER', 'MEDIAN', 'ACTIVE'], (optional)
:param snap_point: Point
:type snap_point: float array of 3 items in [-inf, inf], (optional)
:param snap_align: Align with Point Normal
:type snap_align: boolean, (optional)
:param snap_normal: Normal
:type snap_normal: float array of 3 items in [-inf, inf], (optional)
:param release_confirm: Confirm on Release, Always confirm operation when releasing button
:type release_confirm: boolean, (optional)
'''
pass
def resize(
value=(1.0, 1.0, 1.0),
constraint_axis=(False, False, False),
constraint_orientation='GLOBAL',
mirror=False,
proportional='DISABLED',
proportional_edit_falloff='SMOOTH',
proportional_size=1.0,
snap=False,
snap_target='CLOSEST',
snap_point=(0.0, 0.0, 0.0),
snap_align=False,
snap_normal=(0.0, 0.0, 0.0),
gpencil_strokes=False,
texture_space=False,
remove_on_cancel=False,
release_confirm=False):
'''Scale (resize) selected items
:param value: Vector
:type value: float array of 3 items in [-inf, inf], (optional)
:param constraint_axis: Constraint Axis
:type constraint_axis: boolean array of 3 items, (optional)
:param constraint_orientation: Orientation, Transformation orientation
:type constraint_orientation: enum in [], (optional)
:param mirror: Mirror Editing
:type mirror: boolean, (optional)
:param proportional: Proportional EditingDISABLED Disable, Proportional Editing disabled.ENABLED Enable, Proportional Editing enabled.PROJECTED Projected (2D), Proportional Editing using screen space locations.CONNECTED Connected, Proportional Editing using connected geometry only.
:type proportional: enum in ['DISABLED', 'ENABLED', 'PROJECTED', 'CONNECTED'], (optional)
:param proportional_edit_falloff: Proportional Editing Falloff, Falloff type for proportional editing modeSMOOTH Smooth, Smooth falloff.SPHERE Sphere, Spherical falloff.ROOT Root, Root falloff.INVERSE_SQUARE Inverse Square, Inverse Square falloff.SHARP Sharp, Sharp falloff.LINEAR Linear, Linear falloff.CONSTANT Constant, Constant falloff.RANDOM Random, Random falloff.
:type proportional_edit_falloff: enum in ['SMOOTH', 'SPHERE', 'ROOT', 'INVERSE_SQUARE', 'SHARP', 'LINEAR', 'CONSTANT', 'RANDOM'], (optional)
:param proportional_size: Proportional Size
:type proportional_size: float in [1e-06, inf], (optional)
:param snap: Use Snapping Options
:type snap: boolean, (optional)
:param snap_target: TargetCLOSEST Closest, Snap closest point onto target.CENTER Center, Snap center onto target.MEDIAN Median, Snap median onto target.ACTIVE Active, Snap active onto target.
:type snap_target: enum in ['CLOSEST', 'CENTER', 'MEDIAN', 'ACTIVE'], (optional)
:param snap_point: Point
:type snap_point: float array of 3 items in [-inf, inf], (optional)
:param snap_align: Align with Point Normal
:type snap_align: boolean, (optional)
:param snap_normal: Normal
:type snap_normal: float array of 3 items in [-inf, inf], (optional)
:param gpencil_strokes: Edit Grease Pencil, Edit selected Grease Pencil strokes
:type gpencil_strokes: boolean, (optional)
:param texture_space: Edit Texture Space, Edit Object data texture space
:type texture_space: boolean, (optional)
:param remove_on_cancel: Remove on Cancel, Remove elements on cancel
:type remove_on_cancel: boolean, (optional)
:param release_confirm: Confirm on Release, Always confirm operation when releasing button
:type release_confirm: boolean, (optional)
'''
pass
def rotate(value=0.0,
axis=(0.0, 0.0, 0.0),
constraint_axis=(False, False, False),
constraint_orientation='GLOBAL',
mirror=False,
proportional='DISABLED',
proportional_edit_falloff='SMOOTH',
proportional_size=1.0,
snap=False,
snap_target='CLOSEST',
snap_point=(0.0, 0.0, 0.0),
snap_align=False,
snap_normal=(0.0, 0.0, 0.0),
gpencil_strokes=False,
release_confirm=False):
'''Rotate selected items
:param value: Angle
:type value: float in [-inf, inf], (optional)
:param axis: Axis, The axis around which the transformation occurs
:type axis: float array of 3 items in [-inf, inf], (optional)
:param constraint_axis: Constraint Axis
:type constraint_axis: boolean array of 3 items, (optional)
:param constraint_orientation: Orientation, Transformation orientation
:type constraint_orientation: enum in [], (optional)
:param mirror: Mirror Editing
:type mirror: boolean, (optional)
:param proportional: Proportional EditingDISABLED Disable, Proportional Editing disabled.ENABLED Enable, Proportional Editing enabled.PROJECTED Projected (2D), Proportional Editing using screen space locations.CONNECTED Connected, Proportional Editing using connected geometry only.
:type proportional: enum in ['DISABLED', 'ENABLED', 'PROJECTED', 'CONNECTED'], (optional)
:param proportional_edit_falloff: Proportional Editing Falloff, Falloff type for proportional editing modeSMOOTH Smooth, Smooth falloff.SPHERE Sphere, Spherical falloff.ROOT Root, Root falloff.INVERSE_SQUARE Inverse Square, Inverse Square falloff.SHARP Sharp, Sharp falloff.LINEAR Linear, Linear falloff.CONSTANT Constant, Constant falloff.RANDOM Random, Random falloff.
:type proportional_edit_falloff: enum in ['SMOOTH', 'SPHERE', 'ROOT', 'INVERSE_SQUARE', 'SHARP', 'LINEAR', 'CONSTANT', 'RANDOM'], (optional)
:param proportional_size: Proportional Size
:type proportional_size: float in [1e-06, inf], (optional)
:param snap: Use Snapping Options
:type snap: boolean, (optional)
:param snap_target: TargetCLOSEST Closest, Snap closest point onto target.CENTER Center, Snap center onto target.MEDIAN Median, Snap median onto target.ACTIVE Active, Snap active onto target.
:type snap_target: enum in ['CLOSEST', 'CENTER', 'MEDIAN', 'ACTIVE'], (optional)
:param snap_point: Point
:type snap_point: float array of 3 items in [-inf, inf], (optional)
:param snap_align: Align with Point Normal
:type snap_align: boolean, (optional)
:param snap_normal: Normal
:type snap_normal: float array of 3 items in [-inf, inf], (optional)
:param gpencil_strokes: Edit Grease Pencil, Edit selected Grease Pencil strokes
:type gpencil_strokes: boolean, (optional)
:param release_confirm: Confirm on Release, Always confirm operation when releasing button
:type release_confirm: boolean, (optional)
'''
pass
def select_orientation(orientation='GLOBAL'):
'''Select transformation orientation
:param orientation: Orientation, Transformation orientation
:type orientation: enum in [], (optional)
'''
pass
def seq_slide(
value=(0.0, 0.0),
snap=False,
snap_target='CLOSEST',
snap_point=(0.0, 0.0, 0.0),
snap_align=False,
snap_normal=(0.0, 0.0, 0.0),
release_confirm=False):
'''Slide a sequence strip in time
:param value: Vector
:type value: float array of 2 items in [-inf, inf], (optional)
:param snap: Use Snapping Options
:type snap: boolean, (optional)
:param snap_target: TargetCLOSEST Closest, Snap closest point onto target.CENTER Center, Snap center onto target.MEDIAN Median, Snap median onto target.ACTIVE Active, Snap active onto target.
:type snap_target: enum in ['CLOSEST', 'CENTER', 'MEDIAN', 'ACTIVE'], (optional)
:param snap_point: Point
:type snap_point: float array of 3 items in [-inf, inf], (optional)
:param snap_align: Align with Point Normal
:type snap_align: boolean, (optional)
:param snap_normal: Normal
:type snap_normal: float array of 3 items in [-inf, inf], (optional)
:param release_confirm: Confirm on Release, Always confirm operation when releasing button
:type release_confirm: boolean, (optional)
'''
pass
def shear(value=0.0,
mirror=False,
proportional='DISABLED',
proportional_edit_falloff='SMOOTH',
proportional_size=1.0,
snap=False,
snap_target='CLOSEST',
snap_point=(0.0, 0.0, 0.0),
snap_align=False,
snap_normal=(0.0, 0.0, 0.0),
gpencil_strokes=False,
release_confirm=False):
'''Shear selected items along the horizontal screen axis
:param value: Offset
:type value: float in [-inf, inf], (optional)
:param mirror: Mirror Editing
:type mirror: boolean, (optional)
:param proportional: Proportional EditingDISABLED Disable, Proportional Editing disabled.ENABLED Enable, Proportional Editing enabled.PROJECTED Projected (2D), Proportional Editing using screen space locations.CONNECTED Connected, Proportional Editing using connected geometry only.
:type proportional: enum in ['DISABLED', 'ENABLED', 'PROJECTED', 'CONNECTED'], (optional)
:param proportional_edit_falloff: Proportional Editing Falloff, Falloff type for proportional editing modeSMOOTH Smooth, Smooth falloff.SPHERE Sphere, Spherical falloff.ROOT Root, Root falloff.INVERSE_SQUARE Inverse Square, Inverse Square falloff.SHARP Sharp, Sharp falloff.LINEAR Linear, Linear falloff.CONSTANT Constant, Constant falloff.RANDOM Random, Random falloff.
:type proportional_edit_falloff: enum in ['SMOOTH', 'SPHERE', 'ROOT', 'INVERSE_SQUARE', 'SHARP', 'LINEAR', 'CONSTANT', 'RANDOM'], (optional)
:param proportional_size: Proportional Size
:type proportional_size: float in [1e-06, inf], (optional)
:param snap: Use Snapping Options
:type snap: boolean, (optional)
:param snap_target: TargetCLOSEST Closest, Snap closest point onto target.CENTER Center, Snap center onto target.MEDIAN Median, Snap median onto target.ACTIVE Active, Snap active onto target.
:type snap_target: enum in ['CLOSEST', 'CENTER', 'MEDIAN', 'ACTIVE'], (optional)
:param snap_point: Point
:type snap_point: float array of 3 items in [-inf, inf], (optional)
:param snap_align: Align with Point Normal
:type snap_align: boolean, (optional)
:param snap_normal: Normal
:type snap_normal: float array of 3 items in [-inf, inf], (optional)
:param gpencil_strokes: Edit Grease Pencil, Edit selected Grease Pencil strokes
:type gpencil_strokes: boolean, (optional)
:param release_confirm: Confirm on Release, Always confirm operation when releasing button
:type release_confirm: boolean, (optional)
'''
pass
def shrink_fatten(value=0.0,
use_even_offset=True,
mirror=False,
proportional='DISABLED',
proportional_edit_falloff='SMOOTH',
proportional_size=1.0,
snap=False,
snap_target='CLOSEST',
snap_point=(0.0, 0.0, 0.0),
snap_align=False,
snap_normal=(0.0, 0.0, 0.0),
release_confirm=False):
'''Shrink/fatten selected vertices along normals
:param value: Offset
:type value: float in [-inf, inf], (optional)
:param use_even_offset: Offset Even, Scale the offset to give more even thickness
:type use_even_offset: boolean, (optional)
:param mirror: Mirror Editing
:type mirror: boolean, (optional)
:param proportional: Proportional EditingDISABLED Disable, Proportional Editing disabled.ENABLED Enable, Proportional Editing enabled.PROJECTED Projected (2D), Proportional Editing using screen space locations.CONNECTED Connected, Proportional Editing using connected geometry only.
:type proportional: enum in ['DISABLED', 'ENABLED', 'PROJECTED', 'CONNECTED'], (optional)
:param proportional_edit_falloff: Proportional Editing Falloff, Falloff type for proportional editing modeSMOOTH Smooth, Smooth falloff.SPHERE Sphere, Spherical falloff.ROOT Root, Root falloff.INVERSE_SQUARE Inverse Square, Inverse Square falloff.SHARP Sharp, Sharp falloff.LINEAR Linear, Linear falloff.CONSTANT Constant, Constant falloff.RANDOM Random, Random falloff.
:type proportional_edit_falloff: enum in ['SMOOTH', 'SPHERE', 'ROOT', 'INVERSE_SQUARE', 'SHARP', 'LINEAR', 'CONSTANT', 'RANDOM'], (optional)
:param proportional_size: Proportional Size
:type proportional_size: float in [1e-06, inf], (optional)
:param snap: Use Snapping Options
:type snap: boolean, (optional)
:param snap_target: TargetCLOSEST Closest, Snap closest point onto target.CENTER Center, Snap center onto target.MEDIAN Median, Snap median onto target.ACTIVE Active, Snap active onto target.
:type snap_target: enum in ['CLOSEST', 'CENTER', 'MEDIAN', 'ACTIVE'], (optional)
:param snap_point: Point
:type snap_point: float array of 3 items in [-inf, inf], (optional)
:param snap_align: Align with Point Normal
:type snap_align: boolean, (optional)
:param snap_normal: Normal
:type snap_normal: float array of 3 items in [-inf, inf], (optional)
:param release_confirm: Confirm on Release, Always confirm operation when releasing button
:type release_confirm: boolean, (optional)
'''
pass
def skin_resize(
value=(1.0, 1.0, 1.0),
constraint_axis=(False, False, False),
constraint_orientation='GLOBAL',
mirror=False,
proportional='DISABLED',
proportional_edit_falloff='SMOOTH',
proportional_size=1.0,
snap=False,
snap_target='CLOSEST',
snap_point=(0.0, 0.0, 0.0),
snap_align=False,
snap_normal=(0.0, 0.0, 0.0),
release_confirm=False):
'''Scale selected vertices’ skin radii
:param value: Vector
:type value: float array of 3 items in [-inf, inf], (optional)
:param constraint_axis: Constraint Axis
:type constraint_axis: boolean array of 3 items, (optional)
:param constraint_orientation: Orientation, Transformation orientation
:type constraint_orientation: enum in [], (optional)
:param mirror: Mirror Editing
:type mirror: boolean, (optional)
:param proportional: Proportional EditingDISABLED Disable, Proportional Editing disabled.ENABLED Enable, Proportional Editing enabled.PROJECTED Projected (2D), Proportional Editing using screen space locations.CONNECTED Connected, Proportional Editing using connected geometry only.
:type proportional: enum in ['DISABLED', 'ENABLED', 'PROJECTED', 'CONNECTED'], (optional)
:param proportional_edit_falloff: Proportional Editing Falloff, Falloff type for proportional editing modeSMOOTH Smooth, Smooth falloff.SPHERE Sphere, Spherical falloff.ROOT Root, Root falloff.INVERSE_SQUARE Inverse Square, Inverse Square falloff.SHARP Sharp, Sharp falloff.LINEAR Linear, Linear falloff.CONSTANT Constant, Constant falloff.RANDOM Random, Random falloff.
:type proportional_edit_falloff: enum in ['SMOOTH', 'SPHERE', 'ROOT', 'INVERSE_SQUARE', 'SHARP', 'LINEAR', 'CONSTANT', 'RANDOM'], (optional)
:param proportional_size: Proportional Size
:type proportional_size: float in [1e-06, inf], (optional)
:param snap: Use Snapping Options
:type snap: boolean, (optional)
:param snap_target: TargetCLOSEST Closest, Snap closest point onto target.CENTER Center, Snap center onto target.MEDIAN Median, Snap median onto target.ACTIVE Active, Snap active onto target.
:type snap_target: enum in ['CLOSEST', 'CENTER', 'MEDIAN', 'ACTIVE'], (optional)
:param snap_point: Point
:type snap_point: float array of 3 items in [-inf, inf], (optional)
:param snap_align: Align with Point Normal
:type snap_align: boolean, (optional)
:param snap_normal: Normal
:type snap_normal: float array of 3 items in [-inf, inf], (optional)
:param release_confirm: Confirm on Release, Always confirm operation when releasing button
:type release_confirm: boolean, (optional)
'''
pass
def tilt(value=0.0,
mirror=False,
proportional='DISABLED',
proportional_edit_falloff='SMOOTH',
proportional_size=1.0,
snap=False,
snap_target='CLOSEST',
snap_point=(0.0, 0.0, 0.0),
snap_align=False,
snap_normal=(0.0, 0.0, 0.0),
release_confirm=False):
'''Tilt selected control vertices of 3D curve
:param value: Angle
:type value: float in [-inf, inf], (optional)
:param mirror: Mirror Editing
:type mirror: boolean, (optional)
:param proportional: Proportional EditingDISABLED Disable, Proportional Editing disabled.ENABLED Enable, Proportional Editing enabled.PROJECTED Projected (2D), Proportional Editing using screen space locations.CONNECTED Connected, Proportional Editing using connected geometry only.
:type proportional: enum in ['DISABLED', 'ENABLED', 'PROJECTED', 'CONNECTED'], (optional)
:param proportional_edit_falloff: Proportional Editing Falloff, Falloff type for proportional editing modeSMOOTH Smooth, Smooth falloff.SPHERE Sphere, Spherical falloff.ROOT Root, Root falloff.INVERSE_SQUARE Inverse Square, Inverse Square falloff.SHARP Sharp, Sharp falloff.LINEAR Linear, Linear falloff.CONSTANT Constant, Constant falloff.RANDOM Random, Random falloff.
:type proportional_edit_falloff: enum in ['SMOOTH', 'SPHERE', 'ROOT', 'INVERSE_SQUARE', 'SHARP', 'LINEAR', 'CONSTANT', 'RANDOM'], (optional)
:param proportional_size: Proportional Size
:type proportional_size: float in [1e-06, inf], (optional)
:param snap: Use Snapping Options
:type snap: boolean, (optional)
:param snap_target: TargetCLOSEST Closest, Snap closest point onto target.CENTER Center, Snap center onto target.MEDIAN Median, Snap median onto target.ACTIVE Active, Snap active onto target.
:type snap_target: enum in ['CLOSEST', 'CENTER', 'MEDIAN', 'ACTIVE'], (optional)
:param snap_point: Point
:type snap_point: float array of 3 items in [-inf, inf], (optional)
:param snap_align: Align with Point Normal
:type snap_align: boolean, (optional)
:param snap_normal: Normal
:type snap_normal: float array of 3 items in [-inf, inf], (optional)
:param release_confirm: Confirm on Release, Always confirm operation when releasing button
:type release_confirm: boolean, (optional)
'''
pass
def tosphere(value=0.0,
mirror=False,
proportional='DISABLED',
proportional_edit_falloff='SMOOTH',
proportional_size=1.0,
snap=False,
snap_target='CLOSEST',
snap_point=(0.0, 0.0, 0.0),
snap_align=False,
snap_normal=(0.0, 0.0, 0.0),
gpencil_strokes=False,
release_confirm=False):
'''Move selected vertices outward in a spherical shape around mesh center
:param value: Factor
:type value: float in [0, 1], (optional)
:param mirror: Mirror Editing
:type mirror: boolean, (optional)
:param proportional: Proportional EditingDISABLED Disable, Proportional Editing disabled.ENABLED Enable, Proportional Editing enabled.PROJECTED Projected (2D), Proportional Editing using screen space locations.CONNECTED Connected, Proportional Editing using connected geometry only.
:type proportional: enum in ['DISABLED', 'ENABLED', 'PROJECTED', 'CONNECTED'], (optional)
:param proportional_edit_falloff: Proportional Editing Falloff, Falloff type for proportional editing modeSMOOTH Smooth, Smooth falloff.SPHERE Sphere, Spherical falloff.ROOT Root, Root falloff.INVERSE_SQUARE Inverse Square, Inverse Square falloff.SHARP Sharp, Sharp falloff.LINEAR Linear, Linear falloff.CONSTANT Constant, Constant falloff.RANDOM Random, Random falloff.
:type proportional_edit_falloff: enum in ['SMOOTH', 'SPHERE', 'ROOT', 'INVERSE_SQUARE', 'SHARP', 'LINEAR', 'CONSTANT', 'RANDOM'], (optional)
:param proportional_size: Proportional Size
:type proportional_size: float in [1e-06, inf], (optional)
:param snap: Use Snapping Options
:type snap: boolean, (optional)
:param snap_target: TargetCLOSEST Closest, Snap closest point onto target.CENTER Center, Snap center onto target.MEDIAN Median, Snap median onto target.ACTIVE Active, Snap active onto target.
:type snap_target: enum in ['CLOSEST', 'CENTER', 'MEDIAN', 'ACTIVE'], (optional)
:param snap_point: Point
:type snap_point: float array of 3 items in [-inf, inf], (optional)
:param snap_align: Align with Point Normal
:type snap_align: boolean, (optional)
:param snap_normal: Normal
:type snap_normal: float array of 3 items in [-inf, inf], (optional)
:param gpencil_strokes: Edit Grease Pencil, Edit selected Grease Pencil strokes
:type gpencil_strokes: boolean, (optional)
:param release_confirm: Confirm on Release, Always confirm operation when releasing button
:type release_confirm: boolean, (optional)
'''
pass
def trackball(
value=(0.0, 0.0),
mirror=False,
proportional='DISABLED',
proportional_edit_falloff='SMOOTH',
proportional_size=1.0,
snap=False,
snap_target='CLOSEST',
snap_point=(0.0, 0.0, 0.0),
snap_align=False,
snap_normal=(0.0, 0.0, 0.0),
gpencil_strokes=False,
release_confirm=False):
'''Trackball style rotation of selected items
:param value: Angle
:type value: float array of 2 items in [-inf, inf], (optional)
:param mirror: Mirror Editing
:type mirror: boolean, (optional)
:param proportional: Proportional EditingDISABLED Disable, Proportional Editing disabled.ENABLED Enable, Proportional Editing enabled.PROJECTED Projected (2D), Proportional Editing using screen space locations.CONNECTED Connected, Proportional Editing using connected geometry only.
:type proportional: enum in ['DISABLED', 'ENABLED', 'PROJECTED', 'CONNECTED'], (optional)
:param proportional_edit_falloff: Proportional Editing Falloff, Falloff type for proportional editing modeSMOOTH Smooth, Smooth falloff.SPHERE Sphere, Spherical falloff.ROOT Root, Root falloff.INVERSE_SQUARE Inverse Square, Inverse Square falloff.SHARP Sharp, Sharp falloff.LINEAR Linear, Linear falloff.CONSTANT Constant, Constant falloff.RANDOM Random, Random falloff.
:type proportional_edit_falloff: enum in ['SMOOTH', 'SPHERE', 'ROOT', 'INVERSE_SQUARE', 'SHARP', 'LINEAR', 'CONSTANT', 'RANDOM'], (optional)
:param proportional_size: Proportional Size
:type proportional_size: float in [1e-06, inf], (optional)
:param snap: Use Snapping Options
:type snap: boolean, (optional)
:param snap_target: TargetCLOSEST Closest, Snap closest point onto target.CENTER Center, Snap center onto target.MEDIAN Median, Snap median onto target.ACTIVE Active, Snap active onto target.
:type snap_target: enum in ['CLOSEST', 'CENTER', 'MEDIAN', 'ACTIVE'], (optional)
:param snap_point: Point
:type snap_point: float array of 3 items in [-inf, inf], (optional)
:param snap_align: Align with Point Normal
:type snap_align: boolean, (optional)
:param snap_normal: Normal
:type snap_normal: float array of 3 items in [-inf, inf], (optional)
:param gpencil_strokes: Edit Grease Pencil, Edit selected Grease Pencil strokes
:type gpencil_strokes: boolean, (optional)
:param release_confirm: Confirm on Release, Always confirm operation when releasing button
:type release_confirm: boolean, (optional)
'''
pass
def transform(mode='TRANSLATION',
value=(0.0, 0.0, 0.0, 0.0),
axis=(0.0, 0.0, 0.0),
constraint_axis=(False, False, False),
constraint_orientation='GLOBAL',
mirror=False,
proportional='DISABLED',
proportional_edit_falloff='SMOOTH',
proportional_size=1.0,
snap=False,
snap_target='CLOSEST',
snap_point=(0.0, 0.0, 0.0),
snap_align=False,
snap_normal=(0.0, 0.0, 0.0),
gpencil_strokes=False,
release_confirm=False):
'''Transform selected items by mode type
:param mode: Mode
:type mode: enum in ['INIT', 'DUMMY', 'TRANSLATION', 'ROTATION', 'RESIZE', 'SKIN_RESIZE', 'TOSPHERE', 'SHEAR', 'BEND', 'SHRINKFATTEN', 'TILT', 'TRACKBALL', 'PUSHPULL', 'CREASE', 'MIRROR', 'BONE_SIZE', 'BONE_ENVELOPE', 'BONE_ENVELOPE_DIST', 'CURVE_SHRINKFATTEN', 'MASK_SHRINKFATTEN', 'GPENCIL_SHRINKFATTEN', 'BONE_ROLL', 'TIME_TRANSLATE', 'TIME_SLIDE', 'TIME_SCALE', 'TIME_EXTEND', 'BAKE_TIME', 'BWEIGHT', 'ALIGN', 'EDGESLIDE', 'SEQSLIDE'], (optional)
:param value: Values
:type value: float array of 4 items in [-inf, inf], (optional)
:param axis: Axis, The axis around which the transformation occurs
:type axis: float array of 3 items in [-inf, inf], (optional)
:param constraint_axis: Constraint Axis
:type constraint_axis: boolean array of 3 items, (optional)
:param constraint_orientation: Orientation, Transformation orientation
:type constraint_orientation: enum in [], (optional)
:param mirror: Mirror Editing
:type mirror: boolean, (optional)
:param proportional: Proportional EditingDISABLED Disable, Proportional Editing disabled.ENABLED Enable, Proportional Editing enabled.PROJECTED Projected (2D), Proportional Editing using screen space locations.CONNECTED Connected, Proportional Editing using connected geometry only.
:type proportional: enum in ['DISABLED', 'ENABLED', 'PROJECTED', 'CONNECTED'], (optional)
:param proportional_edit_falloff: Proportional Editing Falloff, Falloff type for proportional editing modeSMOOTH Smooth, Smooth falloff.SPHERE Sphere, Spherical falloff.ROOT Root, Root falloff.INVERSE_SQUARE Inverse Square, Inverse Square falloff.SHARP Sharp, Sharp falloff.LINEAR Linear, Linear falloff.CONSTANT Constant, Constant falloff.RANDOM Random, Random falloff.
:type proportional_edit_falloff: enum in ['SMOOTH', 'SPHERE', 'ROOT', 'INVERSE_SQUARE', 'SHARP', 'LINEAR', 'CONSTANT', 'RANDOM'], (optional)
:param proportional_size: Proportional Size
:type proportional_size: float in [1e-06, inf], (optional)
:param snap: Use Snapping Options
:type snap: boolean, (optional)
:param snap_target: TargetCLOSEST Closest, Snap closest point onto target.CENTER Center, Snap center onto target.MEDIAN Median, Snap median onto target.ACTIVE Active, Snap active onto target.
:type snap_target: enum in ['CLOSEST', 'CENTER', 'MEDIAN', 'ACTIVE'], (optional)
:param snap_point: Point
:type snap_point: float array of 3 items in [-inf, inf], (optional)
:param snap_align: Align with Point Normal
:type snap_align: boolean, (optional)
:param snap_normal: Normal
:type snap_normal: float array of 3 items in [-inf, inf], (optional)
:param gpencil_strokes: Edit Grease Pencil, Edit selected Grease Pencil strokes
:type gpencil_strokes: boolean, (optional)
:param release_confirm: Confirm on Release, Always confirm operation when releasing button
:type release_confirm: boolean, (optional)
'''
pass
def translate(
value=(0.0, 0.0, 0.0),
constraint_axis=(False, False, False),
constraint_orientation='GLOBAL',
mirror=False,
proportional='DISABLED',
proportional_edit_falloff='SMOOTH',
proportional_size=1.0,
snap=False,
snap_target='CLOSEST',
snap_point=(0.0, 0.0, 0.0),
snap_align=False,
snap_normal=(0.0, 0.0, 0.0),
gpencil_strokes=False,
texture_space=False,
remove_on_cancel=False,
release_confirm=False):
'''Translate (move) selected items
:param value: Vector
:type value: float array of 3 items in [-inf, inf], (optional)
:param constraint_axis: Constraint Axis
:type constraint_axis: boolean array of 3 items, (optional)
:param constraint_orientation: Orientation, Transformation orientation
:type constraint_orientation: enum in [], (optional)
:param mirror: Mirror Editing
:type mirror: boolean, (optional)
:param proportional: Proportional EditingDISABLED Disable, Proportional Editing disabled.ENABLED Enable, Proportional Editing enabled.PROJECTED Projected (2D), Proportional Editing using screen space locations.CONNECTED Connected, Proportional Editing using connected geometry only.
:type proportional: enum in ['DISABLED', 'ENABLED', 'PROJECTED', 'CONNECTED'], (optional)
:param proportional_edit_falloff: Proportional Editing Falloff, Falloff type for proportional editing modeSMOOTH Smooth, Smooth falloff.SPHERE Sphere, Spherical falloff.ROOT Root, Root falloff.INVERSE_SQUARE Inverse Square, Inverse Square falloff.SHARP Sharp, Sharp falloff.LINEAR Linear, Linear falloff.CONSTANT Constant, Constant falloff.RANDOM Random, Random falloff.
:type proportional_edit_falloff: enum in ['SMOOTH', 'SPHERE', 'ROOT', 'INVERSE_SQUARE', 'SHARP', 'LINEAR', 'CONSTANT', 'RANDOM'], (optional)
:param proportional_size: Proportional Size
:type proportional_size: float in [1e-06, inf], (optional)
:param snap: Use Snapping Options
:type snap: boolean, (optional)
:param snap_target: TargetCLOSEST Closest, Snap closest point onto target.CENTER Center, Snap center onto target.MEDIAN Median, Snap median onto target.ACTIVE Active, Snap active onto target.
:type snap_target: enum in ['CLOSEST', 'CENTER', 'MEDIAN', 'ACTIVE'], (optional)
:param snap_point: Point
:type snap_point: float array of 3 items in [-inf, inf], (optional)
:param snap_align: Align with Point Normal
:type snap_align: boolean, (optional)
:param snap_normal: Normal
:type snap_normal: float array of 3 items in [-inf, inf], (optional)
:param gpencil_strokes: Edit Grease Pencil, Edit selected Grease Pencil strokes
:type gpencil_strokes: boolean, (optional)
:param texture_space: Edit Texture Space, Edit Object data texture space
:type texture_space: boolean, (optional)
:param remove_on_cancel: Remove on Cancel, Remove elements on cancel
:type remove_on_cancel: boolean, (optional)
:param release_confirm: Confirm on Release, Always confirm operation when releasing button
:type release_confirm: boolean, (optional)
'''
pass
def vert_slide(value=0.0,
use_even=False,
flipped=False,
use_clamp=True,
mirror=False,
snap=False,
snap_target='CLOSEST',
snap_point=(0.0, 0.0, 0.0),
snap_align=False,
snap_normal=(0.0, 0.0, 0.0),
correct_uv=False,
release_confirm=False):
'''Slide a vertex along a mesh
:param value: Factor
:type value: float in [-10, 10], (optional)
:param use_even: Even, Make the edge loop match the shape of the adjacent edge loop
:type use_even: boolean, (optional)
:param flipped: Flipped, When Even mode is active, flips between the two adjacent edge loops
:type flipped: boolean, (optional)
:param use_clamp: Clamp, Clamp within the edge extents
:type use_clamp: boolean, (optional)
:param mirror: Mirror Editing
:type mirror: boolean, (optional)
:param snap: Use Snapping Options
:type snap: boolean, (optional)
:param snap_target: TargetCLOSEST Closest, Snap closest point onto target.CENTER Center, Snap center onto target.MEDIAN Median, Snap median onto target.ACTIVE Active, Snap active onto target.
:type snap_target: enum in ['CLOSEST', 'CENTER', 'MEDIAN', 'ACTIVE'], (optional)
:param snap_point: Point
:type snap_point: float array of 3 items in [-inf, inf], (optional)
:param snap_align: Align with Point Normal
:type snap_align: boolean, (optional)
:param snap_normal: Normal
:type snap_normal: float array of 3 items in [-inf, inf], (optional)
:param correct_uv: Correct UVs, Correct UV coordinates when transforming
:type correct_uv: boolean, (optional)
:param release_confirm: Confirm on Release, Always confirm operation when releasing button
:type release_confirm: boolean, (optional)
'''
pass
def vertex_random(offset=0.1, uniform=0.0, normal=0.0, seed=0):
'''Randomize vertices
:param offset: Amount, Distance to offset
:type offset: float in [-inf, inf], (optional)
:param uniform: Uniform, Increase for uniform offset distance
:type uniform: float in [0, 1], (optional)
:param normal: normal, Align offset direction to normals
:type normal: float in [0, 1], (optional)
:param seed: Random Seed, Seed for the random number generator
:type seed: int in [0, 10000], (optional)
'''
pass
def vertex_warp(warp_angle=6.28319,
offset_angle=0.0,
min=-1,
max=1.0,
viewmat=(0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0,
0.0, 0.0, 0.0, 0.0, 0.0),
center=(0.0, 0.0, 0.0)):
'''Warp vertices around the cursor
:param warp_angle: Warp Angle, Amount to warp about the cursor
:type warp_angle: float in [-inf, inf], (optional)
:param offset_angle: Offset Angle, Angle to use as the basis for warping
:type offset_angle: float in [-inf, inf], (optional)
:param min: Min
:type min: float in [-inf, inf], (optional)
:param max: Max
:type max: float in [-inf, inf], (optional)
:param viewmat: Matrix
:type viewmat: float array of 16 items in [-inf, inf], (optional)
:param center: Center
:type center: float array of 3 items in [-inf, inf], (optional)
'''
pass
| 53.963139 | 454 | 0.697927 | 5,639 | 45,383 | 5.520305 | 0.042915 | 0.015934 | 0.018504 | 0.019275 | 0.925311 | 0.921649 | 0.916477 | 0.911915 | 0.906936 | 0.905843 | 0 | 0.01342 | 0.205319 | 45,383 | 840 | 455 | 54.027381 | 0.849716 | 0.773814 | 0 | 0.865079 | 0 | 0 | 0.043008 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.09127 | false | 0.09127 | 0 | 0 | 0.09127 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 8 |
54305a229b90100942c18ec828b315a6b383ca38 | 115,581 | py | Python | sdk/python/pulumi_alicloud/dts/synchronization_job.py | pulumi/pulumi-alicloud | 9c34d84b4588a7c885c6bec1f03b5016e5a41683 | [
"ECL-2.0",
"Apache-2.0"
] | 42 | 2019-03-18T06:34:37.000Z | 2022-03-24T07:08:57.000Z | sdk/python/pulumi_alicloud/dts/synchronization_job.py | pulumi/pulumi-alicloud | 9c34d84b4588a7c885c6bec1f03b5016e5a41683 | [
"ECL-2.0",
"Apache-2.0"
] | 152 | 2019-04-15T21:03:44.000Z | 2022-03-29T18:00:57.000Z | sdk/python/pulumi_alicloud/dts/synchronization_job.py | pulumi/pulumi-alicloud | 9c34d84b4588a7c885c6bec1f03b5016e5a41683 | [
"ECL-2.0",
"Apache-2.0"
] | 3 | 2020-08-26T17:30:07.000Z | 2021-07-05T01:37:45.000Z | # coding=utf-8
# *** WARNING: this file was generated by the Pulumi Terraform Bridge (tfgen) Tool. ***
# *** Do not edit by hand unless you're certain you know what you are doing! ***
import warnings
import pulumi
import pulumi.runtime
from typing import Any, Mapping, Optional, Sequence, Union, overload
from .. import _utilities
__all__ = ['SynchronizationJobArgs', 'SynchronizationJob']
@pulumi.input_type
class SynchronizationJobArgs:
def __init__(__self__, *,
data_initialization: pulumi.Input[bool],
data_synchronization: pulumi.Input[bool],
db_list: pulumi.Input[str],
destination_endpoint_engine_name: pulumi.Input[str],
destination_endpoint_instance_type: pulumi.Input[str],
dts_instance_id: pulumi.Input[str],
source_endpoint_engine_name: pulumi.Input[str],
source_endpoint_instance_type: pulumi.Input[str],
structure_initialization: pulumi.Input[bool],
checkpoint: Optional[pulumi.Input[str]] = None,
delay_notice: Optional[pulumi.Input[bool]] = None,
delay_phone: Optional[pulumi.Input[str]] = None,
delay_rule_time: Optional[pulumi.Input[str]] = None,
destination_endpoint_database_name: Optional[pulumi.Input[str]] = None,
destination_endpoint_instance_id: Optional[pulumi.Input[str]] = None,
destination_endpoint_ip: Optional[pulumi.Input[str]] = None,
destination_endpoint_oracle_sid: Optional[pulumi.Input[str]] = None,
destination_endpoint_password: Optional[pulumi.Input[str]] = None,
destination_endpoint_port: Optional[pulumi.Input[str]] = None,
destination_endpoint_region: Optional[pulumi.Input[str]] = None,
destination_endpoint_user_name: Optional[pulumi.Input[str]] = None,
dts_job_name: Optional[pulumi.Input[str]] = None,
error_notice: Optional[pulumi.Input[bool]] = None,
error_phone: Optional[pulumi.Input[str]] = None,
instance_class: Optional[pulumi.Input[str]] = None,
reserve: Optional[pulumi.Input[str]] = None,
source_endpoint_database_name: Optional[pulumi.Input[str]] = None,
source_endpoint_instance_id: Optional[pulumi.Input[str]] = None,
source_endpoint_ip: Optional[pulumi.Input[str]] = None,
source_endpoint_oracle_sid: Optional[pulumi.Input[str]] = None,
source_endpoint_owner_id: Optional[pulumi.Input[str]] = None,
source_endpoint_password: Optional[pulumi.Input[str]] = None,
source_endpoint_port: Optional[pulumi.Input[str]] = None,
source_endpoint_region: Optional[pulumi.Input[str]] = None,
source_endpoint_role: Optional[pulumi.Input[str]] = None,
source_endpoint_user_name: Optional[pulumi.Input[str]] = None,
status: Optional[pulumi.Input[str]] = None,
synchronization_direction: Optional[pulumi.Input[str]] = None):
"""
The set of arguments for constructing a SynchronizationJob resource.
:param pulumi.Input[bool] data_initialization: Whether or not to execute DTS supports schema migration, full data migration, or full-data initialization values include:
:param pulumi.Input[bool] data_synchronization: Whether to perform incremental data migration for migration types or synchronization values include:
:param pulumi.Input[str] db_list: Migration object, in the format of JSON strings. For detailed definition instructions, please refer to [the description of migration, synchronization or subscription objects](https://help.aliyun.com/document_detail/209545.html).
:param pulumi.Input[str] destination_endpoint_engine_name: The type of destination database. Valid values: `ADB20`, `ADB30`, `AS400`, `DATAHUB`, `DB2`, `GREENPLUM`, `KAFKA`, `MONGODB`, `MSSQL`, `MySQL`, `ORACLE`, `PolarDB`, `POLARDBX20`, `POLARDB_O`, `PostgreSQL`.
:param pulumi.Input[str] destination_endpoint_instance_type: The type of destination instance. Valid values: `ads`, `CEN`, `DATAHUB`, `DG`, `ECS`, `EXPRESS`, `GREENPLUM`, `MONGODB`, `OTHER`, `PolarDB`, `POLARDBX20`, `RDS`.
:param pulumi.Input[str] dts_instance_id: Synchronizing instance ID. The ID of `dts.SynchronizationInstance`.
:param pulumi.Input[str] source_endpoint_engine_name: The type of source database. Valid values: `AS400`, `DB2`, `DMSPOLARDB`, `HBASE`, `MONGODB`, `MSSQL`, `MySQL`, `ORACLE`, `PolarDB`, `POLARDBX20`, `POLARDB_O`, `POSTGRESQL`, `TERADATA`.
:param pulumi.Input[str] source_endpoint_instance_type: The type of source instance. Valid values: `CEN`, `DG`, `DISTRIBUTED_DMSLOGICDB`, `ECS`, `EXPRESS`, `MONGODB`, `OTHER`, `PolarDB`, `POLARDBX20`, `RDS`.
:param pulumi.Input[bool] structure_initialization: Whether to perform a database table structure to migrate or initialization values include:
:param pulumi.Input[str] checkpoint: Start time in Unix timestamp format.
:param pulumi.Input[bool] delay_notice: The delay notice. Valid values: `true`, `false`.
:param pulumi.Input[str] delay_phone: The delay phone. The mobile phone number of the contact who delayed the alarm. Multiple mobile phone numbers separated by English commas `,`. This parameter currently only supports China stations, and only supports mainland mobile phone numbers, and up to 10 mobile phone numbers can be passed in.
:param pulumi.Input[str] delay_rule_time: The delay rule time. When `delay_notice` is set to `true`, this parameter must be passed in. The threshold for triggering the delay alarm. The unit is second and needs to be an integer. The threshold can be set according to business needs. It is recommended to set it above 10 seconds to avoid delay fluctuations caused by network and database load.
:param pulumi.Input[str] destination_endpoint_instance_id: The ID of destination instance.
:param pulumi.Input[str] destination_endpoint_ip: The ip of source endpoint.
:param pulumi.Input[str] destination_endpoint_oracle_sid: The SID of Oracle database.
:param pulumi.Input[str] destination_endpoint_password: The password of database account.
:param pulumi.Input[str] destination_endpoint_port: The port of source endpoint.
:param pulumi.Input[str] destination_endpoint_region: The region of destination instance.
:param pulumi.Input[str] destination_endpoint_user_name: The username of database account.
:param pulumi.Input[str] dts_job_name: The name of synchronization job.
:param pulumi.Input[bool] error_notice: The error notice. Valid values: `true`, `false`.
:param pulumi.Input[str] error_phone: The error phone. The mobile phone number of the contact who error the alarm. Multiple mobile phone numbers separated by English commas `,`. This parameter currently only supports China stations, and only supports mainland mobile phone numbers, and up to 10 mobile phone numbers can be passed in.
:param pulumi.Input[str] instance_class: The instance class. Valid values: `large`, `medium`, `micro`, `small`, `xlarge`, `xxlarge`. You can only upgrade the configuration, not downgrade the configuration. If you downgrade the instance, you need to [submit a ticket](https://selfservice.console.aliyun.com/ticket/category/dts/today).
:param pulumi.Input[str] reserve: DTS reserves parameters, the format is a JSON string, you can pass in this parameter to complete the source and target database information (such as the data storage format of the target Kafka database, the instance ID of the cloud enterprise network CEN). For more information, please refer to the parameter [description of the Reserve parameter](https://help.aliyun.com/document_detail/273111.html).
:param pulumi.Input[str] source_endpoint_database_name: The name of migrate the database.
:param pulumi.Input[str] source_endpoint_instance_id: The ID of source instance.
:param pulumi.Input[str] source_endpoint_ip: The ip of source endpoint.
:param pulumi.Input[str] source_endpoint_oracle_sid: The SID of Oracle database.
:param pulumi.Input[str] source_endpoint_owner_id: The Alibaba Cloud account ID to which the source instance belongs.
:param pulumi.Input[str] source_endpoint_password: The password of database account.
:param pulumi.Input[str] source_endpoint_port: The port of source endpoint.
:param pulumi.Input[str] source_endpoint_region: The region of source instance.
:param pulumi.Input[str] source_endpoint_role: The name of the role configured for the cloud account to which the source instance belongs.
:param pulumi.Input[str] source_endpoint_user_name: The username of database account.
:param pulumi.Input[str] status: The status of the resource. Valid values: `Synchronizing`, `Suspending`. You can stop the task by specifying `Suspending` and start the task by specifying `Synchronizing`.
:param pulumi.Input[str] synchronization_direction: Synchronization direction. Valid values: `Forward`, `Reverse`. Only when the property `sync_architecture` of the `dts.SynchronizationInstance` was `bidirectional` this parameter should be passed, otherwise this parameter should not be specified.
"""
pulumi.set(__self__, "data_initialization", data_initialization)
pulumi.set(__self__, "data_synchronization", data_synchronization)
pulumi.set(__self__, "db_list", db_list)
pulumi.set(__self__, "destination_endpoint_engine_name", destination_endpoint_engine_name)
pulumi.set(__self__, "destination_endpoint_instance_type", destination_endpoint_instance_type)
pulumi.set(__self__, "dts_instance_id", dts_instance_id)
pulumi.set(__self__, "source_endpoint_engine_name", source_endpoint_engine_name)
pulumi.set(__self__, "source_endpoint_instance_type", source_endpoint_instance_type)
pulumi.set(__self__, "structure_initialization", structure_initialization)
if checkpoint is not None:
pulumi.set(__self__, "checkpoint", checkpoint)
if delay_notice is not None:
pulumi.set(__self__, "delay_notice", delay_notice)
if delay_phone is not None:
pulumi.set(__self__, "delay_phone", delay_phone)
if delay_rule_time is not None:
pulumi.set(__self__, "delay_rule_time", delay_rule_time)
if destination_endpoint_database_name is not None:
pulumi.set(__self__, "destination_endpoint_database_name", destination_endpoint_database_name)
if destination_endpoint_instance_id is not None:
pulumi.set(__self__, "destination_endpoint_instance_id", destination_endpoint_instance_id)
if destination_endpoint_ip is not None:
pulumi.set(__self__, "destination_endpoint_ip", destination_endpoint_ip)
if destination_endpoint_oracle_sid is not None:
pulumi.set(__self__, "destination_endpoint_oracle_sid", destination_endpoint_oracle_sid)
if destination_endpoint_password is not None:
pulumi.set(__self__, "destination_endpoint_password", destination_endpoint_password)
if destination_endpoint_port is not None:
pulumi.set(__self__, "destination_endpoint_port", destination_endpoint_port)
if destination_endpoint_region is not None:
pulumi.set(__self__, "destination_endpoint_region", destination_endpoint_region)
if destination_endpoint_user_name is not None:
pulumi.set(__self__, "destination_endpoint_user_name", destination_endpoint_user_name)
if dts_job_name is not None:
pulumi.set(__self__, "dts_job_name", dts_job_name)
if error_notice is not None:
pulumi.set(__self__, "error_notice", error_notice)
if error_phone is not None:
pulumi.set(__self__, "error_phone", error_phone)
if instance_class is not None:
pulumi.set(__self__, "instance_class", instance_class)
if reserve is not None:
pulumi.set(__self__, "reserve", reserve)
if source_endpoint_database_name is not None:
pulumi.set(__self__, "source_endpoint_database_name", source_endpoint_database_name)
if source_endpoint_instance_id is not None:
pulumi.set(__self__, "source_endpoint_instance_id", source_endpoint_instance_id)
if source_endpoint_ip is not None:
pulumi.set(__self__, "source_endpoint_ip", source_endpoint_ip)
if source_endpoint_oracle_sid is not None:
pulumi.set(__self__, "source_endpoint_oracle_sid", source_endpoint_oracle_sid)
if source_endpoint_owner_id is not None:
pulumi.set(__self__, "source_endpoint_owner_id", source_endpoint_owner_id)
if source_endpoint_password is not None:
pulumi.set(__self__, "source_endpoint_password", source_endpoint_password)
if source_endpoint_port is not None:
pulumi.set(__self__, "source_endpoint_port", source_endpoint_port)
if source_endpoint_region is not None:
pulumi.set(__self__, "source_endpoint_region", source_endpoint_region)
if source_endpoint_role is not None:
pulumi.set(__self__, "source_endpoint_role", source_endpoint_role)
if source_endpoint_user_name is not None:
pulumi.set(__self__, "source_endpoint_user_name", source_endpoint_user_name)
if status is not None:
pulumi.set(__self__, "status", status)
if synchronization_direction is not None:
pulumi.set(__self__, "synchronization_direction", synchronization_direction)
@property
@pulumi.getter(name="dataInitialization")
def data_initialization(self) -> pulumi.Input[bool]:
"""
Whether or not to execute DTS supports schema migration, full data migration, or full-data initialization values include:
"""
return pulumi.get(self, "data_initialization")
@data_initialization.setter
def data_initialization(self, value: pulumi.Input[bool]):
pulumi.set(self, "data_initialization", value)
@property
@pulumi.getter(name="dataSynchronization")
def data_synchronization(self) -> pulumi.Input[bool]:
"""
Whether to perform incremental data migration for migration types or synchronization values include:
"""
return pulumi.get(self, "data_synchronization")
@data_synchronization.setter
def data_synchronization(self, value: pulumi.Input[bool]):
pulumi.set(self, "data_synchronization", value)
@property
@pulumi.getter(name="dbList")
def db_list(self) -> pulumi.Input[str]:
"""
Migration object, in the format of JSON strings. For detailed definition instructions, please refer to [the description of migration, synchronization or subscription objects](https://help.aliyun.com/document_detail/209545.html).
"""
return pulumi.get(self, "db_list")
@db_list.setter
def db_list(self, value: pulumi.Input[str]):
pulumi.set(self, "db_list", value)
@property
@pulumi.getter(name="destinationEndpointEngineName")
def destination_endpoint_engine_name(self) -> pulumi.Input[str]:
"""
The type of destination database. Valid values: `ADB20`, `ADB30`, `AS400`, `DATAHUB`, `DB2`, `GREENPLUM`, `KAFKA`, `MONGODB`, `MSSQL`, `MySQL`, `ORACLE`, `PolarDB`, `POLARDBX20`, `POLARDB_O`, `PostgreSQL`.
"""
return pulumi.get(self, "destination_endpoint_engine_name")
@destination_endpoint_engine_name.setter
def destination_endpoint_engine_name(self, value: pulumi.Input[str]):
pulumi.set(self, "destination_endpoint_engine_name", value)
@property
@pulumi.getter(name="destinationEndpointInstanceType")
def destination_endpoint_instance_type(self) -> pulumi.Input[str]:
"""
The type of destination instance. Valid values: `ads`, `CEN`, `DATAHUB`, `DG`, `ECS`, `EXPRESS`, `GREENPLUM`, `MONGODB`, `OTHER`, `PolarDB`, `POLARDBX20`, `RDS`.
"""
return pulumi.get(self, "destination_endpoint_instance_type")
@destination_endpoint_instance_type.setter
def destination_endpoint_instance_type(self, value: pulumi.Input[str]):
pulumi.set(self, "destination_endpoint_instance_type", value)
@property
@pulumi.getter(name="dtsInstanceId")
def dts_instance_id(self) -> pulumi.Input[str]:
"""
Synchronizing instance ID. The ID of `dts.SynchronizationInstance`.
"""
return pulumi.get(self, "dts_instance_id")
@dts_instance_id.setter
def dts_instance_id(self, value: pulumi.Input[str]):
pulumi.set(self, "dts_instance_id", value)
@property
@pulumi.getter(name="sourceEndpointEngineName")
def source_endpoint_engine_name(self) -> pulumi.Input[str]:
"""
The type of source database. Valid values: `AS400`, `DB2`, `DMSPOLARDB`, `HBASE`, `MONGODB`, `MSSQL`, `MySQL`, `ORACLE`, `PolarDB`, `POLARDBX20`, `POLARDB_O`, `POSTGRESQL`, `TERADATA`.
"""
return pulumi.get(self, "source_endpoint_engine_name")
@source_endpoint_engine_name.setter
def source_endpoint_engine_name(self, value: pulumi.Input[str]):
pulumi.set(self, "source_endpoint_engine_name", value)
@property
@pulumi.getter(name="sourceEndpointInstanceType")
def source_endpoint_instance_type(self) -> pulumi.Input[str]:
"""
The type of source instance. Valid values: `CEN`, `DG`, `DISTRIBUTED_DMSLOGICDB`, `ECS`, `EXPRESS`, `MONGODB`, `OTHER`, `PolarDB`, `POLARDBX20`, `RDS`.
"""
return pulumi.get(self, "source_endpoint_instance_type")
@source_endpoint_instance_type.setter
def source_endpoint_instance_type(self, value: pulumi.Input[str]):
pulumi.set(self, "source_endpoint_instance_type", value)
@property
@pulumi.getter(name="structureInitialization")
def structure_initialization(self) -> pulumi.Input[bool]:
"""
Whether to perform a database table structure to migrate or initialization values include:
"""
return pulumi.get(self, "structure_initialization")
@structure_initialization.setter
def structure_initialization(self, value: pulumi.Input[bool]):
pulumi.set(self, "structure_initialization", value)
@property
@pulumi.getter
def checkpoint(self) -> Optional[pulumi.Input[str]]:
"""
Start time in Unix timestamp format.
"""
return pulumi.get(self, "checkpoint")
@checkpoint.setter
def checkpoint(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "checkpoint", value)
@property
@pulumi.getter(name="delayNotice")
def delay_notice(self) -> Optional[pulumi.Input[bool]]:
"""
The delay notice. Valid values: `true`, `false`.
"""
return pulumi.get(self, "delay_notice")
@delay_notice.setter
def delay_notice(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "delay_notice", value)
@property
@pulumi.getter(name="delayPhone")
def delay_phone(self) -> Optional[pulumi.Input[str]]:
"""
The delay phone. The mobile phone number of the contact who delayed the alarm. Multiple mobile phone numbers separated by English commas `,`. This parameter currently only supports China stations, and only supports mainland mobile phone numbers, and up to 10 mobile phone numbers can be passed in.
"""
return pulumi.get(self, "delay_phone")
@delay_phone.setter
def delay_phone(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "delay_phone", value)
@property
@pulumi.getter(name="delayRuleTime")
def delay_rule_time(self) -> Optional[pulumi.Input[str]]:
"""
The delay rule time. When `delay_notice` is set to `true`, this parameter must be passed in. The threshold for triggering the delay alarm. The unit is second and needs to be an integer. The threshold can be set according to business needs. It is recommended to set it above 10 seconds to avoid delay fluctuations caused by network and database load.
"""
return pulumi.get(self, "delay_rule_time")
@delay_rule_time.setter
def delay_rule_time(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "delay_rule_time", value)
@property
@pulumi.getter(name="destinationEndpointDatabaseName")
def destination_endpoint_database_name(self) -> Optional[pulumi.Input[str]]:
return pulumi.get(self, "destination_endpoint_database_name")
@destination_endpoint_database_name.setter
def destination_endpoint_database_name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "destination_endpoint_database_name", value)
@property
@pulumi.getter(name="destinationEndpointInstanceId")
def destination_endpoint_instance_id(self) -> Optional[pulumi.Input[str]]:
"""
The ID of destination instance.
"""
return pulumi.get(self, "destination_endpoint_instance_id")
@destination_endpoint_instance_id.setter
def destination_endpoint_instance_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "destination_endpoint_instance_id", value)
@property
@pulumi.getter(name="destinationEndpointIp")
def destination_endpoint_ip(self) -> Optional[pulumi.Input[str]]:
"""
The ip of source endpoint.
"""
return pulumi.get(self, "destination_endpoint_ip")
@destination_endpoint_ip.setter
def destination_endpoint_ip(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "destination_endpoint_ip", value)
@property
@pulumi.getter(name="destinationEndpointOracleSid")
def destination_endpoint_oracle_sid(self) -> Optional[pulumi.Input[str]]:
"""
The SID of Oracle database.
"""
return pulumi.get(self, "destination_endpoint_oracle_sid")
@destination_endpoint_oracle_sid.setter
def destination_endpoint_oracle_sid(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "destination_endpoint_oracle_sid", value)
@property
@pulumi.getter(name="destinationEndpointPassword")
def destination_endpoint_password(self) -> Optional[pulumi.Input[str]]:
"""
The password of database account.
"""
return pulumi.get(self, "destination_endpoint_password")
@destination_endpoint_password.setter
def destination_endpoint_password(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "destination_endpoint_password", value)
@property
@pulumi.getter(name="destinationEndpointPort")
def destination_endpoint_port(self) -> Optional[pulumi.Input[str]]:
"""
The port of source endpoint.
"""
return pulumi.get(self, "destination_endpoint_port")
@destination_endpoint_port.setter
def destination_endpoint_port(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "destination_endpoint_port", value)
@property
@pulumi.getter(name="destinationEndpointRegion")
def destination_endpoint_region(self) -> Optional[pulumi.Input[str]]:
"""
The region of destination instance.
"""
return pulumi.get(self, "destination_endpoint_region")
@destination_endpoint_region.setter
def destination_endpoint_region(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "destination_endpoint_region", value)
@property
@pulumi.getter(name="destinationEndpointUserName")
def destination_endpoint_user_name(self) -> Optional[pulumi.Input[str]]:
"""
The username of database account.
"""
return pulumi.get(self, "destination_endpoint_user_name")
@destination_endpoint_user_name.setter
def destination_endpoint_user_name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "destination_endpoint_user_name", value)
@property
@pulumi.getter(name="dtsJobName")
def dts_job_name(self) -> Optional[pulumi.Input[str]]:
"""
The name of synchronization job.
"""
return pulumi.get(self, "dts_job_name")
@dts_job_name.setter
def dts_job_name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "dts_job_name", value)
@property
@pulumi.getter(name="errorNotice")
def error_notice(self) -> Optional[pulumi.Input[bool]]:
"""
The error notice. Valid values: `true`, `false`.
"""
return pulumi.get(self, "error_notice")
@error_notice.setter
def error_notice(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "error_notice", value)
@property
@pulumi.getter(name="errorPhone")
def error_phone(self) -> Optional[pulumi.Input[str]]:
"""
The error phone. The mobile phone number of the contact who error the alarm. Multiple mobile phone numbers separated by English commas `,`. This parameter currently only supports China stations, and only supports mainland mobile phone numbers, and up to 10 mobile phone numbers can be passed in.
"""
return pulumi.get(self, "error_phone")
@error_phone.setter
def error_phone(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "error_phone", value)
@property
@pulumi.getter(name="instanceClass")
def instance_class(self) -> Optional[pulumi.Input[str]]:
"""
The instance class. Valid values: `large`, `medium`, `micro`, `small`, `xlarge`, `xxlarge`. You can only upgrade the configuration, not downgrade the configuration. If you downgrade the instance, you need to [submit a ticket](https://selfservice.console.aliyun.com/ticket/category/dts/today).
"""
return pulumi.get(self, "instance_class")
@instance_class.setter
def instance_class(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "instance_class", value)
@property
@pulumi.getter
def reserve(self) -> Optional[pulumi.Input[str]]:
"""
DTS reserves parameters, the format is a JSON string, you can pass in this parameter to complete the source and target database information (such as the data storage format of the target Kafka database, the instance ID of the cloud enterprise network CEN). For more information, please refer to the parameter [description of the Reserve parameter](https://help.aliyun.com/document_detail/273111.html).
"""
return pulumi.get(self, "reserve")
@reserve.setter
def reserve(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "reserve", value)
@property
@pulumi.getter(name="sourceEndpointDatabaseName")
def source_endpoint_database_name(self) -> Optional[pulumi.Input[str]]:
"""
The name of migrate the database.
"""
return pulumi.get(self, "source_endpoint_database_name")
@source_endpoint_database_name.setter
def source_endpoint_database_name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "source_endpoint_database_name", value)
@property
@pulumi.getter(name="sourceEndpointInstanceId")
def source_endpoint_instance_id(self) -> Optional[pulumi.Input[str]]:
"""
The ID of source instance.
"""
return pulumi.get(self, "source_endpoint_instance_id")
@source_endpoint_instance_id.setter
def source_endpoint_instance_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "source_endpoint_instance_id", value)
@property
@pulumi.getter(name="sourceEndpointIp")
def source_endpoint_ip(self) -> Optional[pulumi.Input[str]]:
"""
The ip of source endpoint.
"""
return pulumi.get(self, "source_endpoint_ip")
@source_endpoint_ip.setter
def source_endpoint_ip(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "source_endpoint_ip", value)
@property
@pulumi.getter(name="sourceEndpointOracleSid")
def source_endpoint_oracle_sid(self) -> Optional[pulumi.Input[str]]:
"""
The SID of Oracle database.
"""
return pulumi.get(self, "source_endpoint_oracle_sid")
@source_endpoint_oracle_sid.setter
def source_endpoint_oracle_sid(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "source_endpoint_oracle_sid", value)
@property
@pulumi.getter(name="sourceEndpointOwnerId")
def source_endpoint_owner_id(self) -> Optional[pulumi.Input[str]]:
"""
The Alibaba Cloud account ID to which the source instance belongs.
"""
return pulumi.get(self, "source_endpoint_owner_id")
@source_endpoint_owner_id.setter
def source_endpoint_owner_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "source_endpoint_owner_id", value)
@property
@pulumi.getter(name="sourceEndpointPassword")
def source_endpoint_password(self) -> Optional[pulumi.Input[str]]:
"""
The password of database account.
"""
return pulumi.get(self, "source_endpoint_password")
@source_endpoint_password.setter
def source_endpoint_password(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "source_endpoint_password", value)
@property
@pulumi.getter(name="sourceEndpointPort")
def source_endpoint_port(self) -> Optional[pulumi.Input[str]]:
"""
The port of source endpoint.
"""
return pulumi.get(self, "source_endpoint_port")
@source_endpoint_port.setter
def source_endpoint_port(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "source_endpoint_port", value)
@property
@pulumi.getter(name="sourceEndpointRegion")
def source_endpoint_region(self) -> Optional[pulumi.Input[str]]:
"""
The region of source instance.
"""
return pulumi.get(self, "source_endpoint_region")
@source_endpoint_region.setter
def source_endpoint_region(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "source_endpoint_region", value)
@property
@pulumi.getter(name="sourceEndpointRole")
def source_endpoint_role(self) -> Optional[pulumi.Input[str]]:
"""
The name of the role configured for the cloud account to which the source instance belongs.
"""
return pulumi.get(self, "source_endpoint_role")
@source_endpoint_role.setter
def source_endpoint_role(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "source_endpoint_role", value)
@property
@pulumi.getter(name="sourceEndpointUserName")
def source_endpoint_user_name(self) -> Optional[pulumi.Input[str]]:
"""
The username of database account.
"""
return pulumi.get(self, "source_endpoint_user_name")
@source_endpoint_user_name.setter
def source_endpoint_user_name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "source_endpoint_user_name", value)
@property
@pulumi.getter
def status(self) -> Optional[pulumi.Input[str]]:
"""
The status of the resource. Valid values: `Synchronizing`, `Suspending`. You can stop the task by specifying `Suspending` and start the task by specifying `Synchronizing`.
"""
return pulumi.get(self, "status")
@status.setter
def status(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "status", value)
@property
@pulumi.getter(name="synchronizationDirection")
def synchronization_direction(self) -> Optional[pulumi.Input[str]]:
"""
Synchronization direction. Valid values: `Forward`, `Reverse`. Only when the property `sync_architecture` of the `dts.SynchronizationInstance` was `bidirectional` this parameter should be passed, otherwise this parameter should not be specified.
"""
return pulumi.get(self, "synchronization_direction")
@synchronization_direction.setter
def synchronization_direction(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "synchronization_direction", value)
@pulumi.input_type
class _SynchronizationJobState:
def __init__(__self__, *,
checkpoint: Optional[pulumi.Input[str]] = None,
data_initialization: Optional[pulumi.Input[bool]] = None,
data_synchronization: Optional[pulumi.Input[bool]] = None,
db_list: Optional[pulumi.Input[str]] = None,
delay_notice: Optional[pulumi.Input[bool]] = None,
delay_phone: Optional[pulumi.Input[str]] = None,
delay_rule_time: Optional[pulumi.Input[str]] = None,
destination_endpoint_database_name: Optional[pulumi.Input[str]] = None,
destination_endpoint_engine_name: Optional[pulumi.Input[str]] = None,
destination_endpoint_instance_id: Optional[pulumi.Input[str]] = None,
destination_endpoint_instance_type: Optional[pulumi.Input[str]] = None,
destination_endpoint_ip: Optional[pulumi.Input[str]] = None,
destination_endpoint_oracle_sid: Optional[pulumi.Input[str]] = None,
destination_endpoint_password: Optional[pulumi.Input[str]] = None,
destination_endpoint_port: Optional[pulumi.Input[str]] = None,
destination_endpoint_region: Optional[pulumi.Input[str]] = None,
destination_endpoint_user_name: Optional[pulumi.Input[str]] = None,
dts_instance_id: Optional[pulumi.Input[str]] = None,
dts_job_name: Optional[pulumi.Input[str]] = None,
error_notice: Optional[pulumi.Input[bool]] = None,
error_phone: Optional[pulumi.Input[str]] = None,
instance_class: Optional[pulumi.Input[str]] = None,
reserve: Optional[pulumi.Input[str]] = None,
source_endpoint_database_name: Optional[pulumi.Input[str]] = None,
source_endpoint_engine_name: Optional[pulumi.Input[str]] = None,
source_endpoint_instance_id: Optional[pulumi.Input[str]] = None,
source_endpoint_instance_type: Optional[pulumi.Input[str]] = None,
source_endpoint_ip: Optional[pulumi.Input[str]] = None,
source_endpoint_oracle_sid: Optional[pulumi.Input[str]] = None,
source_endpoint_owner_id: Optional[pulumi.Input[str]] = None,
source_endpoint_password: Optional[pulumi.Input[str]] = None,
source_endpoint_port: Optional[pulumi.Input[str]] = None,
source_endpoint_region: Optional[pulumi.Input[str]] = None,
source_endpoint_role: Optional[pulumi.Input[str]] = None,
source_endpoint_user_name: Optional[pulumi.Input[str]] = None,
status: Optional[pulumi.Input[str]] = None,
structure_initialization: Optional[pulumi.Input[bool]] = None,
synchronization_direction: Optional[pulumi.Input[str]] = None):
"""
Input properties used for looking up and filtering SynchronizationJob resources.
:param pulumi.Input[str] checkpoint: Start time in Unix timestamp format.
:param pulumi.Input[bool] data_initialization: Whether or not to execute DTS supports schema migration, full data migration, or full-data initialization values include:
:param pulumi.Input[bool] data_synchronization: Whether to perform incremental data migration for migration types or synchronization values include:
:param pulumi.Input[str] db_list: Migration object, in the format of JSON strings. For detailed definition instructions, please refer to [the description of migration, synchronization or subscription objects](https://help.aliyun.com/document_detail/209545.html).
:param pulumi.Input[bool] delay_notice: The delay notice. Valid values: `true`, `false`.
:param pulumi.Input[str] delay_phone: The delay phone. The mobile phone number of the contact who delayed the alarm. Multiple mobile phone numbers separated by English commas `,`. This parameter currently only supports China stations, and only supports mainland mobile phone numbers, and up to 10 mobile phone numbers can be passed in.
:param pulumi.Input[str] delay_rule_time: The delay rule time. When `delay_notice` is set to `true`, this parameter must be passed in. The threshold for triggering the delay alarm. The unit is second and needs to be an integer. The threshold can be set according to business needs. It is recommended to set it above 10 seconds to avoid delay fluctuations caused by network and database load.
:param pulumi.Input[str] destination_endpoint_engine_name: The type of destination database. Valid values: `ADB20`, `ADB30`, `AS400`, `DATAHUB`, `DB2`, `GREENPLUM`, `KAFKA`, `MONGODB`, `MSSQL`, `MySQL`, `ORACLE`, `PolarDB`, `POLARDBX20`, `POLARDB_O`, `PostgreSQL`.
:param pulumi.Input[str] destination_endpoint_instance_id: The ID of destination instance.
:param pulumi.Input[str] destination_endpoint_instance_type: The type of destination instance. Valid values: `ads`, `CEN`, `DATAHUB`, `DG`, `ECS`, `EXPRESS`, `GREENPLUM`, `MONGODB`, `OTHER`, `PolarDB`, `POLARDBX20`, `RDS`.
:param pulumi.Input[str] destination_endpoint_ip: The ip of source endpoint.
:param pulumi.Input[str] destination_endpoint_oracle_sid: The SID of Oracle database.
:param pulumi.Input[str] destination_endpoint_password: The password of database account.
:param pulumi.Input[str] destination_endpoint_port: The port of source endpoint.
:param pulumi.Input[str] destination_endpoint_region: The region of destination instance.
:param pulumi.Input[str] destination_endpoint_user_name: The username of database account.
:param pulumi.Input[str] dts_instance_id: Synchronizing instance ID. The ID of `dts.SynchronizationInstance`.
:param pulumi.Input[str] dts_job_name: The name of synchronization job.
:param pulumi.Input[bool] error_notice: The error notice. Valid values: `true`, `false`.
:param pulumi.Input[str] error_phone: The error phone. The mobile phone number of the contact who error the alarm. Multiple mobile phone numbers separated by English commas `,`. This parameter currently only supports China stations, and only supports mainland mobile phone numbers, and up to 10 mobile phone numbers can be passed in.
:param pulumi.Input[str] instance_class: The instance class. Valid values: `large`, `medium`, `micro`, `small`, `xlarge`, `xxlarge`. You can only upgrade the configuration, not downgrade the configuration. If you downgrade the instance, you need to [submit a ticket](https://selfservice.console.aliyun.com/ticket/category/dts/today).
:param pulumi.Input[str] reserve: DTS reserves parameters, the format is a JSON string, you can pass in this parameter to complete the source and target database information (such as the data storage format of the target Kafka database, the instance ID of the cloud enterprise network CEN). For more information, please refer to the parameter [description of the Reserve parameter](https://help.aliyun.com/document_detail/273111.html).
:param pulumi.Input[str] source_endpoint_database_name: The name of migrate the database.
:param pulumi.Input[str] source_endpoint_engine_name: The type of source database. Valid values: `AS400`, `DB2`, `DMSPOLARDB`, `HBASE`, `MONGODB`, `MSSQL`, `MySQL`, `ORACLE`, `PolarDB`, `POLARDBX20`, `POLARDB_O`, `POSTGRESQL`, `TERADATA`.
:param pulumi.Input[str] source_endpoint_instance_id: The ID of source instance.
:param pulumi.Input[str] source_endpoint_instance_type: The type of source instance. Valid values: `CEN`, `DG`, `DISTRIBUTED_DMSLOGICDB`, `ECS`, `EXPRESS`, `MONGODB`, `OTHER`, `PolarDB`, `POLARDBX20`, `RDS`.
:param pulumi.Input[str] source_endpoint_ip: The ip of source endpoint.
:param pulumi.Input[str] source_endpoint_oracle_sid: The SID of Oracle database.
:param pulumi.Input[str] source_endpoint_owner_id: The Alibaba Cloud account ID to which the source instance belongs.
:param pulumi.Input[str] source_endpoint_password: The password of database account.
:param pulumi.Input[str] source_endpoint_port: The port of source endpoint.
:param pulumi.Input[str] source_endpoint_region: The region of source instance.
:param pulumi.Input[str] source_endpoint_role: The name of the role configured for the cloud account to which the source instance belongs.
:param pulumi.Input[str] source_endpoint_user_name: The username of database account.
:param pulumi.Input[str] status: The status of the resource. Valid values: `Synchronizing`, `Suspending`. You can stop the task by specifying `Suspending` and start the task by specifying `Synchronizing`.
:param pulumi.Input[bool] structure_initialization: Whether to perform a database table structure to migrate or initialization values include:
:param pulumi.Input[str] synchronization_direction: Synchronization direction. Valid values: `Forward`, `Reverse`. Only when the property `sync_architecture` of the `dts.SynchronizationInstance` was `bidirectional` this parameter should be passed, otherwise this parameter should not be specified.
"""
if checkpoint is not None:
pulumi.set(__self__, "checkpoint", checkpoint)
if data_initialization is not None:
pulumi.set(__self__, "data_initialization", data_initialization)
if data_synchronization is not None:
pulumi.set(__self__, "data_synchronization", data_synchronization)
if db_list is not None:
pulumi.set(__self__, "db_list", db_list)
if delay_notice is not None:
pulumi.set(__self__, "delay_notice", delay_notice)
if delay_phone is not None:
pulumi.set(__self__, "delay_phone", delay_phone)
if delay_rule_time is not None:
pulumi.set(__self__, "delay_rule_time", delay_rule_time)
if destination_endpoint_database_name is not None:
pulumi.set(__self__, "destination_endpoint_database_name", destination_endpoint_database_name)
if destination_endpoint_engine_name is not None:
pulumi.set(__self__, "destination_endpoint_engine_name", destination_endpoint_engine_name)
if destination_endpoint_instance_id is not None:
pulumi.set(__self__, "destination_endpoint_instance_id", destination_endpoint_instance_id)
if destination_endpoint_instance_type is not None:
pulumi.set(__self__, "destination_endpoint_instance_type", destination_endpoint_instance_type)
if destination_endpoint_ip is not None:
pulumi.set(__self__, "destination_endpoint_ip", destination_endpoint_ip)
if destination_endpoint_oracle_sid is not None:
pulumi.set(__self__, "destination_endpoint_oracle_sid", destination_endpoint_oracle_sid)
if destination_endpoint_password is not None:
pulumi.set(__self__, "destination_endpoint_password", destination_endpoint_password)
if destination_endpoint_port is not None:
pulumi.set(__self__, "destination_endpoint_port", destination_endpoint_port)
if destination_endpoint_region is not None:
pulumi.set(__self__, "destination_endpoint_region", destination_endpoint_region)
if destination_endpoint_user_name is not None:
pulumi.set(__self__, "destination_endpoint_user_name", destination_endpoint_user_name)
if dts_instance_id is not None:
pulumi.set(__self__, "dts_instance_id", dts_instance_id)
if dts_job_name is not None:
pulumi.set(__self__, "dts_job_name", dts_job_name)
if error_notice is not None:
pulumi.set(__self__, "error_notice", error_notice)
if error_phone is not None:
pulumi.set(__self__, "error_phone", error_phone)
if instance_class is not None:
pulumi.set(__self__, "instance_class", instance_class)
if reserve is not None:
pulumi.set(__self__, "reserve", reserve)
if source_endpoint_database_name is not None:
pulumi.set(__self__, "source_endpoint_database_name", source_endpoint_database_name)
if source_endpoint_engine_name is not None:
pulumi.set(__self__, "source_endpoint_engine_name", source_endpoint_engine_name)
if source_endpoint_instance_id is not None:
pulumi.set(__self__, "source_endpoint_instance_id", source_endpoint_instance_id)
if source_endpoint_instance_type is not None:
pulumi.set(__self__, "source_endpoint_instance_type", source_endpoint_instance_type)
if source_endpoint_ip is not None:
pulumi.set(__self__, "source_endpoint_ip", source_endpoint_ip)
if source_endpoint_oracle_sid is not None:
pulumi.set(__self__, "source_endpoint_oracle_sid", source_endpoint_oracle_sid)
if source_endpoint_owner_id is not None:
pulumi.set(__self__, "source_endpoint_owner_id", source_endpoint_owner_id)
if source_endpoint_password is not None:
pulumi.set(__self__, "source_endpoint_password", source_endpoint_password)
if source_endpoint_port is not None:
pulumi.set(__self__, "source_endpoint_port", source_endpoint_port)
if source_endpoint_region is not None:
pulumi.set(__self__, "source_endpoint_region", source_endpoint_region)
if source_endpoint_role is not None:
pulumi.set(__self__, "source_endpoint_role", source_endpoint_role)
if source_endpoint_user_name is not None:
pulumi.set(__self__, "source_endpoint_user_name", source_endpoint_user_name)
if status is not None:
pulumi.set(__self__, "status", status)
if structure_initialization is not None:
pulumi.set(__self__, "structure_initialization", structure_initialization)
if synchronization_direction is not None:
pulumi.set(__self__, "synchronization_direction", synchronization_direction)
@property
@pulumi.getter
def checkpoint(self) -> Optional[pulumi.Input[str]]:
"""
Start time in Unix timestamp format.
"""
return pulumi.get(self, "checkpoint")
@checkpoint.setter
def checkpoint(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "checkpoint", value)
@property
@pulumi.getter(name="dataInitialization")
def data_initialization(self) -> Optional[pulumi.Input[bool]]:
"""
Whether or not to execute DTS supports schema migration, full data migration, or full-data initialization values include:
"""
return pulumi.get(self, "data_initialization")
@data_initialization.setter
def data_initialization(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "data_initialization", value)
@property
@pulumi.getter(name="dataSynchronization")
def data_synchronization(self) -> Optional[pulumi.Input[bool]]:
"""
Whether to perform incremental data migration for migration types or synchronization values include:
"""
return pulumi.get(self, "data_synchronization")
@data_synchronization.setter
def data_synchronization(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "data_synchronization", value)
@property
@pulumi.getter(name="dbList")
def db_list(self) -> Optional[pulumi.Input[str]]:
"""
Migration object, in the format of JSON strings. For detailed definition instructions, please refer to [the description of migration, synchronization or subscription objects](https://help.aliyun.com/document_detail/209545.html).
"""
return pulumi.get(self, "db_list")
@db_list.setter
def db_list(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "db_list", value)
@property
@pulumi.getter(name="delayNotice")
def delay_notice(self) -> Optional[pulumi.Input[bool]]:
"""
The delay notice. Valid values: `true`, `false`.
"""
return pulumi.get(self, "delay_notice")
@delay_notice.setter
def delay_notice(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "delay_notice", value)
@property
@pulumi.getter(name="delayPhone")
def delay_phone(self) -> Optional[pulumi.Input[str]]:
"""
The delay phone. The mobile phone number of the contact who delayed the alarm. Multiple mobile phone numbers separated by English commas `,`. This parameter currently only supports China stations, and only supports mainland mobile phone numbers, and up to 10 mobile phone numbers can be passed in.
"""
return pulumi.get(self, "delay_phone")
@delay_phone.setter
def delay_phone(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "delay_phone", value)
@property
@pulumi.getter(name="delayRuleTime")
def delay_rule_time(self) -> Optional[pulumi.Input[str]]:
"""
The delay rule time. When `delay_notice` is set to `true`, this parameter must be passed in. The threshold for triggering the delay alarm. The unit is second and needs to be an integer. The threshold can be set according to business needs. It is recommended to set it above 10 seconds to avoid delay fluctuations caused by network and database load.
"""
return pulumi.get(self, "delay_rule_time")
@delay_rule_time.setter
def delay_rule_time(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "delay_rule_time", value)
@property
@pulumi.getter(name="destinationEndpointDatabaseName")
def destination_endpoint_database_name(self) -> Optional[pulumi.Input[str]]:
return pulumi.get(self, "destination_endpoint_database_name")
@destination_endpoint_database_name.setter
def destination_endpoint_database_name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "destination_endpoint_database_name", value)
@property
@pulumi.getter(name="destinationEndpointEngineName")
def destination_endpoint_engine_name(self) -> Optional[pulumi.Input[str]]:
"""
The type of destination database. Valid values: `ADB20`, `ADB30`, `AS400`, `DATAHUB`, `DB2`, `GREENPLUM`, `KAFKA`, `MONGODB`, `MSSQL`, `MySQL`, `ORACLE`, `PolarDB`, `POLARDBX20`, `POLARDB_O`, `PostgreSQL`.
"""
return pulumi.get(self, "destination_endpoint_engine_name")
@destination_endpoint_engine_name.setter
def destination_endpoint_engine_name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "destination_endpoint_engine_name", value)
@property
@pulumi.getter(name="destinationEndpointInstanceId")
def destination_endpoint_instance_id(self) -> Optional[pulumi.Input[str]]:
"""
The ID of destination instance.
"""
return pulumi.get(self, "destination_endpoint_instance_id")
@destination_endpoint_instance_id.setter
def destination_endpoint_instance_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "destination_endpoint_instance_id", value)
@property
@pulumi.getter(name="destinationEndpointInstanceType")
def destination_endpoint_instance_type(self) -> Optional[pulumi.Input[str]]:
"""
The type of destination instance. Valid values: `ads`, `CEN`, `DATAHUB`, `DG`, `ECS`, `EXPRESS`, `GREENPLUM`, `MONGODB`, `OTHER`, `PolarDB`, `POLARDBX20`, `RDS`.
"""
return pulumi.get(self, "destination_endpoint_instance_type")
@destination_endpoint_instance_type.setter
def destination_endpoint_instance_type(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "destination_endpoint_instance_type", value)
@property
@pulumi.getter(name="destinationEndpointIp")
def destination_endpoint_ip(self) -> Optional[pulumi.Input[str]]:
"""
The ip of source endpoint.
"""
return pulumi.get(self, "destination_endpoint_ip")
@destination_endpoint_ip.setter
def destination_endpoint_ip(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "destination_endpoint_ip", value)
@property
@pulumi.getter(name="destinationEndpointOracleSid")
def destination_endpoint_oracle_sid(self) -> Optional[pulumi.Input[str]]:
"""
The SID of Oracle database.
"""
return pulumi.get(self, "destination_endpoint_oracle_sid")
@destination_endpoint_oracle_sid.setter
def destination_endpoint_oracle_sid(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "destination_endpoint_oracle_sid", value)
@property
@pulumi.getter(name="destinationEndpointPassword")
def destination_endpoint_password(self) -> Optional[pulumi.Input[str]]:
"""
The password of database account.
"""
return pulumi.get(self, "destination_endpoint_password")
@destination_endpoint_password.setter
def destination_endpoint_password(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "destination_endpoint_password", value)
@property
@pulumi.getter(name="destinationEndpointPort")
def destination_endpoint_port(self) -> Optional[pulumi.Input[str]]:
"""
The port of source endpoint.
"""
return pulumi.get(self, "destination_endpoint_port")
@destination_endpoint_port.setter
def destination_endpoint_port(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "destination_endpoint_port", value)
@property
@pulumi.getter(name="destinationEndpointRegion")
def destination_endpoint_region(self) -> Optional[pulumi.Input[str]]:
"""
The region of destination instance.
"""
return pulumi.get(self, "destination_endpoint_region")
@destination_endpoint_region.setter
def destination_endpoint_region(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "destination_endpoint_region", value)
@property
@pulumi.getter(name="destinationEndpointUserName")
def destination_endpoint_user_name(self) -> Optional[pulumi.Input[str]]:
"""
The username of database account.
"""
return pulumi.get(self, "destination_endpoint_user_name")
@destination_endpoint_user_name.setter
def destination_endpoint_user_name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "destination_endpoint_user_name", value)
@property
@pulumi.getter(name="dtsInstanceId")
def dts_instance_id(self) -> Optional[pulumi.Input[str]]:
"""
Synchronizing instance ID. The ID of `dts.SynchronizationInstance`.
"""
return pulumi.get(self, "dts_instance_id")
@dts_instance_id.setter
def dts_instance_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "dts_instance_id", value)
@property
@pulumi.getter(name="dtsJobName")
def dts_job_name(self) -> Optional[pulumi.Input[str]]:
"""
The name of synchronization job.
"""
return pulumi.get(self, "dts_job_name")
@dts_job_name.setter
def dts_job_name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "dts_job_name", value)
@property
@pulumi.getter(name="errorNotice")
def error_notice(self) -> Optional[pulumi.Input[bool]]:
"""
The error notice. Valid values: `true`, `false`.
"""
return pulumi.get(self, "error_notice")
@error_notice.setter
def error_notice(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "error_notice", value)
@property
@pulumi.getter(name="errorPhone")
def error_phone(self) -> Optional[pulumi.Input[str]]:
"""
The error phone. The mobile phone number of the contact who error the alarm. Multiple mobile phone numbers separated by English commas `,`. This parameter currently only supports China stations, and only supports mainland mobile phone numbers, and up to 10 mobile phone numbers can be passed in.
"""
return pulumi.get(self, "error_phone")
@error_phone.setter
def error_phone(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "error_phone", value)
@property
@pulumi.getter(name="instanceClass")
def instance_class(self) -> Optional[pulumi.Input[str]]:
"""
The instance class. Valid values: `large`, `medium`, `micro`, `small`, `xlarge`, `xxlarge`. You can only upgrade the configuration, not downgrade the configuration. If you downgrade the instance, you need to [submit a ticket](https://selfservice.console.aliyun.com/ticket/category/dts/today).
"""
return pulumi.get(self, "instance_class")
@instance_class.setter
def instance_class(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "instance_class", value)
@property
@pulumi.getter
def reserve(self) -> Optional[pulumi.Input[str]]:
"""
DTS reserves parameters, the format is a JSON string, you can pass in this parameter to complete the source and target database information (such as the data storage format of the target Kafka database, the instance ID of the cloud enterprise network CEN). For more information, please refer to the parameter [description of the Reserve parameter](https://help.aliyun.com/document_detail/273111.html).
"""
return pulumi.get(self, "reserve")
@reserve.setter
def reserve(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "reserve", value)
@property
@pulumi.getter(name="sourceEndpointDatabaseName")
def source_endpoint_database_name(self) -> Optional[pulumi.Input[str]]:
"""
The name of migrate the database.
"""
return pulumi.get(self, "source_endpoint_database_name")
@source_endpoint_database_name.setter
def source_endpoint_database_name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "source_endpoint_database_name", value)
@property
@pulumi.getter(name="sourceEndpointEngineName")
def source_endpoint_engine_name(self) -> Optional[pulumi.Input[str]]:
"""
The type of source database. Valid values: `AS400`, `DB2`, `DMSPOLARDB`, `HBASE`, `MONGODB`, `MSSQL`, `MySQL`, `ORACLE`, `PolarDB`, `POLARDBX20`, `POLARDB_O`, `POSTGRESQL`, `TERADATA`.
"""
return pulumi.get(self, "source_endpoint_engine_name")
@source_endpoint_engine_name.setter
def source_endpoint_engine_name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "source_endpoint_engine_name", value)
@property
@pulumi.getter(name="sourceEndpointInstanceId")
def source_endpoint_instance_id(self) -> Optional[pulumi.Input[str]]:
"""
The ID of source instance.
"""
return pulumi.get(self, "source_endpoint_instance_id")
@source_endpoint_instance_id.setter
def source_endpoint_instance_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "source_endpoint_instance_id", value)
@property
@pulumi.getter(name="sourceEndpointInstanceType")
def source_endpoint_instance_type(self) -> Optional[pulumi.Input[str]]:
"""
The type of source instance. Valid values: `CEN`, `DG`, `DISTRIBUTED_DMSLOGICDB`, `ECS`, `EXPRESS`, `MONGODB`, `OTHER`, `PolarDB`, `POLARDBX20`, `RDS`.
"""
return pulumi.get(self, "source_endpoint_instance_type")
@source_endpoint_instance_type.setter
def source_endpoint_instance_type(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "source_endpoint_instance_type", value)
@property
@pulumi.getter(name="sourceEndpointIp")
def source_endpoint_ip(self) -> Optional[pulumi.Input[str]]:
"""
The ip of source endpoint.
"""
return pulumi.get(self, "source_endpoint_ip")
@source_endpoint_ip.setter
def source_endpoint_ip(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "source_endpoint_ip", value)
@property
@pulumi.getter(name="sourceEndpointOracleSid")
def source_endpoint_oracle_sid(self) -> Optional[pulumi.Input[str]]:
"""
The SID of Oracle database.
"""
return pulumi.get(self, "source_endpoint_oracle_sid")
@source_endpoint_oracle_sid.setter
def source_endpoint_oracle_sid(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "source_endpoint_oracle_sid", value)
@property
@pulumi.getter(name="sourceEndpointOwnerId")
def source_endpoint_owner_id(self) -> Optional[pulumi.Input[str]]:
"""
The Alibaba Cloud account ID to which the source instance belongs.
"""
return pulumi.get(self, "source_endpoint_owner_id")
@source_endpoint_owner_id.setter
def source_endpoint_owner_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "source_endpoint_owner_id", value)
@property
@pulumi.getter(name="sourceEndpointPassword")
def source_endpoint_password(self) -> Optional[pulumi.Input[str]]:
"""
The password of database account.
"""
return pulumi.get(self, "source_endpoint_password")
@source_endpoint_password.setter
def source_endpoint_password(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "source_endpoint_password", value)
@property
@pulumi.getter(name="sourceEndpointPort")
def source_endpoint_port(self) -> Optional[pulumi.Input[str]]:
"""
The port of source endpoint.
"""
return pulumi.get(self, "source_endpoint_port")
@source_endpoint_port.setter
def source_endpoint_port(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "source_endpoint_port", value)
@property
@pulumi.getter(name="sourceEndpointRegion")
def source_endpoint_region(self) -> Optional[pulumi.Input[str]]:
"""
The region of source instance.
"""
return pulumi.get(self, "source_endpoint_region")
@source_endpoint_region.setter
def source_endpoint_region(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "source_endpoint_region", value)
@property
@pulumi.getter(name="sourceEndpointRole")
def source_endpoint_role(self) -> Optional[pulumi.Input[str]]:
"""
The name of the role configured for the cloud account to which the source instance belongs.
"""
return pulumi.get(self, "source_endpoint_role")
@source_endpoint_role.setter
def source_endpoint_role(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "source_endpoint_role", value)
@property
@pulumi.getter(name="sourceEndpointUserName")
def source_endpoint_user_name(self) -> Optional[pulumi.Input[str]]:
"""
The username of database account.
"""
return pulumi.get(self, "source_endpoint_user_name")
@source_endpoint_user_name.setter
def source_endpoint_user_name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "source_endpoint_user_name", value)
@property
@pulumi.getter
def status(self) -> Optional[pulumi.Input[str]]:
"""
The status of the resource. Valid values: `Synchronizing`, `Suspending`. You can stop the task by specifying `Suspending` and start the task by specifying `Synchronizing`.
"""
return pulumi.get(self, "status")
@status.setter
def status(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "status", value)
@property
@pulumi.getter(name="structureInitialization")
def structure_initialization(self) -> Optional[pulumi.Input[bool]]:
"""
Whether to perform a database table structure to migrate or initialization values include:
"""
return pulumi.get(self, "structure_initialization")
@structure_initialization.setter
def structure_initialization(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "structure_initialization", value)
@property
@pulumi.getter(name="synchronizationDirection")
def synchronization_direction(self) -> Optional[pulumi.Input[str]]:
"""
Synchronization direction. Valid values: `Forward`, `Reverse`. Only when the property `sync_architecture` of the `dts.SynchronizationInstance` was `bidirectional` this parameter should be passed, otherwise this parameter should not be specified.
"""
return pulumi.get(self, "synchronization_direction")
@synchronization_direction.setter
def synchronization_direction(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "synchronization_direction", value)
class SynchronizationJob(pulumi.CustomResource):
@overload
def __init__(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
checkpoint: Optional[pulumi.Input[str]] = None,
data_initialization: Optional[pulumi.Input[bool]] = None,
data_synchronization: Optional[pulumi.Input[bool]] = None,
db_list: Optional[pulumi.Input[str]] = None,
delay_notice: Optional[pulumi.Input[bool]] = None,
delay_phone: Optional[pulumi.Input[str]] = None,
delay_rule_time: Optional[pulumi.Input[str]] = None,
destination_endpoint_database_name: Optional[pulumi.Input[str]] = None,
destination_endpoint_engine_name: Optional[pulumi.Input[str]] = None,
destination_endpoint_instance_id: Optional[pulumi.Input[str]] = None,
destination_endpoint_instance_type: Optional[pulumi.Input[str]] = None,
destination_endpoint_ip: Optional[pulumi.Input[str]] = None,
destination_endpoint_oracle_sid: Optional[pulumi.Input[str]] = None,
destination_endpoint_password: Optional[pulumi.Input[str]] = None,
destination_endpoint_port: Optional[pulumi.Input[str]] = None,
destination_endpoint_region: Optional[pulumi.Input[str]] = None,
destination_endpoint_user_name: Optional[pulumi.Input[str]] = None,
dts_instance_id: Optional[pulumi.Input[str]] = None,
dts_job_name: Optional[pulumi.Input[str]] = None,
error_notice: Optional[pulumi.Input[bool]] = None,
error_phone: Optional[pulumi.Input[str]] = None,
instance_class: Optional[pulumi.Input[str]] = None,
reserve: Optional[pulumi.Input[str]] = None,
source_endpoint_database_name: Optional[pulumi.Input[str]] = None,
source_endpoint_engine_name: Optional[pulumi.Input[str]] = None,
source_endpoint_instance_id: Optional[pulumi.Input[str]] = None,
source_endpoint_instance_type: Optional[pulumi.Input[str]] = None,
source_endpoint_ip: Optional[pulumi.Input[str]] = None,
source_endpoint_oracle_sid: Optional[pulumi.Input[str]] = None,
source_endpoint_owner_id: Optional[pulumi.Input[str]] = None,
source_endpoint_password: Optional[pulumi.Input[str]] = None,
source_endpoint_port: Optional[pulumi.Input[str]] = None,
source_endpoint_region: Optional[pulumi.Input[str]] = None,
source_endpoint_role: Optional[pulumi.Input[str]] = None,
source_endpoint_user_name: Optional[pulumi.Input[str]] = None,
status: Optional[pulumi.Input[str]] = None,
structure_initialization: Optional[pulumi.Input[bool]] = None,
synchronization_direction: Optional[pulumi.Input[str]] = None,
__props__=None):
"""
Provides a DTS Synchronization Job resource.
For information about DTS Synchronization Job and how to use it, see [What is Synchronization Job](https://www.alibabacloud.com/product/data-transmission-service).
> **NOTE:** Available in v1.138.0+.
## Example Usage
Basic Usage
```python
import pulumi
import pulumi_alicloud as alicloud
default_synchronization_instance = alicloud.dts.SynchronizationInstance("defaultSynchronizationInstance",
payment_type="PostPaid",
source_endpoint_engine_name="PolarDB",
source_endpoint_region="cn-hangzhou",
destination_endpoint_engine_name="ADB30",
destination_endpoint_region="cn-hangzhou",
instance_class="small",
sync_architecture="oneway")
default_synchronization_job = alicloud.dts.SynchronizationJob("defaultSynchronizationJob",
dts_instance_id=default_synchronization_instance.id,
dts_job_name="tf-testAccCase1",
source_endpoint_instance_type="PolarDB",
source_endpoint_instance_id="pc-xxxxxxxx",
source_endpoint_engine_name="PolarDB",
source_endpoint_region="cn-hangzhou",
source_endpoint_database_name="tf-testacc",
source_endpoint_user_name="root",
source_endpoint_password="password",
destination_endpoint_instance_type="ads",
destination_endpoint_instance_id="am-xxxxxxxx",
destination_endpoint_engine_name="ADB30",
destination_endpoint_region="cn-hangzhou",
destination_endpoint_database_name="tf-testacc",
destination_endpoint_user_name="root",
destination_endpoint_password="password",
db_list="{\"tf-testacc\":{\"name\":\"tf-test\",\"all\":true,\"state\":\"normal\"}}",
structure_initialization=True,
data_initialization=True,
data_synchronization=True,
status="Synchronizing")
```
## Notice
1. The expiration time cannot be changed after the work of the annual and monthly subscription suspended;
2. After the pay-as-you-go type job suspended, your job configuration fee will still be charged;
3. If the task suspended for more than 6 hours, the task will not start successfully.
4. Suspending the task will only stop writing to the target library, but will still continue to obtain the incremental log of the source, so that the task can be quickly resumed after the suspension is cancelled. Therefore, some resources of the source library, such as bandwidth resources, will continue to be occupied during the period.
5. Charges will continue during the task suspension period. If you need to stop charging, please release the instance
6. When a DTS instance suspended for more than 7 days, the instance cannot be resumed, and the status will change from suspended to failed.
## Import
DTS Synchronization Job can be imported using the id, e.g.
```sh
$ pulumi import alicloud:dts/synchronizationJob:SynchronizationJob example <id>
```
:param str resource_name: The name of the resource.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[str] checkpoint: Start time in Unix timestamp format.
:param pulumi.Input[bool] data_initialization: Whether or not to execute DTS supports schema migration, full data migration, or full-data initialization values include:
:param pulumi.Input[bool] data_synchronization: Whether to perform incremental data migration for migration types or synchronization values include:
:param pulumi.Input[str] db_list: Migration object, in the format of JSON strings. For detailed definition instructions, please refer to [the description of migration, synchronization or subscription objects](https://help.aliyun.com/document_detail/209545.html).
:param pulumi.Input[bool] delay_notice: The delay notice. Valid values: `true`, `false`.
:param pulumi.Input[str] delay_phone: The delay phone. The mobile phone number of the contact who delayed the alarm. Multiple mobile phone numbers separated by English commas `,`. This parameter currently only supports China stations, and only supports mainland mobile phone numbers, and up to 10 mobile phone numbers can be passed in.
:param pulumi.Input[str] delay_rule_time: The delay rule time. When `delay_notice` is set to `true`, this parameter must be passed in. The threshold for triggering the delay alarm. The unit is second and needs to be an integer. The threshold can be set according to business needs. It is recommended to set it above 10 seconds to avoid delay fluctuations caused by network and database load.
:param pulumi.Input[str] destination_endpoint_engine_name: The type of destination database. Valid values: `ADB20`, `ADB30`, `AS400`, `DATAHUB`, `DB2`, `GREENPLUM`, `KAFKA`, `MONGODB`, `MSSQL`, `MySQL`, `ORACLE`, `PolarDB`, `POLARDBX20`, `POLARDB_O`, `PostgreSQL`.
:param pulumi.Input[str] destination_endpoint_instance_id: The ID of destination instance.
:param pulumi.Input[str] destination_endpoint_instance_type: The type of destination instance. Valid values: `ads`, `CEN`, `DATAHUB`, `DG`, `ECS`, `EXPRESS`, `GREENPLUM`, `MONGODB`, `OTHER`, `PolarDB`, `POLARDBX20`, `RDS`.
:param pulumi.Input[str] destination_endpoint_ip: The ip of source endpoint.
:param pulumi.Input[str] destination_endpoint_oracle_sid: The SID of Oracle database.
:param pulumi.Input[str] destination_endpoint_password: The password of database account.
:param pulumi.Input[str] destination_endpoint_port: The port of source endpoint.
:param pulumi.Input[str] destination_endpoint_region: The region of destination instance.
:param pulumi.Input[str] destination_endpoint_user_name: The username of database account.
:param pulumi.Input[str] dts_instance_id: Synchronizing instance ID. The ID of `dts.SynchronizationInstance`.
:param pulumi.Input[str] dts_job_name: The name of synchronization job.
:param pulumi.Input[bool] error_notice: The error notice. Valid values: `true`, `false`.
:param pulumi.Input[str] error_phone: The error phone. The mobile phone number of the contact who error the alarm. Multiple mobile phone numbers separated by English commas `,`. This parameter currently only supports China stations, and only supports mainland mobile phone numbers, and up to 10 mobile phone numbers can be passed in.
:param pulumi.Input[str] instance_class: The instance class. Valid values: `large`, `medium`, `micro`, `small`, `xlarge`, `xxlarge`. You can only upgrade the configuration, not downgrade the configuration. If you downgrade the instance, you need to [submit a ticket](https://selfservice.console.aliyun.com/ticket/category/dts/today).
:param pulumi.Input[str] reserve: DTS reserves parameters, the format is a JSON string, you can pass in this parameter to complete the source and target database information (such as the data storage format of the target Kafka database, the instance ID of the cloud enterprise network CEN). For more information, please refer to the parameter [description of the Reserve parameter](https://help.aliyun.com/document_detail/273111.html).
:param pulumi.Input[str] source_endpoint_database_name: The name of migrate the database.
:param pulumi.Input[str] source_endpoint_engine_name: The type of source database. Valid values: `AS400`, `DB2`, `DMSPOLARDB`, `HBASE`, `MONGODB`, `MSSQL`, `MySQL`, `ORACLE`, `PolarDB`, `POLARDBX20`, `POLARDB_O`, `POSTGRESQL`, `TERADATA`.
:param pulumi.Input[str] source_endpoint_instance_id: The ID of source instance.
:param pulumi.Input[str] source_endpoint_instance_type: The type of source instance. Valid values: `CEN`, `DG`, `DISTRIBUTED_DMSLOGICDB`, `ECS`, `EXPRESS`, `MONGODB`, `OTHER`, `PolarDB`, `POLARDBX20`, `RDS`.
:param pulumi.Input[str] source_endpoint_ip: The ip of source endpoint.
:param pulumi.Input[str] source_endpoint_oracle_sid: The SID of Oracle database.
:param pulumi.Input[str] source_endpoint_owner_id: The Alibaba Cloud account ID to which the source instance belongs.
:param pulumi.Input[str] source_endpoint_password: The password of database account.
:param pulumi.Input[str] source_endpoint_port: The port of source endpoint.
:param pulumi.Input[str] source_endpoint_region: The region of source instance.
:param pulumi.Input[str] source_endpoint_role: The name of the role configured for the cloud account to which the source instance belongs.
:param pulumi.Input[str] source_endpoint_user_name: The username of database account.
:param pulumi.Input[str] status: The status of the resource. Valid values: `Synchronizing`, `Suspending`. You can stop the task by specifying `Suspending` and start the task by specifying `Synchronizing`.
:param pulumi.Input[bool] structure_initialization: Whether to perform a database table structure to migrate or initialization values include:
:param pulumi.Input[str] synchronization_direction: Synchronization direction. Valid values: `Forward`, `Reverse`. Only when the property `sync_architecture` of the `dts.SynchronizationInstance` was `bidirectional` this parameter should be passed, otherwise this parameter should not be specified.
"""
...
@overload
def __init__(__self__,
resource_name: str,
args: SynchronizationJobArgs,
opts: Optional[pulumi.ResourceOptions] = None):
"""
Provides a DTS Synchronization Job resource.
For information about DTS Synchronization Job and how to use it, see [What is Synchronization Job](https://www.alibabacloud.com/product/data-transmission-service).
> **NOTE:** Available in v1.138.0+.
## Example Usage
Basic Usage
```python
import pulumi
import pulumi_alicloud as alicloud
default_synchronization_instance = alicloud.dts.SynchronizationInstance("defaultSynchronizationInstance",
payment_type="PostPaid",
source_endpoint_engine_name="PolarDB",
source_endpoint_region="cn-hangzhou",
destination_endpoint_engine_name="ADB30",
destination_endpoint_region="cn-hangzhou",
instance_class="small",
sync_architecture="oneway")
default_synchronization_job = alicloud.dts.SynchronizationJob("defaultSynchronizationJob",
dts_instance_id=default_synchronization_instance.id,
dts_job_name="tf-testAccCase1",
source_endpoint_instance_type="PolarDB",
source_endpoint_instance_id="pc-xxxxxxxx",
source_endpoint_engine_name="PolarDB",
source_endpoint_region="cn-hangzhou",
source_endpoint_database_name="tf-testacc",
source_endpoint_user_name="root",
source_endpoint_password="password",
destination_endpoint_instance_type="ads",
destination_endpoint_instance_id="am-xxxxxxxx",
destination_endpoint_engine_name="ADB30",
destination_endpoint_region="cn-hangzhou",
destination_endpoint_database_name="tf-testacc",
destination_endpoint_user_name="root",
destination_endpoint_password="password",
db_list="{\"tf-testacc\":{\"name\":\"tf-test\",\"all\":true,\"state\":\"normal\"}}",
structure_initialization=True,
data_initialization=True,
data_synchronization=True,
status="Synchronizing")
```
## Notice
1. The expiration time cannot be changed after the work of the annual and monthly subscription suspended;
2. After the pay-as-you-go type job suspended, your job configuration fee will still be charged;
3. If the task suspended for more than 6 hours, the task will not start successfully.
4. Suspending the task will only stop writing to the target library, but will still continue to obtain the incremental log of the source, so that the task can be quickly resumed after the suspension is cancelled. Therefore, some resources of the source library, such as bandwidth resources, will continue to be occupied during the period.
5. Charges will continue during the task suspension period. If you need to stop charging, please release the instance
6. When a DTS instance suspended for more than 7 days, the instance cannot be resumed, and the status will change from suspended to failed.
## Import
DTS Synchronization Job can be imported using the id, e.g.
```sh
$ pulumi import alicloud:dts/synchronizationJob:SynchronizationJob example <id>
```
:param str resource_name: The name of the resource.
:param SynchronizationJobArgs args: The arguments to use to populate this resource's properties.
:param pulumi.ResourceOptions opts: Options for the resource.
"""
...
def __init__(__self__, resource_name: str, *args, **kwargs):
resource_args, opts = _utilities.get_resource_args_opts(SynchronizationJobArgs, pulumi.ResourceOptions, *args, **kwargs)
if resource_args is not None:
__self__._internal_init(resource_name, opts, **resource_args.__dict__)
else:
__self__._internal_init(resource_name, *args, **kwargs)
def _internal_init(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
checkpoint: Optional[pulumi.Input[str]] = None,
data_initialization: Optional[pulumi.Input[bool]] = None,
data_synchronization: Optional[pulumi.Input[bool]] = None,
db_list: Optional[pulumi.Input[str]] = None,
delay_notice: Optional[pulumi.Input[bool]] = None,
delay_phone: Optional[pulumi.Input[str]] = None,
delay_rule_time: Optional[pulumi.Input[str]] = None,
destination_endpoint_database_name: Optional[pulumi.Input[str]] = None,
destination_endpoint_engine_name: Optional[pulumi.Input[str]] = None,
destination_endpoint_instance_id: Optional[pulumi.Input[str]] = None,
destination_endpoint_instance_type: Optional[pulumi.Input[str]] = None,
destination_endpoint_ip: Optional[pulumi.Input[str]] = None,
destination_endpoint_oracle_sid: Optional[pulumi.Input[str]] = None,
destination_endpoint_password: Optional[pulumi.Input[str]] = None,
destination_endpoint_port: Optional[pulumi.Input[str]] = None,
destination_endpoint_region: Optional[pulumi.Input[str]] = None,
destination_endpoint_user_name: Optional[pulumi.Input[str]] = None,
dts_instance_id: Optional[pulumi.Input[str]] = None,
dts_job_name: Optional[pulumi.Input[str]] = None,
error_notice: Optional[pulumi.Input[bool]] = None,
error_phone: Optional[pulumi.Input[str]] = None,
instance_class: Optional[pulumi.Input[str]] = None,
reserve: Optional[pulumi.Input[str]] = None,
source_endpoint_database_name: Optional[pulumi.Input[str]] = None,
source_endpoint_engine_name: Optional[pulumi.Input[str]] = None,
source_endpoint_instance_id: Optional[pulumi.Input[str]] = None,
source_endpoint_instance_type: Optional[pulumi.Input[str]] = None,
source_endpoint_ip: Optional[pulumi.Input[str]] = None,
source_endpoint_oracle_sid: Optional[pulumi.Input[str]] = None,
source_endpoint_owner_id: Optional[pulumi.Input[str]] = None,
source_endpoint_password: Optional[pulumi.Input[str]] = None,
source_endpoint_port: Optional[pulumi.Input[str]] = None,
source_endpoint_region: Optional[pulumi.Input[str]] = None,
source_endpoint_role: Optional[pulumi.Input[str]] = None,
source_endpoint_user_name: Optional[pulumi.Input[str]] = None,
status: Optional[pulumi.Input[str]] = None,
structure_initialization: Optional[pulumi.Input[bool]] = None,
synchronization_direction: Optional[pulumi.Input[str]] = None,
__props__=None):
if opts is None:
opts = pulumi.ResourceOptions()
if not isinstance(opts, pulumi.ResourceOptions):
raise TypeError('Expected resource options to be a ResourceOptions instance')
if opts.version is None:
opts.version = _utilities.get_version()
if opts.id is None:
if __props__ is not None:
raise TypeError('__props__ is only valid when passed in combination with a valid opts.id to get an existing resource')
__props__ = SynchronizationJobArgs.__new__(SynchronizationJobArgs)
__props__.__dict__["checkpoint"] = checkpoint
if data_initialization is None and not opts.urn:
raise TypeError("Missing required property 'data_initialization'")
__props__.__dict__["data_initialization"] = data_initialization
if data_synchronization is None and not opts.urn:
raise TypeError("Missing required property 'data_synchronization'")
__props__.__dict__["data_synchronization"] = data_synchronization
if db_list is None and not opts.urn:
raise TypeError("Missing required property 'db_list'")
__props__.__dict__["db_list"] = db_list
__props__.__dict__["delay_notice"] = delay_notice
__props__.__dict__["delay_phone"] = delay_phone
__props__.__dict__["delay_rule_time"] = delay_rule_time
__props__.__dict__["destination_endpoint_database_name"] = destination_endpoint_database_name
if destination_endpoint_engine_name is None and not opts.urn:
raise TypeError("Missing required property 'destination_endpoint_engine_name'")
__props__.__dict__["destination_endpoint_engine_name"] = destination_endpoint_engine_name
__props__.__dict__["destination_endpoint_instance_id"] = destination_endpoint_instance_id
if destination_endpoint_instance_type is None and not opts.urn:
raise TypeError("Missing required property 'destination_endpoint_instance_type'")
__props__.__dict__["destination_endpoint_instance_type"] = destination_endpoint_instance_type
__props__.__dict__["destination_endpoint_ip"] = destination_endpoint_ip
__props__.__dict__["destination_endpoint_oracle_sid"] = destination_endpoint_oracle_sid
__props__.__dict__["destination_endpoint_password"] = destination_endpoint_password
__props__.__dict__["destination_endpoint_port"] = destination_endpoint_port
__props__.__dict__["destination_endpoint_region"] = destination_endpoint_region
__props__.__dict__["destination_endpoint_user_name"] = destination_endpoint_user_name
if dts_instance_id is None and not opts.urn:
raise TypeError("Missing required property 'dts_instance_id'")
__props__.__dict__["dts_instance_id"] = dts_instance_id
__props__.__dict__["dts_job_name"] = dts_job_name
__props__.__dict__["error_notice"] = error_notice
__props__.__dict__["error_phone"] = error_phone
__props__.__dict__["instance_class"] = instance_class
__props__.__dict__["reserve"] = reserve
__props__.__dict__["source_endpoint_database_name"] = source_endpoint_database_name
if source_endpoint_engine_name is None and not opts.urn:
raise TypeError("Missing required property 'source_endpoint_engine_name'")
__props__.__dict__["source_endpoint_engine_name"] = source_endpoint_engine_name
__props__.__dict__["source_endpoint_instance_id"] = source_endpoint_instance_id
if source_endpoint_instance_type is None and not opts.urn:
raise TypeError("Missing required property 'source_endpoint_instance_type'")
__props__.__dict__["source_endpoint_instance_type"] = source_endpoint_instance_type
__props__.__dict__["source_endpoint_ip"] = source_endpoint_ip
__props__.__dict__["source_endpoint_oracle_sid"] = source_endpoint_oracle_sid
__props__.__dict__["source_endpoint_owner_id"] = source_endpoint_owner_id
__props__.__dict__["source_endpoint_password"] = source_endpoint_password
__props__.__dict__["source_endpoint_port"] = source_endpoint_port
__props__.__dict__["source_endpoint_region"] = source_endpoint_region
__props__.__dict__["source_endpoint_role"] = source_endpoint_role
__props__.__dict__["source_endpoint_user_name"] = source_endpoint_user_name
__props__.__dict__["status"] = status
if structure_initialization is None and not opts.urn:
raise TypeError("Missing required property 'structure_initialization'")
__props__.__dict__["structure_initialization"] = structure_initialization
__props__.__dict__["synchronization_direction"] = synchronization_direction
super(SynchronizationJob, __self__).__init__(
'alicloud:dts/synchronizationJob:SynchronizationJob',
resource_name,
__props__,
opts)
@staticmethod
def get(resource_name: str,
id: pulumi.Input[str],
opts: Optional[pulumi.ResourceOptions] = None,
checkpoint: Optional[pulumi.Input[str]] = None,
data_initialization: Optional[pulumi.Input[bool]] = None,
data_synchronization: Optional[pulumi.Input[bool]] = None,
db_list: Optional[pulumi.Input[str]] = None,
delay_notice: Optional[pulumi.Input[bool]] = None,
delay_phone: Optional[pulumi.Input[str]] = None,
delay_rule_time: Optional[pulumi.Input[str]] = None,
destination_endpoint_database_name: Optional[pulumi.Input[str]] = None,
destination_endpoint_engine_name: Optional[pulumi.Input[str]] = None,
destination_endpoint_instance_id: Optional[pulumi.Input[str]] = None,
destination_endpoint_instance_type: Optional[pulumi.Input[str]] = None,
destination_endpoint_ip: Optional[pulumi.Input[str]] = None,
destination_endpoint_oracle_sid: Optional[pulumi.Input[str]] = None,
destination_endpoint_password: Optional[pulumi.Input[str]] = None,
destination_endpoint_port: Optional[pulumi.Input[str]] = None,
destination_endpoint_region: Optional[pulumi.Input[str]] = None,
destination_endpoint_user_name: Optional[pulumi.Input[str]] = None,
dts_instance_id: Optional[pulumi.Input[str]] = None,
dts_job_name: Optional[pulumi.Input[str]] = None,
error_notice: Optional[pulumi.Input[bool]] = None,
error_phone: Optional[pulumi.Input[str]] = None,
instance_class: Optional[pulumi.Input[str]] = None,
reserve: Optional[pulumi.Input[str]] = None,
source_endpoint_database_name: Optional[pulumi.Input[str]] = None,
source_endpoint_engine_name: Optional[pulumi.Input[str]] = None,
source_endpoint_instance_id: Optional[pulumi.Input[str]] = None,
source_endpoint_instance_type: Optional[pulumi.Input[str]] = None,
source_endpoint_ip: Optional[pulumi.Input[str]] = None,
source_endpoint_oracle_sid: Optional[pulumi.Input[str]] = None,
source_endpoint_owner_id: Optional[pulumi.Input[str]] = None,
source_endpoint_password: Optional[pulumi.Input[str]] = None,
source_endpoint_port: Optional[pulumi.Input[str]] = None,
source_endpoint_region: Optional[pulumi.Input[str]] = None,
source_endpoint_role: Optional[pulumi.Input[str]] = None,
source_endpoint_user_name: Optional[pulumi.Input[str]] = None,
status: Optional[pulumi.Input[str]] = None,
structure_initialization: Optional[pulumi.Input[bool]] = None,
synchronization_direction: Optional[pulumi.Input[str]] = None) -> 'SynchronizationJob':
"""
Get an existing SynchronizationJob resource's state with the given name, id, and optional extra
properties used to qualify the lookup.
:param str resource_name: The unique name of the resulting resource.
:param pulumi.Input[str] id: The unique provider ID of the resource to lookup.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[str] checkpoint: Start time in Unix timestamp format.
:param pulumi.Input[bool] data_initialization: Whether or not to execute DTS supports schema migration, full data migration, or full-data initialization values include:
:param pulumi.Input[bool] data_synchronization: Whether to perform incremental data migration for migration types or synchronization values include:
:param pulumi.Input[str] db_list: Migration object, in the format of JSON strings. For detailed definition instructions, please refer to [the description of migration, synchronization or subscription objects](https://help.aliyun.com/document_detail/209545.html).
:param pulumi.Input[bool] delay_notice: The delay notice. Valid values: `true`, `false`.
:param pulumi.Input[str] delay_phone: The delay phone. The mobile phone number of the contact who delayed the alarm. Multiple mobile phone numbers separated by English commas `,`. This parameter currently only supports China stations, and only supports mainland mobile phone numbers, and up to 10 mobile phone numbers can be passed in.
:param pulumi.Input[str] delay_rule_time: The delay rule time. When `delay_notice` is set to `true`, this parameter must be passed in. The threshold for triggering the delay alarm. The unit is second and needs to be an integer. The threshold can be set according to business needs. It is recommended to set it above 10 seconds to avoid delay fluctuations caused by network and database load.
:param pulumi.Input[str] destination_endpoint_engine_name: The type of destination database. Valid values: `ADB20`, `ADB30`, `AS400`, `DATAHUB`, `DB2`, `GREENPLUM`, `KAFKA`, `MONGODB`, `MSSQL`, `MySQL`, `ORACLE`, `PolarDB`, `POLARDBX20`, `POLARDB_O`, `PostgreSQL`.
:param pulumi.Input[str] destination_endpoint_instance_id: The ID of destination instance.
:param pulumi.Input[str] destination_endpoint_instance_type: The type of destination instance. Valid values: `ads`, `CEN`, `DATAHUB`, `DG`, `ECS`, `EXPRESS`, `GREENPLUM`, `MONGODB`, `OTHER`, `PolarDB`, `POLARDBX20`, `RDS`.
:param pulumi.Input[str] destination_endpoint_ip: The ip of source endpoint.
:param pulumi.Input[str] destination_endpoint_oracle_sid: The SID of Oracle database.
:param pulumi.Input[str] destination_endpoint_password: The password of database account.
:param pulumi.Input[str] destination_endpoint_port: The port of source endpoint.
:param pulumi.Input[str] destination_endpoint_region: The region of destination instance.
:param pulumi.Input[str] destination_endpoint_user_name: The username of database account.
:param pulumi.Input[str] dts_instance_id: Synchronizing instance ID. The ID of `dts.SynchronizationInstance`.
:param pulumi.Input[str] dts_job_name: The name of synchronization job.
:param pulumi.Input[bool] error_notice: The error notice. Valid values: `true`, `false`.
:param pulumi.Input[str] error_phone: The error phone. The mobile phone number of the contact who error the alarm. Multiple mobile phone numbers separated by English commas `,`. This parameter currently only supports China stations, and only supports mainland mobile phone numbers, and up to 10 mobile phone numbers can be passed in.
:param pulumi.Input[str] instance_class: The instance class. Valid values: `large`, `medium`, `micro`, `small`, `xlarge`, `xxlarge`. You can only upgrade the configuration, not downgrade the configuration. If you downgrade the instance, you need to [submit a ticket](https://selfservice.console.aliyun.com/ticket/category/dts/today).
:param pulumi.Input[str] reserve: DTS reserves parameters, the format is a JSON string, you can pass in this parameter to complete the source and target database information (such as the data storage format of the target Kafka database, the instance ID of the cloud enterprise network CEN). For more information, please refer to the parameter [description of the Reserve parameter](https://help.aliyun.com/document_detail/273111.html).
:param pulumi.Input[str] source_endpoint_database_name: The name of migrate the database.
:param pulumi.Input[str] source_endpoint_engine_name: The type of source database. Valid values: `AS400`, `DB2`, `DMSPOLARDB`, `HBASE`, `MONGODB`, `MSSQL`, `MySQL`, `ORACLE`, `PolarDB`, `POLARDBX20`, `POLARDB_O`, `POSTGRESQL`, `TERADATA`.
:param pulumi.Input[str] source_endpoint_instance_id: The ID of source instance.
:param pulumi.Input[str] source_endpoint_instance_type: The type of source instance. Valid values: `CEN`, `DG`, `DISTRIBUTED_DMSLOGICDB`, `ECS`, `EXPRESS`, `MONGODB`, `OTHER`, `PolarDB`, `POLARDBX20`, `RDS`.
:param pulumi.Input[str] source_endpoint_ip: The ip of source endpoint.
:param pulumi.Input[str] source_endpoint_oracle_sid: The SID of Oracle database.
:param pulumi.Input[str] source_endpoint_owner_id: The Alibaba Cloud account ID to which the source instance belongs.
:param pulumi.Input[str] source_endpoint_password: The password of database account.
:param pulumi.Input[str] source_endpoint_port: The port of source endpoint.
:param pulumi.Input[str] source_endpoint_region: The region of source instance.
:param pulumi.Input[str] source_endpoint_role: The name of the role configured for the cloud account to which the source instance belongs.
:param pulumi.Input[str] source_endpoint_user_name: The username of database account.
:param pulumi.Input[str] status: The status of the resource. Valid values: `Synchronizing`, `Suspending`. You can stop the task by specifying `Suspending` and start the task by specifying `Synchronizing`.
:param pulumi.Input[bool] structure_initialization: Whether to perform a database table structure to migrate or initialization values include:
:param pulumi.Input[str] synchronization_direction: Synchronization direction. Valid values: `Forward`, `Reverse`. Only when the property `sync_architecture` of the `dts.SynchronizationInstance` was `bidirectional` this parameter should be passed, otherwise this parameter should not be specified.
"""
opts = pulumi.ResourceOptions.merge(opts, pulumi.ResourceOptions(id=id))
__props__ = _SynchronizationJobState.__new__(_SynchronizationJobState)
__props__.__dict__["checkpoint"] = checkpoint
__props__.__dict__["data_initialization"] = data_initialization
__props__.__dict__["data_synchronization"] = data_synchronization
__props__.__dict__["db_list"] = db_list
__props__.__dict__["delay_notice"] = delay_notice
__props__.__dict__["delay_phone"] = delay_phone
__props__.__dict__["delay_rule_time"] = delay_rule_time
__props__.__dict__["destination_endpoint_database_name"] = destination_endpoint_database_name
__props__.__dict__["destination_endpoint_engine_name"] = destination_endpoint_engine_name
__props__.__dict__["destination_endpoint_instance_id"] = destination_endpoint_instance_id
__props__.__dict__["destination_endpoint_instance_type"] = destination_endpoint_instance_type
__props__.__dict__["destination_endpoint_ip"] = destination_endpoint_ip
__props__.__dict__["destination_endpoint_oracle_sid"] = destination_endpoint_oracle_sid
__props__.__dict__["destination_endpoint_password"] = destination_endpoint_password
__props__.__dict__["destination_endpoint_port"] = destination_endpoint_port
__props__.__dict__["destination_endpoint_region"] = destination_endpoint_region
__props__.__dict__["destination_endpoint_user_name"] = destination_endpoint_user_name
__props__.__dict__["dts_instance_id"] = dts_instance_id
__props__.__dict__["dts_job_name"] = dts_job_name
__props__.__dict__["error_notice"] = error_notice
__props__.__dict__["error_phone"] = error_phone
__props__.__dict__["instance_class"] = instance_class
__props__.__dict__["reserve"] = reserve
__props__.__dict__["source_endpoint_database_name"] = source_endpoint_database_name
__props__.__dict__["source_endpoint_engine_name"] = source_endpoint_engine_name
__props__.__dict__["source_endpoint_instance_id"] = source_endpoint_instance_id
__props__.__dict__["source_endpoint_instance_type"] = source_endpoint_instance_type
__props__.__dict__["source_endpoint_ip"] = source_endpoint_ip
__props__.__dict__["source_endpoint_oracle_sid"] = source_endpoint_oracle_sid
__props__.__dict__["source_endpoint_owner_id"] = source_endpoint_owner_id
__props__.__dict__["source_endpoint_password"] = source_endpoint_password
__props__.__dict__["source_endpoint_port"] = source_endpoint_port
__props__.__dict__["source_endpoint_region"] = source_endpoint_region
__props__.__dict__["source_endpoint_role"] = source_endpoint_role
__props__.__dict__["source_endpoint_user_name"] = source_endpoint_user_name
__props__.__dict__["status"] = status
__props__.__dict__["structure_initialization"] = structure_initialization
__props__.__dict__["synchronization_direction"] = synchronization_direction
return SynchronizationJob(resource_name, opts=opts, __props__=__props__)
@property
@pulumi.getter
def checkpoint(self) -> pulumi.Output[str]:
"""
Start time in Unix timestamp format.
"""
return pulumi.get(self, "checkpoint")
@property
@pulumi.getter(name="dataInitialization")
def data_initialization(self) -> pulumi.Output[bool]:
"""
Whether or not to execute DTS supports schema migration, full data migration, or full-data initialization values include:
"""
return pulumi.get(self, "data_initialization")
@property
@pulumi.getter(name="dataSynchronization")
def data_synchronization(self) -> pulumi.Output[bool]:
"""
Whether to perform incremental data migration for migration types or synchronization values include:
"""
return pulumi.get(self, "data_synchronization")
@property
@pulumi.getter(name="dbList")
def db_list(self) -> pulumi.Output[str]:
"""
Migration object, in the format of JSON strings. For detailed definition instructions, please refer to [the description of migration, synchronization or subscription objects](https://help.aliyun.com/document_detail/209545.html).
"""
return pulumi.get(self, "db_list")
@property
@pulumi.getter(name="delayNotice")
def delay_notice(self) -> pulumi.Output[Optional[bool]]:
"""
The delay notice. Valid values: `true`, `false`.
"""
return pulumi.get(self, "delay_notice")
@property
@pulumi.getter(name="delayPhone")
def delay_phone(self) -> pulumi.Output[Optional[str]]:
"""
The delay phone. The mobile phone number of the contact who delayed the alarm. Multiple mobile phone numbers separated by English commas `,`. This parameter currently only supports China stations, and only supports mainland mobile phone numbers, and up to 10 mobile phone numbers can be passed in.
"""
return pulumi.get(self, "delay_phone")
@property
@pulumi.getter(name="delayRuleTime")
def delay_rule_time(self) -> pulumi.Output[Optional[str]]:
"""
The delay rule time. When `delay_notice` is set to `true`, this parameter must be passed in. The threshold for triggering the delay alarm. The unit is second and needs to be an integer. The threshold can be set according to business needs. It is recommended to set it above 10 seconds to avoid delay fluctuations caused by network and database load.
"""
return pulumi.get(self, "delay_rule_time")
@property
@pulumi.getter(name="destinationEndpointDatabaseName")
def destination_endpoint_database_name(self) -> pulumi.Output[Optional[str]]:
return pulumi.get(self, "destination_endpoint_database_name")
@property
@pulumi.getter(name="destinationEndpointEngineName")
def destination_endpoint_engine_name(self) -> pulumi.Output[str]:
"""
The type of destination database. Valid values: `ADB20`, `ADB30`, `AS400`, `DATAHUB`, `DB2`, `GREENPLUM`, `KAFKA`, `MONGODB`, `MSSQL`, `MySQL`, `ORACLE`, `PolarDB`, `POLARDBX20`, `POLARDB_O`, `PostgreSQL`.
"""
return pulumi.get(self, "destination_endpoint_engine_name")
@property
@pulumi.getter(name="destinationEndpointInstanceId")
def destination_endpoint_instance_id(self) -> pulumi.Output[Optional[str]]:
"""
The ID of destination instance.
"""
return pulumi.get(self, "destination_endpoint_instance_id")
@property
@pulumi.getter(name="destinationEndpointInstanceType")
def destination_endpoint_instance_type(self) -> pulumi.Output[str]:
"""
The type of destination instance. Valid values: `ads`, `CEN`, `DATAHUB`, `DG`, `ECS`, `EXPRESS`, `GREENPLUM`, `MONGODB`, `OTHER`, `PolarDB`, `POLARDBX20`, `RDS`.
"""
return pulumi.get(self, "destination_endpoint_instance_type")
@property
@pulumi.getter(name="destinationEndpointIp")
def destination_endpoint_ip(self) -> pulumi.Output[Optional[str]]:
"""
The ip of source endpoint.
"""
return pulumi.get(self, "destination_endpoint_ip")
@property
@pulumi.getter(name="destinationEndpointOracleSid")
def destination_endpoint_oracle_sid(self) -> pulumi.Output[Optional[str]]:
"""
The SID of Oracle database.
"""
return pulumi.get(self, "destination_endpoint_oracle_sid")
@property
@pulumi.getter(name="destinationEndpointPassword")
def destination_endpoint_password(self) -> pulumi.Output[Optional[str]]:
"""
The password of database account.
"""
return pulumi.get(self, "destination_endpoint_password")
@property
@pulumi.getter(name="destinationEndpointPort")
def destination_endpoint_port(self) -> pulumi.Output[Optional[str]]:
"""
The port of source endpoint.
"""
return pulumi.get(self, "destination_endpoint_port")
@property
@pulumi.getter(name="destinationEndpointRegion")
def destination_endpoint_region(self) -> pulumi.Output[Optional[str]]:
"""
The region of destination instance.
"""
return pulumi.get(self, "destination_endpoint_region")
@property
@pulumi.getter(name="destinationEndpointUserName")
def destination_endpoint_user_name(self) -> pulumi.Output[Optional[str]]:
"""
The username of database account.
"""
return pulumi.get(self, "destination_endpoint_user_name")
@property
@pulumi.getter(name="dtsInstanceId")
def dts_instance_id(self) -> pulumi.Output[str]:
"""
Synchronizing instance ID. The ID of `dts.SynchronizationInstance`.
"""
return pulumi.get(self, "dts_instance_id")
@property
@pulumi.getter(name="dtsJobName")
def dts_job_name(self) -> pulumi.Output[str]:
"""
The name of synchronization job.
"""
return pulumi.get(self, "dts_job_name")
@property
@pulumi.getter(name="errorNotice")
def error_notice(self) -> pulumi.Output[Optional[bool]]:
"""
The error notice. Valid values: `true`, `false`.
"""
return pulumi.get(self, "error_notice")
@property
@pulumi.getter(name="errorPhone")
def error_phone(self) -> pulumi.Output[Optional[str]]:
"""
The error phone. The mobile phone number of the contact who error the alarm. Multiple mobile phone numbers separated by English commas `,`. This parameter currently only supports China stations, and only supports mainland mobile phone numbers, and up to 10 mobile phone numbers can be passed in.
"""
return pulumi.get(self, "error_phone")
@property
@pulumi.getter(name="instanceClass")
def instance_class(self) -> pulumi.Output[str]:
"""
The instance class. Valid values: `large`, `medium`, `micro`, `small`, `xlarge`, `xxlarge`. You can only upgrade the configuration, not downgrade the configuration. If you downgrade the instance, you need to [submit a ticket](https://selfservice.console.aliyun.com/ticket/category/dts/today).
"""
return pulumi.get(self, "instance_class")
@property
@pulumi.getter
def reserve(self) -> pulumi.Output[Optional[str]]:
"""
DTS reserves parameters, the format is a JSON string, you can pass in this parameter to complete the source and target database information (such as the data storage format of the target Kafka database, the instance ID of the cloud enterprise network CEN). For more information, please refer to the parameter [description of the Reserve parameter](https://help.aliyun.com/document_detail/273111.html).
"""
return pulumi.get(self, "reserve")
@property
@pulumi.getter(name="sourceEndpointDatabaseName")
def source_endpoint_database_name(self) -> pulumi.Output[Optional[str]]:
"""
The name of migrate the database.
"""
return pulumi.get(self, "source_endpoint_database_name")
@property
@pulumi.getter(name="sourceEndpointEngineName")
def source_endpoint_engine_name(self) -> pulumi.Output[str]:
"""
The type of source database. Valid values: `AS400`, `DB2`, `DMSPOLARDB`, `HBASE`, `MONGODB`, `MSSQL`, `MySQL`, `ORACLE`, `PolarDB`, `POLARDBX20`, `POLARDB_O`, `POSTGRESQL`, `TERADATA`.
"""
return pulumi.get(self, "source_endpoint_engine_name")
@property
@pulumi.getter(name="sourceEndpointInstanceId")
def source_endpoint_instance_id(self) -> pulumi.Output[Optional[str]]:
"""
The ID of source instance.
"""
return pulumi.get(self, "source_endpoint_instance_id")
@property
@pulumi.getter(name="sourceEndpointInstanceType")
def source_endpoint_instance_type(self) -> pulumi.Output[str]:
"""
The type of source instance. Valid values: `CEN`, `DG`, `DISTRIBUTED_DMSLOGICDB`, `ECS`, `EXPRESS`, `MONGODB`, `OTHER`, `PolarDB`, `POLARDBX20`, `RDS`.
"""
return pulumi.get(self, "source_endpoint_instance_type")
@property
@pulumi.getter(name="sourceEndpointIp")
def source_endpoint_ip(self) -> pulumi.Output[Optional[str]]:
"""
The ip of source endpoint.
"""
return pulumi.get(self, "source_endpoint_ip")
@property
@pulumi.getter(name="sourceEndpointOracleSid")
def source_endpoint_oracle_sid(self) -> pulumi.Output[Optional[str]]:
"""
The SID of Oracle database.
"""
return pulumi.get(self, "source_endpoint_oracle_sid")
@property
@pulumi.getter(name="sourceEndpointOwnerId")
def source_endpoint_owner_id(self) -> pulumi.Output[Optional[str]]:
"""
The Alibaba Cloud account ID to which the source instance belongs.
"""
return pulumi.get(self, "source_endpoint_owner_id")
@property
@pulumi.getter(name="sourceEndpointPassword")
def source_endpoint_password(self) -> pulumi.Output[Optional[str]]:
"""
The password of database account.
"""
return pulumi.get(self, "source_endpoint_password")
@property
@pulumi.getter(name="sourceEndpointPort")
def source_endpoint_port(self) -> pulumi.Output[Optional[str]]:
"""
The port of source endpoint.
"""
return pulumi.get(self, "source_endpoint_port")
@property
@pulumi.getter(name="sourceEndpointRegion")
def source_endpoint_region(self) -> pulumi.Output[Optional[str]]:
"""
The region of source instance.
"""
return pulumi.get(self, "source_endpoint_region")
@property
@pulumi.getter(name="sourceEndpointRole")
def source_endpoint_role(self) -> pulumi.Output[Optional[str]]:
"""
The name of the role configured for the cloud account to which the source instance belongs.
"""
return pulumi.get(self, "source_endpoint_role")
@property
@pulumi.getter(name="sourceEndpointUserName")
def source_endpoint_user_name(self) -> pulumi.Output[Optional[str]]:
"""
The username of database account.
"""
return pulumi.get(self, "source_endpoint_user_name")
@property
@pulumi.getter
def status(self) -> pulumi.Output[str]:
"""
The status of the resource. Valid values: `Synchronizing`, `Suspending`. You can stop the task by specifying `Suspending` and start the task by specifying `Synchronizing`.
"""
return pulumi.get(self, "status")
@property
@pulumi.getter(name="structureInitialization")
def structure_initialization(self) -> pulumi.Output[bool]:
"""
Whether to perform a database table structure to migrate or initialization values include:
"""
return pulumi.get(self, "structure_initialization")
@property
@pulumi.getter(name="synchronizationDirection")
def synchronization_direction(self) -> pulumi.Output[str]:
"""
Synchronization direction. Valid values: `Forward`, `Reverse`. Only when the property `sync_architecture` of the `dts.SynchronizationInstance` was `bidirectional` this parameter should be passed, otherwise this parameter should not be specified.
"""
return pulumi.get(self, "synchronization_direction")
| 57.703944 | 443 | 0.699215 | 13,687 | 115,581 | 5.648352 | 0.03149 | 0.070289 | 0.077326 | 0.079396 | 0.975449 | 0.97228 | 0.967714 | 0.959668 | 0.95258 | 0.945414 | 0 | 0.003291 | 0.203459 | 115,581 | 2,002 | 444 | 57.732767 | 0.836431 | 0.370796 | 0 | 0.871575 | 1 | 0 | 0.149061 | 0.102984 | 0 | 0 | 0 | 0 | 0 | 1 | 0.168664 | false | 0.045377 | 0.004281 | 0.002568 | 0.273973 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
5442e6c2c823eb1a10d7cdf13ba51feb3159ad0b | 3,526 | py | Python | zoomus/components/webinar.py | adh-wonolo/zoomus | 7345622be7a60feee3d8eb7376e2d57d4168cd11 | [
"Apache-2.0"
] | 3 | 2020-03-26T05:55:20.000Z | 2020-08-12T23:28:52.000Z | zoomus/components/webinar.py | adh-wonolo/zoomus | 7345622be7a60feee3d8eb7376e2d57d4168cd11 | [
"Apache-2.0"
] | 1 | 2020-06-01T13:43:04.000Z | 2020-06-01T13:43:04.000Z | zoomus/components/webinar.py | adh-wonolo/zoomus | 7345622be7a60feee3d8eb7376e2d57d4168cd11 | [
"Apache-2.0"
] | 2 | 2019-06-26T03:35:43.000Z | 2019-06-27T21:04:46.000Z | """Zoom.us REST API Python Client -- Webinar component"""
from __future__ import absolute_import
from zoomus import util
from zoomus.components import base
class WebinarComponent(base.BaseComponent):
"""Component dealing with all webinar related matters"""
def list(self, **kwargs):
util.require_keys(kwargs, "host_id")
if kwargs.get("start_time"):
kwargs["start_time"] = util.date_to_str(kwargs["start_time"])
return self.post_request("/webinar/list", params=kwargs)
def upcoming(self, **kwargs):
util.require_keys(kwargs, "host_id")
if kwargs.get("start_time"):
kwargs["start_time"] = util.date_to_str(kwargs["start_time"])
return self.post_request("/webinar/list/registration", params=kwargs)
def create(self, **kwargs):
util.require_keys(kwargs, ["host_id", "topic"])
if kwargs.get("start_time"):
kwargs["start_time"] = util.date_to_str(kwargs["start_time"])
return self.post_request("/webinar/create", params=kwargs)
def update(self, **kwargs):
util.require_keys(kwargs, ["id", "host_id"])
if kwargs.get("start_time"):
kwargs["start_time"] = util.date_to_str(kwargs["start_time"])
return self.post_request("/webinar/update", params=kwargs)
def delete(self, **kwargs):
util.require_keys(kwargs, ["id", "host_id"])
return self.post_request("/webinar/delete", params=kwargs)
def end(self, **kwargs):
util.require_keys(kwargs, ["id", "host_id"])
return self.post_request("/webinar/end", params=kwargs)
def get(self, **kwargs):
util.require_keys(kwargs, ["id", "host_id"])
return self.post_request("/webinar/get", params=kwargs)
def register(self, **kwargs):
util.require_keys(kwargs, ["id", "email", "first_name", "last_name"])
if kwargs.get("start_time"):
kwargs["start_time"] = util.date_to_str(kwargs["start_time"])
return self.post_request("/webinar/register", params=kwargs)
class WebinarComponentV2(base.BaseComponent):
"""Component dealing with all webinar related matters"""
def list(self, **kwargs):
util.require_keys(kwargs, "user_id")
return self.get_request(
"/users/{}/webinars".format(kwargs.get("user_id")), params=kwargs
)
def create(self, **kwargs):
util.require_keys(kwargs, "user_id")
return self.post_request(
"/users/{}/webinars".format(kwargs.get("user_id")), params=kwargs
)
def update(self, **kwargs):
util.require_keys(kwargs, "id")
return self.patch_request(
"/webinars/{}".format(kwargs.get("id")), params=kwargs
)
def delete(self, **kwargs):
util.require_keys(kwargs, "id")
return self.delete_request(
"/webinars/{}".format(kwargs.get("id")), params=kwargs
)
def end(self, **kwargs):
util.require_keys(kwargs, "id")
return self.put_request(
"/webinars/{}/status".format(kwargs.get("id")), params={"status": "end"}
)
def get(self, **kwargs):
util.require_keys(kwargs, "id")
return self.get_request("/webinars/{}".format(kwargs.get("id")), params=kwargs)
def register(self, **kwargs):
util.require_keys(kwargs, ["id", "email", "first_name", "last_name"])
return self.post_request(
"/webinars/{}/registrants".format(kwargs.get("id")), params=kwargs
)
| 36.350515 | 87 | 0.625638 | 427 | 3,526 | 4.990632 | 0.149883 | 0.070389 | 0.098545 | 0.147818 | 0.838104 | 0.822618 | 0.80901 | 0.80901 | 0.806194 | 0.721258 | 0 | 0.000361 | 0.214407 | 3,526 | 96 | 88 | 36.729167 | 0.768953 | 0.043392 | 0 | 0.611111 | 0 | 0 | 0.166518 | 0.014894 | 0 | 0 | 0 | 0 | 0 | 1 | 0.208333 | false | 0 | 0.041667 | 0 | 0.486111 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
54534b4c2abf531f9c7ddfb064f57f5873546995 | 199 | py | Python | RNN/basicRNN.py | Coffeexiudou/NLPLearning | e3170f9930fa159d07ee843d01e3e85e4f276b1e | [
"MIT"
] | null | null | null | RNN/basicRNN.py | Coffeexiudou/NLPLearning | e3170f9930fa159d07ee843d01e3e85e4f276b1e | [
"MIT"
] | null | null | null | RNN/basicRNN.py | Coffeexiudou/NLPLearning | e3170f9930fa159d07ee843d01e3e85e4f276b1e | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
"""
Author : kouyafei
date: 2018/4/20
"""
def tanh():
pass
def rnn_cell_forward():
pass
def rnn_cell_backward():
pass
def loss():
pass | 11.705882 | 27 | 0.522613 | 25 | 199 | 4 | 0.68 | 0.21 | 0.2 | 0.28 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.059259 | 0.321608 | 199 | 17 | 28 | 11.705882 | 0.681481 | 0.356784 | 0 | 0.5 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | true | 0.5 | 0 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 7 |
49b7839df5607c89ae61c4c7595b4e5bc1cb0ac4 | 9,525 | py | Python | rdmo/accounts/tests/test_views.py | berkerY/rdmo | c0500f9b6caff9106a254a05e0d0e8018fc8db28 | [
"Apache-2.0"
] | 1 | 2021-12-13T16:32:25.000Z | 2021-12-13T16:32:25.000Z | rdmo/accounts/tests/test_views.py | MSpenger/rdmo | c0500f9b6caff9106a254a05e0d0e8018fc8db28 | [
"Apache-2.0"
] | null | null | null | rdmo/accounts/tests/test_views.py | MSpenger/rdmo | c0500f9b6caff9106a254a05e0d0e8018fc8db28 | [
"Apache-2.0"
] | 1 | 2021-05-20T09:31:49.000Z | 2021-05-20T09:31:49.000Z | import re
from django.conf import settings
from django.core import mail
from django.urls import reverse
from ..models import User
users = (
('editor', 'editor'),
('reviewer', 'reviewer'),
('user', 'user'),
('api', 'api'),
('anonymous', None),
)
def test_get_profile_update(db, client):
"""
An authorized GET request to the profile update form returns the form.
"""
client.login(username='user', password='user')
url = reverse('profile_update')
response = client.get(url)
assert response.status_code == 200
def test_get_profile_update_redirect(db, client):
"""
An unauthorized GET request to the profile update form gets
redirected to login.
"""
url = reverse('profile_update')
response = client.get(url)
assert response.status_code == 302
assert response.url == reverse('account_login') + '?next=' + url
def test_post_profile_update(db, client):
"""
An authorized POST request to the profile update form updates the
user and redirects to home.
"""
client.login(username='user', password='user')
url = reverse('profile_update')
data = {
'email': 'test@example.com',
'first_name': 'Albert',
'last_name': 'Admin',
'text': 'text',
'textarea': 'textarea'
}
response = client.post(url, data)
if settings.PROFILE_UPDATE:
assert response.status_code == 302
assert response.url == reverse('home')
else:
assert response.status_code == 200
def test_post_profile_update_cancel(db, client):
"""
An authorized POST request to the profile update form updates with
cancel redirects to home.
"""
client.login(username='user', password='user')
url = reverse('profile_update')
data = {
'email': 'test@example.com',
'first_name': 'Albert',
'last_name': 'Admin',
'cancel': 'cancel'
}
response = client.post(url, data)
if settings.PROFILE_UPDATE:
assert response.status_code == 302
assert response.url == reverse('home')
else:
assert response.status_code == 200
def test_post_profile_update_cancel2(db, client):
"""
An authorized POST request to the profile update form updates with
cancel and the next field redirects to the given url.
"""
client.login(username='user', password='user')
url = reverse('profile_update')
data = {
'email': 'test@example.com',
'first_name': 'Albert',
'last_name': 'Admin',
'cancel': 'cancel',
'next': reverse('projects')
}
response = client.post(url, data)
if settings.PROFILE_UPDATE:
assert response.status_code == 302
assert response.url == reverse('projects')
else:
assert response.status_code == 200
def test_post_profile_update_next(db, client):
"""
An authorized POST request to the profile update form with next field
updates the user and redirects to the given url.
"""
client.login(username='user', password='user')
url = reverse('profile_update')
data = {
'email': 'test@example.com',
'first_name': 'Albert',
'last_name': 'Admin',
'text': 'text',
'textarea': 'textarea',
'next': reverse('projects')
}
response = client.post(url, data)
if settings.PROFILE_UPDATE:
assert response.status_code == 302
assert response.url == reverse('projects')
else:
assert response.status_code == 200
def test_post_profile_update_next2(db, client):
"""
An authorized POST request to the profile update form with next
field set to profile_update updates the user and redirects to home.
"""
client.login(username='user', password='user')
url = reverse('profile_update')
data = {
'email': 'test@example.com',
'first_name': 'Albert',
'last_name': 'Admin',
'text': 'text',
'textarea': 'textarea',
'next': reverse('profile_update')
}
response = client.post(url, data)
if settings.PROFILE_UPDATE:
assert response.status_code == 302
assert response.url == reverse('home')
else:
assert response.status_code == 200
def test_password_change_get(db, client):
"""
An authorized GET request to the password change form returns the form.
"""
if settings.ACCOUNT:
client.login(username='user', password='user')
url = reverse('account_change_password')
response = client.get(url)
assert response.status_code == 200
def test_password_change_post(db, client):
"""
An authorized POST request to the password change form updates the
password and redirects to home.
"""
if settings.ACCOUNT:
client.login(username='user', password='user')
url = reverse('account_change_password')
data = {
'old_password': 'user',
'new_password1': 'resu',
'new_password2': 'resu',
}
response = client.post(url, data)
assert response.status_code == 200
def test_password_reset_get(db, client):
"""
A GET request to the password reset form returns the form.
"""
if settings.ACCOUNT:
url = reverse('account_reset_password')
response = client.get(url)
assert response.status_code == 200
def test_password_reset_post_invalid(db, client):
"""
A POST request to the password reset form with an invalid mail address
sends no mail.
"""
if settings.ACCOUNT:
url = reverse('account_reset_password')
data = {'email': 'wrong@example.com'}
response = client.post(url, data)
assert response.status_code == 200
assert len(mail.outbox) == 0
def test_password_reset_post_valid(db, client):
"""
A POST request to the password reset form with an invalid mail address
sends a mail with a correct link.
"""
if settings.ACCOUNT:
url = reverse('account_reset_password')
data = {'email': 'user@example.com'}
response = client.post(url, data)
assert response.status_code == 302
assert response.url == reverse('account_reset_password_done')
assert len(mail.outbox) == 1
# get the link from the mail
urls = re.findall('http[s]?://(?:[a-zA-Z]|[0-9]|[$-_@.&+]|[!*\(\),]|(?:%[0-9a-fA-F][0-9a-fA-F]))+', mail.outbox[0].body)
assert len(urls) == 1
# get the password_reset page
response = client.get(urls[0])
assert response.status_code == 302
assert response.url == reverse('account_reset_password_from_key', args=['4','set-password'])
def test_remove_user_get(db, client):
if settings.PROFILE_DELETE:
client.login(username='user', password='user')
url = reverse('profile_remove')
response = client.get(url)
assert response.status_code == 200
def test_remove_user_post(db, client):
if settings.PROFILE_DELETE:
client.login(username='user', password='user')
url = reverse('profile_remove')
data = {
'email': 'user@example.com',
'password': 'user',
'consent': True
}
response = client.post(url, data)
assert response.status_code == 200
assert not User.objects.filter(username='user').exists()
def test_remove_user_post_invalid_email(db, client):
if settings.PROFILE_DELETE:
client.login(username='user', password='user')
url = reverse('profile_remove')
data = {
'email': 'invalid',
'password': 'user',
'consent': True
}
response = client.post(url, data)
assert response.status_code == 200
assert User.objects.filter(username='user').exists()
def test_remove_user_post_invalid_password(db, client):
if settings.PROFILE_DELETE:
client.login(username='user', password='user')
url = reverse('profile_remove')
data = {
'email': 'user@example.com',
'password': 'invalid',
'consent': True
}
response = client.post(url, data)
assert response.status_code == 200
assert User.objects.filter(username='user').exists()
def test_remove_user_post_invalid_consent(db, client):
if settings.PROFILE_DELETE:
client.login(username='user', password='user')
url = reverse('profile_remove')
data = {
'email': 'user@example.com',
'password': 'user',
'consent': False
}
response = client.post(url, data)
assert response.status_code == 200
assert User.objects.filter(username='user').exists()
def test_signup(db, client):
url = reverse('account_signup')
response = client.post(url, {
'email': 'test@example.com',
'username': 'test',
'first_name': 'test',
'last_name': 'test',
'password1': 'test',
'password2': 'test',
})
assert response.status_code == 302
assert response.url == '/'
def test_signup_next(db, client):
url = reverse('account_signup') + '?next=/about/'
response = client.post(url, {
'email': 'test@example.com',
'username': 'test',
'first_name': 'test',
'last_name': 'test',
'password1': 'test',
'password2': 'test',
})
assert response.status_code == 302
assert response.url == '/about/'
| 28.432836 | 128 | 0.612073 | 1,114 | 9,525 | 5.094255 | 0.10772 | 0.086344 | 0.088106 | 0.105727 | 0.869604 | 0.852159 | 0.823612 | 0.808634 | 0.783084 | 0.775154 | 0 | 0.013454 | 0.258688 | 9,525 | 334 | 129 | 28.517964 | 0.790256 | 0.124619 | 0 | 0.704036 | 0 | 0.004484 | 0.185714 | 0.030542 | 0 | 0 | 0 | 0 | 0.188341 | 1 | 0.085202 | false | 0.165919 | 0.022422 | 0 | 0.107623 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 8 |
49d9a85e0d39fe62f44bd21aaed9f53718bca04c | 8,533 | py | Python | python/ray/rllib/RL/DeepADFQ/models.py | christopher-hsu/ray | abe84b596253411607a91b3a44c135f5e9ac6ac7 | [
"Apache-2.0"
] | 1 | 2019-07-08T15:29:25.000Z | 2019-07-08T15:29:25.000Z | python/ray/rllib/RL/DeepADFQ/models.py | christopher-hsu/ray | abe84b596253411607a91b3a44c135f5e9ac6ac7 | [
"Apache-2.0"
] | null | null | null | python/ray/rllib/RL/DeepADFQ/models.py | christopher-hsu/ray | abe84b596253411607a91b3a44c135f5e9ac6ac7 | [
"Apache-2.0"
] | null | null | null | """This code was modified from a OpenAI baseline code - baselines0/baselines0/deepq/models.py
"""
import tensorflow as tf
import tensorflow.contrib.layers as layers
import numpy as np
def wrap_atari_dqn(env):
from baselines0.common.atari_wrappers import wrap_deepmind
return wrap_deepmind(env, frame_stack=True, scale=True)
def _mlp(hiddens, inpt, num_actions, scope, reuse=False, layer_norm=False, init_mean = 1.0, init_sd = 20.0):
with tf.variable_scope(scope, reuse=reuse):
out = inpt
for hidden in hiddens:
out = layers.fully_connected(out, num_outputs=hidden, activation_fn=None)
if layer_norm:
out = layers.layer_norm(out, center=True, scale=True)
out = tf.nn.relu(out)
bias_init = [init_mean for _ in range(int(num_actions/2))]
bias_init.extend([-np.log(init_sd) for _ in range(int(num_actions/2))])
q_out = layers.fully_connected(out,
num_outputs=num_actions,
activation_fn=None,
weights_initializer=tf.zeros_initializer(),
biases_initializer=tf.constant_initializer(bias_init))
return q_out
def mlp(hiddens=[], layer_norm=False, init_mean = 1.0, init_sd = 20.0):
"""This model takes as input an observation and returns values of all actions.
Parameters
----------
hiddens: [int]
list of sizes of hidden layers
Returns
-------
q_func: function
q_function for DQN algorithm.
"""
return lambda *args, **kwargs: _mlp(hiddens, layer_norm=layer_norm, init_mean=init_mean, init_sd = init_sd, *args, **kwargs)
def _cnn_to_mlp(convs, hiddens, dueling, inpt, num_actions, scope, reuse=False, layer_norm=False, init_mean = 1.0, init_sd = 20.0):
with tf.variable_scope(scope, reuse=reuse):
out = inpt
with tf.variable_scope("convnet"):
for num_outputs, kernel_size, stride in convs:
out = layers.convolution2d(out,
num_outputs=num_outputs, # number of output filters
kernel_size=kernel_size, # filter spatial dimension
stride=stride,
activation_fn=tf.nn.relu)
conv_out = layers.flatten(out)
with tf.variable_scope("action_value"):
action_out = conv_out
for hidden in hiddens:
action_out = layers.fully_connected(action_out, num_outputs=hidden, activation_fn=None)
if layer_norm:
action_out = layers.layer_norm(action_out, center=True, scale=True)
action_out = tf.nn.relu(action_out)
#action_scores = layers.fully_connected(action_out, num_outputs=num_actions, activation_fn=None)
bias_init = [init_mean for _ in range(int(num_actions/2))]
bias_init.extend([-np.log(init_sd) for _ in range(int(num_actions/2))])
action_scores = layers.fully_connected(action_out,
num_outputs=num_actions,
activation_fn=None,
weights_initializer=tf.zeros_initializer(),
biases_initializer=tf.constant_initializer(bias_init))
if dueling:
with tf.variable_scope("state_value"):
state_out = conv_out
for hidden in hiddens:
state_out = layers.fully_connected(state_out, num_outputs=hidden, activation_fn=None)
if layer_norm:
state_out = layers.layer_norm(state_out, center=True, scale=True)
state_out = tf.nn.relu(state_out)
state_score = layers.fully_connected(state_out, num_outputs=1, activation_fn=None)
action_scores_mean = tf.reduce_mean(action_scores, 1)
action_scores_centered = action_scores - tf.expand_dims(action_scores_mean, 1)
q_out = state_score + action_scores_centered
else:
q_out = action_scores
return q_out
def cnn_to_mlp(convs, hiddens, dueling=False, layer_norm=False, init_mean = 1.0, init_sd = 20.0):
"""This model takes as input an observation and returns values of all actions.
Parameters
----------
convs: [(int, int int)]
list of convolutional layers in form of
(num_outputs, kernel_size, stride)
hiddens: [int]
list of sizes of hidden layers
dueling: bool
if true double the output MLP to compute a baseline
for action scores
Returns
-------
q_func: function
q_function for DQN algorithm.
"""
return lambda *args, **kwargs: _cnn_to_mlp(convs, hiddens, dueling, layer_norm=layer_norm, init_mean=init_mean, init_sd = init_sd, *args, **kwargs)
def _cnn_plus_mlp(convs, hiddens, dueling, inpt, num_actions, scope, reuse=False, layer_norm=False, init_mean = 1.0, init_sd = 20.0):
with tf.variable_scope(scope, reuse=reuse):
"""
inpt: vectorized image input + continuous input
"""
im_size = 50
out = tf.reshape(tf.slice(inpt, [0, 0],[-1, im_size*im_size]), [-1, im_size, im_size, 1])
mlp_inpt = tf.slice(inpt, [0, im_size*im_size],[-1, int(inpt.shape[1])-im_size*im_size])
with tf.variable_scope("convnet"):
for num_outputs, kernel_size, stride in convs:
out = layers.convolution2d(out,
num_outputs=num_outputs,
kernel_size=kernel_size,
stride=stride,
activation_fn=tf.nn.relu)
out = layers.avg_pool2d(out,
kernel_size=2,
stride=2,
padding='VALID')
conv_out = layers.flatten(out)
with tf.variable_scope("action_value"):
action_out = tf.concat([conv_out, mlp_inpt], 1)
for hidden in hiddens:
action_out = layers.fully_connected(action_out, num_outputs=hidden, activation_fn=None)
if layer_norm:
action_out = layers.layer_norm(action_out, center=True, scale=True)
action_out = tf.nn.relu(action_out)
#action_scores = layers.fully_connected(action_out, num_outputs=num_actions, activation_fn=None)
bias_init = [init_mean for _ in range(int(num_actions/2))]
bias_init.extend([-np.log(init_sd) for _ in range(int(num_actions/2))])
action_scores = layers.fully_connected(action_out,
num_outputs=num_actions,
activation_fn=None,
weights_initializer=tf.zeros_initializer(),
biases_initializer=tf.constant_initializer(bias_init))
if dueling:
with tf.variable_scope("state_value"):
state_out = tf.concat([conv_out, mlp_inpt], 1)
for hidden in hiddens:
state_out = layers.fully_connected(state_out, num_outputs=hidden, activation_fn=None)
if layer_norm:
state_out = layers.layer_norm(state_out, center=True, scale=True)
state_out = tf.nn.relu(state_out)
state_score = layers.fully_connected(state_out, num_outputs=1, activation_fn=None)
action_scores_mean = tf.reduce_mean(action_scores, 1)
action_scores_centered = action_scores - tf.expand_dims(action_scores_mean, 1)
q_out = state_score + action_scores_centered
else:
q_out = action_scores
return q_out
def cnn_plus_mlp(convs, hiddens, dueling=False, layer_norm=False, init_mean = 1.0, init_sd = 20.0):
"""This model takes an image input and a 1D vector input
and returns values of all actions.
Parameters
----------
convs: [(int, int int)]
list of convolutional layers in form of
(num_outputs, kernel_size, stride)
hiddens: [int]
list of sizes of hidden layers
dueling: bool
if true double the output MLP to compute a baseline
for action scores
Returns
-------
q_func: function
q_function for DQN algorithm.
"""
return lambda *args, **kwargs: _cnn_plus_mlp(convs, hiddens, dueling, layer_norm=layer_norm, init_mean=init_mean, init_sd = init_sd, *args, **kwargs)
| 44.675393 | 153 | 0.607992 | 1,081 | 8,533 | 4.53469 | 0.134135 | 0.040392 | 0.037128 | 0.034884 | 0.872909 | 0.863117 | 0.856385 | 0.828641 | 0.821501 | 0.821501 | 0 | 0.010574 | 0.30177 | 8,533 | 190 | 154 | 44.910526 | 0.812185 | 0.163483 | 0 | 0.736842 | 0 | 0 | 0.009456 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.061404 | false | 0 | 0.035088 | 0 | 0.157895 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
b72360aaa46ddc2001aadb60a26b97bef948c3e5 | 26,212 | py | Python | Join_Mergo_Export_SSURGO.py | prasanna310/SSURGO_Extract_ArcGIS | e0e09022f786a8b88ec0380f138cbb7b1c87924e | [
"BSD-2-Clause"
] | null | null | null | Join_Mergo_Export_SSURGO.py | prasanna310/SSURGO_Extract_ArcGIS | e0e09022f786a8b88ec0380f138cbb7b1c87924e | [
"BSD-2-Clause"
] | null | null | null | Join_Mergo_Export_SSURGO.py | prasanna310/SSURGO_Extract_ArcGIS | e0e09022f786a8b88ec0380f138cbb7b1c87924e | [
"BSD-2-Clause"
] | null | null | null | #-------------------------------------------------------------------------------
# Name: Join_Mergo_Export_SSURGO
# Purpose: Join the soil properties table, merge them to SSURGO and/ or STATSGO
# Export the data as Raster (to TIF)
#
# Author: Prasanna Dahal
#
# Created: 05/08/2016
# Copyright: (c) Prasanna Dahal 2016
# ArcGIS: 10.3
# Python: 2.7
#-------------------------------------------------------------------------------
import os
import arcpy
from arcpy import env
path2_ssurgo = arcpy.GetParameterAsText(0)
path2statsgo = arcpy.GetParameterAsText(1) # make it optional
outDir = arcpy.GetParameterAsText(2)
MaskRaster = arcpy.GetParameterAsText(3)
def STEP5_Join_Merge_Export (path2_ssurgo, path2statsgo, outDir,MaskRaster ):
arcpy.AddMessage("*** This scripts joins the soil values to mushape, and exports as Rasters *** ")
if not os.path.exists(outDir+"/TEMP"):
os.mkdir(outDir+"/TEMP")
TEMP = os.path.join(outDir, "TEMP")
# soilProperties =[ [name from csv, new name for raster, field name defaulted by ArcGIS],.....
soilProperties = [
["Ks_WtAvg", "KSAT", "MUKEY_Vs_3" ],
["ResidualWaterContent_WtAvg", "RSM", "MUKEY_Vs_6" ],
["Porosity_WtAvg","POR", "MUKEY_Vs_7" ],
["EffectivePorosity_WtAvg","EFPO", "MUKEY_Vs_8" ] ,
["BubblingPressure_Geometric_WtAvg", "BBL", "MUKEY_Vs_9" ] ,
["PoreSizeDistribution_geometric_WtAvg_y","PSD", "MUKEY_V_10"],
["HydroGrp", "HSG", "MUKEY_V_12"]
]
mxd = arcpy.mapping.MapDocument("CURRENT") # get the map document
df = arcpy.mapping.ListDataFrames(mxd,"*")[0] #first dataframe in the document
# this file is used to clip the soil features
arcpy.RasterToPolygon_conversion(in_raster=MaskRaster,
out_polygon_features=os.path.join(TEMP,"mask_polygon"),
simplify="NO_SIMPLIFY", raster_field="Value")
# for each ssurgo or statsgo folders, navigates to sub-folders, adds join and adds to mxd as a layer
def join_field( path2statsgo_or_ssurgo, MaskRaster, dataType ):
"""
:param path2_soilFolderFolders: The path to a folder containing the collection of SSURGO (or Statsgo) folders
:param MaskRaster: DEM or any raster whose extent, coordinate system are considered while creating SSURGO rasters
:param dataType: string, "statsgo" or "ssurgo"
:return:
"""
mxd = arcpy.mapping.MapDocument("CURRENT") # get the map document
df = arcpy.mapping.ListDataFrames(mxd,"*")[0] #first dataframe in the document
# create a list of folders containing SSURGO folders only
folderList = []
[folderList.append(folders) for folders in os.listdir(path2statsgo_or_ssurgo)
if os.path.isdir(os.path.join(path2statsgo_or_ssurgo, folders))]
# One ssurgo or statsgo folder, one at a time
for folder in folderList:
arcpy.AddMessage(folder)
path2_soilFolder= os.path.join(path2statsgo_or_ssurgo, folder)
path2tabular = os.path.join(path2_soilFolder, "tabular")
path2Spatial= os.path.join(path2_soilFolder,"spatial")
# arcpy.env.workspace = path2_soilFolder # arcpy.env.scratchWorkspace =
muShapefile = os.listdir(path2Spatial)[1].split('.')[0] # muShapefile = 'soilmu_a_ut612'
# project the shapefile in ssurgo table, FILE SELECTION
if dataType.lower() == "statsgo":
new_mu = muShapefile +"_statsgo_prj"
if dataType.lower() == "ssurgo" :
new_mu = muShapefile +"_ssurgo_prj"
try:
arcpy.Project_management(in_dataset=path2_soilFolder+"/spatial/" + muShapefile +".shp",
out_dataset=TEMP + "/"+ new_mu,
out_coor_system= MaskRaster, transform_method="", max_deviation="")
arcpy.AddMessage("SUCCESS: Shapefile projection")
except Exception, e:
print e
arcpy.AddMessage("FAILED: Shapefile projection")
# to add the projected shapefile from ssurgo, as a layer to the map at the bottom of the TOC in data frame 0
muShapefileAsLayer = new_mu
layer1 = arcpy.mapping.Layer(TEMP + "/"+ muShapefileAsLayer+ ".shp" ) # create a new layer
arcpy.mapping.AddLayer(df, layer1,"TOP") #added to layer because this will be used code below
try:
# join the table that had mUKEY mapped to all soil properties
arcpy.AddJoin_management(muShapefileAsLayer, "MUKEY", path2_soilFolder+"/MUKEY-Vs-Values.csv", "MUKEY")
arcpy.AddMessage("SUCCESS: Field Addition")
except Exception, e:
arcpy.AddMessage("FAILED: Field Addition")
return
def merge_and_clip(TEMP):
layers = arcpy.mapping.ListLayers(mxd, "", df)
statsgo_layers = [lyr.name for lyr in layers if lyr.name.endswith("_statsgo_prj")]
ssurgo_layers = [lyr.name for lyr in layers if lyr.name.endswith("_ssurgo_prj")]
arcpy.AddMessage("MERGING ")
arcpy.Merge_management(inputs= ";".join(ssurgo_layers),
output=os.path.join(TEMP,"ssurgo_merged") ,
field_mappings="""AREASYMBOL "AREASYMBOL" true true false 20 Text 0 0 ,First,#,gsmsoilmu_a_id_statsgo_prj,gsmsoilmu_a_id_statsgo_prj.AREASYMBOL,-1,-1;SPATIALVER "SPATIALVER" true true false 10 Long 0 10 ,First,#,gsmsoilmu_a_id_statsgo_prj,gsmsoilmu_a_id_statsgo_prj.SPATIALVER,-1,-1;MUSYM "MUSYM" true true false 6 Text 0 0 ,First,#,gsmsoilmu_a_id_statsgo_prj,gsmsoilmu_a_id_statsgo_prj.MUSYM,-1,-1;MUKEY "MUKEY" true true false 30 Text 0 0 ,First,#,gsmsoilmu_a_id_statsgo_prj,gsmsoilmu_a_id_statsgo_prj.MUKEY,-1,-1;MUKEY_Vs_Values_csv_PoreSizeDistribution_geometric_WtAvg_x "MUKEY_Vs_Values_csv_PoreSizeDistribution_geometric_WtAvg_x" true true false 8 Double 0 0 ,First,#,gsmsoilmu_a_id_statsgo_prj,MUKEY-Vs-Values.csv.PoreSizeDistribution_geometric_WtAvg_x,-1,-1;MUKEY_Vs_Values_csv_MUKEY "MUKEY_Vs_Values_csv_MUKEY" true true false 8000 Text 0 0 ,First,#,gsmsoilmu_a_id_statsgo_prj,MUKEY-Vs-Values.csv.MUKEY,-1,-1;MUKEY_Vs_Values_csv_ksat_r_WtAvg "MUKEY_Vs_Values_csv_ksat_r_WtAvg" true true false 8 Double 0 0 ,First,#,gsmsoilmu_a_id_statsgo_prj,MUKEY-Vs-Values.csv.ksat_r_WtAvg,-1,-1;MUKEY_Vs_Values_csv_Ks_WtAvg "MUKEY_Vs_Values_csv_Ks_WtAvg" true true false 8 Double 0 0 ,First,#,gsmsoilmu_a_id_statsgo_prj,MUKEY-Vs-Values.csv.Ks_WtAvg,-1,-1;MUKEY_Vs_Values_csv_dbthirdbar_r_WtAvg "MUKEY_Vs_Values_csv_dbthirdbar_r_WtAvg" true true false 8 Double 0 0 ,First,#,gsmsoilmu_a_id_statsgo_prj,MUKEY-Vs-Values.csv.dbthirdbar_r_WtAvg,-1,-1;MUKEY_Vs_Values_csv_dbfifteenbar_r_WtAvg "MUKEY_Vs_Values_csv_dbfifteenbar_r_WtAvg" true true false 8000 Text 0 0 ,First,#,gsmsoilmu_a_id_statsgo_prj,MUKEY-Vs-Values.csv.dbfifteenbar_r_WtAvg,-1,-1;MUKEY_Vs_Values_csv_ResidualWaterContent_WtAvg "MUKEY_Vs_Values_csv_ResidualWaterContent_WtAvg" true true false 8 Double 0 0 ,First,#,gsmsoilmu_a_id_statsgo_prj,MUKEY-Vs-Values.csv.ResidualWaterContent_WtAvg,-1,-1;MUKEY_Vs_Values_csv_Porosity_WtAvg "MUKEY_Vs_Values_csv_Porosity_WtAvg" true true false 8 Double 0 0 ,First,#,gsmsoilmu_a_id_statsgo_prj,MUKEY-Vs-Values.csv.Porosity_WtAvg,-1,-1;MUKEY_Vs_Values_csv_EffectivePorosity_WtAvg "MUKEY_Vs_Values_csv_EffectivePorosity_WtAvg" true true false 8 Double 0 0 ,First,#,gsmsoilmu_a_id_statsgo_prj,MUKEY-Vs-Values.csv.EffectivePorosity_WtAvg,-1,-1;MUKEY_Vs_Values_csv_BubblingPressure_Geometric_WtAvg "MUKEY_Vs_Values_csv_BubblingPressure_Geometric_WtAvg" true true false 8 Double 0 0 ,First,#,gsmsoilmu_a_id_statsgo_prj,MUKEY-Vs-Values.csv.BubblingPressure_Geometric_WtAvg,-1,-1;MUKEY_Vs_Values_csv_PoreSizeDistribution_geometric_WtAvg_y "MUKEY_Vs_Values_csv_PoreSizeDistribution_geometric_WtAvg_y" true true false 8 Double 0 0 ,First,#,gsmsoilmu_a_id_statsgo_prj,MUKEY-Vs-Values.csv.PoreSizeDistribution_geometric_WtAvg_y,-1,-1;AREASYMBOL_1 "AREASYMBOL_1" true true false 20 Text 0 0 ,First,#,gsmsoilmu_a_ut_statsgo_prj,gsmsoilmu_a_ut_statsgo_prj.AREASYMBOL,-1,-1;SPATIALVER_1 "SPATIALVER_1" true true false 10 Long 0 10 ,First,#,gsmsoilmu_a_ut_statsgo_prj,gsmsoilmu_a_ut_statsgo_prj.SPATIALVER,-1,-1;MUSYM_1 "MUSYM_1" true true false 6 Text 0 0 ,First,#,gsmsoilmu_a_ut_statsgo_prj,gsmsoilmu_a_ut_statsgo_prj.MUSYM,-1,-1;MUKEY_1 "MUKEY_1" true true false 30 Text 0 0 ,First,#,gsmsoilmu_a_ut_statsgo_prj,gsmsoilmu_a_ut_statsgo_prj.MUKEY,-1,-1;MUKEY_Vs_Values_csv_PoreSizeDistribution_geometric_WtAvg_x_1 "MUKEY_Vs_Values_csv_PoreSizeDistribution_geometric_WtAvg_x_1" true true false 8 Double 0 0 ,First,#,gsmsoilmu_a_ut_statsgo_prj,MUKEY-Vs-Values.csv.PoreSizeDistribution_geometric_WtAvg_x,-1,-1;MUKEY_Vs_Values_csv_MUKEY_1 "MUKEY_Vs_Values_csv_MUKEY_1" true true false 8000 Text 0 0 ,First,#,gsmsoilmu_a_ut_statsgo_prj,MUKEY-Vs-Values.csv.MUKEY,-1,-1;MUKEY_Vs_Values_csv_ksat_r_WtAvg_1 "MUKEY_Vs_Values_csv_ksat_r_WtAvg_1" true true false 8 Double 0 0 ,First,#,gsmsoilmu_a_ut_statsgo_prj,MUKEY-Vs-Values.csv.ksat_r_WtAvg,-1,-1;MUKEY_Vs_Values_csv_Ks_WtAvg_1 "MUKEY_Vs_Values_csv_Ks_WtAvg_1" true true false 8 Double 0 0 ,First,#,gsmsoilmu_a_ut_statsgo_prj,MUKEY-Vs-Values.csv.Ks_WtAvg,-1,-1;MUKEY_Vs_Values_csv_dbthirdbar_r_WtAvg_1 "MUKEY_Vs_Values_csv_dbthirdbar_r_WtAvg_1" true true false 8 Double 0 0 ,First,#,gsmsoilmu_a_ut_statsgo_prj,MUKEY-Vs-Values.csv.dbthirdbar_r_WtAvg,-1,-1;MUKEY_Vs_Values_csv_dbfifteenbar_r_WtAvg_1 "MUKEY_Vs_Values_csv_dbfifteenbar_r_WtAvg_1" true true false 8000 Text 0 0 ,First,#,gsmsoilmu_a_ut_statsgo_prj,MUKEY-Vs-Values.csv.dbfifteenbar_r_WtAvg,-1,-1;MUKEY_Vs_Values_csv_ResidualWaterContent_WtAvg_1 "MUKEY_Vs_Values_csv_ResidualWaterContent_WtAvg_1" true true false 8 Double 0 0 ,First,#,gsmsoilmu_a_ut_statsgo_prj,MUKEY-Vs-Values.csv.ResidualWaterContent_WtAvg,-1,-1;MUKEY_Vs_Values_csv_Porosity_WtAvg_1 "MUKEY_Vs_Values_csv_Porosity_WtAvg_1" true true false 8 Double 0 0 ,First,#,gsmsoilmu_a_ut_statsgo_prj,MUKEY-Vs-Values.csv.Porosity_WtAvg,-1,-1;MUKEY_Vs_Values_csv_EffectivePorosity_WtAvg_1 "MUKEY_Vs_Values_csv_EffectivePorosity_WtAvg_1" true true false 8 Double 0 0 ,First,#,gsmsoilmu_a_ut_statsgo_prj,MUKEY-Vs-Values.csv.EffectivePorosity_WtAvg,-1,-1;MUKEY_Vs_Values_csv_BubblingPressure_Geometric_WtAvg_1 "MUKEY_Vs_Values_csv_BubblingPressure_Geometric_WtAvg_1" true true false 8 Double 0 0 ,First,#,gsmsoilmu_a_ut_statsgo_prj,MUKEY-Vs-Values.csv.BubblingPressure_Geometric_WtAvg,-1,-1;MUKEY_Vs_Values_csv_PoreSizeDistribution_geometric_WtAvg_y_1 "MUKEY_Vs_Values_csv_PoreSizeDistribution_geometric_WtAvg_y_1" true true false 8 Double 0 0 ,First,#,gsmsoilmu_a_ut_statsgo_prj,MUKEY-Vs-Values.csv.PoreSizeDistribution_geometric_WtAvg_y,-1,-1;MUKEY_Vs_Values_csv_AvaWaterCon "MUKEY_Vs_Values_csv_AvaWaterCon" true true false 4 Long 0 0 ,First,#,gsmsoilmu_a_ut_statsgo_prj,MUKEY-Vs-Values.csv.AvaWaterCon,-1,-1;MUKEY_Vs_Values_csv_HydroGrp "MUKEY_Vs_Values_csv_HydroGrp" true true false 8000 Text 0 0 ,First,#,gsmsoilmu_a_ut_statsgo_prj,MUKEY-Vs-Values.csv.HydroGrp,-1,-1""")
arcpy.Merge_management(inputs= ";".join(statsgo_layers),
output=os.path.join(TEMP,"statgo_merged") ,
field_mappings="""AREASYMBOL "AREASYMBOL" true true false 20 Text 0 0 ,First,#,gsmsoilmu_a_id_statsgo_prj,gsmsoilmu_a_id_statsgo_prj.AREASYMBOL,-1,-1;SPATIALVER "SPATIALVER" true true false 10 Long 0 10 ,First,#,gsmsoilmu_a_id_statsgo_prj,gsmsoilmu_a_id_statsgo_prj.SPATIALVER,-1,-1;MUSYM "MUSYM" true true false 6 Text 0 0 ,First,#,gsmsoilmu_a_id_statsgo_prj,gsmsoilmu_a_id_statsgo_prj.MUSYM,-1,-1;MUKEY "MUKEY" true true false 30 Text 0 0 ,First,#,gsmsoilmu_a_id_statsgo_prj,gsmsoilmu_a_id_statsgo_prj.MUKEY,-1,-1;MUKEY_Vs_Values_csv_PoreSizeDistribution_geometric_WtAvg_x "MUKEY_Vs_Values_csv_PoreSizeDistribution_geometric_WtAvg_x" true true false 8 Double 0 0 ,First,#,gsmsoilmu_a_id_statsgo_prj,MUKEY-Vs-Values.csv.PoreSizeDistribution_geometric_WtAvg_x,-1,-1;MUKEY_Vs_Values_csv_MUKEY "MUKEY_Vs_Values_csv_MUKEY" true true false 8000 Text 0 0 ,First,#,gsmsoilmu_a_id_statsgo_prj,MUKEY-Vs-Values.csv.MUKEY,-1,-1;MUKEY_Vs_Values_csv_ksat_r_WtAvg "MUKEY_Vs_Values_csv_ksat_r_WtAvg" true true false 8 Double 0 0 ,First,#,gsmsoilmu_a_id_statsgo_prj,MUKEY-Vs-Values.csv.ksat_r_WtAvg,-1,-1;MUKEY_Vs_Values_csv_Ks_WtAvg "MUKEY_Vs_Values_csv_Ks_WtAvg" true true false 8 Double 0 0 ,First,#,gsmsoilmu_a_id_statsgo_prj,MUKEY-Vs-Values.csv.Ks_WtAvg,-1,-1;MUKEY_Vs_Values_csv_dbthirdbar_r_WtAvg "MUKEY_Vs_Values_csv_dbthirdbar_r_WtAvg" true true false 8 Double 0 0 ,First,#,gsmsoilmu_a_id_statsgo_prj,MUKEY-Vs-Values.csv.dbthirdbar_r_WtAvg,-1,-1;MUKEY_Vs_Values_csv_dbfifteenbar_r_WtAvg "MUKEY_Vs_Values_csv_dbfifteenbar_r_WtAvg" true true false 8000 Text 0 0 ,First,#,gsmsoilmu_a_id_statsgo_prj,MUKEY-Vs-Values.csv.dbfifteenbar_r_WtAvg,-1,-1;MUKEY_Vs_Values_csv_ResidualWaterContent_WtAvg "MUKEY_Vs_Values_csv_ResidualWaterContent_WtAvg" true true false 8 Double 0 0 ,First,#,gsmsoilmu_a_id_statsgo_prj,MUKEY-Vs-Values.csv.ResidualWaterContent_WtAvg,-1,-1;MUKEY_Vs_Values_csv_Porosity_WtAvg "MUKEY_Vs_Values_csv_Porosity_WtAvg" true true false 8 Double 0 0 ,First,#,gsmsoilmu_a_id_statsgo_prj,MUKEY-Vs-Values.csv.Porosity_WtAvg,-1,-1;MUKEY_Vs_Values_csv_EffectivePorosity_WtAvg "MUKEY_Vs_Values_csv_EffectivePorosity_WtAvg" true true false 8 Double 0 0 ,First,#,gsmsoilmu_a_id_statsgo_prj,MUKEY-Vs-Values.csv.EffectivePorosity_WtAvg,-1,-1;MUKEY_Vs_Values_csv_BubblingPressure_Geometric_WtAvg "MUKEY_Vs_Values_csv_BubblingPressure_Geometric_WtAvg" true true false 8 Double 0 0 ,First,#,gsmsoilmu_a_id_statsgo_prj,MUKEY-Vs-Values.csv.BubblingPressure_Geometric_WtAvg,-1,-1;MUKEY_Vs_Values_csv_PoreSizeDistribution_geometric_WtAvg_y "MUKEY_Vs_Values_csv_PoreSizeDistribution_geometric_WtAvg_y" true true false 8 Double 0 0 ,First,#,gsmsoilmu_a_id_statsgo_prj,MUKEY-Vs-Values.csv.PoreSizeDistribution_geometric_WtAvg_y,-1,-1;AREASYMBOL_1 "AREASYMBOL_1" true true false 20 Text 0 0 ,First,#,gsmsoilmu_a_ut_statsgo_prj,gsmsoilmu_a_ut_statsgo_prj.AREASYMBOL,-1,-1;SPATIALVER_1 "SPATIALVER_1" true true false 10 Long 0 10 ,First,#,gsmsoilmu_a_ut_statsgo_prj,gsmsoilmu_a_ut_statsgo_prj.SPATIALVER,-1,-1;MUSYM_1 "MUSYM_1" true true false 6 Text 0 0 ,First,#,gsmsoilmu_a_ut_statsgo_prj,gsmsoilmu_a_ut_statsgo_prj.MUSYM,-1,-1;MUKEY_1 "MUKEY_1" true true false 30 Text 0 0 ,First,#,gsmsoilmu_a_ut_statsgo_prj,gsmsoilmu_a_ut_statsgo_prj.MUKEY,-1,-1;MUKEY_Vs_Values_csv_PoreSizeDistribution_geometric_WtAvg_x_1 "MUKEY_Vs_Values_csv_PoreSizeDistribution_geometric_WtAvg_x_1" true true false 8 Double 0 0 ,First,#,gsmsoilmu_a_ut_statsgo_prj,MUKEY-Vs-Values.csv.PoreSizeDistribution_geometric_WtAvg_x,-1,-1;MUKEY_Vs_Values_csv_MUKEY_1 "MUKEY_Vs_Values_csv_MUKEY_1" true true false 8000 Text 0 0 ,First,#,gsmsoilmu_a_ut_statsgo_prj,MUKEY-Vs-Values.csv.MUKEY,-1,-1;MUKEY_Vs_Values_csv_ksat_r_WtAvg_1 "MUKEY_Vs_Values_csv_ksat_r_WtAvg_1" true true false 8 Double 0 0 ,First,#,gsmsoilmu_a_ut_statsgo_prj,MUKEY-Vs-Values.csv.ksat_r_WtAvg,-1,-1;MUKEY_Vs_Values_csv_Ks_WtAvg_1 "MUKEY_Vs_Values_csv_Ks_WtAvg_1" true true false 8 Double 0 0 ,First,#,gsmsoilmu_a_ut_statsgo_prj,MUKEY-Vs-Values.csv.Ks_WtAvg,-1,-1;MUKEY_Vs_Values_csv_dbthirdbar_r_WtAvg_1 "MUKEY_Vs_Values_csv_dbthirdbar_r_WtAvg_1" true true false 8 Double 0 0 ,First,#,gsmsoilmu_a_ut_statsgo_prj,MUKEY-Vs-Values.csv.dbthirdbar_r_WtAvg,-1,-1;MUKEY_Vs_Values_csv_dbfifteenbar_r_WtAvg_1 "MUKEY_Vs_Values_csv_dbfifteenbar_r_WtAvg_1" true true false 8000 Text 0 0 ,First,#,gsmsoilmu_a_ut_statsgo_prj,MUKEY-Vs-Values.csv.dbfifteenbar_r_WtAvg,-1,-1;MUKEY_Vs_Values_csv_ResidualWaterContent_WtAvg_1 "MUKEY_Vs_Values_csv_ResidualWaterContent_WtAvg_1" true true false 8 Double 0 0 ,First,#,gsmsoilmu_a_ut_statsgo_prj,MUKEY-Vs-Values.csv.ResidualWaterContent_WtAvg,-1,-1;MUKEY_Vs_Values_csv_Porosity_WtAvg_1 "MUKEY_Vs_Values_csv_Porosity_WtAvg_1" true true false 8 Double 0 0 ,First,#,gsmsoilmu_a_ut_statsgo_prj,MUKEY-Vs-Values.csv.Porosity_WtAvg,-1,-1;MUKEY_Vs_Values_csv_EffectivePorosity_WtAvg_1 "MUKEY_Vs_Values_csv_EffectivePorosity_WtAvg_1" true true false 8 Double 0 0 ,First,#,gsmsoilmu_a_ut_statsgo_prj,MUKEY-Vs-Values.csv.EffectivePorosity_WtAvg,-1,-1;MUKEY_Vs_Values_csv_BubblingPressure_Geometric_WtAvg_1 "MUKEY_Vs_Values_csv_BubblingPressure_Geometric_WtAvg_1" true true false 8 Double 0 0 ,First,#,gsmsoilmu_a_ut_statsgo_prj,MUKEY-Vs-Values.csv.BubblingPressure_Geometric_WtAvg,-1,-1;MUKEY_Vs_Values_csv_PoreSizeDistribution_geometric_WtAvg_y_1 "MUKEY_Vs_Values_csv_PoreSizeDistribution_geometric_WtAvg_y_1" true true false 8 Double 0 0 ,First,#,gsmsoilmu_a_ut_statsgo_prj,MUKEY-Vs-Values.csv.PoreSizeDistribution_geometric_WtAvg_y,-1,-1;MUKEY_Vs_Values_csv_AvaWaterCon "MUKEY_Vs_Values_csv_AvaWaterCon" true true false 4 Long 0 0 ,First,#,gsmsoilmu_a_ut_statsgo_prj,MUKEY-Vs-Values.csv.AvaWaterCon,-1,-1;MUKEY_Vs_Values_csv_HydroGrp "MUKEY_Vs_Values_csv_HydroGrp" true true false 8000 Text 0 0 ,First,#,gsmsoilmu_a_ut_statsgo_prj,MUKEY-Vs-Values.csv.HydroGrp,-1,-1""")
arcpy.Clip_analysis(in_features=os.path.join(TEMP,"ssurgo_merged.shp"),
clip_features=os.path.join(TEMP,"mask_polygon.shp"),
out_feature_class=os.path.join(TEMP,"Project_ssurgo_merged"), cluster_tolerance="")
arcpy.Clip_analysis(in_features=os.path.join(TEMP,"statgo_merged.shp"),
clip_features=os.path.join(TEMP,"mask_polygon.shp"),
out_feature_class=os.path.join(TEMP,"Project_statsgo_merged"), cluster_tolerance="")
arcpy.AddMessage("SUCCESS: SSURGO and STATSGO for the area merged")
return
def erase_statsgo_and_merge():
# select the NULL values in SSURGO, and delete them
try:
arcpy.MakeFeatureLayer_management(os.path.join(TEMP,"Project_ssurgo_merged.shp"), "Project_ssurgo_merged")
arcpy.SelectLayerByAttribute_management(in_layer_or_view="Project_ssurgo_merged",
selection_type="NEW_SELECTION",
where_clause=""""MUKEY_Vs_9" = 0""")
arcpy.DeleteFeatures_management(in_features= "Project_ssurgo_merged")
arcpy.AddMessage("SUCCESS: NULL records from SSURGO deleted")
arcpy.Erase_analysis(in_features=os.path.join(TEMP,"Project_statsgo_merged.shp"),
erase_features=os.path.join(TEMP,"Project_ssurgo_merged.shp"),
out_feature_class=os.path.join(TEMP,"useful_stasgo"), cluster_tolerance="")
arcpy.Merge_management(inputs="%suseful_stasgo.shp;%sProject_ssurgo_merged.shp"%(TEMP+"/", TEMP+"/"),
output=os.path.join(TEMP,"Project_ssurgo_statsgo_merge"),
field_mappings="""AREASYMBOL "AREASYMBOL" true true false 20 Text 0 0 ,First,#,Project_statsgo_merged_Erase,AREASYMBOL,-1,-1,Project_ssurgo_merged,AREASYMBOL,-1,-1;SPATIALVER "SPATIALVER" true true false 4 Long 0 0 ,First,#,Project_statsgo_merged_Erase,SPATIALVER,-1,-1,Project_ssurgo_merged,SPATIALVER,-1,-1;MUSYM "MUSYM" true true false 6 Text 0 0 ,First,#,Project_statsgo_merged_Erase,MUSYM,-1,-1,Project_ssurgo_merged,MUSYM,-1,-1;MUKEY "MUKEY" true true false 30 Text 0 0 ,First,#,Project_statsgo_merged_Erase,MUKEY,-1,-1,Project_ssurgo_merged,MUKEY,-1,-1;MUKEY_Vs_V "MUKEY_Vs_V" true true false 8 Double 0 0 ,First,#,Project_statsgo_merged_Erase,MUKEY_Vs_V,-1,-1,Project_ssurgo_merged,MUKEY_Vs_V,-1,-1;MUKEY_Vs_1 "MUKEY_Vs_1" true true false 254 Text 0 0 ,First,#,Project_statsgo_merged_Erase,MUKEY_Vs_1,-1,-1,Project_ssurgo_merged,MUKEY_Vs_1,-1,-1;MUKEY_Vs_2 "MUKEY_Vs_2" true true false 8 Double 0 0 ,First,#,Project_statsgo_merged_Erase,MUKEY_Vs_2,-1,-1,Project_ssurgo_merged,MUKEY_Vs_2,-1,-1;MUKEY_Vs_3 "MUKEY_Vs_3" true true false 8 Double 0 0 ,First,#,Project_statsgo_merged_Erase,MUKEY_Vs_3,-1,-1,Project_ssurgo_merged,MUKEY_Vs_3,-1,-1;MUKEY_Vs_4 "MUKEY_Vs_4" true true false 8 Double 0 0 ,First,#,Project_statsgo_merged_Erase,MUKEY_Vs_4,-1,-1,Project_ssurgo_merged,MUKEY_Vs_4,-1,-1;MUKEY_Vs_5 "MUKEY_Vs_5" true true false 254 Text 0 0 ,First,#,Project_statsgo_merged_Erase,MUKEY_Vs_5,-1,-1,Project_ssurgo_merged,MUKEY_Vs_5,-1,-1;MUKEY_Vs_6 "MUKEY_Vs_6" true true false 8 Double 0 0 ,First,#,Project_statsgo_merged_Erase,MUKEY_Vs_6,-1,-1,Project_ssurgo_merged,MUKEY_Vs_6,-1,-1;MUKEY_Vs_7 "MUKEY_Vs_7" true true false 8 Double 0 0 ,First,#,Project_statsgo_merged_Erase,MUKEY_Vs_7,-1,-1,Project_ssurgo_merged,MUKEY_Vs_7,-1,-1;MUKEY_Vs_8 "MUKEY_Vs_8" true true false 8 Double 0 0 ,First,#,Project_statsgo_merged_Erase,MUKEY_Vs_8,-1,-1,Project_ssurgo_merged,MUKEY_Vs_8,-1,-1;MUKEY_Vs_9 "MUKEY_Vs_9" true true false 8 Double 0 0 ,First,#,Project_statsgo_merged_Erase,MUKEY_Vs_9,-1,-1,Project_ssurgo_merged,MUKEY_Vs_9,-1,-1;MUKEY_V_10 "MUKEY_V_10" true true false 8 Double 0 0 ,First,#,Project_statsgo_merged_Erase,MUKEY_V_10,-1,-1,Project_ssurgo_merged,MUKEY_V_10,-1,-1;AREASYMB_1 "AREASYMB_1" true true false 20 Text 0 0 ,First,#,Project_statsgo_merged_Erase,AREASYMB_1,-1,-1;SPATIALV_1 "SPATIALV_1" true true false 4 Long 0 0 ,First,#,Project_statsgo_merged_Erase,SPATIALV_1,-1,-1;MUSYM_1 "MUSYM_1" true true false 6 Text 0 0 ,First,#,Project_statsgo_merged_Erase,MUSYM_1,-1,-1;MUKEY_1 "MUKEY_1" true true false 30 Text 0 0 ,First,#,Project_statsgo_merged_Erase,MUKEY_1,-1,-1;MUKEY_V_11 "MUKEY_V_11" true true false 8 Double 0 0 ,First,#,Project_statsgo_merged_Erase,MUKEY_V_11,-1,-1,Project_ssurgo_merged,MUKEY_V_11,-1,-1;MUKEY_V_12 "MUKEY_V_12" true true false 254 Text 0 0 ,First,#,Project_statsgo_merged_Erase,MUKEY_V_12,-1,-1,Project_ssurgo_merged,MUKEY_V_12,-1,-1;MUKEY_V_13 "MUKEY_V_13" true true false 8 Double 0 0 ,First,#,Project_statsgo_merged_Erase,MUKEY_V_13,-1,-1;MUKEY_V_14 "MUKEY_V_14" true true false 8 Double 0 0 ,First,#,Project_statsgo_merged_Erase,MUKEY_V_14,-1,-1;MUKEY_V_15 "MUKEY_V_15" true true false 8 Double 0 0 ,First,#,Project_statsgo_merged_Erase,MUKEY_V_15,-1,-1;MUKEY_V_16 "MUKEY_V_16" true true false 254 Text 0 0 ,First,#,Project_statsgo_merged_Erase,MUKEY_V_16,-1,-1;MUKEY_V_17 "MUKEY_V_17" true true false 8 Double 0 0 ,First,#,Project_statsgo_merged_Erase,MUKEY_V_17,-1,-1;MUKEY_V_18 "MUKEY_V_18" true true false 8 Double 0 0 ,First,#,Project_statsgo_merged_Erase,MUKEY_V_18,-1,-1;MUKEY_V_19 "MUKEY_V_19" true true false 8 Double 0 0 ,First,#,Project_statsgo_merged_Erase,MUKEY_V_19,-1,-1;MUKEY_V_20 "MUKEY_V_20" true true false 8 Double 0 0 ,First,#,Project_statsgo_merged_Erase,MUKEY_V_20,-1,-1;MUKEY_V_21 "MUKEY_V_21" true true false 8 Double 0 0 ,First,#,Project_statsgo_merged_Erase,MUKEY_V_21,-1,-1;MUKEY_V_22 "MUKEY_V_22" true true false 4 Long 0 0 ,First,#,Project_statsgo_merged_Erase,MUKEY_V_22,-1,-1;MUKEY_V_23 "MUKEY_V_23" true true false 254 Text 0 0 ,First,#,Project_statsgo_merged_Erase,MUKEY_V_23,-1,-1;Shape_Length "Shape_Length" false true true 8 Double 0 0 ,First,#,Project_statsgo_merged_Erase,Shape_Length,-1,-1;Shape_Area "Shape_Area" false true true 8 Double 0 0 ,First,#,Project_statsgo_merged_Erase,Shape_Area,-1,-1""")
arcpy.AddMessage("SUCCESS: Erasing NULL SSURGO records and Merging STATSGO to SSURGO")
merged_layer = arcpy.mapping.Layer(os.path.join(TEMP,"Project_ssurgo_statsgo_merge.shp") ) # create a new layer
arcpy.mapping.AddLayer(df, merged_layer ,"TOP")
except Exception,e:
arcpy.AddMessage("FAILED: Erasing NULL SSURGO records and Merging STATSGO to SSURGO")
return os.path.join(TEMP,"Project_ssurgo_statsgo_merge")
def export(project_ssurgo_statsgo, soilProperties, MaskRaster, TEMP= TEMP, outDir= outDir ):
arcpy.env.snapRaster = MaskRaster # Set Snap Raster environment
project_ssurgo_statsgo = project_ssurgo_statsgo+".shp"
# soilProperties = [[ "ksat_r_WtAvg", "Ksat-s_UT612" ], ["Ks_WtAvg", "Ksat-t_ut612" ], .... ]
for a_soil_property in soilProperties:
soil_property_name = a_soil_property[1] #e.g. Ksat-s, Bblpr-t, PoreSz-t etc.
field_name = a_soil_property[2]
outRaster = os.path.join(TEMP, soil_property_name)
arcpy.FeatureToRaster_conversion(in_features=project_ssurgo_statsgo,
field= field_name ,
out_raster=outRaster, cell_size= MaskRaster )
arcpy.gp.ExtractByMask_sa(outRaster, MaskRaster,outRaster+"X") #c=clipped
# to clip the rasters to the consistent extent, so that their (nrows x ncol) matches
arcpy.Clip_management(in_raster=outRaster+"X",
out_raster= outRaster+"c" , in_template_dataset=MaskRaster, nodata_value="-9999",
clipping_geometry="NONE", maintain_clipping_extent="MAINTAIN_EXTENT")
arcpy.RasterToOtherFormat_conversion(Input_Rasters="'%s'"%(outRaster+"c"), Output_Workspace=outDir, Raster_Format="TIFF")
arcpy.AddMessage("SUCCESS: TIF representing %s values saved in %s"%(soil_property_name, outDir))
# load the TIF to arcmap
# tif_layer = arcpy.mapping.Layer(os.path.join(outDir, soil_property_name+"c.tif") ) # create a new layer
# arcpy.mapping.AddLayer(df, tif_layer ,"TOP")
# __main__
join_field( path2_ssurgo, MaskRaster,"ssurgo")
if path2statsgo != "":
join_field( path2statsgo, MaskRaster, "statsgo" )
merge_and_clip(TEMP)
erase_statsgo_and_merge()
export(erase_statsgo_and_merge(), soilProperties, MaskRaster )
if __name__ == "__main__":
STEP5_Join_Merge_Export (path2_ssurgo, path2statsgo, outDir, MaskRaster )
| 132.383838 | 5,836 | 0.751946 | 4,080 | 26,212 | 4.453676 | 0.079412 | 0.073579 | 0.103737 | 0.127676 | 0.73155 | 0.717242 | 0.703924 | 0.675032 | 0.659237 | 0.638325 | 0 | 0.038465 | 0.14604 | 26,212 | 197 | 5,837 | 133.055838 | 0.77332 | 0.068785 | 0 | 0.131148 | 0 | 0.02459 | 0.716877 | 0.526811 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.02459 | null | null | 0.008197 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
b7328062fd60601451dd51b64577f7f9d0c854dd | 289 | py | Python | pywren_ibm_cloud/runtime/__init__.py | erezh16/pywren-ibm-cloud | 54d0d5346f15ae86ff95b5502da2fc062014adb3 | [
"Apache-2.0"
] | null | null | null | pywren_ibm_cloud/runtime/__init__.py | erezh16/pywren-ibm-cloud | 54d0d5346f15ae86ff95b5502da2fc062014adb3 | [
"Apache-2.0"
] | null | null | null | pywren_ibm_cloud/runtime/__init__.py | erezh16/pywren-ibm-cloud | 54d0d5346f15ae86ff95b5502da2fc062014adb3 | [
"Apache-2.0"
] | null | null | null | from pywren_ibm_cloud.runtime.utils import create_runtime
from pywren_ibm_cloud.runtime.utils import build_runtime
from pywren_ibm_cloud.runtime.utils import update_runtime
from pywren_ibm_cloud.runtime.utils import delete_runtime
from pywren_ibm_cloud.runtime.utils import clean_runtimes
| 48.166667 | 57 | 0.896194 | 45 | 289 | 5.422222 | 0.288889 | 0.204918 | 0.266393 | 0.368852 | 0.852459 | 0.852459 | 0.852459 | 0.704918 | 0 | 0 | 0 | 0 | 0.069204 | 289 | 5 | 58 | 57.8 | 0.907063 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 11 |
b7423dd12154cd073eb63400a43b03041685f0d0 | 118 | py | Python | src/seqflow/__init__.py | iBiology/SeqFlow | a4114ffb37aa421946f1811b79b78896ababd8d1 | [
"MIT"
] | null | null | null | src/seqflow/__init__.py | iBiology/SeqFlow | a4114ffb37aa421946f1811b79b78896ababd8d1 | [
"MIT"
] | null | null | null | src/seqflow/__init__.py | iBiology/SeqFlow | a4114ffb37aa421946f1811b79b78896ababd8d1 | [
"MIT"
] | null | null | null | #!/usr/bin/env python
# -*- coding: utf-8 -*-
from .flow import task
from .flow import Flow
from .flow import logger
| 16.857143 | 24 | 0.686441 | 19 | 118 | 4.263158 | 0.631579 | 0.296296 | 0.518519 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.010204 | 0.169492 | 118 | 6 | 25 | 19.666667 | 0.816327 | 0.355932 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
3f987e7d4dd85ab3ed9ee481ac4bdfe9dc04fd48 | 57 | py | Python | python/protocol/__init__.py | Gronis/gloomhaven-helper-rfid | f04947c94e05e8fa3f2d17ded4dacb807bcf8677 | [
"MIT"
] | 11 | 2019-08-23T11:42:04.000Z | 2022-03-18T22:59:38.000Z | python/protocol/__init__.py | Gronis/gloomhaven-helper-rfid | f04947c94e05e8fa3f2d17ded4dacb807bcf8677 | [
"MIT"
] | 4 | 2020-02-05T08:28:40.000Z | 2020-08-02T14:20:34.000Z | python/protocol/__init__.py | Gronis/gloomhaven-helper-rfid | f04947c94e05e8fa3f2d17ded4dacb807bcf8677 | [
"MIT"
] | 3 | 2020-02-05T08:26:18.000Z | 2020-12-12T23:34:51.000Z | from . import v7_6
from . import v8_0
from . import v8_4
| 14.25 | 18 | 0.736842 | 12 | 57 | 3.25 | 0.583333 | 0.769231 | 0.615385 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.133333 | 0.210526 | 57 | 3 | 19 | 19 | 0.733333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
3f9b64d6f55682c14e2305b231911d6c7c0852b9 | 240 | py | Python | viktor/etl/__init__.py | barretobrock/viktor | 81b9fb60853682493618fd82dea804828b247530 | [
"MIT"
] | null | null | null | viktor/etl/__init__.py | barretobrock/viktor | 81b9fb60853682493618fd82dea804828b247530 | [
"MIT"
] | 2 | 2021-03-31T19:57:45.000Z | 2021-06-15T23:10:09.000Z | viktor/etl/__init__.py | barretobrock/viktor | 81b9fb60853682493618fd82dea804828b247530 | [
"MIT"
] | null | null | null | from .etl_acronyms import acronym_tables
from .etl_emojis import emoji_tables
from .etl_okr_perks import okr_tables
from .etl_okr_users import user_tables
from .etl_okr_quotes import quotes_tables
from .etl_responses import response_tables
| 34.285714 | 42 | 0.875 | 39 | 240 | 5 | 0.384615 | 0.215385 | 0.333333 | 0.246154 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.1 | 240 | 6 | 43 | 40 | 0.902778 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
3fce467dd738f78d699a47631f841df02c7556ea | 81,685 | py | Python | dask-fargate/.env/lib/python3.6/site-packages/aws_cdk/aws_secretsmanager/__init__.py | chriscoombs/amazon-sagemaker-cdk-examples | ba848218dab59abb03f68dc92bcad7929841fcc9 | [
"Apache-2.0"
] | 41 | 2019-08-22T13:03:42.000Z | 2022-02-24T05:07:32.000Z | dask-fargate/.env/lib/python3.6/site-packages/aws_cdk/aws_secretsmanager/__init__.py | chriscoombs/amazon-sagemaker-cdk-examples | ba848218dab59abb03f68dc92bcad7929841fcc9 | [
"Apache-2.0"
] | 1 | 2020-06-17T17:44:28.000Z | 2021-02-12T22:40:01.000Z | dask-fargate/.env/lib/python3.6/site-packages/aws_cdk/aws_secretsmanager/__init__.py | chriscoombs/amazon-sagemaker-cdk-examples | ba848218dab59abb03f68dc92bcad7929841fcc9 | [
"Apache-2.0"
] | 31 | 2019-08-23T17:33:41.000Z | 2022-03-28T09:20:07.000Z | """
## AWS Secrets Manager Construct Library
<!--BEGIN STABILITY BANNER-->---

---
<!--END STABILITY BANNER-->
```python
# Example automatically generated. See https://github.com/aws/jsii/issues/826
secretsmanager = require("@aws-cdk/aws-secretsmanager")
```
### Create a new Secret in a Stack
In order to have SecretsManager generate a new secret value automatically,
you can get started with the following:
```python
# Example automatically generated. See https://github.com/aws/jsii/issues/826
# Default secret
secret = secretsmanager.Secret(self, "Secret")
secret.grant_read(role)
iam.User(self, "User",
password=secret.secret_value
)
# Templated secret
templated_secret = secretsmanager.Secret(self, "TemplatedSecret",
generate_secret_string={
"secret_string_template": JSON.stringify(username="user"),
"generate_string_key": "password"
}
)
iam.User(self, "OtherUser",
user_name=templated_secret.secret_value_from_json("username").to_string(),
password=templated_secret.secret_value_from_json("password")
)
```
The `Secret` construct does not allow specifying the `SecretString` property
of the `AWS::SecretsManager::Secret` resource (as this will almost always
lead to the secret being surfaced in plain text and possibly committed to
your source control).
If you need to use a pre-existing secret, the recommended way is to manually
provision the secret in *AWS SecretsManager* and use the `Secret.fromSecretArn`
or `Secret.fromSecretAttributes` method to make it available in your CDK Application:
```python
# Example automatically generated. See https://github.com/aws/jsii/issues/826
secret = secretsmanager.Secret.from_secret_attributes(scope, "ImportedSecret",
secret_arn="arn:aws:secretsmanager:<region>:<account-id-number>:secret:<secret-name>-<random-6-characters>",
# If the secret is encrypted using a KMS-hosted CMK, either import or reference that key:
encryption_key=encryption_key
)
```
SecretsManager secret values can only be used in select set of properties. For the
list of properties, see [the CloudFormation Dynamic References documentation](https://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/dynamic-references.htm).
### Rotating a Secret
A rotation schedule can be added to a Secret:
```python
# Example automatically generated. See https://github.com/aws/jsii/issues/826
fn = lambda.Function(...)
secret = secretsmanager.Secret(self, "Secret")
secret.add_rotation_schedule("RotationSchedule",
rotation_lambda=fn,
automatically_after=Duration.days(15)
)
```
See [Overview of the Lambda Rotation Function](https://docs.aws.amazon.com/secretsmanager/latest/userguide/rotating-secrets-lambda-function-overview.html) on how to implement a Lambda Rotation Function.
For RDS credentials rotation, see [aws-rds](https://github.com/aws/aws-cdk/blob/master/packages/%40aws-cdk/aws-rds/README.md).
"""
import abc
import datetime
import enum
import typing
import jsii
import jsii.compat
import publication
from jsii.python import classproperty
import aws_cdk.aws_ec2
import aws_cdk.aws_iam
import aws_cdk.aws_kms
import aws_cdk.aws_lambda
import aws_cdk.core
__jsii_assembly__ = jsii.JSIIAssembly.load("@aws-cdk/aws-secretsmanager", "1.18.0", __name__, "aws-secretsmanager@1.18.0.jsii.tgz")
@jsii.data_type(jsii_type="@aws-cdk/aws-secretsmanager.AttachedSecretOptions", jsii_struct_bases=[], name_mapping={'target': 'target'})
class AttachedSecretOptions():
def __init__(self, *, target: "ISecretAttachmentTarget"):
"""Options to add a secret attachment to a secret.
:param target: The target to attach the secret to.
"""
self._values = {
'target': target,
}
@property
def target(self) -> "ISecretAttachmentTarget":
"""The target to attach the secret to."""
return self._values.get('target')
def __eq__(self, rhs) -> bool:
return isinstance(rhs, self.__class__) and rhs._values == self._values
def __ne__(self, rhs) -> bool:
return not (rhs == self)
def __repr__(self) -> str:
return 'AttachedSecretOptions(%s)' % ', '.join(k + '=' + repr(v) for k, v in self._values.items())
@jsii.enum(jsii_type="@aws-cdk/aws-secretsmanager.AttachmentTargetType")
class AttachmentTargetType(enum.Enum):
"""The type of service or database that's being associated with the secret."""
INSTANCE = "INSTANCE"
"""A database instance."""
CLUSTER = "CLUSTER"
"""A database cluster."""
@jsii.implements(aws_cdk.core.IInspectable)
class CfnResourcePolicy(aws_cdk.core.CfnResource, metaclass=jsii.JSIIMeta, jsii_type="@aws-cdk/aws-secretsmanager.CfnResourcePolicy"):
"""A CloudFormation ``AWS::SecretsManager::ResourcePolicy``.
see
:see: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-secretsmanager-resourcepolicy.html
cloudformationResource:
:cloudformationResource:: AWS::SecretsManager::ResourcePolicy
"""
def __init__(self, scope: aws_cdk.core.Construct, id: str, *, resource_policy: typing.Any, secret_id: str) -> None:
"""Create a new ``AWS::SecretsManager::ResourcePolicy``.
:param scope: - scope in which this resource is defined.
:param id: - scoped id of the resource.
:param props: - resource properties.
:param resource_policy: ``AWS::SecretsManager::ResourcePolicy.ResourcePolicy``.
:param secret_id: ``AWS::SecretsManager::ResourcePolicy.SecretId``.
"""
props = CfnResourcePolicyProps(resource_policy=resource_policy, secret_id=secret_id)
jsii.create(CfnResourcePolicy, self, [scope, id, props])
@jsii.member(jsii_name="inspect")
def inspect(self, inspector: aws_cdk.core.TreeInspector) -> None:
"""Examines the CloudFormation resource and discloses attributes.
:param inspector: - tree inspector to collect and process attributes.
stability
:stability: experimental
"""
return jsii.invoke(self, "inspect", [inspector])
@jsii.member(jsii_name="renderProperties")
def _render_properties(self, props: typing.Mapping[str,typing.Any]) -> typing.Mapping[str,typing.Any]:
"""
:param props: -
"""
return jsii.invoke(self, "renderProperties", [props])
@classproperty
@jsii.member(jsii_name="CFN_RESOURCE_TYPE_NAME")
def CFN_RESOURCE_TYPE_NAME(cls) -> str:
"""The CloudFormation resource type name for this resource class."""
return jsii.sget(cls, "CFN_RESOURCE_TYPE_NAME")
@property
@jsii.member(jsii_name="cfnProperties")
def _cfn_properties(self) -> typing.Mapping[str,typing.Any]:
return jsii.get(self, "cfnProperties")
@property
@jsii.member(jsii_name="resourcePolicy")
def resource_policy(self) -> typing.Any:
"""``AWS::SecretsManager::ResourcePolicy.ResourcePolicy``.
see
:see: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-secretsmanager-resourcepolicy.html#cfn-secretsmanager-resourcepolicy-resourcepolicy
"""
return jsii.get(self, "resourcePolicy")
@resource_policy.setter
def resource_policy(self, value: typing.Any):
return jsii.set(self, "resourcePolicy", value)
@property
@jsii.member(jsii_name="secretId")
def secret_id(self) -> str:
"""``AWS::SecretsManager::ResourcePolicy.SecretId``.
see
:see: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-secretsmanager-resourcepolicy.html#cfn-secretsmanager-resourcepolicy-secretid
"""
return jsii.get(self, "secretId")
@secret_id.setter
def secret_id(self, value: str):
return jsii.set(self, "secretId", value)
@jsii.data_type(jsii_type="@aws-cdk/aws-secretsmanager.CfnResourcePolicyProps", jsii_struct_bases=[], name_mapping={'resource_policy': 'resourcePolicy', 'secret_id': 'secretId'})
class CfnResourcePolicyProps():
def __init__(self, *, resource_policy: typing.Any, secret_id: str):
"""Properties for defining a ``AWS::SecretsManager::ResourcePolicy``.
:param resource_policy: ``AWS::SecretsManager::ResourcePolicy.ResourcePolicy``.
:param secret_id: ``AWS::SecretsManager::ResourcePolicy.SecretId``.
see
:see: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-secretsmanager-resourcepolicy.html
"""
self._values = {
'resource_policy': resource_policy,
'secret_id': secret_id,
}
@property
def resource_policy(self) -> typing.Any:
"""``AWS::SecretsManager::ResourcePolicy.ResourcePolicy``.
see
:see: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-secretsmanager-resourcepolicy.html#cfn-secretsmanager-resourcepolicy-resourcepolicy
"""
return self._values.get('resource_policy')
@property
def secret_id(self) -> str:
"""``AWS::SecretsManager::ResourcePolicy.SecretId``.
see
:see: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-secretsmanager-resourcepolicy.html#cfn-secretsmanager-resourcepolicy-secretid
"""
return self._values.get('secret_id')
def __eq__(self, rhs) -> bool:
return isinstance(rhs, self.__class__) and rhs._values == self._values
def __ne__(self, rhs) -> bool:
return not (rhs == self)
def __repr__(self) -> str:
return 'CfnResourcePolicyProps(%s)' % ', '.join(k + '=' + repr(v) for k, v in self._values.items())
@jsii.implements(aws_cdk.core.IInspectable)
class CfnRotationSchedule(aws_cdk.core.CfnResource, metaclass=jsii.JSIIMeta, jsii_type="@aws-cdk/aws-secretsmanager.CfnRotationSchedule"):
"""A CloudFormation ``AWS::SecretsManager::RotationSchedule``.
see
:see: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-secretsmanager-rotationschedule.html
cloudformationResource:
:cloudformationResource:: AWS::SecretsManager::RotationSchedule
"""
def __init__(self, scope: aws_cdk.core.Construct, id: str, *, secret_id: str, rotation_lambda_arn: typing.Optional[str]=None, rotation_rules: typing.Optional[typing.Union[typing.Optional[aws_cdk.core.IResolvable], typing.Optional["RotationRulesProperty"]]]=None) -> None:
"""Create a new ``AWS::SecretsManager::RotationSchedule``.
:param scope: - scope in which this resource is defined.
:param id: - scoped id of the resource.
:param props: - resource properties.
:param secret_id: ``AWS::SecretsManager::RotationSchedule.SecretId``.
:param rotation_lambda_arn: ``AWS::SecretsManager::RotationSchedule.RotationLambdaARN``.
:param rotation_rules: ``AWS::SecretsManager::RotationSchedule.RotationRules``.
"""
props = CfnRotationScheduleProps(secret_id=secret_id, rotation_lambda_arn=rotation_lambda_arn, rotation_rules=rotation_rules)
jsii.create(CfnRotationSchedule, self, [scope, id, props])
@jsii.member(jsii_name="inspect")
def inspect(self, inspector: aws_cdk.core.TreeInspector) -> None:
"""Examines the CloudFormation resource and discloses attributes.
:param inspector: - tree inspector to collect and process attributes.
stability
:stability: experimental
"""
return jsii.invoke(self, "inspect", [inspector])
@jsii.member(jsii_name="renderProperties")
def _render_properties(self, props: typing.Mapping[str,typing.Any]) -> typing.Mapping[str,typing.Any]:
"""
:param props: -
"""
return jsii.invoke(self, "renderProperties", [props])
@classproperty
@jsii.member(jsii_name="CFN_RESOURCE_TYPE_NAME")
def CFN_RESOURCE_TYPE_NAME(cls) -> str:
"""The CloudFormation resource type name for this resource class."""
return jsii.sget(cls, "CFN_RESOURCE_TYPE_NAME")
@property
@jsii.member(jsii_name="cfnProperties")
def _cfn_properties(self) -> typing.Mapping[str,typing.Any]:
return jsii.get(self, "cfnProperties")
@property
@jsii.member(jsii_name="secretId")
def secret_id(self) -> str:
"""``AWS::SecretsManager::RotationSchedule.SecretId``.
see
:see: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-secretsmanager-rotationschedule.html#cfn-secretsmanager-rotationschedule-secretid
"""
return jsii.get(self, "secretId")
@secret_id.setter
def secret_id(self, value: str):
return jsii.set(self, "secretId", value)
@property
@jsii.member(jsii_name="rotationLambdaArn")
def rotation_lambda_arn(self) -> typing.Optional[str]:
"""``AWS::SecretsManager::RotationSchedule.RotationLambdaARN``.
see
:see: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-secretsmanager-rotationschedule.html#cfn-secretsmanager-rotationschedule-rotationlambdaarn
"""
return jsii.get(self, "rotationLambdaArn")
@rotation_lambda_arn.setter
def rotation_lambda_arn(self, value: typing.Optional[str]):
return jsii.set(self, "rotationLambdaArn", value)
@property
@jsii.member(jsii_name="rotationRules")
def rotation_rules(self) -> typing.Optional[typing.Union[typing.Optional[aws_cdk.core.IResolvable], typing.Optional["RotationRulesProperty"]]]:
"""``AWS::SecretsManager::RotationSchedule.RotationRules``.
see
:see: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-secretsmanager-rotationschedule.html#cfn-secretsmanager-rotationschedule-rotationrules
"""
return jsii.get(self, "rotationRules")
@rotation_rules.setter
def rotation_rules(self, value: typing.Optional[typing.Union[typing.Optional[aws_cdk.core.IResolvable], typing.Optional["RotationRulesProperty"]]]):
return jsii.set(self, "rotationRules", value)
@jsii.data_type(jsii_type="@aws-cdk/aws-secretsmanager.CfnRotationSchedule.RotationRulesProperty", jsii_struct_bases=[], name_mapping={'automatically_after_days': 'automaticallyAfterDays'})
class RotationRulesProperty():
def __init__(self, *, automatically_after_days: typing.Optional[jsii.Number]=None):
"""
:param automatically_after_days: ``CfnRotationSchedule.RotationRulesProperty.AutomaticallyAfterDays``.
see
:see: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-secretsmanager-rotationschedule-rotationrules.html
"""
self._values = {
}
if automatically_after_days is not None: self._values["automatically_after_days"] = automatically_after_days
@property
def automatically_after_days(self) -> typing.Optional[jsii.Number]:
"""``CfnRotationSchedule.RotationRulesProperty.AutomaticallyAfterDays``.
see
:see: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-secretsmanager-rotationschedule-rotationrules.html#cfn-secretsmanager-rotationschedule-rotationrules-automaticallyafterdays
"""
return self._values.get('automatically_after_days')
def __eq__(self, rhs) -> bool:
return isinstance(rhs, self.__class__) and rhs._values == self._values
def __ne__(self, rhs) -> bool:
return not (rhs == self)
def __repr__(self) -> str:
return 'RotationRulesProperty(%s)' % ', '.join(k + '=' + repr(v) for k, v in self._values.items())
@jsii.data_type(jsii_type="@aws-cdk/aws-secretsmanager.CfnRotationScheduleProps", jsii_struct_bases=[], name_mapping={'secret_id': 'secretId', 'rotation_lambda_arn': 'rotationLambdaArn', 'rotation_rules': 'rotationRules'})
class CfnRotationScheduleProps():
def __init__(self, *, secret_id: str, rotation_lambda_arn: typing.Optional[str]=None, rotation_rules: typing.Optional[typing.Union[typing.Optional[aws_cdk.core.IResolvable], typing.Optional["CfnRotationSchedule.RotationRulesProperty"]]]=None):
"""Properties for defining a ``AWS::SecretsManager::RotationSchedule``.
:param secret_id: ``AWS::SecretsManager::RotationSchedule.SecretId``.
:param rotation_lambda_arn: ``AWS::SecretsManager::RotationSchedule.RotationLambdaARN``.
:param rotation_rules: ``AWS::SecretsManager::RotationSchedule.RotationRules``.
see
:see: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-secretsmanager-rotationschedule.html
"""
self._values = {
'secret_id': secret_id,
}
if rotation_lambda_arn is not None: self._values["rotation_lambda_arn"] = rotation_lambda_arn
if rotation_rules is not None: self._values["rotation_rules"] = rotation_rules
@property
def secret_id(self) -> str:
"""``AWS::SecretsManager::RotationSchedule.SecretId``.
see
:see: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-secretsmanager-rotationschedule.html#cfn-secretsmanager-rotationschedule-secretid
"""
return self._values.get('secret_id')
@property
def rotation_lambda_arn(self) -> typing.Optional[str]:
"""``AWS::SecretsManager::RotationSchedule.RotationLambdaARN``.
see
:see: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-secretsmanager-rotationschedule.html#cfn-secretsmanager-rotationschedule-rotationlambdaarn
"""
return self._values.get('rotation_lambda_arn')
@property
def rotation_rules(self) -> typing.Optional[typing.Union[typing.Optional[aws_cdk.core.IResolvable], typing.Optional["CfnRotationSchedule.RotationRulesProperty"]]]:
"""``AWS::SecretsManager::RotationSchedule.RotationRules``.
see
:see: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-secretsmanager-rotationschedule.html#cfn-secretsmanager-rotationschedule-rotationrules
"""
return self._values.get('rotation_rules')
def __eq__(self, rhs) -> bool:
return isinstance(rhs, self.__class__) and rhs._values == self._values
def __ne__(self, rhs) -> bool:
return not (rhs == self)
def __repr__(self) -> str:
return 'CfnRotationScheduleProps(%s)' % ', '.join(k + '=' + repr(v) for k, v in self._values.items())
@jsii.implements(aws_cdk.core.IInspectable)
class CfnSecret(aws_cdk.core.CfnResource, metaclass=jsii.JSIIMeta, jsii_type="@aws-cdk/aws-secretsmanager.CfnSecret"):
"""A CloudFormation ``AWS::SecretsManager::Secret``.
see
:see: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-secretsmanager-secret.html
cloudformationResource:
:cloudformationResource:: AWS::SecretsManager::Secret
"""
def __init__(self, scope: aws_cdk.core.Construct, id: str, *, description: typing.Optional[str]=None, generate_secret_string: typing.Optional[typing.Union[typing.Optional[aws_cdk.core.IResolvable], typing.Optional["GenerateSecretStringProperty"]]]=None, kms_key_id: typing.Optional[str]=None, name: typing.Optional[str]=None, secret_string: typing.Optional[str]=None, tags: typing.Optional[typing.List[aws_cdk.core.CfnTag]]=None) -> None:
"""Create a new ``AWS::SecretsManager::Secret``.
:param scope: - scope in which this resource is defined.
:param id: - scoped id of the resource.
:param props: - resource properties.
:param description: ``AWS::SecretsManager::Secret.Description``.
:param generate_secret_string: ``AWS::SecretsManager::Secret.GenerateSecretString``.
:param kms_key_id: ``AWS::SecretsManager::Secret.KmsKeyId``.
:param name: ``AWS::SecretsManager::Secret.Name``.
:param secret_string: ``AWS::SecretsManager::Secret.SecretString``.
:param tags: ``AWS::SecretsManager::Secret.Tags``.
"""
props = CfnSecretProps(description=description, generate_secret_string=generate_secret_string, kms_key_id=kms_key_id, name=name, secret_string=secret_string, tags=tags)
jsii.create(CfnSecret, self, [scope, id, props])
@jsii.member(jsii_name="inspect")
def inspect(self, inspector: aws_cdk.core.TreeInspector) -> None:
"""Examines the CloudFormation resource and discloses attributes.
:param inspector: - tree inspector to collect and process attributes.
stability
:stability: experimental
"""
return jsii.invoke(self, "inspect", [inspector])
@jsii.member(jsii_name="renderProperties")
def _render_properties(self, props: typing.Mapping[str,typing.Any]) -> typing.Mapping[str,typing.Any]:
"""
:param props: -
"""
return jsii.invoke(self, "renderProperties", [props])
@classproperty
@jsii.member(jsii_name="CFN_RESOURCE_TYPE_NAME")
def CFN_RESOURCE_TYPE_NAME(cls) -> str:
"""The CloudFormation resource type name for this resource class."""
return jsii.sget(cls, "CFN_RESOURCE_TYPE_NAME")
@property
@jsii.member(jsii_name="cfnProperties")
def _cfn_properties(self) -> typing.Mapping[str,typing.Any]:
return jsii.get(self, "cfnProperties")
@property
@jsii.member(jsii_name="tags")
def tags(self) -> aws_cdk.core.TagManager:
"""``AWS::SecretsManager::Secret.Tags``.
see
:see: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-secretsmanager-secret.html#cfn-secretsmanager-secret-tags
"""
return jsii.get(self, "tags")
@property
@jsii.member(jsii_name="description")
def description(self) -> typing.Optional[str]:
"""``AWS::SecretsManager::Secret.Description``.
see
:see: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-secretsmanager-secret.html#cfn-secretsmanager-secret-description
"""
return jsii.get(self, "description")
@description.setter
def description(self, value: typing.Optional[str]):
return jsii.set(self, "description", value)
@property
@jsii.member(jsii_name="generateSecretString")
def generate_secret_string(self) -> typing.Optional[typing.Union[typing.Optional[aws_cdk.core.IResolvable], typing.Optional["GenerateSecretStringProperty"]]]:
"""``AWS::SecretsManager::Secret.GenerateSecretString``.
see
:see: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-secretsmanager-secret.html#cfn-secretsmanager-secret-generatesecretstring
"""
return jsii.get(self, "generateSecretString")
@generate_secret_string.setter
def generate_secret_string(self, value: typing.Optional[typing.Union[typing.Optional[aws_cdk.core.IResolvable], typing.Optional["GenerateSecretStringProperty"]]]):
return jsii.set(self, "generateSecretString", value)
@property
@jsii.member(jsii_name="kmsKeyId")
def kms_key_id(self) -> typing.Optional[str]:
"""``AWS::SecretsManager::Secret.KmsKeyId``.
see
:see: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-secretsmanager-secret.html#cfn-secretsmanager-secret-kmskeyid
"""
return jsii.get(self, "kmsKeyId")
@kms_key_id.setter
def kms_key_id(self, value: typing.Optional[str]):
return jsii.set(self, "kmsKeyId", value)
@property
@jsii.member(jsii_name="name")
def name(self) -> typing.Optional[str]:
"""``AWS::SecretsManager::Secret.Name``.
see
:see: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-secretsmanager-secret.html#cfn-secretsmanager-secret-name
"""
return jsii.get(self, "name")
@name.setter
def name(self, value: typing.Optional[str]):
return jsii.set(self, "name", value)
@property
@jsii.member(jsii_name="secretString")
def secret_string(self) -> typing.Optional[str]:
"""``AWS::SecretsManager::Secret.SecretString``.
see
:see: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-secretsmanager-secret.html#cfn-secretsmanager-secret-secretstring
"""
return jsii.get(self, "secretString")
@secret_string.setter
def secret_string(self, value: typing.Optional[str]):
return jsii.set(self, "secretString", value)
@jsii.data_type(jsii_type="@aws-cdk/aws-secretsmanager.CfnSecret.GenerateSecretStringProperty", jsii_struct_bases=[], name_mapping={'exclude_characters': 'excludeCharacters', 'exclude_lowercase': 'excludeLowercase', 'exclude_numbers': 'excludeNumbers', 'exclude_punctuation': 'excludePunctuation', 'exclude_uppercase': 'excludeUppercase', 'generate_string_key': 'generateStringKey', 'include_space': 'includeSpace', 'password_length': 'passwordLength', 'require_each_included_type': 'requireEachIncludedType', 'secret_string_template': 'secretStringTemplate'})
class GenerateSecretStringProperty():
def __init__(self, *, exclude_characters: typing.Optional[str]=None, exclude_lowercase: typing.Optional[typing.Union[typing.Optional[bool], typing.Optional[aws_cdk.core.IResolvable]]]=None, exclude_numbers: typing.Optional[typing.Union[typing.Optional[bool], typing.Optional[aws_cdk.core.IResolvable]]]=None, exclude_punctuation: typing.Optional[typing.Union[typing.Optional[bool], typing.Optional[aws_cdk.core.IResolvable]]]=None, exclude_uppercase: typing.Optional[typing.Union[typing.Optional[bool], typing.Optional[aws_cdk.core.IResolvable]]]=None, generate_string_key: typing.Optional[str]=None, include_space: typing.Optional[typing.Union[typing.Optional[bool], typing.Optional[aws_cdk.core.IResolvable]]]=None, password_length: typing.Optional[jsii.Number]=None, require_each_included_type: typing.Optional[typing.Union[typing.Optional[bool], typing.Optional[aws_cdk.core.IResolvable]]]=None, secret_string_template: typing.Optional[str]=None):
"""
:param exclude_characters: ``CfnSecret.GenerateSecretStringProperty.ExcludeCharacters``.
:param exclude_lowercase: ``CfnSecret.GenerateSecretStringProperty.ExcludeLowercase``.
:param exclude_numbers: ``CfnSecret.GenerateSecretStringProperty.ExcludeNumbers``.
:param exclude_punctuation: ``CfnSecret.GenerateSecretStringProperty.ExcludePunctuation``.
:param exclude_uppercase: ``CfnSecret.GenerateSecretStringProperty.ExcludeUppercase``.
:param generate_string_key: ``CfnSecret.GenerateSecretStringProperty.GenerateStringKey``.
:param include_space: ``CfnSecret.GenerateSecretStringProperty.IncludeSpace``.
:param password_length: ``CfnSecret.GenerateSecretStringProperty.PasswordLength``.
:param require_each_included_type: ``CfnSecret.GenerateSecretStringProperty.RequireEachIncludedType``.
:param secret_string_template: ``CfnSecret.GenerateSecretStringProperty.SecretStringTemplate``.
see
:see: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-secretsmanager-secret-generatesecretstring.html
"""
self._values = {
}
if exclude_characters is not None: self._values["exclude_characters"] = exclude_characters
if exclude_lowercase is not None: self._values["exclude_lowercase"] = exclude_lowercase
if exclude_numbers is not None: self._values["exclude_numbers"] = exclude_numbers
if exclude_punctuation is not None: self._values["exclude_punctuation"] = exclude_punctuation
if exclude_uppercase is not None: self._values["exclude_uppercase"] = exclude_uppercase
if generate_string_key is not None: self._values["generate_string_key"] = generate_string_key
if include_space is not None: self._values["include_space"] = include_space
if password_length is not None: self._values["password_length"] = password_length
if require_each_included_type is not None: self._values["require_each_included_type"] = require_each_included_type
if secret_string_template is not None: self._values["secret_string_template"] = secret_string_template
@property
def exclude_characters(self) -> typing.Optional[str]:
"""``CfnSecret.GenerateSecretStringProperty.ExcludeCharacters``.
see
:see: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-secretsmanager-secret-generatesecretstring.html#cfn-secretsmanager-secret-generatesecretstring-excludecharacters
"""
return self._values.get('exclude_characters')
@property
def exclude_lowercase(self) -> typing.Optional[typing.Union[typing.Optional[bool], typing.Optional[aws_cdk.core.IResolvable]]]:
"""``CfnSecret.GenerateSecretStringProperty.ExcludeLowercase``.
see
:see: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-secretsmanager-secret-generatesecretstring.html#cfn-secretsmanager-secret-generatesecretstring-excludelowercase
"""
return self._values.get('exclude_lowercase')
@property
def exclude_numbers(self) -> typing.Optional[typing.Union[typing.Optional[bool], typing.Optional[aws_cdk.core.IResolvable]]]:
"""``CfnSecret.GenerateSecretStringProperty.ExcludeNumbers``.
see
:see: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-secretsmanager-secret-generatesecretstring.html#cfn-secretsmanager-secret-generatesecretstring-excludenumbers
"""
return self._values.get('exclude_numbers')
@property
def exclude_punctuation(self) -> typing.Optional[typing.Union[typing.Optional[bool], typing.Optional[aws_cdk.core.IResolvable]]]:
"""``CfnSecret.GenerateSecretStringProperty.ExcludePunctuation``.
see
:see: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-secretsmanager-secret-generatesecretstring.html#cfn-secretsmanager-secret-generatesecretstring-excludepunctuation
"""
return self._values.get('exclude_punctuation')
@property
def exclude_uppercase(self) -> typing.Optional[typing.Union[typing.Optional[bool], typing.Optional[aws_cdk.core.IResolvable]]]:
"""``CfnSecret.GenerateSecretStringProperty.ExcludeUppercase``.
see
:see: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-secretsmanager-secret-generatesecretstring.html#cfn-secretsmanager-secret-generatesecretstring-excludeuppercase
"""
return self._values.get('exclude_uppercase')
@property
def generate_string_key(self) -> typing.Optional[str]:
"""``CfnSecret.GenerateSecretStringProperty.GenerateStringKey``.
see
:see: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-secretsmanager-secret-generatesecretstring.html#cfn-secretsmanager-secret-generatesecretstring-generatestringkey
"""
return self._values.get('generate_string_key')
@property
def include_space(self) -> typing.Optional[typing.Union[typing.Optional[bool], typing.Optional[aws_cdk.core.IResolvable]]]:
"""``CfnSecret.GenerateSecretStringProperty.IncludeSpace``.
see
:see: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-secretsmanager-secret-generatesecretstring.html#cfn-secretsmanager-secret-generatesecretstring-includespace
"""
return self._values.get('include_space')
@property
def password_length(self) -> typing.Optional[jsii.Number]:
"""``CfnSecret.GenerateSecretStringProperty.PasswordLength``.
see
:see: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-secretsmanager-secret-generatesecretstring.html#cfn-secretsmanager-secret-generatesecretstring-passwordlength
"""
return self._values.get('password_length')
@property
def require_each_included_type(self) -> typing.Optional[typing.Union[typing.Optional[bool], typing.Optional[aws_cdk.core.IResolvable]]]:
"""``CfnSecret.GenerateSecretStringProperty.RequireEachIncludedType``.
see
:see: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-secretsmanager-secret-generatesecretstring.html#cfn-secretsmanager-secret-generatesecretstring-requireeachincludedtype
"""
return self._values.get('require_each_included_type')
@property
def secret_string_template(self) -> typing.Optional[str]:
"""``CfnSecret.GenerateSecretStringProperty.SecretStringTemplate``.
see
:see: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-secretsmanager-secret-generatesecretstring.html#cfn-secretsmanager-secret-generatesecretstring-secretstringtemplate
"""
return self._values.get('secret_string_template')
def __eq__(self, rhs) -> bool:
return isinstance(rhs, self.__class__) and rhs._values == self._values
def __ne__(self, rhs) -> bool:
return not (rhs == self)
def __repr__(self) -> str:
return 'GenerateSecretStringProperty(%s)' % ', '.join(k + '=' + repr(v) for k, v in self._values.items())
@jsii.data_type(jsii_type="@aws-cdk/aws-secretsmanager.CfnSecretProps", jsii_struct_bases=[], name_mapping={'description': 'description', 'generate_secret_string': 'generateSecretString', 'kms_key_id': 'kmsKeyId', 'name': 'name', 'secret_string': 'secretString', 'tags': 'tags'})
class CfnSecretProps():
def __init__(self, *, description: typing.Optional[str]=None, generate_secret_string: typing.Optional[typing.Union[typing.Optional[aws_cdk.core.IResolvable], typing.Optional["CfnSecret.GenerateSecretStringProperty"]]]=None, kms_key_id: typing.Optional[str]=None, name: typing.Optional[str]=None, secret_string: typing.Optional[str]=None, tags: typing.Optional[typing.List[aws_cdk.core.CfnTag]]=None):
"""Properties for defining a ``AWS::SecretsManager::Secret``.
:param description: ``AWS::SecretsManager::Secret.Description``.
:param generate_secret_string: ``AWS::SecretsManager::Secret.GenerateSecretString``.
:param kms_key_id: ``AWS::SecretsManager::Secret.KmsKeyId``.
:param name: ``AWS::SecretsManager::Secret.Name``.
:param secret_string: ``AWS::SecretsManager::Secret.SecretString``.
:param tags: ``AWS::SecretsManager::Secret.Tags``.
see
:see: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-secretsmanager-secret.html
"""
self._values = {
}
if description is not None: self._values["description"] = description
if generate_secret_string is not None: self._values["generate_secret_string"] = generate_secret_string
if kms_key_id is not None: self._values["kms_key_id"] = kms_key_id
if name is not None: self._values["name"] = name
if secret_string is not None: self._values["secret_string"] = secret_string
if tags is not None: self._values["tags"] = tags
@property
def description(self) -> typing.Optional[str]:
"""``AWS::SecretsManager::Secret.Description``.
see
:see: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-secretsmanager-secret.html#cfn-secretsmanager-secret-description
"""
return self._values.get('description')
@property
def generate_secret_string(self) -> typing.Optional[typing.Union[typing.Optional[aws_cdk.core.IResolvable], typing.Optional["CfnSecret.GenerateSecretStringProperty"]]]:
"""``AWS::SecretsManager::Secret.GenerateSecretString``.
see
:see: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-secretsmanager-secret.html#cfn-secretsmanager-secret-generatesecretstring
"""
return self._values.get('generate_secret_string')
@property
def kms_key_id(self) -> typing.Optional[str]:
"""``AWS::SecretsManager::Secret.KmsKeyId``.
see
:see: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-secretsmanager-secret.html#cfn-secretsmanager-secret-kmskeyid
"""
return self._values.get('kms_key_id')
@property
def name(self) -> typing.Optional[str]:
"""``AWS::SecretsManager::Secret.Name``.
see
:see: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-secretsmanager-secret.html#cfn-secretsmanager-secret-name
"""
return self._values.get('name')
@property
def secret_string(self) -> typing.Optional[str]:
"""``AWS::SecretsManager::Secret.SecretString``.
see
:see: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-secretsmanager-secret.html#cfn-secretsmanager-secret-secretstring
"""
return self._values.get('secret_string')
@property
def tags(self) -> typing.Optional[typing.List[aws_cdk.core.CfnTag]]:
"""``AWS::SecretsManager::Secret.Tags``.
see
:see: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-secretsmanager-secret.html#cfn-secretsmanager-secret-tags
"""
return self._values.get('tags')
def __eq__(self, rhs) -> bool:
return isinstance(rhs, self.__class__) and rhs._values == self._values
def __ne__(self, rhs) -> bool:
return not (rhs == self)
def __repr__(self) -> str:
return 'CfnSecretProps(%s)' % ', '.join(k + '=' + repr(v) for k, v in self._values.items())
@jsii.implements(aws_cdk.core.IInspectable)
class CfnSecretTargetAttachment(aws_cdk.core.CfnResource, metaclass=jsii.JSIIMeta, jsii_type="@aws-cdk/aws-secretsmanager.CfnSecretTargetAttachment"):
"""A CloudFormation ``AWS::SecretsManager::SecretTargetAttachment``.
see
:see: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-secretsmanager-secrettargetattachment.html
cloudformationResource:
:cloudformationResource:: AWS::SecretsManager::SecretTargetAttachment
"""
def __init__(self, scope: aws_cdk.core.Construct, id: str, *, secret_id: str, target_id: str, target_type: str) -> None:
"""Create a new ``AWS::SecretsManager::SecretTargetAttachment``.
:param scope: - scope in which this resource is defined.
:param id: - scoped id of the resource.
:param props: - resource properties.
:param secret_id: ``AWS::SecretsManager::SecretTargetAttachment.SecretId``.
:param target_id: ``AWS::SecretsManager::SecretTargetAttachment.TargetId``.
:param target_type: ``AWS::SecretsManager::SecretTargetAttachment.TargetType``.
"""
props = CfnSecretTargetAttachmentProps(secret_id=secret_id, target_id=target_id, target_type=target_type)
jsii.create(CfnSecretTargetAttachment, self, [scope, id, props])
@jsii.member(jsii_name="inspect")
def inspect(self, inspector: aws_cdk.core.TreeInspector) -> None:
"""Examines the CloudFormation resource and discloses attributes.
:param inspector: - tree inspector to collect and process attributes.
stability
:stability: experimental
"""
return jsii.invoke(self, "inspect", [inspector])
@jsii.member(jsii_name="renderProperties")
def _render_properties(self, props: typing.Mapping[str,typing.Any]) -> typing.Mapping[str,typing.Any]:
"""
:param props: -
"""
return jsii.invoke(self, "renderProperties", [props])
@classproperty
@jsii.member(jsii_name="CFN_RESOURCE_TYPE_NAME")
def CFN_RESOURCE_TYPE_NAME(cls) -> str:
"""The CloudFormation resource type name for this resource class."""
return jsii.sget(cls, "CFN_RESOURCE_TYPE_NAME")
@property
@jsii.member(jsii_name="cfnProperties")
def _cfn_properties(self) -> typing.Mapping[str,typing.Any]:
return jsii.get(self, "cfnProperties")
@property
@jsii.member(jsii_name="secretId")
def secret_id(self) -> str:
"""``AWS::SecretsManager::SecretTargetAttachment.SecretId``.
see
:see: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-secretsmanager-secrettargetattachment.html#cfn-secretsmanager-secrettargetattachment-secretid
"""
return jsii.get(self, "secretId")
@secret_id.setter
def secret_id(self, value: str):
return jsii.set(self, "secretId", value)
@property
@jsii.member(jsii_name="targetId")
def target_id(self) -> str:
"""``AWS::SecretsManager::SecretTargetAttachment.TargetId``.
see
:see: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-secretsmanager-secrettargetattachment.html#cfn-secretsmanager-secrettargetattachment-targetid
"""
return jsii.get(self, "targetId")
@target_id.setter
def target_id(self, value: str):
return jsii.set(self, "targetId", value)
@property
@jsii.member(jsii_name="targetType")
def target_type(self) -> str:
"""``AWS::SecretsManager::SecretTargetAttachment.TargetType``.
see
:see: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-secretsmanager-secrettargetattachment.html#cfn-secretsmanager-secrettargetattachment-targettype
"""
return jsii.get(self, "targetType")
@target_type.setter
def target_type(self, value: str):
return jsii.set(self, "targetType", value)
@jsii.data_type(jsii_type="@aws-cdk/aws-secretsmanager.CfnSecretTargetAttachmentProps", jsii_struct_bases=[], name_mapping={'secret_id': 'secretId', 'target_id': 'targetId', 'target_type': 'targetType'})
class CfnSecretTargetAttachmentProps():
def __init__(self, *, secret_id: str, target_id: str, target_type: str):
"""Properties for defining a ``AWS::SecretsManager::SecretTargetAttachment``.
:param secret_id: ``AWS::SecretsManager::SecretTargetAttachment.SecretId``.
:param target_id: ``AWS::SecretsManager::SecretTargetAttachment.TargetId``.
:param target_type: ``AWS::SecretsManager::SecretTargetAttachment.TargetType``.
see
:see: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-secretsmanager-secrettargetattachment.html
"""
self._values = {
'secret_id': secret_id,
'target_id': target_id,
'target_type': target_type,
}
@property
def secret_id(self) -> str:
"""``AWS::SecretsManager::SecretTargetAttachment.SecretId``.
see
:see: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-secretsmanager-secrettargetattachment.html#cfn-secretsmanager-secrettargetattachment-secretid
"""
return self._values.get('secret_id')
@property
def target_id(self) -> str:
"""``AWS::SecretsManager::SecretTargetAttachment.TargetId``.
see
:see: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-secretsmanager-secrettargetattachment.html#cfn-secretsmanager-secrettargetattachment-targetid
"""
return self._values.get('target_id')
@property
def target_type(self) -> str:
"""``AWS::SecretsManager::SecretTargetAttachment.TargetType``.
see
:see: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-secretsmanager-secrettargetattachment.html#cfn-secretsmanager-secrettargetattachment-targettype
"""
return self._values.get('target_type')
def __eq__(self, rhs) -> bool:
return isinstance(rhs, self.__class__) and rhs._values == self._values
def __ne__(self, rhs) -> bool:
return not (rhs == self)
def __repr__(self) -> str:
return 'CfnSecretTargetAttachmentProps(%s)' % ', '.join(k + '=' + repr(v) for k, v in self._values.items())
@jsii.interface(jsii_type="@aws-cdk/aws-secretsmanager.ISecret")
class ISecret(aws_cdk.core.IResource, jsii.compat.Protocol):
"""A secret in AWS Secrets Manager."""
@staticmethod
def __jsii_proxy_class__():
return _ISecretProxy
@property
@jsii.member(jsii_name="secretArn")
def secret_arn(self) -> str:
"""The ARN of the secret in AWS Secrets Manager.
attribute:
:attribute:: true
"""
...
@property
@jsii.member(jsii_name="secretValue")
def secret_value(self) -> aws_cdk.core.SecretValue:
"""Retrieve the value of the stored secret as a ``SecretValue``.
attribute:
:attribute:: true
"""
...
@property
@jsii.member(jsii_name="encryptionKey")
def encryption_key(self) -> typing.Optional[aws_cdk.aws_kms.IKey]:
"""The customer-managed encryption key that is used to encrypt this secret, if any.
When not specified, the default
KMS key for the account and region is being used.
"""
...
@jsii.member(jsii_name="addRotationSchedule")
def add_rotation_schedule(self, id: str, *, rotation_lambda: aws_cdk.aws_lambda.IFunction, automatically_after: typing.Optional[aws_cdk.core.Duration]=None) -> "RotationSchedule":
"""Adds a rotation schedule to the secret.
:param id: -
:param options: -
:param rotation_lambda: THe Lambda function that can rotate the secret.
:param automatically_after: Specifies the number of days after the previous rotation before Secrets Manager triggers the next automatic rotation. Default: Duration.days(30)
"""
...
@jsii.member(jsii_name="grantRead")
def grant_read(self, grantee: aws_cdk.aws_iam.IGrantable, version_stages: typing.Optional[typing.List[str]]=None) -> aws_cdk.aws_iam.Grant:
"""Grants reading the secret value to some role.
:param grantee: the principal being granted permission.
:param version_stages: the version stages the grant is limited to. If not specified, no restriction on the version stages is applied.
"""
...
@jsii.member(jsii_name="secretValueFromJson")
def secret_value_from_json(self, key: str) -> aws_cdk.core.SecretValue:
"""Interpret the secret as a JSON object and return a field's value from it as a ``SecretValue``.
:param key: -
"""
...
class _ISecretProxy(jsii.proxy_for(aws_cdk.core.IResource)):
"""A secret in AWS Secrets Manager."""
__jsii_type__ = "@aws-cdk/aws-secretsmanager.ISecret"
@property
@jsii.member(jsii_name="secretArn")
def secret_arn(self) -> str:
"""The ARN of the secret in AWS Secrets Manager.
attribute:
:attribute:: true
"""
return jsii.get(self, "secretArn")
@property
@jsii.member(jsii_name="secretValue")
def secret_value(self) -> aws_cdk.core.SecretValue:
"""Retrieve the value of the stored secret as a ``SecretValue``.
attribute:
:attribute:: true
"""
return jsii.get(self, "secretValue")
@property
@jsii.member(jsii_name="encryptionKey")
def encryption_key(self) -> typing.Optional[aws_cdk.aws_kms.IKey]:
"""The customer-managed encryption key that is used to encrypt this secret, if any.
When not specified, the default
KMS key for the account and region is being used.
"""
return jsii.get(self, "encryptionKey")
@jsii.member(jsii_name="addRotationSchedule")
def add_rotation_schedule(self, id: str, *, rotation_lambda: aws_cdk.aws_lambda.IFunction, automatically_after: typing.Optional[aws_cdk.core.Duration]=None) -> "RotationSchedule":
"""Adds a rotation schedule to the secret.
:param id: -
:param options: -
:param rotation_lambda: THe Lambda function that can rotate the secret.
:param automatically_after: Specifies the number of days after the previous rotation before Secrets Manager triggers the next automatic rotation. Default: Duration.days(30)
"""
options = RotationScheduleOptions(rotation_lambda=rotation_lambda, automatically_after=automatically_after)
return jsii.invoke(self, "addRotationSchedule", [id, options])
@jsii.member(jsii_name="grantRead")
def grant_read(self, grantee: aws_cdk.aws_iam.IGrantable, version_stages: typing.Optional[typing.List[str]]=None) -> aws_cdk.aws_iam.Grant:
"""Grants reading the secret value to some role.
:param grantee: the principal being granted permission.
:param version_stages: the version stages the grant is limited to. If not specified, no restriction on the version stages is applied.
"""
return jsii.invoke(self, "grantRead", [grantee, version_stages])
@jsii.member(jsii_name="secretValueFromJson")
def secret_value_from_json(self, key: str) -> aws_cdk.core.SecretValue:
"""Interpret the secret as a JSON object and return a field's value from it as a ``SecretValue``.
:param key: -
"""
return jsii.invoke(self, "secretValueFromJson", [key])
@jsii.interface(jsii_type="@aws-cdk/aws-secretsmanager.ISecretAttachmentTarget")
class ISecretAttachmentTarget(jsii.compat.Protocol):
"""A secret attachment target."""
@staticmethod
def __jsii_proxy_class__():
return _ISecretAttachmentTargetProxy
@jsii.member(jsii_name="asSecretAttachmentTarget")
def as_secret_attachment_target(self) -> "SecretAttachmentTargetProps":
"""Renders the target specifications."""
...
class _ISecretAttachmentTargetProxy():
"""A secret attachment target."""
__jsii_type__ = "@aws-cdk/aws-secretsmanager.ISecretAttachmentTarget"
@jsii.member(jsii_name="asSecretAttachmentTarget")
def as_secret_attachment_target(self) -> "SecretAttachmentTargetProps":
"""Renders the target specifications."""
return jsii.invoke(self, "asSecretAttachmentTarget", [])
@jsii.interface(jsii_type="@aws-cdk/aws-secretsmanager.ISecretTargetAttachment")
class ISecretTargetAttachment(ISecret, jsii.compat.Protocol):
@staticmethod
def __jsii_proxy_class__():
return _ISecretTargetAttachmentProxy
@property
@jsii.member(jsii_name="secretTargetAttachmentSecretArn")
def secret_target_attachment_secret_arn(self) -> str:
"""Same as ``secretArn``.
attribute:
:attribute:: true
"""
...
class _ISecretTargetAttachmentProxy(jsii.proxy_for(ISecret)):
__jsii_type__ = "@aws-cdk/aws-secretsmanager.ISecretTargetAttachment"
@property
@jsii.member(jsii_name="secretTargetAttachmentSecretArn")
def secret_target_attachment_secret_arn(self) -> str:
"""Same as ``secretArn``.
attribute:
:attribute:: true
"""
return jsii.get(self, "secretTargetAttachmentSecretArn")
class RotationSchedule(aws_cdk.core.Resource, metaclass=jsii.JSIIMeta, jsii_type="@aws-cdk/aws-secretsmanager.RotationSchedule"):
"""A rotation schedule."""
def __init__(self, scope: aws_cdk.core.Construct, id: str, *, secret: "ISecret", rotation_lambda: aws_cdk.aws_lambda.IFunction, automatically_after: typing.Optional[aws_cdk.core.Duration]=None) -> None:
"""
:param scope: -
:param id: -
:param props: -
:param secret: The secret to rotate.
:param rotation_lambda: THe Lambda function that can rotate the secret.
:param automatically_after: Specifies the number of days after the previous rotation before Secrets Manager triggers the next automatic rotation. Default: Duration.days(30)
"""
props = RotationScheduleProps(secret=secret, rotation_lambda=rotation_lambda, automatically_after=automatically_after)
jsii.create(RotationSchedule, self, [scope, id, props])
@jsii.data_type(jsii_type="@aws-cdk/aws-secretsmanager.RotationScheduleOptions", jsii_struct_bases=[], name_mapping={'rotation_lambda': 'rotationLambda', 'automatically_after': 'automaticallyAfter'})
class RotationScheduleOptions():
def __init__(self, *, rotation_lambda: aws_cdk.aws_lambda.IFunction, automatically_after: typing.Optional[aws_cdk.core.Duration]=None):
"""Options to add a rotation schedule to a secret.
:param rotation_lambda: THe Lambda function that can rotate the secret.
:param automatically_after: Specifies the number of days after the previous rotation before Secrets Manager triggers the next automatic rotation. Default: Duration.days(30)
"""
self._values = {
'rotation_lambda': rotation_lambda,
}
if automatically_after is not None: self._values["automatically_after"] = automatically_after
@property
def rotation_lambda(self) -> aws_cdk.aws_lambda.IFunction:
"""THe Lambda function that can rotate the secret."""
return self._values.get('rotation_lambda')
@property
def automatically_after(self) -> typing.Optional[aws_cdk.core.Duration]:
"""Specifies the number of days after the previous rotation before Secrets Manager triggers the next automatic rotation.
default
:default: Duration.days(30)
"""
return self._values.get('automatically_after')
def __eq__(self, rhs) -> bool:
return isinstance(rhs, self.__class__) and rhs._values == self._values
def __ne__(self, rhs) -> bool:
return not (rhs == self)
def __repr__(self) -> str:
return 'RotationScheduleOptions(%s)' % ', '.join(k + '=' + repr(v) for k, v in self._values.items())
@jsii.data_type(jsii_type="@aws-cdk/aws-secretsmanager.RotationScheduleProps", jsii_struct_bases=[RotationScheduleOptions], name_mapping={'rotation_lambda': 'rotationLambda', 'automatically_after': 'automaticallyAfter', 'secret': 'secret'})
class RotationScheduleProps(RotationScheduleOptions):
def __init__(self, *, rotation_lambda: aws_cdk.aws_lambda.IFunction, automatically_after: typing.Optional[aws_cdk.core.Duration]=None, secret: "ISecret"):
"""Construction properties for a RotationSchedule.
:param rotation_lambda: THe Lambda function that can rotate the secret.
:param automatically_after: Specifies the number of days after the previous rotation before Secrets Manager triggers the next automatic rotation. Default: Duration.days(30)
:param secret: The secret to rotate.
"""
self._values = {
'rotation_lambda': rotation_lambda,
'secret': secret,
}
if automatically_after is not None: self._values["automatically_after"] = automatically_after
@property
def rotation_lambda(self) -> aws_cdk.aws_lambda.IFunction:
"""THe Lambda function that can rotate the secret."""
return self._values.get('rotation_lambda')
@property
def automatically_after(self) -> typing.Optional[aws_cdk.core.Duration]:
"""Specifies the number of days after the previous rotation before Secrets Manager triggers the next automatic rotation.
default
:default: Duration.days(30)
"""
return self._values.get('automatically_after')
@property
def secret(self) -> "ISecret":
"""The secret to rotate."""
return self._values.get('secret')
def __eq__(self, rhs) -> bool:
return isinstance(rhs, self.__class__) and rhs._values == self._values
def __ne__(self, rhs) -> bool:
return not (rhs == self)
def __repr__(self) -> str:
return 'RotationScheduleProps(%s)' % ', '.join(k + '=' + repr(v) for k, v in self._values.items())
@jsii.implements(ISecret)
class Secret(aws_cdk.core.Resource, metaclass=jsii.JSIIMeta, jsii_type="@aws-cdk/aws-secretsmanager.Secret"):
"""Creates a new secret in AWS SecretsManager."""
def __init__(self, scope: aws_cdk.core.Construct, id: str, *, description: typing.Optional[str]=None, encryption_key: typing.Optional[aws_cdk.aws_kms.IKey]=None, generate_secret_string: typing.Optional["SecretStringGenerator"]=None, secret_name: typing.Optional[str]=None) -> None:
"""
:param scope: -
:param id: -
:param props: -
:param description: An optional, human-friendly description of the secret. Default: - No description.
:param encryption_key: The customer-managed encryption key to use for encrypting the secret value. Default: - A default KMS key for the account and region is used.
:param generate_secret_string: Configuration for how to generate a secret value. Default: - 32 characters with upper-case letters, lower-case letters, punctuation and numbers (at least one from each category), per the default values of ``SecretStringGenerator``.
:param secret_name: A name for the secret. Note that deleting secrets from SecretsManager does not happen immediately, but after a 7 to 30 days blackout period. During that period, it is not possible to create another secret that shares the same name. Default: - A name is generated by CloudFormation.
"""
props = SecretProps(description=description, encryption_key=encryption_key, generate_secret_string=generate_secret_string, secret_name=secret_name)
jsii.create(Secret, self, [scope, id, props])
@jsii.member(jsii_name="fromSecretArn")
@classmethod
def from_secret_arn(cls, scope: aws_cdk.core.Construct, id: str, secret_arn: str) -> "ISecret":
"""
:param scope: -
:param id: -
:param secret_arn: -
"""
return jsii.sinvoke(cls, "fromSecretArn", [scope, id, secret_arn])
@jsii.member(jsii_name="fromSecretAttributes")
@classmethod
def from_secret_attributes(cls, scope: aws_cdk.core.Construct, id: str, *, secret_arn: str, encryption_key: typing.Optional[aws_cdk.aws_kms.IKey]=None) -> "ISecret":
"""Import an existing secret into the Stack.
:param scope: the scope of the import.
:param id: the ID of the imported Secret in the construct tree.
:param attrs: the attributes of the imported secret.
:param secret_arn: The ARN of the secret in SecretsManager.
:param encryption_key: The encryption key that is used to encrypt the secret, unless the default SecretsManager key is used.
"""
attrs = SecretAttributes(secret_arn=secret_arn, encryption_key=encryption_key)
return jsii.sinvoke(cls, "fromSecretAttributes", [scope, id, attrs])
@jsii.member(jsii_name="addRotationSchedule")
def add_rotation_schedule(self, id: str, *, rotation_lambda: aws_cdk.aws_lambda.IFunction, automatically_after: typing.Optional[aws_cdk.core.Duration]=None) -> "RotationSchedule":
"""Adds a rotation schedule to the secret.
:param id: -
:param options: -
:param rotation_lambda: THe Lambda function that can rotate the secret.
:param automatically_after: Specifies the number of days after the previous rotation before Secrets Manager triggers the next automatic rotation. Default: Duration.days(30)
"""
options = RotationScheduleOptions(rotation_lambda=rotation_lambda, automatically_after=automatically_after)
return jsii.invoke(self, "addRotationSchedule", [id, options])
@jsii.member(jsii_name="addTargetAttachment")
def add_target_attachment(self, id: str, *, target: "ISecretAttachmentTarget") -> "SecretTargetAttachment":
"""Adds a target attachment to the secret.
:param id: -
:param options: -
:param target: The target to attach the secret to.
return
:return: an AttachedSecret
"""
options = AttachedSecretOptions(target=target)
return jsii.invoke(self, "addTargetAttachment", [id, options])
@jsii.member(jsii_name="grantRead")
def grant_read(self, grantee: aws_cdk.aws_iam.IGrantable, version_stages: typing.Optional[typing.List[str]]=None) -> aws_cdk.aws_iam.Grant:
"""Grants reading the secret value to some role.
:param grantee: -
:param version_stages: -
"""
return jsii.invoke(self, "grantRead", [grantee, version_stages])
@jsii.member(jsii_name="secretValueFromJson")
def secret_value_from_json(self, json_field: str) -> aws_cdk.core.SecretValue:
"""Interpret the secret as a JSON object and return a field's value from it as a ``SecretValue``.
:param json_field: -
"""
return jsii.invoke(self, "secretValueFromJson", [json_field])
@property
@jsii.member(jsii_name="secretArn")
def secret_arn(self) -> str:
"""The ARN of the secret in AWS Secrets Manager."""
return jsii.get(self, "secretArn")
@property
@jsii.member(jsii_name="secretValue")
def secret_value(self) -> aws_cdk.core.SecretValue:
"""Retrieve the value of the stored secret as a ``SecretValue``."""
return jsii.get(self, "secretValue")
@property
@jsii.member(jsii_name="encryptionKey")
def encryption_key(self) -> typing.Optional[aws_cdk.aws_kms.IKey]:
"""The customer-managed encryption key that is used to encrypt this secret, if any.
When not specified, the default
KMS key for the account and region is being used.
"""
return jsii.get(self, "encryptionKey")
@jsii.data_type(jsii_type="@aws-cdk/aws-secretsmanager.SecretAttachmentTargetProps", jsii_struct_bases=[], name_mapping={'target_id': 'targetId', 'target_type': 'targetType'})
class SecretAttachmentTargetProps():
def __init__(self, *, target_id: str, target_type: "AttachmentTargetType"):
"""Attachment target specifications.
:param target_id: The id of the target to attach the secret to.
:param target_type: The type of the target to attach the secret to.
"""
self._values = {
'target_id': target_id,
'target_type': target_type,
}
@property
def target_id(self) -> str:
"""The id of the target to attach the secret to."""
return self._values.get('target_id')
@property
def target_type(self) -> "AttachmentTargetType":
"""The type of the target to attach the secret to."""
return self._values.get('target_type')
def __eq__(self, rhs) -> bool:
return isinstance(rhs, self.__class__) and rhs._values == self._values
def __ne__(self, rhs) -> bool:
return not (rhs == self)
def __repr__(self) -> str:
return 'SecretAttachmentTargetProps(%s)' % ', '.join(k + '=' + repr(v) for k, v in self._values.items())
@jsii.data_type(jsii_type="@aws-cdk/aws-secretsmanager.SecretAttributes", jsii_struct_bases=[], name_mapping={'secret_arn': 'secretArn', 'encryption_key': 'encryptionKey'})
class SecretAttributes():
def __init__(self, *, secret_arn: str, encryption_key: typing.Optional[aws_cdk.aws_kms.IKey]=None):
"""Attributes required to import an existing secret into the Stack.
:param secret_arn: The ARN of the secret in SecretsManager.
:param encryption_key: The encryption key that is used to encrypt the secret, unless the default SecretsManager key is used.
"""
self._values = {
'secret_arn': secret_arn,
}
if encryption_key is not None: self._values["encryption_key"] = encryption_key
@property
def secret_arn(self) -> str:
"""The ARN of the secret in SecretsManager."""
return self._values.get('secret_arn')
@property
def encryption_key(self) -> typing.Optional[aws_cdk.aws_kms.IKey]:
"""The encryption key that is used to encrypt the secret, unless the default SecretsManager key is used."""
return self._values.get('encryption_key')
def __eq__(self, rhs) -> bool:
return isinstance(rhs, self.__class__) and rhs._values == self._values
def __ne__(self, rhs) -> bool:
return not (rhs == self)
def __repr__(self) -> str:
return 'SecretAttributes(%s)' % ', '.join(k + '=' + repr(v) for k, v in self._values.items())
@jsii.data_type(jsii_type="@aws-cdk/aws-secretsmanager.SecretProps", jsii_struct_bases=[], name_mapping={'description': 'description', 'encryption_key': 'encryptionKey', 'generate_secret_string': 'generateSecretString', 'secret_name': 'secretName'})
class SecretProps():
def __init__(self, *, description: typing.Optional[str]=None, encryption_key: typing.Optional[aws_cdk.aws_kms.IKey]=None, generate_secret_string: typing.Optional["SecretStringGenerator"]=None, secret_name: typing.Optional[str]=None):
"""The properties required to create a new secret in AWS Secrets Manager.
:param description: An optional, human-friendly description of the secret. Default: - No description.
:param encryption_key: The customer-managed encryption key to use for encrypting the secret value. Default: - A default KMS key for the account and region is used.
:param generate_secret_string: Configuration for how to generate a secret value. Default: - 32 characters with upper-case letters, lower-case letters, punctuation and numbers (at least one from each category), per the default values of ``SecretStringGenerator``.
:param secret_name: A name for the secret. Note that deleting secrets from SecretsManager does not happen immediately, but after a 7 to 30 days blackout period. During that period, it is not possible to create another secret that shares the same name. Default: - A name is generated by CloudFormation.
"""
if isinstance(generate_secret_string, dict): generate_secret_string = SecretStringGenerator(**generate_secret_string)
self._values = {
}
if description is not None: self._values["description"] = description
if encryption_key is not None: self._values["encryption_key"] = encryption_key
if generate_secret_string is not None: self._values["generate_secret_string"] = generate_secret_string
if secret_name is not None: self._values["secret_name"] = secret_name
@property
def description(self) -> typing.Optional[str]:
"""An optional, human-friendly description of the secret.
default
:default: - No description.
"""
return self._values.get('description')
@property
def encryption_key(self) -> typing.Optional[aws_cdk.aws_kms.IKey]:
"""The customer-managed encryption key to use for encrypting the secret value.
default
:default: - A default KMS key for the account and region is used.
"""
return self._values.get('encryption_key')
@property
def generate_secret_string(self) -> typing.Optional["SecretStringGenerator"]:
"""Configuration for how to generate a secret value.
default
:default:
- 32 characters with upper-case letters, lower-case letters, punctuation and numbers (at least one from each
category), per the default values of ``SecretStringGenerator``.
"""
return self._values.get('generate_secret_string')
@property
def secret_name(self) -> typing.Optional[str]:
"""A name for the secret.
Note that deleting secrets from SecretsManager does not happen immediately, but after a 7 to
30 days blackout period. During that period, it is not possible to create another secret that shares the same name.
default
:default: - A name is generated by CloudFormation.
"""
return self._values.get('secret_name')
def __eq__(self, rhs) -> bool:
return isinstance(rhs, self.__class__) and rhs._values == self._values
def __ne__(self, rhs) -> bool:
return not (rhs == self)
def __repr__(self) -> str:
return 'SecretProps(%s)' % ', '.join(k + '=' + repr(v) for k, v in self._values.items())
@jsii.data_type(jsii_type="@aws-cdk/aws-secretsmanager.SecretStringGenerator", jsii_struct_bases=[], name_mapping={'exclude_characters': 'excludeCharacters', 'exclude_lowercase': 'excludeLowercase', 'exclude_numbers': 'excludeNumbers', 'exclude_punctuation': 'excludePunctuation', 'exclude_uppercase': 'excludeUppercase', 'generate_string_key': 'generateStringKey', 'include_space': 'includeSpace', 'password_length': 'passwordLength', 'require_each_included_type': 'requireEachIncludedType', 'secret_string_template': 'secretStringTemplate'})
class SecretStringGenerator():
def __init__(self, *, exclude_characters: typing.Optional[str]=None, exclude_lowercase: typing.Optional[bool]=None, exclude_numbers: typing.Optional[bool]=None, exclude_punctuation: typing.Optional[bool]=None, exclude_uppercase: typing.Optional[bool]=None, generate_string_key: typing.Optional[str]=None, include_space: typing.Optional[bool]=None, password_length: typing.Optional[jsii.Number]=None, require_each_included_type: typing.Optional[bool]=None, secret_string_template: typing.Optional[str]=None):
"""Configuration to generate secrets such as passwords automatically.
:param exclude_characters: A string that includes characters that shouldn't be included in the generated password. The string can be a minimum of ``0`` and a maximum of ``4096`` characters long. Default: no exclusions
:param exclude_lowercase: Specifies that the generated password shouldn't include lowercase letters. Default: false
:param exclude_numbers: Specifies that the generated password shouldn't include digits. Default: false
:param exclude_punctuation: Specifies that the generated password shouldn't include punctuation characters. Default: false
:param exclude_uppercase: Specifies that the generated password shouldn't include uppercase letters. Default: false
:param generate_string_key: The JSON key name that's used to add the generated password to the JSON structure specified by the ``secretStringTemplate`` parameter. If you specify ``generateStringKey`` then ``secretStringTemplate`` must be also be specified.
:param include_space: Specifies that the generated password can include the space character. Default: false
:param password_length: The desired length of the generated password. Default: 32
:param require_each_included_type: Specifies whether the generated password must include at least one of every allowed character type. Default: true
:param secret_string_template: A properly structured JSON string that the generated password can be added to. The ``generateStringKey`` is combined with the generated random string and inserted into the JSON structure that's specified by this parameter. The merged JSON string is returned as the completed SecretString of the secret. If you specify ``secretStringTemplate`` then ``generateStringKey`` must be also be specified.
"""
self._values = {
}
if exclude_characters is not None: self._values["exclude_characters"] = exclude_characters
if exclude_lowercase is not None: self._values["exclude_lowercase"] = exclude_lowercase
if exclude_numbers is not None: self._values["exclude_numbers"] = exclude_numbers
if exclude_punctuation is not None: self._values["exclude_punctuation"] = exclude_punctuation
if exclude_uppercase is not None: self._values["exclude_uppercase"] = exclude_uppercase
if generate_string_key is not None: self._values["generate_string_key"] = generate_string_key
if include_space is not None: self._values["include_space"] = include_space
if password_length is not None: self._values["password_length"] = password_length
if require_each_included_type is not None: self._values["require_each_included_type"] = require_each_included_type
if secret_string_template is not None: self._values["secret_string_template"] = secret_string_template
@property
def exclude_characters(self) -> typing.Optional[str]:
"""A string that includes characters that shouldn't be included in the generated password.
The string can be a minimum
of ``0`` and a maximum of ``4096`` characters long.
default
:default: no exclusions
"""
return self._values.get('exclude_characters')
@property
def exclude_lowercase(self) -> typing.Optional[bool]:
"""Specifies that the generated password shouldn't include lowercase letters.
default
:default: false
"""
return self._values.get('exclude_lowercase')
@property
def exclude_numbers(self) -> typing.Optional[bool]:
"""Specifies that the generated password shouldn't include digits.
default
:default: false
"""
return self._values.get('exclude_numbers')
@property
def exclude_punctuation(self) -> typing.Optional[bool]:
"""Specifies that the generated password shouldn't include punctuation characters.
default
:default: false
"""
return self._values.get('exclude_punctuation')
@property
def exclude_uppercase(self) -> typing.Optional[bool]:
"""Specifies that the generated password shouldn't include uppercase letters.
default
:default: false
"""
return self._values.get('exclude_uppercase')
@property
def generate_string_key(self) -> typing.Optional[str]:
"""The JSON key name that's used to add the generated password to the JSON structure specified by the ``secretStringTemplate`` parameter.
If you specify ``generateStringKey`` then ``secretStringTemplate``
must be also be specified.
"""
return self._values.get('generate_string_key')
@property
def include_space(self) -> typing.Optional[bool]:
"""Specifies that the generated password can include the space character.
default
:default: false
"""
return self._values.get('include_space')
@property
def password_length(self) -> typing.Optional[jsii.Number]:
"""The desired length of the generated password.
default
:default: 32
"""
return self._values.get('password_length')
@property
def require_each_included_type(self) -> typing.Optional[bool]:
"""Specifies whether the generated password must include at least one of every allowed character type.
default
:default: true
"""
return self._values.get('require_each_included_type')
@property
def secret_string_template(self) -> typing.Optional[str]:
"""A properly structured JSON string that the generated password can be added to.
The ``generateStringKey`` is
combined with the generated random string and inserted into the JSON structure that's specified by this parameter.
The merged JSON string is returned as the completed SecretString of the secret. If you specify ``secretStringTemplate``
then ``generateStringKey`` must be also be specified.
"""
return self._values.get('secret_string_template')
def __eq__(self, rhs) -> bool:
return isinstance(rhs, self.__class__) and rhs._values == self._values
def __ne__(self, rhs) -> bool:
return not (rhs == self)
def __repr__(self) -> str:
return 'SecretStringGenerator(%s)' % ', '.join(k + '=' + repr(v) for k, v in self._values.items())
@jsii.implements(ISecretTargetAttachment, ISecret)
class SecretTargetAttachment(aws_cdk.core.Resource, metaclass=jsii.JSIIMeta, jsii_type="@aws-cdk/aws-secretsmanager.SecretTargetAttachment"):
"""An attached secret."""
def __init__(self, scope: aws_cdk.core.Construct, id: str, *, secret: "ISecret", target: "ISecretAttachmentTarget") -> None:
"""
:param scope: -
:param id: -
:param props: -
:param secret: The secret to attach to the target.
:param target: The target to attach the secret to.
"""
props = SecretTargetAttachmentProps(secret=secret, target=target)
jsii.create(SecretTargetAttachment, self, [scope, id, props])
@jsii.member(jsii_name="fromSecretTargetAttachmentSecretArn")
@classmethod
def from_secret_target_attachment_secret_arn(cls, scope: aws_cdk.core.Construct, id: str, secret_target_attachment_secret_arn: str) -> "ISecretTargetAttachment":
"""
:param scope: -
:param id: -
:param secret_target_attachment_secret_arn: -
"""
return jsii.sinvoke(cls, "fromSecretTargetAttachmentSecretArn", [scope, id, secret_target_attachment_secret_arn])
@jsii.member(jsii_name="addRotationSchedule")
def add_rotation_schedule(self, id: str, *, rotation_lambda: aws_cdk.aws_lambda.IFunction, automatically_after: typing.Optional[aws_cdk.core.Duration]=None) -> "RotationSchedule":
"""Adds a rotation schedule to the secret.
:param id: -
:param options: -
:param rotation_lambda: THe Lambda function that can rotate the secret.
:param automatically_after: Specifies the number of days after the previous rotation before Secrets Manager triggers the next automatic rotation. Default: Duration.days(30)
"""
options = RotationScheduleOptions(rotation_lambda=rotation_lambda, automatically_after=automatically_after)
return jsii.invoke(self, "addRotationSchedule", [id, options])
@jsii.member(jsii_name="grantRead")
def grant_read(self, grantee: aws_cdk.aws_iam.IGrantable, version_stages: typing.Optional[typing.List[str]]=None) -> aws_cdk.aws_iam.Grant:
"""Grants reading the secret value to some role.
:param grantee: -
:param version_stages: -
"""
return jsii.invoke(self, "grantRead", [grantee, version_stages])
@jsii.member(jsii_name="secretValueFromJson")
def secret_value_from_json(self, json_field: str) -> aws_cdk.core.SecretValue:
"""Interpret the secret as a JSON object and return a field's value from it as a ``SecretValue``.
:param json_field: -
"""
return jsii.invoke(self, "secretValueFromJson", [json_field])
@property
@jsii.member(jsii_name="secretArn")
def secret_arn(self) -> str:
"""The ARN of the secret in AWS Secrets Manager."""
return jsii.get(self, "secretArn")
@property
@jsii.member(jsii_name="secretTargetAttachmentSecretArn")
def secret_target_attachment_secret_arn(self) -> str:
"""Same as ``secretArn``.
attribute:
:attribute:: true
"""
return jsii.get(self, "secretTargetAttachmentSecretArn")
@property
@jsii.member(jsii_name="secretValue")
def secret_value(self) -> aws_cdk.core.SecretValue:
"""Retrieve the value of the stored secret as a ``SecretValue``."""
return jsii.get(self, "secretValue")
@property
@jsii.member(jsii_name="encryptionKey")
def encryption_key(self) -> typing.Optional[aws_cdk.aws_kms.IKey]:
"""The customer-managed encryption key that is used to encrypt this secret, if any.
When not specified, the default
KMS key for the account and region is being used.
"""
return jsii.get(self, "encryptionKey")
@jsii.data_type(jsii_type="@aws-cdk/aws-secretsmanager.SecretTargetAttachmentProps", jsii_struct_bases=[AttachedSecretOptions], name_mapping={'target': 'target', 'secret': 'secret'})
class SecretTargetAttachmentProps(AttachedSecretOptions):
def __init__(self, *, target: "ISecretAttachmentTarget", secret: "ISecret"):
"""Construction properties for an AttachedSecret.
:param target: The target to attach the secret to.
:param secret: The secret to attach to the target.
"""
self._values = {
'target': target,
'secret': secret,
}
@property
def target(self) -> "ISecretAttachmentTarget":
"""The target to attach the secret to."""
return self._values.get('target')
@property
def secret(self) -> "ISecret":
"""The secret to attach to the target."""
return self._values.get('secret')
def __eq__(self, rhs) -> bool:
return isinstance(rhs, self.__class__) and rhs._values == self._values
def __ne__(self, rhs) -> bool:
return not (rhs == self)
def __repr__(self) -> str:
return 'SecretTargetAttachmentProps(%s)' % ', '.join(k + '=' + repr(v) for k, v in self._values.items())
__all__ = ["AttachedSecretOptions", "AttachmentTargetType", "CfnResourcePolicy", "CfnResourcePolicyProps", "CfnRotationSchedule", "CfnRotationScheduleProps", "CfnSecret", "CfnSecretProps", "CfnSecretTargetAttachment", "CfnSecretTargetAttachmentProps", "ISecret", "ISecretAttachmentTarget", "ISecretTargetAttachment", "RotationSchedule", "RotationScheduleOptions", "RotationScheduleProps", "Secret", "SecretAttachmentTargetProps", "SecretAttributes", "SecretProps", "SecretStringGenerator", "SecretTargetAttachment", "SecretTargetAttachmentProps", "__jsii_assembly__"]
publication.publish()
| 47.189486 | 959 | 0.701524 | 9,216 | 81,685 | 6.048069 | 0.04872 | 0.039183 | 0.012738 | 0.020345 | 0.828791 | 0.805684 | 0.77879 | 0.757674 | 0.734907 | 0.719622 | 0 | 0.001092 | 0.181735 | 81,685 | 1,730 | 960 | 47.216763 | 0.832825 | 0.390623 | 0 | 0.652051 | 0 | 0 | 0.172983 | 0.081254 | 0 | 0 | 0 | 0 | 0 | 1 | 0.272984 | false | 0.014144 | 0.018388 | 0.087694 | 0.570014 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
3fe1d1ea99fe0b39fe849d2a02635d2a0e4be5b0 | 16,983 | py | Python | rsi_divergence_finder.py | SpiralDevelopment/RSI-divergence-detector | de88eea98d002fa86d28d96ef1f4e9e3349632ac | [
"MIT"
] | 7 | 2022-03-23T14:44:23.000Z | 2022-03-31T10:44:44.000Z | rsi_divergence_finder.py | b1nhm1nh/RSI-divergence-detector | de88eea98d002fa86d28d96ef1f4e9e3349632ac | [
"MIT"
] | null | null | null | rsi_divergence_finder.py | b1nhm1nh/RSI-divergence-detector | de88eea98d002fa86d28d96ef1f4e9e3349632ac | [
"MIT"
] | 1 | 2022-03-26T08:04:15.000Z | 2022-03-26T08:04:15.000Z | import pandas as pd
from helpers.calculus_helper import *
import logging
from datetime import datetime
from scipy import stats
logger = logging.getLogger(__name__)
RSI_COLUMN = 'rsi'
BASE_COLUMN = 'C'
TIME_COLUMN = 'T'
ANGLE_LIMIT = 45.0 # Limit for angle of divergence lines
def calc_percentage_increase(original, new):
increase = (new - original) / original
return increase * 100
# cur_candle_idx - index of the candle to which we compare candles in the past to find divergences
def get_rsi_divergences(df, tf, cur_candle_idx=-1):
divergences = []
cur_candle = df.iloc[cur_candle_idx]
cur_rsi_change = calc_percentage_increase(df.iloc[-2][RSI_COLUMN],
cur_candle[RSI_COLUMN])
# 'cur_base_value' is the close price here
cur_base_value_time = cur_candle[TIME_COLUMN]
cur_base_value = cur_candle[BASE_COLUMN]
cur_base_value_rsi = cur_candle[RSI_COLUMN]
# 'candles_to_compare' - Candles in the past to which we compare 'cur_candle' and look for divergences
# We skip the most recent 21 candles because divergence signals formed among 21 (or less) candles are not that strong
# We get the other 55 candles before that
candles_to_compare = df[df[TIME_COLUMN] < cur_base_value_time - pd.Timedelta(minutes=tf.value[0] * 21)]
candles_to_compare = candles_to_compare.tail(55)
candles_to_compare_len = candles_to_compare.shape[0]
if candles_to_compare is None:
return divergences
# The rest is RSI divergence detection part
# Some things are hardcoded there, those are the numbers that I find to be more accurate
# Feel free to play around with those numbers
# In the following block, we check if there is bullish divergence
if cur_base_value_rsi <= 37 and cur_rsi_change < 0:
bullish_divs = pd.DataFrame()
for idx, (past_candle_idx, past_candle) in enumerate(candles_to_compare.iterrows()):
try:
past_base_value = past_candle[BASE_COLUMN]
past_base_value_rsi = past_candle[RSI_COLUMN]
past_base_value_time = past_candle[TIME_COLUMN]
if past_base_value_rsi > 32:
continue
is_bullish = False
base_value_change = calc_percentage_increase(past_base_value,
cur_base_value)
rsi_change = calc_percentage_increase(past_base_value_rsi,
cur_base_value_rsi)
df_in_period = df[(past_base_value_time <= df[TIME_COLUMN]) & (df[TIME_COLUMN] <= cur_base_value_time)]
seconds = (df_in_period[TIME_COLUMN] - datetime(1970, 1, 1)).dt.total_seconds()
slope, intercept, r_value, p_value, std_err = stats.linregress(seconds,
df_in_period[
BASE_COLUMN])
if rsi_change >= 6 and base_value_change <= 0 and slope < 0 and pow(r_value, 2) > 0.3:
is_bullish = True
if is_bullish \
and does_any_value_cross_down(df,
past_base_value_rsi,
past_base_value_time,
cur_base_value_rsi,
cur_base_value_time,
diff=1.05,
value_column=RSI_COLUMN) is False \
and does_any_value_cross_down(df,
past_base_value,
past_base_value_time,
cur_base_value,
cur_base_value_time,
diff=1.03,
value_column=BASE_COLUMN) is False \
and get_angle(
past_base_value_rsi,
past_base_value_time,
cur_base_value_rsi,
cur_base_value_time,
tf=tf) <= ANGLE_LIMIT:
bullish_divs = bullish_divs.append(past_candle)
except Exception as e:
logging.exception(str(e))
for index, div in bullish_divs.iterrows():
divergences.append({'start_dtm': div[TIME_COLUMN],
'end_dtm': cur_base_value_time,
'rsi_start': div[RSI_COLUMN],
'rsi_end': cur_base_value_rsi,
'price_start': div[BASE_COLUMN],
'price_end': cur_base_value,
'type': 'bullish'})
# In the following block, we check if there is bearish divergence
elif cur_base_value_rsi >= 63 and 0 < cur_rsi_change:
bearish_divs = pd.DataFrame()
for idx, (past_candle_idx, past_candle) in enumerate(candles_to_compare.iterrows()):
try:
past_base_value_rsi = past_candle[RSI_COLUMN]
if past_base_value_rsi < 68:
continue
past_base_value = past_candle[BASE_COLUMN]
past_base_value_time = past_candle[TIME_COLUMN]
is_bearish = False
base_value_change = calc_percentage_increase(past_base_value,
cur_base_value)
rsi_change = calc_percentage_increase(past_base_value_rsi, cur_base_value_rsi)
df_in_period = df[(past_base_value_time <= df[TIME_COLUMN]) & (df[TIME_COLUMN] <= cur_base_value_time)]
seconds = (df_in_period[TIME_COLUMN] - datetime(1970, 1, 1)).dt.total_seconds()
slope, intercept, r_value, p_value, std_err = stats.linregress(seconds,
df_in_period[
BASE_COLUMN])
if rsi_change <= -6 and 0 <= base_value_change and slope > 0 and pow(r_value, 2) > 0.3:
is_bearish = True
if is_bearish \
and does_any_value_cross_up(df,
past_base_value_rsi,
past_base_value_time,
cur_base_value_rsi,
cur_base_value_time,
diff=1.05,
value_column=RSI_COLUMN) is False \
and does_any_value_cross_up(df,
past_base_value,
past_base_value_time,
cur_base_value,
cur_base_value_time,
diff=1.03,
value_column=BASE_COLUMN) is False \
and get_angle(
past_base_value_rsi,
past_base_value_time,
cur_base_value_rsi,
cur_base_value_time, tf=tf) <= ANGLE_LIMIT:
bearish_divs = bearish_divs.append(past_candle)
except Exception as e:
logging.exception(str(e))
for index, div in bearish_divs.iterrows():
divergences.append({'start_dtm': div[TIME_COLUMN],
'end_dtm': cur_base_value_time,
'rsi_start': div[RSI_COLUMN],
'rsi_end': cur_base_value_rsi,
'price_start': div[BASE_COLUMN],
'price_end': cur_base_value,
'type': 'bearish'})
# In the following block, we check if there is hidden bearish divergence
if 50 < cur_base_value_rsi <= 70 and cur_rsi_change > 0:
h_bearish_divs = pd.DataFrame()
for idx_lcl, (past_candle_idx, past_candle) in enumerate(candles_to_compare.iterrows()):
try:
if idx_lcl in [0, candles_to_compare_len - 1]:
continue
past_base_value = past_candle[BASE_COLUMN]
past_base_value_rsi = past_candle[RSI_COLUMN]
if candles_to_compare.iloc[idx_lcl - 1][RSI_COLUMN] < \
past_base_value_rsi > \
candles_to_compare.iloc[idx_lcl + 1][RSI_COLUMN]:
if not (50 < past_base_value_rsi < 65):
continue
past_base_value_time = past_candle[TIME_COLUMN]
is_bearish = False
base_value_change = calc_percentage_increase(past_base_value,
cur_base_value)
rsi_change = calc_percentage_increase(past_base_value_rsi,
cur_base_value_rsi)
df_in_period = df[
(past_base_value_time <= df[TIME_COLUMN]) & (df[TIME_COLUMN] <= cur_base_value_time)]
seconds = (df_in_period[TIME_COLUMN] - datetime(1970, 1, 1)).dt.total_seconds()
slope, intercept, r_value, p_value, std_err = stats.linregress(seconds,
df_in_period[BASE_COLUMN])
slope2, intercept2, r_value2, p_value2, std_err2 = stats.linregress(seconds,
df_in_period[
RSI_COLUMN])
if rsi_change >= 6 and base_value_change < 0 and slope < 0 < slope2 and pow(r_value, 2) > 0.3:
is_bearish = True
if is_bearish \
and does_any_value_cross_up(df,
past_base_value_rsi,
past_base_value_time,
cur_base_value_rsi,
cur_base_value_time,
diff=1.05,
value_column=RSI_COLUMN) is False \
and does_any_value_cross_up(df,
past_base_value,
past_base_value_time,
cur_base_value,
cur_base_value_time,
diff=1.03,
value_column=BASE_COLUMN) is False \
and get_angle(
past_base_value_rsi,
past_base_value_time,
cur_base_value_rsi,
cur_base_value_time, tf=tf) <= ANGLE_LIMIT:
h_bearish_divs = h_bearish_divs.append(past_candle)
except Exception as e:
logging.exception(str(e))
continue
for index, div in h_bearish_divs.iterrows():
divergences.append({'start_dtm': div[TIME_COLUMN],
'end_dtm': cur_base_value_time,
'rsi_start': div[RSI_COLUMN],
'rsi_end': cur_base_value_rsi,
'price_start': div[BASE_COLUMN],
'price_end': cur_base_value,
'type': 'h_bearish'})
# In the following block, we check if there is hidden bullish divergence
elif 30 < cur_base_value_rsi <= 50 and cur_rsi_change < 0:
h_bullish_divs = pd.DataFrame()
for idx_lcl, (past_candle_idx, past_candle) in enumerate(candles_to_compare.iterrows()):
try:
if idx_lcl in [0, candles_to_compare_len - 1]:
continue
past_base_value = past_candle[BASE_COLUMN]
past_base_value_rsi = past_candle[RSI_COLUMN]
if candles_to_compare.iloc[idx_lcl - 1][RSI_COLUMN] > \
past_base_value_rsi < \
candles_to_compare.iloc[idx_lcl + 1][RSI_COLUMN]:
if not (40 < past_base_value_rsi < 55):
continue
past_base_value_time = past_candle[TIME_COLUMN]
is_bullish = False
base_value_change = calc_percentage_increase(past_base_value,
cur_base_value)
rsi_change = calc_percentage_increase(past_base_value_rsi,
cur_base_value_rsi)
df_in_period = df[
(past_base_value_time <= df[TIME_COLUMN]) & (df[TIME_COLUMN] <= cur_base_value_time)]
seconds = (df_in_period[TIME_COLUMN] - datetime(1970, 1, 1)).dt.total_seconds()
slope, intercept, r_value, p_value, std_err = stats.linregress(seconds,
df_in_period[BASE_COLUMN])
slope2, intercept2, r_value2, p_value2, std_err2 = stats.linregress(seconds,
df_in_period[RSI_COLUMN])
if rsi_change <= -6 and 0 < base_value_change and slope > 0 > slope2 and pow(r_value,
2) > 0.3:
is_bullish = True
if is_bullish \
and does_any_value_cross_down(df,
past_base_value_rsi,
past_base_value_time,
cur_base_value_rsi,
cur_base_value_time,
diff=1.05,
value_column=RSI_COLUMN) is False \
and does_any_value_cross_down(df,
past_base_value,
past_base_value_time,
cur_base_value,
cur_base_value_time,
diff=1.03,
value_column=BASE_COLUMN) is False \
and get_angle(
past_base_value_rsi,
past_base_value_time,
cur_base_value_rsi,
cur_base_value_time, tf=tf) <= ANGLE_LIMIT:
h_bullish_divs = h_bullish_divs.append(past_candle)
except Exception as e:
logging.exception(str(e))
continue
for index, div in h_bullish_divs.iterrows():
divergences.append({'start_dtm': div[TIME_COLUMN],
'end_dtm': cur_base_value_time,
'rsi_start': div[RSI_COLUMN],
'rsi_end': cur_base_value_rsi,
'price_start': div[BASE_COLUMN],
'price_end': cur_base_value,
'type': 'h_bullish'})
return divergences
def get_all_rsi_divergences(df, tf):
all_divergences = []
for idx in range(df.shape[0]):
all_divergences += get_rsi_divergences(df, tf, idx)
return all_divergences
| 52.416667 | 121 | 0.456928 | 1,661 | 16,983 | 4.257074 | 0.103552 | 0.151464 | 0.096733 | 0.053034 | 0.801443 | 0.783199 | 0.762551 | 0.758591 | 0.758591 | 0.740065 | 0 | 0.016212 | 0.4879 | 16,983 | 323 | 122 | 52.578947 | 0.796826 | 0.051404 | 0 | 0.74031 | 0 | 0 | 0.016215 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.011628 | false | 0 | 0.01938 | 0 | 0.046512 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
b751f4b88944743a1adc63b542322edf2faa16cb | 153 | py | Python | blackjax/mcmc/__init__.py | rlouf/blackjax | 07c345a2977ef81fc0f6d2231464bc14e0815b2f | [
"Apache-2.0"
] | null | null | null | blackjax/mcmc/__init__.py | rlouf/blackjax | 07c345a2977ef81fc0f6d2231464bc14e0815b2f | [
"Apache-2.0"
] | null | null | null | blackjax/mcmc/__init__.py | rlouf/blackjax | 07c345a2977ef81fc0f6d2231464bc14e0815b2f | [
"Apache-2.0"
] | null | null | null | from . import elliptical_slice, hmc, mala, nuts, periodic_orbital, rmh
__all__ = ["elliptical_slice", "hmc", "mala", "nuts", "periodic_orbital", "rmh"]
| 38.25 | 80 | 0.712418 | 19 | 153 | 5.315789 | 0.578947 | 0.29703 | 0.356436 | 0.435644 | 0.871287 | 0.871287 | 0.871287 | 0.871287 | 0 | 0 | 0 | 0 | 0.117647 | 153 | 3 | 81 | 51 | 0.748148 | 0 | 0 | 0 | 0 | 0 | 0.300654 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 10 |
b760e12d492d9434e2a6eb2c22a7a20d4c18e2f2 | 390 | py | Python | thinc/v2v.py | adamjm/thinc | 219d8172faee83303bb78e338996d7b6c6d56155 | [
"MIT"
] | 44 | 2015-01-02T22:04:48.000Z | 2020-05-15T21:17:10.000Z | venv/lib/python3.6/site-packages/thinc/v2v.py | assaufianggie/VerraBot | cbe46ccb219c2972871e760268b427e1f8e79f93 | [
"MIT"
] | 3 | 2022-02-13T15:21:42.000Z | 2022-02-27T06:12:44.000Z | venv/lib/python3.6/site-packages/thinc/v2v.py | assaufianggie/VerraBot | cbe46ccb219c2972871e760268b427e1f8e79f93 | [
"MIT"
] | 7 | 2015-06-18T00:50:57.000Z | 2016-02-03T17:08:07.000Z | # coding: utf8
from __future__ import unicode_literals
from .neural._classes.model import Model # noqa: F401
from .neural._classes.affine import Affine # noqa: F401
from .neural._classes.relu import ReLu # noqa: F401
from .neural._classes.maxout import Maxout # noqa: F401
from .neural._classes.softmax import Softmax # noqa: F401
from .neural._classes.selu import SELU # noqa: F401
| 39 | 58 | 0.774359 | 55 | 390 | 5.290909 | 0.309091 | 0.206186 | 0.350515 | 0.309278 | 0.429553 | 0 | 0 | 0 | 0 | 0 | 0 | 0.056886 | 0.14359 | 390 | 9 | 59 | 43.333333 | 0.814371 | 0.2 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
b7c398827a3651bfa1d4ad9b72b898bddf88047e | 25,189 | py | Python | tests/test_unittest_on_hw.py | Accelize/drm | 081ef761de50b526523b692c3a8decf290714ed0 | [
"Apache-2.0"
] | 4 | 2021-02-21T09:11:50.000Z | 2021-11-29T02:34:07.000Z | tests/test_unittest_on_hw.py | Accelize/drm | 081ef761de50b526523b692c3a8decf290714ed0 | [
"Apache-2.0"
] | null | null | null | tests/test_unittest_on_hw.py | Accelize/drm | 081ef761de50b526523b692c3a8decf290714ed0 | [
"Apache-2.0"
] | null | null | null | # -*- coding: utf-8 -*-
"""
Test node-locked behavior of DRM Library.
"""
import pytest
from re import search, IGNORECASE
from time import sleep
from datetime import datetime, timedelta
from flask import request as _request
@pytest.mark.minimum
def test_get_version(accelize_drm):
"""Test the versions of the DRM Lib and its dependencies are well displayed"""
versions = accelize_drm.get_api_version()
assert search(r'\d+\.\d+\.\d+', versions.version) is not None
@pytest.mark.long_run
@pytest.mark.hwtst
def test_activation_and_license_status(accelize_drm, conf_json, cred_json, async_handler):
"""Test status of IP activators"""
driver = accelize_drm.pytest_fpga_driver[0]
async_cb = async_handler.create()
activators = accelize_drm.pytest_fpga_activators[0]
cred_json.set_user('accelize_accelerator_test_02')
with accelize_drm.DrmManager(
conf_json.path,
cred_json.path,
driver.read_register_callback,
driver.write_register_callback,
async_cb.callback
) as drm_manager:
print()
# Test license status on start/stop
# Check all activators are locked
assert not drm_manager.get('license_status'), 'License is not inactive'
activators.autotest(is_activated=False)
# Activate all activators
drm_manager.activate()
# Check all activators are unlocked
assert drm_manager.get('license_status'), 'License is not active'
activators.autotest(is_activated=True)
# Deactivate all activators
drm_manager.deactivate()
# Check all activators are locked again
assert not drm_manager.get('license_status'), 'License is not inactive'
activators.autotest(is_activated=False)
async_cb.assert_NoError()
print('Test license status on start/stop: PASS')
# Test license status on start/pause
# Check all activators are locked
assert not drm_manager.get('license_status'), 'License is not inactive'
activators.autotest(is_activated=False)
# Activate all activators
drm_manager.activate()
start = datetime.now()
# Check all activators are unlocked
assert drm_manager.get('license_status'), 'License is not active'
activators.autotest(is_activated=True)
# Pause all activators
drm_manager.deactivate(True)
# Check all activators are still unlocked
assert drm_manager.get('license_status'), 'License is not active'
activators.autotest(is_activated=True)
async_cb.assert_NoError()
print('Test license status on start/pause: PASS')
# Test license status on resume from valid license/pause
# Check all activators are unlocked
assert drm_manager.get('license_status'), 'License is not active'
activators.autotest(is_activated=True)
# Resume all activators
drm_manager.activate(True)
# Check all activators are still unlocked
assert drm_manager.get('license_status'), 'License is not active'
activators.autotest(is_activated=True)
# Pause all activators
drm_manager.deactivate(True)
# Check all activators are still unlocked
activators.autotest(is_activated=True)
# Wait until license expires
lic_duration = drm_manager.get('license_duration')
wait_period = start + timedelta(seconds=2 * lic_duration + 1) - datetime.now()
sleep(wait_period.total_seconds())
# Check all activators are now locked again
assert not drm_manager.get('license_status'), 'License is not inactive'
activators.autotest(is_activated=False)
async_cb.assert_NoError()
print('Test license status on resume from valid license/pause: PASS')
# Test license status on resume from expired license/pause
# Check all activators are locked
assert not drm_manager.get('license_status'), 'License is not inactive'
activators.autotest(is_activated=False)
# Resume all activators
drm_manager.activate(True)
# Check all activators are unlocked
assert drm_manager.get('license_status'), 'License is not active'
activators.autotest(is_activated=True)
# Pause all activators
drm_manager.deactivate(True)
# Check all activators are still unlocked
assert drm_manager.get('license_status'), 'License is not active'
activators.autotest(is_activated=True)
async_cb.assert_NoError()
print('Test license status on resume from expired license/pause: PASS')
# Test license status on resume/stop
# Check all activators are still unlocked
assert drm_manager.get('license_status'), 'License is not active'
activators.autotest(is_activated=True)
async_cb.assert_NoError()
# Resume all activators
drm_manager.activate(True)
# Check all activators are still unlocked
assert drm_manager.get('license_status'), 'License is not active'
activators.autotest(is_activated=True)
# Deactivate all activators
drm_manager.deactivate()
# Check all activators are locked again
assert not drm_manager.get('license_status'), 'License is not inactive'
activators.autotest(is_activated=False)
async_cb.assert_NoError()
print('Test license status on resume/stop: PASS')
# Test license status on restart from paused session/stop
# Check all activators are locked again
assert not drm_manager.get('license_status'), 'License is not inactive'
activators.autotest(is_activated=False)
async_cb.assert_NoError()
# Activate all activators
drm_manager.activate()
# Check all activators are unlocked
assert drm_manager.get('license_status'), 'License is not active'
activators.autotest(is_activated=True)
# Pause activators
drm_manager.deactivate(True)
# Check all activators are still unlocked
assert drm_manager.get('license_status'), 'License is not active'
activators.autotest(is_activated=True)
# Restart all activators
drm_manager.activate()
# Check all activators are still unlocked
assert drm_manager.get('license_status'), 'License is not active'
activators.autotest(is_activated=True)
async_cb.assert_NoError()
print('Test license status on restart: PASS')
@pytest.mark.long_run
@pytest.mark.hwtst
def test_session_status(accelize_drm, conf_json, cred_json, async_handler):
"""Test status of session"""
driver = accelize_drm.pytest_fpga_driver[0]
async_cb = async_handler.create()
cred_json.set_user('accelize_accelerator_test_02')
with accelize_drm.DrmManager(
conf_json.path,
cred_json.path,
driver.read_register_callback,
driver.write_register_callback,
async_cb.callback
) as drm_manager:
print()
# Test session status on start/stop
# Check no session is running and no ID is available
assert not drm_manager.get('session_status')
assert len(drm_manager.get('session_id')) == 0
# Activate new session
drm_manager.activate()
# Check a session is running with a valid ID
assert drm_manager.get('session_status')
assert len(drm_manager.get('session_id')) == 16
# Deactivate current session
drm_manager.deactivate()
# Check session is closed
assert not drm_manager.get('session_status')
assert len(drm_manager.get('session_id')) == 0
print('Test session status on start/stop: PASS')
# Test session status on start/pause
# Check no session is running and no ID is available
assert not drm_manager.get('session_status')
assert len(drm_manager.get('session_id')) == 0
# Activate new session
drm_manager.activate()
start = datetime.now()
# Check a session is running with a valid ID
assert drm_manager.get('session_status')
id_ref = drm_manager.get('session_id')
assert len(id_ref) == 16, 'No session ID is returned'
# Pause current session
drm_manager.deactivate(True)
# Check a session is still alive with the same ID
assert drm_manager.get('session_status')
session_id = drm_manager.get('session_id')
assert len(session_id) == 16, 'No session ID is returned'
assert session_id == id_ref, 'Return different session ID'
async_cb.assert_NoError()
print('Test session status on start/pause: PASS')
# Test session status on resume from valid license/pause
# Check a session is still alive with the same ID
assert drm_manager.get('session_status')
session_id = drm_manager.get('session_id')
assert len(session_id) == 16, 'No session ID is returned'
assert session_id == id_ref, 'Return different session ID'
# Resume current session
drm_manager.activate(True)
# Check a session is still alive with the same ID
assert drm_manager.get('session_status')
session_id = drm_manager.get('session_id')
assert len(session_id) == 16, 'No session ID is returned'
assert session_id == id_ref, 'Return different session ID'
# Pause current session
drm_manager.deactivate(True)
# Check a session is still alive with the same ID
assert drm_manager.get('session_status')
session_id = drm_manager.get('session_id')
assert len(session_id) == 16, 'No session ID is returned'
assert session_id == id_ref, 'Return different session ID'
# Wait until license expires
lic_duration = drm_manager.get('license_duration')
wait_period = start + timedelta(seconds=2 * lic_duration + 1) - datetime.now()
sleep(wait_period.total_seconds())
# Check a session is still alive with the same ID
assert drm_manager.get('session_status')
session_id = drm_manager.get('session_id')
assert len(session_id) == 16, 'No session ID is returned'
assert session_id == id_ref, 'Return different session ID'
async_cb.assert_NoError()
print('Test session status on resume from valid license/pause: PASS')
# Test session status on resume from expired license/pause
# Check a session is still alive with the same ID
assert drm_manager.get('session_status')
session_id = drm_manager.get('session_id')
assert len(session_id) == 16, 'No session ID is returned'
assert session_id == id_ref, 'Return different session ID'
# Resume current session
drm_manager.activate(True)
# Check a session is still alive with the same ID
assert drm_manager.get('session_status')
session_id = drm_manager.get('session_id')
assert len(session_id) == 16, 'No session ID is returned'
assert session_id == id_ref, 'Return different session ID'
# Pause current session
drm_manager.deactivate(True)
# Check a session is still alive with the same ID
assert drm_manager.get('session_status')
session_id = drm_manager.get('session_id')
assert len(session_id) == 16, 'No session ID is returned'
assert session_id == id_ref, 'Return different session ID'
async_cb.assert_NoError()
print('Test session status on resume from expired license/pause: PASS')
# Test session status on resume/stop
# Check a session is still alive with the same ID
assert drm_manager.get('session_status')
session_id = drm_manager.get('session_id')
assert len(session_id) == 16, 'No session ID is returned'
assert session_id == id_ref, 'Return different session ID'
# Resume current session
drm_manager.activate(True)
# Check a session is still alive with the same ID
assert drm_manager.get('session_status')
session_id = drm_manager.get('session_id')
assert len(session_id) == 16, 'No session ID is returned'
assert session_id == id_ref, 'Return different session ID'
# Close session
drm_manager.deactivate()
# Check session is closed
assert not drm_manager.get('session_status')
assert len(drm_manager.get('session_id')) == 0
async_cb.assert_NoError()
print('Test session status on resume/stop: PASS')
# Test session status on start from paused session/stop
# Check no session is running
assert not drm_manager.get('session_status')
assert len(drm_manager.get('session_id')) == 0
# Start a new session
drm_manager.activate()
# Check a session is alive with a new ID
assert drm_manager.get('session_status')
session_id = drm_manager.get('session_id')
assert len(session_id) == 16, 'No session ID is returned'
assert session_id != id_ref, 'Return different session ID'
id_ref = session_id
# Pause session
drm_manager.deactivate(True)
# Check a session is still alive with the same ID
assert drm_manager.get('session_status')
session_id = drm_manager.get('session_id')
assert len(session_id) == 16, 'No session ID is returned'
assert session_id == id_ref, 'Return different session ID'
# Start a new session
drm_manager.activate()
# Check a new session has been created with a new ID
assert drm_manager.get('session_status')
session_id = drm_manager.get('session_id')
assert len(session_id) == 16, 'No session ID is returned'
assert session_id != id_ref, 'Return different session ID'
# Close session
drm_manager.deactivate()
# Check session is closed
assert not drm_manager.get('session_status')
assert len(drm_manager.get('session_id')) == 0
async_cb.assert_NoError()
print('Test session status on restart: PASS')
@pytest.mark.long_run
@pytest.mark.hwtst
def test_license_expiration(accelize_drm, conf_json, cred_json, async_handler):
"""Test license expiration"""
driver = accelize_drm.pytest_fpga_driver[0]
async_cb = async_handler.create()
activators = accelize_drm.pytest_fpga_activators[0]
cred_json.set_user('accelize_accelerator_test_02')
with accelize_drm.DrmManager(
conf_json.path,
cred_json.path,
driver.read_register_callback,
driver.write_register_callback,
async_cb.callback
) as drm_manager:
print()
# Test license expires after 2 duration periods when start/pause
# Check no license is running
assert not drm_manager.get('license_status')
activators.autotest(is_activated=False)
# Start
drm_manager.activate()
start = datetime.now()
lic_duration = drm_manager.get('license_duration')
# Pause
sleep(lic_duration/2)
drm_manager.deactivate(True)
# Check license is still running and activator are all unlocked
assert drm_manager.get('license_status')
activators.autotest(is_activated=True)
# Wait right before expiration
wait_period = start + timedelta(seconds=2*lic_duration-2) - datetime.now()
sleep(wait_period.total_seconds())
# Check license is still running and activators are all unlocked
assert drm_manager.get('license_status')
activators.autotest(is_activated=True)
# Wait a bit more time the expiration
sleep(3)
# Check no license is running
assert not drm_manager.get('license_status')
activators.autotest(is_activated=False)
drm_manager.deactivate()
# Check no license is running
assert not drm_manager.get('license_status')
activators.autotest(is_activated=False)
async_cb.assert_NoError()
print('Test license expires after 2 duration periods when start/pause/stop: PASS')
# Test license does not expire after 3 duration periods when start
# Check no license is running
assert not drm_manager.get('license_status')
activators.autotest(is_activated=False)
# Start
drm_manager.activate()
start = datetime.now()
# Check license is running
assert drm_manager.get('license_status')
activators.autotest(is_activated=True)
# Wait 3 duration periods
lic_duration = drm_manager.get('license_duration')
wait_period = start + timedelta(seconds=3*lic_duration+2) - datetime.now()
sleep(wait_period.total_seconds())
# Check license is still running
assert drm_manager.get('license_status')
activators.autotest(is_activated=True)
# Stop
drm_manager.deactivate()
# Check no license is running
assert not drm_manager.get('license_status')
activators.autotest(is_activated=False)
async_cb.assert_NoError()
print('Test license does not expire after 3 duration periods when start: PASS')
# Test license does not expire after 3 duration periods when start/pause
# Check no license is running
assert not drm_manager.get('license_status')
activators.autotest(is_activated=False)
# Start
drm_manager.activate()
start = datetime.now()
lic_duration = drm_manager.get('license_duration')
# Check license is running
assert drm_manager.get('license_status')
activators.autotest(is_activated=True)
# Wait 1 full duration period
wait_period = start + timedelta(seconds=lic_duration+lic_duration/2) - datetime.now()
sleep(wait_period.total_seconds())
# Check license is still running
assert drm_manager.get('license_status')
activators.autotest(is_activated=True)
# Pause
drm_manager.deactivate(True)
# Wait right before the next 2 duration periods expire
wait_period = start + timedelta(seconds=3*lic_duration-2) - datetime.now()
sleep(wait_period.total_seconds())
# Check license is still running
assert drm_manager.get('license_status')
activators.autotest(is_activated=True)
# Wait a bit more time the expiration
sleep(3)
# Check license has expired
assert not drm_manager.get('license_status')
activators.autotest(is_activated=False)
drm_manager.deactivate()
# Check no license is running
assert not drm_manager.get('license_status')
activators.autotest(is_activated=False)
async_cb.assert_NoError()
print('Test license does not expire after 3 duration periods when start/pause: PASS')
@pytest.mark.hwtst
def test_multiple_call(accelize_drm, conf_json, cred_json, async_handler):
"""Test multiple calls to activate and deactivate"""
driver = accelize_drm.pytest_fpga_driver[0]
async_cb = async_handler.create()
cred_json.set_user('accelize_accelerator_test_02')
with accelize_drm.DrmManager(
conf_json.path,
cred_json.path,
driver.read_register_callback,
driver.write_register_callback,
async_cb.callback
) as drm_manager:
print()
# Test multiple activate
# Check license is inactive
assert not drm_manager.get('license_status')
# Start
drm_manager.activate()
# Check license is active
assert drm_manager.get('license_status')
# Check a session is valid
session_id = drm_manager.get('session_id')
assert len(session_id) == 16
# Resume
drm_manager.activate(True)
# Check license is active
assert drm_manager.get('license_status')
# Check a session is valid
session_id2 = drm_manager.get('session_id')
assert len(session_id2) == 16
assert session_id2 == session_id
# Start again
drm_manager.activate()
# Check license is active
assert drm_manager.get('license_status')
# Check a session is valid
session_id = drm_manager.get('session_id')
assert len(session_id) == 16
assert session_id != session_id2
# Start again
drm_manager.activate()
# Check license is active
assert drm_manager.get('license_status')
# Check a session is valid
session_id2 = drm_manager.get('session_id')
assert len(session_id2) == 16
assert session_id2 != session_id
async_cb.assert_NoError()
# Test multiple deactivate
# Check license is active
assert drm_manager.get('license_status')
# Pause
drm_manager.deactivate(True)
# Check license is active
assert drm_manager.get('license_status')
# Check a session is valid
session_id = drm_manager.get('session_id')
assert len(session_id) == 16
assert session_id == session_id2
# Resume
drm_manager.deactivate(True)
# Check license is active
assert drm_manager.get('license_status')
# Check a session is valid
session_id = drm_manager.get('session_id')
assert len(session_id) == 16
assert session_id == session_id2
# Stop
drm_manager.deactivate()
# Check license is in active
assert not drm_manager.get('license_status')
# Check session ID is invalid
session_id = drm_manager.get('session_id')
assert len(session_id) == 0
# Stop
drm_manager.deactivate()
# Check license is in active
assert not drm_manager.get('license_status')
# Check session ID is invalid
session_id = drm_manager.get('session_id')
assert len(session_id) == 0
async_cb.assert_NoError()
def test_security_stop(accelize_drm, conf_json, cred_json, async_handler):
"""
Test the session is stopped in case of abnormal termination
"""
driver = accelize_drm.pytest_fpga_driver[0]
async_cb = async_handler.create()
cred_json.set_user('accelize_accelerator_test_02')
drm_manager0 = accelize_drm.DrmManager(
conf_json.path,
cred_json.path,
driver.read_register_callback,
driver.write_register_callback,
async_cb.callback
)
drm_manager0.activate()
assert drm_manager0.get('session_status')
session_id = drm_manager0.get('session_id')
assert len(session_id) > 0
del drm_manager0
drm_manager1 = accelize_drm.DrmManager(
conf_json.path,
cred_json.path,
driver.read_register_callback,
driver.write_register_callback,
async_cb.callback
)
assert not drm_manager1.get('session_status')
assert len(drm_manager1.get('session_id')) == 0
async_cb.assert_NoError()
@pytest.mark.minimum
def test_curl_host_resolve(accelize_drm, conf_json, cred_json, async_handler):
"""Test host resolve information is taken into account by DRM Library"""
driver = accelize_drm.pytest_fpga_driver[0]
async_cb = async_handler.create()
conf_json.reset()
url = conf_json['licensing']['url']
conf_json['licensing']['host_resolves'] = {'%s:443' % url.replace('https://',''): '78.153.251.226'}
conf_json.save()
async_cb.reset()
with accelize_drm.DrmManager(
conf_json.path,
cred_json.path,
driver.read_register_callback,
driver.write_register_callback,
async_cb.callback
) as drm_manager:
with pytest.raises(accelize_drm.exceptions.DRMExternFail) as excinfo:
drm_manager.activate()
assert 'Failed to perform HTTP request to Accelize webservice' in str(excinfo.value)
assert search(r'peer certificate', str(excinfo.value), IGNORECASE)
assert async_handler.get_error_code(str(excinfo.value)) == accelize_drm.exceptions.DRMExternFail.error_code
async_cb.assert_Error(accelize_drm.exceptions.DRMExternFail.error_code, 'peer certificate')
async_cb.reset()
@pytest.mark.no_parallel
@pytest.mark.minimum
def test_http_header_api_version(accelize_drm, conf_json, cred_json,
async_handler, live_server, request):
"""Test the http header contains the expected API version"""
driver = accelize_drm.pytest_fpga_driver[0]
async_cb = async_handler.create()
async_cb.reset()
conf_json.reset()
conf_json['licensing']['url'] = _request.url + request.function.__name__
conf_json.save()
with accelize_drm.DrmManager(
conf_json.path,
cred_json.path,
driver.read_register_callback,
driver.write_register_callback,
async_cb.callback
) as drm_manager:
drm_manager.activate()
async_cb.assert_NoError()
| 40.366987 | 115 | 0.668387 | 3,188 | 25,189 | 5.072146 | 0.06399 | 0.092764 | 0.079592 | 0.061843 | 0.896413 | 0.88256 | 0.861781 | 0.843043 | 0.830983 | 0.794743 | 0 | 0.006679 | 0.251102 | 25,189 | 623 | 116 | 40.431782 | 0.850509 | 0.188852 | 0 | 0.844444 | 0 | 0 | 0.174319 | 0.006919 | 0 | 0 | 0 | 0 | 0.348148 | 1 | 0.019753 | false | 0.037037 | 0.012346 | 0 | 0.032099 | 0.046914 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
b7f79227df0bfb6b999704ff2b612967b4d8d2af | 6,337 | py | Python | python_modules/dagster-graphql/dagster_graphql_tests/graphql/snapshots/snap_test_pipeline_snapshot.py | naralogics/dagster | 16d599daa380b800149474fcaff2311b3f8f269a | [
"Apache-2.0"
] | null | null | null | python_modules/dagster-graphql/dagster_graphql_tests/graphql/snapshots/snap_test_pipeline_snapshot.py | naralogics/dagster | 16d599daa380b800149474fcaff2311b3f8f269a | [
"Apache-2.0"
] | null | null | null | python_modules/dagster-graphql/dagster_graphql_tests/graphql/snapshots/snap_test_pipeline_snapshot.py | naralogics/dagster | 16d599daa380b800149474fcaff2311b3f8f269a | [
"Apache-2.0"
] | null | null | null | # -*- coding: utf-8 -*-
# snapshottest: v1 - https://goo.gl/zC4yUc
from __future__ import unicode_literals
from snapshottest import Snapshot
snapshots = Snapshot()
snapshots['TestPipelineSnapshotGraphQL.test_fetch_snapshot_success[create_ephemeral_instance] 1'] = '''{
"pipelineSnapshot": {
"__typename": "PipelineSnapshot",
"description": null,
"modes": [
{
"name": "default"
}
],
"name": "noop_pipeline",
"pipelineSnapshotId": "e95e6f29f25ea236ce191a1e8b49ae8601a0afef",
"runtimeTypes": [
{
"key": "Any"
},
{
"key": "Bool"
},
{
"key": "Float"
},
{
"key": "Int"
},
{
"key": "Nothing"
},
{
"key": "String"
}
],
"solidHandles": [
{
"handleID": "noop_solid"
}
],
"solids": [
{
"name": "noop_solid"
}
],
"tags": []
}
}'''
snapshots['TestPipelineSnapshotGraphQL.test_fetch_snapshot_success[create_local_temp_instance] 1'] = '''{
"pipelineSnapshot": {
"__typename": "PipelineSnapshot",
"description": null,
"modes": [
{
"name": "default"
}
],
"name": "noop_pipeline",
"pipelineSnapshotId": "e95e6f29f25ea236ce191a1e8b49ae8601a0afef",
"runtimeTypes": [
{
"key": "Any"
},
{
"key": "Bool"
},
{
"key": "Float"
},
{
"key": "Int"
},
{
"key": "Nothing"
},
{
"key": "String"
}
],
"solidHandles": [
{
"handleID": "noop_solid"
}
],
"solids": [
{
"name": "noop_solid"
}
],
"tags": []
}
}'''
snapshots['TestPipelineSnapshotGraphQL.test_fetch_snapshot_or_error_by_snapshot_id_success[create_ephemeral_instance] 1'] = '''{
"pipelineSnapshotOrError": {
"__typename": "PipelineSnapshot",
"description": null,
"modes": [
{
"name": "default"
}
],
"name": "noop_pipeline",
"pipelineSnapshotId": "e95e6f29f25ea236ce191a1e8b49ae8601a0afef",
"runtimeTypes": [
{
"key": "Any"
},
{
"key": "Bool"
},
{
"key": "Float"
},
{
"key": "Int"
},
{
"key": "Nothing"
},
{
"key": "String"
}
],
"solidHandles": [
{
"handleID": "noop_solid"
}
],
"solids": [
{
"name": "noop_solid"
}
],
"tags": []
}
}'''
snapshots['TestPipelineSnapshotGraphQL.test_fetch_snapshot_or_error_by_snapshot_id_success[create_local_temp_instance] 1'] = '''{
"pipelineSnapshotOrError": {
"__typename": "PipelineSnapshot",
"description": null,
"modes": [
{
"name": "default"
}
],
"name": "noop_pipeline",
"pipelineSnapshotId": "e95e6f29f25ea236ce191a1e8b49ae8601a0afef",
"runtimeTypes": [
{
"key": "Any"
},
{
"key": "Bool"
},
{
"key": "Float"
},
{
"key": "Int"
},
{
"key": "Nothing"
},
{
"key": "String"
}
],
"solidHandles": [
{
"handleID": "noop_solid"
}
],
"solids": [
{
"name": "noop_solid"
}
],
"tags": []
}
}'''
snapshots['TestPipelineSnapshotGraphQL.test_fetch_snapshot_or_error_by_snapshot_id_snapshot_not_found[create_ephemeral_instance] 1'] = '''{
"pipelineSnapshotOrError": {
"__typename": "PipelineSnapshotNotFoundError",
"snapshotId": "notthere"
}
}'''
snapshots['TestPipelineSnapshotGraphQL.test_fetch_snapshot_or_error_by_snapshot_id_snapshot_not_found[create_local_temp_instance] 1'] = '''{
"pipelineSnapshotOrError": {
"__typename": "PipelineSnapshotNotFoundError",
"snapshotId": "notthere"
}
}'''
snapshots['TestPipelineSnapshotGraphQL.test_fetch_snapshot_or_error_by_active_pipeline_name_success[create_ephemeral_instance] 1'] = '''{
"pipelineSnapshotOrError": {
"__typename": "PipelineSnapshot",
"description": null,
"modes": [
{
"name": "default"
}
],
"name": "csv_hello_world",
"pipelineSnapshotId": "d21414f48707616b77d78a4151b4dc90b0f5406e",
"runtimeTypes": [
{
"key": "Any"
},
{
"key": "Bool"
},
{
"key": "Float"
},
{
"key": "Int"
},
{
"key": "Nothing"
},
{
"key": "PoorMansDataFrame"
},
{
"key": "String"
}
],
"solidHandles": [
{
"handleID": "sum_solid"
},
{
"handleID": "sum_sq_solid"
}
],
"solids": [
{
"name": "sum_solid"
},
{
"name": "sum_sq_solid"
}
],
"tags": []
}
}'''
snapshots['TestPipelineSnapshotGraphQL.test_fetch_snapshot_or_error_by_active_pipeline_name_success[create_local_temp_instance] 1'] = '''{
"pipelineSnapshotOrError": {
"__typename": "PipelineSnapshot",
"description": null,
"modes": [
{
"name": "default"
}
],
"name": "csv_hello_world",
"pipelineSnapshotId": "d21414f48707616b77d78a4151b4dc90b0f5406e",
"runtimeTypes": [
{
"key": "Any"
},
{
"key": "Bool"
},
{
"key": "Float"
},
{
"key": "Int"
},
{
"key": "Nothing"
},
{
"key": "PoorMansDataFrame"
},
{
"key": "String"
}
],
"solidHandles": [
{
"handleID": "sum_solid"
},
{
"handleID": "sum_sq_solid"
}
],
"solids": [
{
"name": "sum_solid"
},
{
"name": "sum_sq_solid"
}
],
"tags": []
}
}'''
snapshots['TestPipelineSnapshotGraphQL.test_fetch_snapshot_or_error_by_active_pipeline_name_not_found[create_ephemeral_instance] 1'] = '''{
"pipelineSnapshotOrError": {
"__typename": "PipelineNotFoundError"
}
}'''
snapshots['TestPipelineSnapshotGraphQL.test_fetch_snapshot_or_error_by_active_pipeline_name_not_found[create_local_temp_instance] 1'] = '''{
"pipelineSnapshotOrError": {
"__typename": "PipelineNotFoundError"
}
}'''
| 19.680124 | 140 | 0.511283 | 423 | 6,337 | 7.286052 | 0.163121 | 0.116807 | 0.129786 | 0.146009 | 0.961713 | 0.945165 | 0.942245 | 0.916613 | 0.897794 | 0.878975 | 0 | 0.037289 | 0.327126 | 6,337 | 321 | 141 | 19.741433 | 0.685507 | 0.009784 | 0 | 0.534202 | 0 | 0 | 0.945631 | 0.28396 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.006515 | 0 | 0.006515 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
4d050a04b1193940ec9648f3fd87f17b44bbb49f | 166 | py | Python | test/run/t447.py | timmartin/skulpt | 2e3a3fbbaccc12baa29094a717ceec491a8a6750 | [
"MIT"
] | 2,671 | 2015-01-03T08:23:25.000Z | 2022-03-31T06:15:48.000Z | test/run/t447.py | timmartin/skulpt | 2e3a3fbbaccc12baa29094a717ceec491a8a6750 | [
"MIT"
] | 972 | 2015-01-05T08:11:00.000Z | 2022-03-29T13:47:15.000Z | test/run/t447.py | timmartin/skulpt | 2e3a3fbbaccc12baa29094a717ceec491a8a6750 | [
"MIT"
] | 845 | 2015-01-03T19:53:36.000Z | 2022-03-29T18:34:22.000Z | print ((1+3)/2)
print ((4/2)+3)
print ((1L+3L)/2L)
print ((1L+3L)/3L)
print ((1L/2L)+1L)
print ((4L/2L)+3L)
print 4L**2L-5L+3L/7L%2L
print (4L**2L)-5L+((3L/7L)%2L)
| 15.090909 | 30 | 0.566265 | 38 | 166 | 2.473684 | 0.289474 | 0.223404 | 0.287234 | 0.234043 | 0.361702 | 0.361702 | 0.361702 | 0 | 0 | 0 | 0 | 0.202703 | 0.108434 | 166 | 10 | 31 | 16.6 | 0.432432 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 1 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 7 |
4d09c455e74216befc7e9fe2ba320e84291595fc | 5,417 | py | Python | test/pyaz/cdn/endpoint/__init__.py | bigdatamoore/py-az-cli | 54383a4ee7cc77556f6183e74e992eec95b28e01 | [
"MIT"
] | null | null | null | test/pyaz/cdn/endpoint/__init__.py | bigdatamoore/py-az-cli | 54383a4ee7cc77556f6183e74e992eec95b28e01 | [
"MIT"
] | 9 | 2021-09-24T16:37:24.000Z | 2021-12-24T00:39:19.000Z | test/pyaz/cdn/endpoint/__init__.py | bigdatamoore/py-az-cli | 54383a4ee7cc77556f6183e74e992eec95b28e01 | [
"MIT"
] | null | null | null | import json, subprocess
from ... pyaz_utils import get_cli_name, get_params
def start(resource_group, profile_name, name, no_wait=None):
params = get_params(locals())
command = "az cdn endpoint start " + params
print(command)
output = subprocess.run(command, shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
stdout = output.stdout.decode("utf-8")
stderr = output.stderr.decode("utf-8")
if stdout:
return json.loads(stdout)
print(stdout)
else:
raise Exception(stderr)
print(stderr)
def stop(resource_group, profile_name, name, no_wait=None):
params = get_params(locals())
command = "az cdn endpoint stop " + params
print(command)
output = subprocess.run(command, shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
stdout = output.stdout.decode("utf-8")
stderr = output.stderr.decode("utf-8")
if stdout:
return json.loads(stdout)
print(stdout)
else:
raise Exception(stderr)
print(stderr)
def delete(resource_group, profile_name, name, no_wait=None):
params = get_params(locals())
command = "az cdn endpoint delete " + params
print(command)
output = subprocess.run(command, shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
stdout = output.stdout.decode("utf-8")
stderr = output.stderr.decode("utf-8")
if stdout:
return json.loads(stdout)
print(stdout)
else:
raise Exception(stderr)
print(stderr)
def show(resource_group, profile_name, name):
params = get_params(locals())
command = "az cdn endpoint show " + params
print(command)
output = subprocess.run(command, shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
stdout = output.stdout.decode("utf-8")
stderr = output.stderr.decode("utf-8")
if stdout:
return json.loads(stdout)
print(stdout)
else:
raise Exception(stderr)
print(stderr)
def list(resource_group, profile_name):
params = get_params(locals())
command = "az cdn endpoint list " + params
print(command)
output = subprocess.run(command, shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
stdout = output.stdout.decode("utf-8")
stderr = output.stderr.decode("utf-8")
if stdout:
return json.loads(stdout)
print(stdout)
else:
raise Exception(stderr)
print(stderr)
def load(resource_group, profile_name, name, content_paths, no_wait=None):
params = get_params(locals())
command = "az cdn endpoint load " + params
print(command)
output = subprocess.run(command, shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
stdout = output.stdout.decode("utf-8")
stderr = output.stderr.decode("utf-8")
if stdout:
return json.loads(stdout)
print(stdout)
else:
raise Exception(stderr)
print(stderr)
def purge(resource_group, profile_name, name, content_paths, no_wait=None):
params = get_params(locals())
command = "az cdn endpoint purge " + params
print(command)
output = subprocess.run(command, shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
stdout = output.stdout.decode("utf-8")
stderr = output.stderr.decode("utf-8")
if stdout:
return json.loads(stdout)
print(stdout)
else:
raise Exception(stderr)
print(stderr)
def validate_custom_domain(resource_group, profile_name, name, host_name):
params = get_params(locals())
command = "az cdn endpoint validate-custom-domain " + params
print(command)
output = subprocess.run(command, shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
stdout = output.stdout.decode("utf-8")
stderr = output.stderr.decode("utf-8")
if stdout:
return json.loads(stdout)
print(stdout)
else:
raise Exception(stderr)
print(stderr)
def create(resource_group, profile_name, name, origin, location=None, origin_host_header=None, origin_path=None, content_types_to_compress=None, enable_compression=None, no_http=None, no_https=None, query_string_caching_behavior=None, tags=None, no_wait=None):
params = get_params(locals())
command = "az cdn endpoint create " + params
print(command)
output = subprocess.run(command, shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
stdout = output.stdout.decode("utf-8")
stderr = output.stderr.decode("utf-8")
if stdout:
return json.loads(stdout)
print(stdout)
else:
raise Exception(stderr)
print(stderr)
def update(resource_group, profile_name, name, origin_host_header=None, origin_path=None, content_types_to_compress=None, enable_compression=None, no_http=None, no_https=None, query_string_caching=None, default_origin_group=None, tags=None, set=None, add=None, remove=None, force_string=None, no_wait=None):
params = get_params(locals())
command = "az cdn endpoint update " + params
print(command)
output = subprocess.run(command, shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
stdout = output.stdout.decode("utf-8")
stderr = output.stderr.decode("utf-8")
if stdout:
return json.loads(stdout)
print(stdout)
else:
raise Exception(stderr)
print(stderr)
| 37.618056 | 307 | 0.676943 | 683 | 5,417 | 5.263543 | 0.10981 | 0.077886 | 0.055633 | 0.066759 | 0.913769 | 0.899305 | 0.882058 | 0.882058 | 0.882058 | 0.844506 | 0 | 0.004667 | 0.208972 | 5,417 | 143 | 308 | 37.881119 | 0.834306 | 0 | 0 | 0.833333 | 0 | 0 | 0.062027 | 0.004061 | 0 | 0 | 0 | 0 | 0 | 1 | 0.075758 | false | 0 | 0.015152 | 0 | 0.166667 | 0.227273 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
4d0c85719bb6f9e585cd9bfa228b8029f4a07abf | 27,669 | py | Python | DeterministicParticleFlowControl/score_estimators/score_function_estimators.py | dimitra-maoutsa/DeterministicParticleFlowControl | 106bc9b01d7a4888e4ded18c5fb5a989fe672386 | [
"MIT"
] | 6 | 2021-12-13T14:30:31.000Z | 2022-01-24T07:54:57.000Z | DeterministicParticleFlowControl/score_estimators/score_function_estimators.py | dimitra-maoutsa/DeterministicParticleFlowControl | 106bc9b01d7a4888e4ded18c5fb5a989fe672386 | [
"MIT"
] | 10 | 2021-12-18T23:04:53.000Z | 2022-02-05T02:06:34.000Z | DeterministicParticleFlowControl/score_estimators/score_function_estimators.py | dimitra-maoutsa/DeterministicParticleFlowControl | 106bc9b01d7a4888e4ded18c5fb5a989fe672386 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
#Created on Sun Dec 12 03:35:29 2021
#@author: maout
### calculate score function from empirical distribution
### uses RBF kernel
import math
import numpy as np
from functools import reduce
from scipy.spatial.distance import cdist
import numba
__all__ = ["my_cdist", "score_function_multid_seperate",
"score_function_multid_seperate_all_dims",
"score_function_multid_seperate_old" ]
#%%
@numba.njit(parallel=True,fastmath=True)
def my_cdist(r,y, output,dist='euclidean'):
"""
Fast computation of pairwise distances between data points in r and y matrices.
Stores the distances in the output array.
Available distances: 'euclidean' and 'seucledian'
Parameters
----------
r : NxM array
First set of N points of dimension M.
y : N2xM array
Second set of N2 points of dimension M.
output : NxN2 array
Placeholder for storing the output of the computed distances.
dist : type of distance, optional
Select 'euclidian' or 'sqeuclidian' for Euclidian or squared Euclidian
distances. The default is 'euclidean'.
Returns
-------
None. (The result is stored in place in the provided array "output").
"""
N, M = r.shape
N2, M2 = y.shape
#assert( M == M2, 'The two inpus have different second dimention! Input should be N1xM and N2xM')
if dist == 'euclidean':
for i in numba.prange(N):
for j in numba.prange(N2):
tmp = 0.0
for k in range(M):
tmp += (r[i, k] - y[j, k])**2
output[i,j] = math.sqrt(tmp)
elif dist == 'sqeuclidean':
for i in numba.prange(N):
for j in numba.prange(N2):
tmp = 0.0
for k in range(M):
tmp += (r[i, k] - y[j, k])**2
output[i,j] = tmp
elif dist == 'l1':
for i in numba.prange(N):
for j in numba.prange(N2):
tmp = 0.0
for k in range(M):
tmp += (r[i, k] - y[j, k])**2
output[i,j] = math.sqrt(tmp)
return 0
def score_function_multid_seperate(X,Z,func_out=False, C=0.001,kern ='RBF',l=1,which=1,which_dim=1):
"""
Sparse kernel based estimation of multidimensional logarithmic gradient of empirical density represented
by samples X across dimension "which_dim" only.
- When `funct_out == False`: computes grad-log at the sample points.
- When `funct_out == True`: return a function for the grad log to be
employed for interpolation/estimation of
the logarithmic gradient in the vicinity of the samples.
For estimation across all dimensions simultaneously see also
See also
----------
score_function_multid_seperate_all_dims
Parameters
----------
X: N x dim array ,
N samples from the density (N x dim), where dim>=2 the dimensionality of the system.
Z: M x dim array,
inducing points points (M x dim).
func_out : Boolean,
True returns function, if False return grad-log-p on data points.
l: float or array-like,
lengthscale of rbf kernel (scalar or vector of size dim).
C: float,
weighting constant (leave it at default value to avoid
unreasonable contraction of deterministic trajectories).
which: (depracated) ,
do not use.
which_dim: int,
which gradient of log density we want to compute
(starts from 1 for the 0-th dimension).
Returns
-------
res1: array with logarithmic gadient of the density along the given dimension N_s x 1 or function
that accepts as inputs 2dimensional arrays of dimension (K x dim), where K>=1.
"""
if kern=='RBF':
"""
#@numba.njit(parallel=True,fastmath=True)
def Knumba(x,y,l,res,multil=False): #version of kernel in the numba form when the call already includes the output matrix
if multil:
for ii in range(len(l)):
tempi = np.zeros((x[:,ii].size, y[:,ii].size ), dtype=np.float64)
##puts into tempi the cdist result
my_cdist(x[:,ii:ii+1], y[:,ii:ii+1],tempi,'sqeuclidean')
res = np.multiply(res,np.exp(-tempi/(2*l[ii]*l[ii])))
else:
tempi = np.zeros((x.shape[0], y.shape[0] ), dtype=np.float64)
my_cdist(x, y,tempi,'sqeuclidean') #this sets into the array tempi the cdist result
res = np.exp(-tempi/(2*l*l))
#return 0
"""
def K(x,y,l,multil=False):
if multil:
res = np.ones((x.shape[0],y.shape[0]))
for ii in range(len(l)):
#tempi = np.zeros((x[:,ii].size, y[:,ii].size ))
##puts into tempi the cdist result
#my_cdist(x[:,ii:ii+1], y[:,ii:ii+1],tempi,'sqeuclidean')
tempi = cdist(x[:,ii:ii+1], y[:,ii:ii+1],'sqeuclidean')
res = np.multiply(res, np.exp(-tempi/(2*l[ii]*l[ii])))
return res
else:
tempi = np.zeros((x.shape[0], y.shape[0] ))
my_cdist(x, y,tempi,'sqeuclidean') #this sets into the array tempi the cdist result
return np.exp(-tempi/(2*l*l))
def K1(x,y,l,multil=False):
if multil:
res = np.ones((x.shape[0],y.shape[0]))
for ii in range(len(l)):
res = np.multiply(res,np.exp(-cdist(x[:,ii].reshape(-1,1), y[:,ii].reshape(-1,1),'sqeuclidean')/(2*l[ii]*l[ii])))
return res
else:
return np.exp(-cdist(x, y,'sqeuclidean')/(2*l*l))
#@njit
def grdx_K(x,y,l,which_dim=1,multil=False): #gradient with respect to the 1st argument - only which_dim
_,dim = x.shape
diffs = x[:,None]-y
#redifs = np.zeros((1*N,N))
ii = which_dim -1
if multil:
redifs = np.multiply(diffs[:,:,ii],K(x,y,l,True))/(l[ii]*l[ii])
else:
redifs = np.multiply(diffs[:,:,ii],K(x,y,l))/(l*l)
return redifs
"""
def grdy_K(x,y): # gradient with respect to the second argument
_,dim = x.shape
diffs = x[:,None]-y
#redifs = np.zeros((N,N))
ii = which_dim -1
redifs = np.multiply(diffs[:,:,ii],K(x,y,l))/(l*l)
return -redifs
#@njit
def ggrdxy_K(x,y):
N,dim = Z.shape
diffs = x[:,None]-y
redifs = np.zeros((N,N))
for ii in range(which_dim-1,which_dim):
for jj in range(which_dim-1,which_dim):
redifs[ii, jj ] = np.multiply(np.multiply(diffs[:,:,ii],diffs[:,:,jj])+(l*l)*(ii==jj),K(x,y))/(l**4)
return -redifs
"""
#############################################################################
elif kern=='periodic': ###############################################################################################
###periodic kernel ###do not use yet!!!
## K(x,y) = exp( -2 * sin^2( pi*| x-y |/ (2*pi) ) /l^2)
## Kx(x,y) = (K(x,y)* (x - y) cos(abs(x - y)/2) sin(abs(x - y)/2))/(l^2 abs(x - y))
## -(2 K(x,y) π (x - y) sin((2 π abs(x - y))/per))/(l^2 s abs(x - y))
##per = 2*np.pi ##period of the kernel
#l = 0.5
def K(x,y,l,multil=False):
if multil:
res = np.ones((x.shape[0],y.shape[0]))
for ii in range(len(l)):
#tempi = np.zeros((x[:,ii].size, y[:,ii].size ))
##puts into tempi the cdist result
#my_cdist(x[:,ii].reshape(-1,1), y[:,ii].reshape(-1,1),tempi, 'l1')
#res = np.multiply(res, np.exp(- 2* (np.sin(tempi/ 2 )**2) /(l[ii]*l[ii])) )
res = np.multiply(res, np.exp(- 2* (np.sin(cdist(x[:,ii].reshape(-1,1), y[:,ii].reshape(-1,1),'minkowski', p=1)/ 2 )**2) /(l[ii]*l[ii])) )
return -res
else:
#tempi = np.zeros((x.shape[0], y.shape[0] ))
##puts into tempi the cdist result
#my_cdist(x, y, tempi,'l1')
#res = np.exp(-2* ( np.sin( tempi / 2 )**2 ) /(l*l) )
res = np.exp(-2* ( np.sin( cdist(x, y,'minkowski', p=1) / 2 )**2 ) /(l*l) )
return res
def grdx_K(x,y,l,which_dim=1,multil=False): #gradient with respect to the 1st argument - only which_dim
#N,dim = x.shape
diffs = x[:,None]-y
#print('diffs:',diffs)
#redifs = np.zeros((1*N,N))
ii = which_dim -1
#print(ii)
if multil:
redifs = np.divide( np.multiply( np.multiply( np.multiply( -2*K(x,y,l,True),diffs[:,:,ii] ),np.sin( np.abs(diffs[:,:,ii]) / 2) ) ,np.cos( np.abs(diffs[:,:,ii]) / 2) ) , (l[ii]*l[ii]* np.abs(diffs[:,:,ii])) )
else:
redifs = np.divide( np.multiply( np.multiply( np.multiply( -2*diffs[:,:,ii],np.sin( np.abs(diffs[:,:,ii]) / 2) ) ,K(x,y,l) ),np.cos( np.abs(diffs[:,:,ii]) / 2) ) ,(l*l* np.abs(diffs[:,:,ii])) )
return -redifs
if isinstance(l, (list, tuple, np.ndarray)):
### for different lengthscales for each dimension
#numb-ed Kernel - uncomment this lines
#K_xz = np.ones((X.shape[0],Z.shape[0]), dtype=np.float64)
#Knumba(X,Z,l,K_xz,multil=True)
#Ks = np.ones((Z.shape[0],Z.shape[0]), dtype=np.float64)
#Knumba(Z,Z,l,Ks,multil=True)
K_xz = K(X,Z,l,multil=True)
Ks = K(Z,Z,l,multil=True)
multil = True
Ksinv = np.linalg.inv(Ks+ 1e-3 * np.eye(Z.shape[0]))
A = K_xz.T @ K_xz
gradx_K = -grdx_K(X,Z,l,which_dim=which_dim,multil=True) #-
else:
multil = False
K_xz = K(X,Z,l,multil=False)
Ks = K(Z,Z,l,multil=False)
Ksinv = np.linalg.inv(Ks+ 1e-3 * np.eye(Z.shape[0]))
A = K_xz.T @ K_xz
gradx_K = -grdx_K(X,Z,l,which_dim=which_dim,multil=False)
sumgradx_K = np.sum(gradx_K ,axis=0)
if func_out==False: #if output wanted is evaluation at data points
### evaluatiion at data points
res1 = -K_xz @ np.linalg.inv( C*np.eye(Z.shape[0], Z.shape[0]) + Ksinv @ A + 1e-3 * np.eye(Z.shape[0]))@ Ksinv@sumgradx_K
else:
#### for function output
if multil:
if kern=='RBF':
K_sz = lambda x: reduce(np.multiply, [ np.exp(-cdist(x[:,iii].reshape(-1,1), Z[:,iii].reshape(-1,1),'sqeuclidean')/(2*l[iii]*l[iii])) for iii in range(x.shape[1]) ])
elif kern=='periodic':
K_sz = lambda x: np.multiply(np.exp(-2*(np.sin( cdist(x[:,0].reshape(-1,1), Z[:,0].reshape(-1,1), 'minkowski', p=2)/(l[0]*l[0])))),np.exp(-2*(np.sin( cdist(x[:,1].reshape(-1,1), Z[:,1].reshape(-1,1),'sqeuclidean')/(l[1]*l[1])))))
else:
if kern=='RBF':
K_sz = lambda x: np.exp(-cdist(x, Z,'sqeuclidean')/(2*l*l))
elif kern=='periodic':
K_sz = lambda x: np.exp(-2* ( np.sin( cdist(x, Z,'minkowski', p=1) / 2 )**2 ) /(l*l) )
res1 = lambda x: K_sz(x) @ ( -np.linalg.inv( C*np.eye(Z.shape[0], Z.shape[0]) + Ksinv @ A + 1e-3 * np.eye(Z.shape[0])) ) @ Ksinv@sumgradx_K
return res1
def score_function_multid_seperate_all_dims(X,Z,func_out=False, C=0.001,kern ='RBF',l=1):
"""
Sparse kernel based estimation of multidimensional logarithmic gradient of empirical density represented
by samples X for all dimensions simultaneously.
- When `funct_out == False`: computes grad-log at the sample points.
- When `funct_out == True`: return a function for the grad log to be employed for interpolation/estimation of grad log
in the vicinity of the samples.
Parameters
-----------
X: N x dim array,
N samples from the density (N x dim), where dim>=2 the
dimensionality of the system.
Z: M x dim array,
inducing points points (M x dim).
func_out : Boolean,
True returns function,
if False returns grad-log-p evaluated on samples X.
l: float or array-like,
lengthscale of rbf kernel (scalar or vector of size dim).
C: float,
weighting constant
(leave it at default value to avoid unreasonable contraction
of deterministic trajectories).
kern: string,
options:
- 'RBF': radial basis function/Gaussian kernel
- 'periodic': periodic, not functional yet.
Returns
-------
res1: array with logarithmic gradient of the density N_s x dim or function
that accepts as inputs 2dimensional arrays of dimension (K x dim), where K>=1.
"""
if kern=='RBF':
"""
#@numba.njit(parallel=True,fastmath=True)
def Knumba(x,y,l,res,multil=False): #version of kernel in the numba form when the call already includes the output matrix
if multil:
for ii in range(len(l)):
tempi = np.zeros((x[:,ii].size, y[:,ii].size ), dtype=np.float64)
##puts into tempi the cdist result
my_cdist(x[:,ii:ii+1], y[:,ii:ii+1],tempi,'sqeuclidean')
res = np.multiply(res,np.exp(-tempi/(2*l[ii]*l[ii])))
else:
tempi = np.zeros((x.shape[0], y.shape[0] ), dtype=np.float64)
my_cdist(x, y,tempi,'sqeuclidean') #this sets into the array tempi the cdist result
res = np.exp(-tempi/(2*l*l))
return 0
"""
def K(x,y,l,multil=False):
if multil:
res = np.ones((x.shape[0],y.shape[0]))
for ii in range(len(l)):
tempi = np.zeros((x[:,ii].size, y[:,ii].size ))
##puts into tempi the cdist result
my_cdist(x[:,ii].reshape(-1,1), y[:,ii].reshape(-1,1),tempi,'sqeuclidean')
res = np.multiply(res,np.exp(-tempi/(2*l[ii]*l[ii])))
return res
else:
tempi = np.zeros((x.shape[0], y.shape[0] ))
my_cdist(x, y,tempi,'sqeuclidean') #this sets into the array tempi the cdist result
return np.exp(-tempi/(2*l*l))
#@njit
def grdx_K_all(x,y,l,multil=False): #gradient with respect to the 1st argument - only which_dim
N,dim = x.shape
M,_ = y.shape
diffs = x[:,None]-y
redifs = np.zeros((1*N,M,dim))
for ii in range(dim):
if multil:
redifs[:,:,ii] = np.multiply(diffs[:,:,ii],K(x,y,l,True))/(l[ii]*l[ii])
else:
redifs[:,:,ii] = np.multiply(diffs[:,:,ii],K(x,y,l))/(l*l)
return redifs
def grdx_K(x,y,l,which_dim=1,multil=False): #gradient with respect to the 1st argument - only which_dim
#_,dim = x.shape
#M,_ = y.shape
diffs = x[:,None]-y
#redifs = np.zeros((1*N,M))
ii = which_dim -1
if multil:
redifs = np.multiply(diffs[:,:,ii],K(x,y,l,True))/(l[ii]*l[ii])
else:
redifs = np.multiply(diffs[:,:,ii],K(x,y,l))/(l*l)
return redifs
#############################################################################
elif kern=='periodic': ###############################################################################################
### DO NOT USE "periodic" yet!!!!!!!
###periodic kernel
## K(x,y) = exp( -2 * sin^2( pi*| x-y |/ (2*pi) ) /l^2)
## Kx(x,y) = (K(x,y)* (x - y) cos(abs(x - y)/2) sin(abs(x - y)/2))/(l^2 abs(x - y))
## -(2 K(x,y) π (x - y) sin((2 π abs(x - y))/per))/(l^2 s abs(x - y))
#per = 2*np.pi ##period of the kernel
def K(x,y,l,multil=False):
if multil:
res = np.ones((x.shape[0],y.shape[0]))
for ii in range(len(l)):
#tempi = np.zeros((x[:,ii].size, y[:,ii].size ))
##puts into tempi the cdist result
#my_cdist(x[:,ii].reshape(-1,1), y[:,ii].reshape(-1,1),tempi, 'l1')
#res = np.multiply(res, np.exp(- 2* (np.sin(tempi/ 2 )**2) /(l[ii]*l[ii])) )
res = np.multiply(res, np.exp(- 2* (np.sin(cdist(x[:,ii].reshape(-1,1), y[:,ii].reshape(-1,1),'minkowski', p=1)/ 2 )**2) /(l[ii]*l[ii])) )
return -res
else:
#tempi = np.zeros((x.shape[0], y.shape[0] ))
##puts into tempi the cdist result
#my_cdist(x, y, tempi,'l1')
#res = np.exp(-2* ( np.sin( tempi / 2 )**2 ) /(l*l) )
res = np.exp(-2* ( np.sin( cdist(x, y,'minkowski', p=1) / 2 )**2 ) /(l*l) )
return res
def grdx_K(x,y,l,which_dim=1,multil=False): #gradient with respect to the 1st argument - only which_dim
#N,dim = x.shape
diffs = x[:,None]-y
#redifs = np.zeros((1*N,N))
ii = which_dim -1
if multil:
redifs = np.divide( np.multiply( np.multiply( np.multiply( -2*K(x,y,l,True),diffs[:,:,ii] ),np.sin( np.abs(diffs[:,:,ii]) / 2) ) ,np.cos( np.abs(diffs[:,:,ii]) / 2) ) , (l[ii]*l[ii]* np.abs(diffs[:,:,ii])) )
else:
redifs = np.divide( np.multiply( np.multiply( np.multiply( -2*diffs[:,:,ii],np.sin( np.abs(diffs[:,:,ii]) / 2) ) ,K(x,y,l) ),np.cos( np.abs(diffs[:,:,ii]) / 2) ) ,(l*l* np.abs(diffs[:,:,ii])) )
return -redifs
#dim = X.shape[1]
if isinstance(l, (list, tuple, np.ndarray)):
multil = True
### for different lengthscales for each dimension
#K_xz = np.ones((X.shape[0],Z.shape[0]), dtype=np.float64)
#Knumba(X,Z,l,K_xz,multil=True)
#Ks = np.ones((Z.shape[0],Z.shape[0]), dtype=np.float64)
#Knumba(Z,Z,l,Ks,multil=True)
K_xz = K(X,Z,l,multil=True)
Ks = K(Z,Z,l,multil=True)
#print(Z.shape)
Ksinv = np.linalg.inv(Ks+ 1e-3 * np.eye(Z.shape[0]))
A = K_xz.T @ K_xz
gradx_K = -grdx_K_all(X,Z,l,multil=True) #-
#gradxK = np.zeros((X.shape[0],Z.shape[0],dim))
#for ii in range(dim):
#gradxK[:,:,ii] = -grdx_K(X,Z,l,multil=True,which_dim=ii+1)
#np.testing.assert_allclose(gradxK, gradx_K)
else:
multil = False
K_xz = K(X,Z,l,multil=False)
Ks = K(Z,Z,l,multil=False)
Ksinv = np.linalg.inv(Ks+ 1e-3 * np.eye(Z.shape[0]))
A = K_xz.T @ K_xz
gradx_K = -grdx_K_all(X,Z,l,multil=False) #shape: (N,M,dim)
sumgradx_K = np.sum(gradx_K ,axis=0) ##last axis will have the gradient for each dimension ### shape (M, dim)
if func_out==False: #if output wanted is evaluation at data points
# res1 = np.zeros((N, dim))
# ### evaluatiion at data points
# for di in range(dim):
# res1[:,di] = -K_xz @ np.linalg.inv( C*np.eye(Z.shape[0], Z.shape[0]) + Ksinv @ A + 1e-3 * np.eye(Z.shape[0]))@ Ksinv@sumgradx_K[:,di]
res1 = -K_xz @ np.linalg.inv( C*np.eye(Z.shape[0], Z.shape[0]) + Ksinv @ A + 1e-3 * np.eye(Z.shape[0]))@ Ksinv@sumgradx_K
#res1 = np.einsum('ik,kj->ij', -K_xz @ np.linalg.inv( C*np.eye(Z.shape[0], Z.shape[0]) + Ksinv @ A + 1e-3 * np.eye(Z.shape[0]))@ Ksinv, sumgradx_K)
else:
#### for function output
if multil:
if kern=='RBF':
K_sz = lambda x: reduce(np.multiply, [ np.exp(-cdist(x[:,iii].reshape(-1,1), Z[:,iii].reshape(-1,1),'sqeuclidean')/(2*l[iii]*l[iii])) for iii in range(x.shape[1]) ])
elif kern=='periodic':
K_sz = lambda x: np.multiply(np.exp(-2*(np.sin( cdist(x[:,0].reshape(-1,1), Z[:,0].reshape(-1,1), 'minkowski', p=2)/(l[0]*l[0])))),np.exp(-2*(np.sin( cdist(x[:,1].reshape(-1,1), Z[:,1].reshape(-1,1),'sqeuclidean')/(l[1]*l[1])))))
else:
if kern=='RBF':
K_sz = lambda x: np.exp(-cdist(x, Z,'sqeuclidean')/(2*l*l))
elif kern=='periodic':
K_sz = lambda x: np.exp(-2* ( np.sin( cdist(x, Z,'minkowski', p=1) / 2 )**2 ) /(l*l) )
res1 = lambda x: K_sz(x) @ ( -np.linalg.inv( C*np.eye(Z.shape[0], Z.shape[0]) + Ksinv @ A + 1e-3 * np.eye(Z.shape[0])) ) @ Ksinv@sumgradx_K
#np.testing.assert_allclose(res2, res1)
return res1 ### shape out N x dim
def score_function_multid_seperate_old(X,Z,func_out=False, C=0.001,kern ='RBF',l=1,which=1,which_dim=1):
"""
.. warning:: !!!This version computes distances with cdist from scipy. If numba is not available use this estimator.!!!!
Sparse kernel based estimation of multidimensional logarithmic gradient of empirical density represented
by samples X across dimension "which_dim" only.
- When `funct_out == False`: computes grad-log at the sample points.
- When `funct_out == True`: return a function for the grad log to be employed for interpolation/estimation of grad log
in the vicinity of the samples.
Parameters
-----------
X: N samples from the density (N x dim), where dim>=2 the dimensionality of the system,
Z: inducing points points (M x dim),
func_out : Boolean, True returns function, if False return grad-log-p on data points,
l: lengthscale of rbf kernel (scalar or vector of size dim),
C: weighting constant (leave it at default value to avoid unreasonable contraction of deterministic trajectories)
which: return 1: grad log p(x)
which_dim: which gradient of log density we want to compute (starts from 1 for the 0-th dimension)
Returns
-------
res1: array with density along the given dimension N_s x 1 or function
that accepts as inputs 2dimensional arrays of dimension (K x dim), where K>=1.
For estimation across all dimensions simultaneously see also
See also
---------
score_function_multid_seperate_all_dims
"""
if kern=='RBF':
def K(x,y,l,multil=False):
if multil:
res = np.ones((x.shape[0],y.shape[0]))
for ii in range(len(l)):
res = np.multiply(res,np.exp(-cdist(x[:,ii].reshape(-1,1), y[:,ii].reshape(-1,1),'sqeuclidean')/(2*l[ii]*l[ii])))
return res
else:
return np.exp(-cdist(x, y,'sqeuclidean')/(2*l*l))
def grdx_K(x,y,l,which_dim=1,multil=False): #gradient with respect to the 1st argument - only which_dim
#N,dim = x.shape
diffs = x[:,None]-y
#redifs = np.zeros((1*N,N))
ii = which_dim -1
if multil:
redifs = np.multiply(diffs[:,:,ii],K(x,y,l,True))/(l[ii]*l[ii])
else:
redifs = np.multiply(diffs[:,:,ii],K(x,y,l))/(l*l)
return redifs
"""
def grdy_K(x,y): # gradient with respect to the second argument
#N,dim = x.shape
diffs = x[:,None]-y
#redifs = np.zeros((N,N))
ii = which_dim -1
redifs = np.multiply(diffs[:,:,ii],K(x,y,l))/(l*l)
return -redifs
def ggrdxy_K(x,y):
N,dim = Z.shape
diffs = x[:,None]-y
redifs = np.zeros((N,N))
for ii in range(which_dim-1,which_dim):
for jj in range(which_dim-1,which_dim):
redifs[ii, jj ] = np.multiply(np.multiply(diffs[:,:,ii],diffs[:,:,jj])+(l*l)*(ii==jj),K(x,y))/(l**4)
return -redifs
"""
if isinstance(l, (list, tuple, np.ndarray)):
### for different lengthscales for each dimension
K_xz = K(X,Z,l,multil=True)
Ks = K(Z,Z,l,multil=True)
multil = True ##just a boolean to keep track if l is scalar or vector
Ksinv = np.linalg.inv(Ks+ 1e-3 * np.eye(Z.shape[0]))
A = K_xz.T @ K_xz
gradx_K = -grdx_K(X,Z,l,which_dim=which_dim,multil=True)
else:
multil = False
K_xz = K(X,Z,l,multil=False)
Ks = K(Z,Z,l,multil=False)
Ksinv = np.linalg.inv(Ks+ 1e-3 * np.eye(Z.shape[0]))
A = K_xz.T @ K_xz
gradx_K = -grdx_K(X,Z,l,which_dim=which_dim,multil=False)
sumgradx_K = np.sum(gradx_K ,axis=0)
if func_out==False: #For evaluation at data points!!!
### evaluatiion at data points
res1 = -K_xz @ np.linalg.inv( C*np.eye(Z.shape[0], Z.shape[0]) + Ksinv @ A + 1e-3 * np.eye(Z.shape[0]))@ Ksinv@sumgradx_K
else:
#### For functional output!!!!
if multil:
if kern=='RBF':
K_sz = lambda x: reduce(np.multiply, [ np.exp(-cdist(x[:,iii].reshape(-1,1), Z[:,iii].reshape(-1,1),'sqeuclidean')/(2*l[iii]*l[iii])) for iii in range(x.shape[1]) ])
else:
K_sz = lambda x: np.exp(-cdist(x, Z,'sqeuclidean')/(2*l*l))
res1 = lambda x: K_sz(x) @ ( -np.linalg.inv( C*np.eye(Z.shape[0], Z.shape[0]) + Ksinv @ A + 1e-3 * np.eye(Z.shape[0])) ) @ Ksinv@sumgradx_K
return res1
#%%
| 43.988871 | 245 | 0.479056 | 3,846 | 27,669 | 3.394696 | 0.079823 | 0.010417 | 0.019838 | 0.007966 | 0.8678 | 0.853018 | 0.840993 | 0.834635 | 0.829657 | 0.828355 | 0 | 0.023394 | 0.357331 | 27,669 | 628 | 246 | 44.058917 | 0.710831 | 0.332502 | 0 | 0.855932 | 0 | 0 | 0.033722 | 0.007375 | 0 | 0 | 0 | 0 | 0 | 1 | 0.067797 | false | 0 | 0.021186 | 0 | 0.182203 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
4d0d76a351e3079ae8a14256fb72e4498ad12ffb | 133 | py | Python | Module 3/Chapter 5/ch5_3.py | PacktPublishing/Natural-Language-Processing-Python-and-NLTK | bb7fd9a3071b4247d13accfbf0a48eefec76e925 | [
"MIT"
] | 50 | 2016-12-11T13:49:01.000Z | 2022-03-20T19:47:55.000Z | Module 3/Chapter 5/ch5_3.py | PacktPublishing/Natural-Language-Processing-Python-and-NLTK | bb7fd9a3071b4247d13accfbf0a48eefec76e925 | [
"MIT"
] | null | null | null | Module 3/Chapter 5/ch5_3.py | PacktPublishing/Natural-Language-Processing-Python-and-NLTK | bb7fd9a3071b4247d13accfbf0a48eefec76e925 | [
"MIT"
] | 40 | 2017-06-14T14:02:48.000Z | 2021-10-14T06:25:00.000Z | import nltk
from nltk.corpus import treebank_chunk
print(treebank_chunk.chunked_sents()[1])
treebank_chunk.chunked_sents()[1].draw()
| 26.6 | 40 | 0.819549 | 20 | 133 | 5.2 | 0.55 | 0.375 | 0.384615 | 0.480769 | 0.5 | 0 | 0 | 0 | 0 | 0 | 0 | 0.016 | 0.06015 | 133 | 4 | 41 | 33.25 | 0.816 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0.25 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 7 |
4d80fe84d6427e2090a44701a91d1455d09a7726 | 2,745 | py | Python | autologin.py | ayush016/Hadoop-Mapreduce-Cluster-Auto-configure | bb27f70b7a17337f736d0e80684278436140a1c8 | [
"MIT"
] | null | null | null | autologin.py | ayush016/Hadoop-Mapreduce-Cluster-Auto-configure | bb27f70b7a17337f736d0e80684278436140a1c8 | [
"MIT"
] | null | null | null | autologin.py | ayush016/Hadoop-Mapreduce-Cluster-Auto-configure | bb27f70b7a17337f736d0e80684278436140a1c8 | [
"MIT"
] | null | null | null | import os
import sys
myip = sys.argv[1]
f = open('/etc/hadoop/hdfs-site.xml','r')
contents = f.readlines()
f.close()
#print contents
#print type(contents)
count = 0
for i in contents:
count = count + 1
print i
if(i=="<configuration>\n" or i=="<configuration>"):
break
#print count
os.system("rm -rf /data")
f = open('/etc/hadoop/hdfs-site.xml','w')
x=0
for i in contents:
if(x<count):
f.write(i)
if(i=="</configuration>\n"):
f.write(i)
x=x+1
f.close()
f = open('/etc/hadoop/hdfs-site.xml','r')
contents = f.readlines()
f.close()
f=open("/etc/hadoop/hdfs-site.xml","w")
count = 0
for i in contents:
count = count + 1
#print i
if(i=="<configuration>\n" or i=="<configuration>"):
break
contents.insert(count, "<property>\n<name>dfs.data.dir</name>\n<value>/data</value>\n</property>\n")
contents = "".join(contents)
f.write(contents)
f.close()
########################################################
datanode_core=open("/etc/hadoop/core-site.xml","r")
contents = datanode_core.readlines()
datanode_core.close()
count = 0
for i in contents:
count = count + 1
#print i
if(i=="<configuration>\n" or i=="<configuration>"):
break
datanode_core=open("/etc/hadoop/core-site.xml","w")
x=0
for i in contents:
if(x<count):
datanode_core.write(i)
if(i=="</configuration>\n"):
datanode_core.write(i)
x=x+1
datanode_core = open('/etc/hadoop/core-site.xml','r')
contents = datanode_core.readlines()
datanode_core.close()
datanode_core=open("/etc/hadoop/core-site.xml","w")
contents.insert(count, "<property>\n<name>fs.default.name</name>\n<value>hdfs://"+myip+":9001</value>\n</property>\n")
contents = "".join(contents)
datanode_core.write(contents)
datanode_core.close()
os.system("iptables -F")
os.system("hadoop-daemon.sh start datanode")
#######################################################################################
datanode_core=open("/etc/hadoop/mapred-site.xml","r")
contents = datanode_core.readlines()
datanode_core.close()
count = 0
for i in contents:
count = count + 1
#print i
if(i=="<configuration>\n" or i=="<configuration>"):
break
datanode_core=open("/etc/hadoop/mapred-site.xml","w")
x=0
for i in contents:
if(x<count):
datanode_core.write(i)
if(i=="</configuration>\n"):
datanode_core.write(i)
x=x+1
datanode_core = open('/etc/hadoop/mapred-site.xml','r')
contents = datanode_core.readlines()
datanode_core.close()
datanode_core=open("/etc/hadoop/mapred-site.xml","w")
contents.insert(count, "<property>\n<name>mapred.job.tracker</name>\n<value>hdfs://"+sys.argv[2]+":9002</value>\n</property>\n")
contents = "".join(contents)
datanode_core.write(contents)
datanode_core.close()
os.system("iptables -F")
os.system("hadoop-daemon.sh start tasktracker")
| 23.067227 | 128 | 0.650638 | 417 | 2,745 | 4.22542 | 0.139089 | 0.163451 | 0.088536 | 0.086266 | 0.874007 | 0.868899 | 0.837684 | 0.817821 | 0.807037 | 0.729852 | 0 | 0.009728 | 0.101275 | 2,745 | 118 | 129 | 23.262712 | 0.704499 | 0.024044 | 0 | 0.804598 | 0 | 0.034483 | 0.33452 | 0.218664 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.022989 | null | null | 0.011494 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
4d86ceaa51929882b37b42d49de567ff846616cb | 17,458 | py | Python | main_app/tests/basic/test_units.py | pflis98/idkwhatiamdoing | d77f1039ef96fd09dce4c38264e5424c61a0b908 | [
"bzip2-1.0.6"
] | 3 | 2019-04-20T13:56:42.000Z | 2019-09-29T11:26:42.000Z | main_app/tests/basic/test_units.py | pflis98/idkwhatiamdoing | d77f1039ef96fd09dce4c38264e5424c61a0b908 | [
"bzip2-1.0.6"
] | 7 | 2020-06-05T20:27:56.000Z | 2022-02-10T07:04:36.000Z | main_app/tests/basic/test_units.py | pflis98/idkwhatiamdoing | d77f1039ef96fd09dce4c38264e5424c61a0b908 | [
"bzip2-1.0.6"
] | 7 | 2019-04-19T20:08:21.000Z | 2019-07-16T19:32:33.000Z | from django.urls import reverse
from main_app.tests.TestCaseSpecialUser import *
from django.test import tag
from unittest import skip
from main_app.tests.TestSetupDatabase import *
# region add
@tag('unit', 'add', 'superuser')
class AddUnitViewTestSuperuser(TestCaseSuperuser):
def test_view_url_exists_at_desired_location(self):
response = self.client.get('/unit/new')
self.assertEqual(response.status_code, 200)
def test_view_url_accessible_by_name(self):
response = self.client.get(reverse('add_ingredient'))
self.assertEqual(response.status_code, 200)
def test_view_uses_correct_template(self):
response = self.client.get(reverse('add_unit'))
self.assertEqual(response.status_code, 200)
self.assertTemplateUsed(response, 'main_app/unit_form.html')
def test_view_regular_add(self):
response = self.client.post(reverse('add_unit'),
{'name': 'Gram',
'amount': 100})
self.assertEqual(response.status_code, 302)
self.assertTrue(Unit.objects.filter(name='Gram').exists())
self.assertEqual(response.url, '/unit/')
def test_view_correct_redirection(self):
response = self.client.post(reverse('add_unit'),
{'name': 'Gram',
'amount': 100},
follow=True)
self.assertRedirects(response,
reverse('unit'),
status_code=302,
target_status_code=200)
@tag('unit', 'add', 'normal_user')
class AddUnitViewTestNormalUser(TestCase):
def test_view_correct_redirection_get(self):
response = self.client.get(reverse('add_unit'), follow=True)
self.assertRedirects(response,
reverse('superuser_required') + "?next=" + reverse('add_unit'),
status_code=302,
target_status_code=200)
def test_view_correct_redirection_post(self):
response = self.client.post(reverse('add_unit'), {'name': 'Gram', 'amount': 100}, follow=True)
self.assertRedirects(response,
reverse('superuser_required') + "?next=" + reverse('add_unit'),
status_code=302,
target_status_code=200)
self.assertFalse(Unit.objects.filter(name='Gram').exists())
@tag('unit', 'add', 'logged_user')
class AddUnitViewTestLoggedUser(TestCaseLoggedUser):
def test_view_correct_redirection_get(self):
response = self.client.get(reverse('add_unit'), follow=True)
self.assertRedirects(response,
reverse('superuser_required') + "?next=" + reverse('add_unit'),
status_code=302,
target_status_code=200)
def test_view_correct_redirection_post(self):
response = self.client.post(reverse('add_unit'), {'name': 'Gram', 'amount': 100}, follow=True)
self.assertRedirects(response,
reverse('superuser_required') + "?next=" + reverse('add_unit'),
status_code=302,
target_status_code=200)
self.assertFalse(Unit.objects.filter(name='Gram').exists())
# endregion
# region delete
@skip
@tag('unit', 'delete', 'superuser')
class DeleteUnitViewTestSuperuser(TestCaseSuperuser):
def test_view_url_exists_at_desired_location_id_doesnt_exists(self):
response = self.client.get('/unit/999/delete')
self.assertEqual(response.status_code, 404)
def test_view_url_exists_at_desired_location_id_exists(self):
TestDatabase.create_default_test_database(units=True)
item = Unit.objects.only('id').get(name='Gram').id
response = self.client.get('/unit/'+str(item.id)+'/delete')
self.assertEqual(response.status_code, 302)
def test_view_url_accessible_by_name(self):
TestDatabase.create_default_test_database(units=True)
item = Unit.objects.only('id').get(name='Gram').id
response = self.client.get(reverse('unit_delete', kwargs={'object_id': item}))
self.assertEqual(response.status_code, 302)
def test_view_deletes_properly(self):
TestDatabase.create_default_test_database(units=True, ingredients=True, recipes=True, tools=True)
item = Unit.objects.only('id').get(name='Gram').id
response = self.client.get(reverse('unit_delete', kwargs={'object_id': item}))
self.assertEqual(response.status_code, 302)
# things should be deleted cascade
self.assertTrue(Ingredient.objects.filter(name='Woda').exists())
self.assertTrue(Recipe.objects.filter(name='Lemoniada').exists())
self.assertFalse(Unit.objects.filter(name='Gram').exists())
def test_view_redirects_properly(self):
TestDatabase.create_default_test_database(units=True)
item = Unit.objects.only('id').get(name='Gram').id
response = self.client.get(reverse('unit_delete', kwargs={'object_id': item}), follow=True)
self.assertRedirects(response, reverse('unit'))
@tag('unit', 'delete', 'logged_user')
class DeleteUnitViewTestLoggedUser(TestCaseLoggedUser):
def test_view_deletes(self):
TestDatabase.create_default_test_database(units=True)
item = Unit.objects.only('id').get(name='Gram').id
response = self.client.get(reverse('unit_delete', kwargs={'object_id': item}), follow=True)
self.assertRedirects(response,
reverse('superuser_required') + "?next=" + reverse('unit_delete',
kwargs={'object_id': item}),
status_code=302,
target_status_code=200)
self.assertTrue(Unit.objects.filter(name='Gram').exists())
def test_view_deletes_doesnt_exist(self):
response = self.client.get(reverse('unit_delete', kwargs={'object_id': 999}), follow=True)
self.assertRedirects(response,
reverse('superuser_required') + "?next=" + reverse('unit_delete',
kwargs={'object_id': 999}),
status_code=302,
target_status_code=200)
@tag('unit', 'delete', 'logged_user')
class DeleteUnitViewTestNormalUser(TestCase):
def test_view_deletes(self):
TestDatabase.create_default_test_database(units=True)
item = Unit.objects.only('id').get(name='Gram').id
response = self.client.get(reverse('unit_delete', kwargs={'object_id': item}), follow=True)
self.assertRedirects(response,
reverse('superuser_required') + "?next=" + reverse('unit_delete',
kwargs={'object_id': item}),
status_code=302,
target_status_code=200)
self.assertTrue(Unit.objects.filter(name='Gram').exists())
def test_view_deletes_doesnt_exist(self):
response = self.client.get(reverse('unit_delete', kwargs={'object_id': 999}), follow=True)
self.assertRedirects(response,
reverse('superuser_required') + "?next=" + reverse('unit_delete',
kwargs={'object_id': 999}),
status_code=302,
target_status_code=200)
# endregion
# region getid
@tag('unit', 'id', 'superuser')
class UnitIDViewTest(TestCaseSuperuser):
@classmethod
def setUpTestData(cls):
TestDatabase.create_default_test_database(units=True)
def test_view_url_exists_at_desired_location_id_doesnt_exists(self):
response = self.client.get('/unit/999')
self.assertEqual(response.status_code, 404)
def test_view_url_exists_at_desired_location_id_exists(self):
item = Unit.objects.only('id').get(name='Gram').id
response = self.client.get('/unit/' + str(item))
self.assertEqual(response.status_code, 200)
def test_view_url_accessible_by_name(self):
item = Unit.objects.only('id').get(name='Gram').id
response = self.client.get(reverse('unit_id', kwargs={'object_id': item}))
self.assertEqual(response.status_code, 200)
def test_view_uses_correct_template(self):
item = Unit.objects.only('id').get(name='Gram').id
response = self.client.get(reverse('unit_id', kwargs={'object_id': item}))
self.assertTemplateUsed(response, 'main_app/unit_detail.html')
def test_view_correct_texts(self):
item = Unit.objects.only('id').get(name='Kilogram').id
response = self.client.get(reverse('unit_id', kwargs={'object_id': item}))
self.assertContains(response, "10") # amount
self.assertContains(response, "Kilogram") # name
@tag('unit', 'id', 'normal_user')
class UnitIDViewTestNormalUser(TestCase):
@classmethod
def setUpTestData(cls):
TestDatabase.create_default_test_database(units=True)
def test_view_correct_redirection(self):
item = Unit.objects.only('id').get(name='Kilogram').id
response = self.client.get(reverse('unit_id', kwargs={'object_id': item}), follow=True)
self.assertRedirects(response,
reverse('superuser_required') + "?next=" + reverse('unit_id',
kwargs={'object_id': item}))
def test_view_correct_redirection_doesnt_exist(self):
response = self.client.get(reverse('unit_id', kwargs={'object_id': 999}), follow=True)
self.assertRedirects(response,
reverse('superuser_required') + "?next=" + reverse('unit_id',
kwargs={'object_id': 999}))
@tag('unit', 'id', 'logged_user')
class UnitIDViewTestLoggedUser(TestCaseLoggedUser):
@classmethod
def setUpTestData(cls):
TestDatabase.create_default_test_database(units=True)
def test_view_correct_redirection(self):
item = Unit.objects.only('id').get(name='Kilogram').id
response = self.client.get(reverse('unit_id', kwargs={'object_id': item}), follow=True)
self.assertRedirects(response,
reverse('superuser_required') + "?next=" + reverse('unit_id',
kwargs={'object_id': item}))
def test_view_correct_redirection_doesnt_exist(self):
response = self.client.get(reverse('unit_id', kwargs={'object_id': 999}), follow=True)
self.assertRedirects(response,
reverse('superuser_required') + "?next=" + reverse('unit_id',
kwargs={'object_id': 999}))
# endregion
# region update
@tag('unit', 'update', 'superuser')
class UpdateUnitViewTestSuperuser(TestCaseSuperuser):
def test_view_url_exists_at_desired_location_id_doesnt_exists(self):
response = self.client.get('/unit/999/update')
self.assertEqual(response.status_code, 404)
def test_view_url_exists_at_desired_location_id_exists(self):
TestDatabase.create_default_test_database(units=True)
item = Unit.objects.only('id').get(name='Kilogram').id
response = self.client.get('/unit/'+str(item)+'/update')
self.assertEqual(response.status_code, 200)
def test_view_url_accessible_by_name(self):
TestDatabase.create_default_test_database(units=True)
item = Unit.objects.only('id').get(name='Kilogram').id
response = self.client.get(reverse('unit_update', kwargs={'object_id': item}))
self.assertEqual(response.status_code, 200)
def test_view_updates_default_values(self):
TestDatabase.create_default_test_database(units=True)
item = Unit.objects.only('id').get(name='Kilogram').id
response = self.client.get(reverse('unit_update', kwargs={'object_id': item}))
self.assertEqual(response.context['form'].initial['name'], 'Kilogram')
self.assertEqual(response.context['form'].initial['amount'], 10)
def test_view_updates_properly_no_modifications(self):
TestDatabase.create_default_test_database(units=True, ingredients=True, recipes=True)
item = Unit.objects.only('id').get(name='Kilogram').id
response_get = self.client.get(reverse('unit_update', kwargs={'object_id': item}))
unit_data = response_get.context['form'].initial
response = self.client.post(reverse('unit_update', kwargs={'object_id': item}),
unit_data)
self.assertEqual(response.status_code, 302)
self.assertTrue(Unit.objects.filter(id=item).exists())
self.assertEquals(Unit.objects.get(id=item).name, 'Kilogram')
self.assertEquals(Unit.objects.get(id=item).amount, 10)
self.assertTrue(Ingredient.objects.filter(name='Woda').exists())
self.assertTrue(Ingredient.objects.get(name='Woda').
units.filter(name='Kilogram').exists())
self.assertTrue(Recipe.objects.filter(name='Lemoniada').exists())
self.assertTrue(Recipe.objects.get(name='Lemoniada').
ingredients.filter(name='Cytryna').exists())
self.assertTrue(Recipe.objects.filter(name='Lemoniada',
recipeingredient__ingredient__name='Cytryna',
recipeingredient__unit__name='Kilogram'))
def test_view_updates_properly_with_modifications(self):
TestDatabase.create_default_test_database(units=True, ingredients=True, recipes=True)
item = Unit.objects.only('id').get(name='Kilogram').id
response_get = self.client.get(reverse('unit_update', kwargs={'object_id': item}))
unit_data = response_get.context['form'].initial
unit_data['name'] = "Litr"
unit_data['amount'] = 50
response = self.client.post(reverse('unit_update', kwargs={'object_id': item}),
unit_data)
self.assertEqual(response.status_code, 302)
self.assertTrue(Unit.objects.filter(id=item).exists())
self.assertEquals(Unit.objects.get(id=item).name, 'Litr')
self.assertEquals(Unit.objects.get(id=item).amount, 50)
self.assertTrue(Ingredient.objects.filter(name='Woda').exists())
self.assertFalse(Ingredient.objects.get(name='Woda').
units.filter(name='Kilogram').exists())
self.assertTrue(Ingredient.objects.get(name='Woda').
units.filter(name='Litr').exists())
self.assertTrue(Recipe.objects.filter(name='Lemoniada').exists())
self.assertTrue(Recipe.objects.get(name='Lemoniada').
ingredients.filter(name='Cytryna').exists())
self.assertTrue(Recipe.objects.filter(name='Lemoniada',
recipeingredient__ingredient__name='Cytryna',
recipeingredient__unit__name='Litr'))
@tag('unit', 'update', 'logged_user')
class UpdateUnitViewTestLoggedUser(TestCaseLoggedUser):
def test_view_doesnt_exists(self):
response = self.client.get(reverse('unit_update', kwargs={'object_id': 999}), follow=True)
self.assertRedirects(response,
reverse('superuser_required') + "?next=" + reverse(
'unit_update',
kwargs={'object_id': 999}))
def test_view(self):
TestDatabase.create_default_test_database(units=True)
item = Unit.objects.only('id').get(name='Kilogram').id
response = self.client.get(reverse('unit_update', kwargs={'object_id': item}), follow=True)
self.assertRedirects(response,
reverse('superuser_required') + "?next=" + reverse(
'unit_update',
kwargs={'object_id': item}))
# todo post?
@tag('unit', 'update', 'normal_user')
class UpdateUnitViewTestNormalUser(TestCase):
def test_view_doesnt_exists(self):
response = self.client.get(reverse('unit_update', kwargs={'object_id': 999}), follow=True)
self.assertRedirects(response,
reverse('superuser_required') + "?next=" + reverse(
'unit_update',
kwargs={'object_id': 999}))
def test_view(self):
TestDatabase.create_default_test_database(units=True)
item = Unit.objects.only('id').get(name='Kilogram').id
response = self.client.get(reverse('unit_update', kwargs={'object_id': item}), follow=True)
self.assertRedirects(response,
reverse('superuser_required') + "?next=" + reverse(
'unit_update',
kwargs={'object_id': item}))
# todo post?
# endregion
| 49.039326 | 108 | 0.607973 | 1,838 | 17,458 | 5.55876 | 0.073449 | 0.038172 | 0.039836 | 0.063717 | 0.877459 | 0.87237 | 0.842615 | 0.83919 | 0.823627 | 0.79818 | 0 | 0.013149 | 0.263776 | 17,458 | 355 | 109 | 49.177465 | 0.781763 | 0.00905 | 0 | 0.760563 | 0 | 0 | 0.114544 | 0.002777 | 0 | 0 | 0 | 0.002817 | 0.232394 | 1 | 0.140845 | false | 0 | 0.017606 | 0 | 0.200704 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
12fda8ceaf80c019e224e8480aa573dfc0fd5fef | 91 | py | Python | vlnce_baselines/__init__.py | awesomericky/VLN-CE | 896d7b7775a42e6060f679620d232fdba8e0137a | [
"MIT"
] | 2 | 2020-11-05T09:12:51.000Z | 2021-12-19T02:44:32.000Z | vlnce_baselines/__init__.py | awesomericky/VLN-CE | 896d7b7775a42e6060f679620d232fdba8e0137a | [
"MIT"
] | null | null | null | vlnce_baselines/__init__.py | awesomericky/VLN-CE | 896d7b7775a42e6060f679620d232fdba8e0137a | [
"MIT"
] | null | null | null | from vlnce_baselines import dagger_trainer
from vlnce_baselines.common import environments
| 30.333333 | 47 | 0.901099 | 12 | 91 | 6.583333 | 0.666667 | 0.227848 | 0.455696 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.087912 | 91 | 2 | 48 | 45.5 | 0.951807 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
42a4da0f92901025133bd831bfa412871928a296 | 5,279 | py | Python | generator/tests/test_generate.py | horvathandris/phenoflow | d0109f3702bc180954051170a56e017af52636fb | [
"MIT"
] | null | null | null | generator/tests/test_generate.py | horvathandris/phenoflow | d0109f3702bc180954051170a56e017af52636fb | [
"MIT"
] | null | null | null | generator/tests/test_generate.py | horvathandris/phenoflow | d0109f3702bc180954051170a56e017af52636fb | [
"MIT"
] | null | null | null | import unittest, json
from starlette.testclient import TestClient
from api import routes
import oyaml as yaml
class BasicTests(unittest.TestCase):
def test_generate(self):
client = TestClient(routes.app)
response = client.post('/generate');
assert response.status_code == 200;
@staticmethod
def generate_twosteps():
client = TestClient(routes.app)
response = client.post('/generate', json=[
{"id":1,"name":"stepName","doc":"doc","type":"type","position":1,"createdAt":"2020-04-02T10:11:47.805Z","updatedAt":"2020-04-02T10:11:47.805Z","workflowId":1,
"inputs":[
{"id":1,"doc":"doc","createdAt":"2020-04-02T10:11:47.829Z","updatedAt":"2020-04-02T10:11:47.829Z","stepId":1}
],
"outputs":[
{"id":1,"doc":"doc","extension":"extension","createdAt":"2020-04-02T10:11:47.850Z","updatedAt":"2020-04-02T10:11:47.850Z","stepId":1}
],
"implementation":{"id":1,"fileName":"hello-world.py","language":"python","createdAt":"2020-04-02T10:11:47.891Z","updatedAt":"2020-04-02T10:11:47.891Z","stepId":1}
},
{"id":2,"name":"stepName","doc":"doc","type":"type","position":2,"createdAt":"2020-04-02T10:11:47.899Z","updatedAt":"2020-04-02T10:11:47.899Z","workflowId":1,
"inputs":[
{"id":2,"doc":"doc","createdAt":"2020-04-02T10:11:47.908Z","updatedAt":"2020-04-02T10:11:47.908Z","stepId":2}
],
"outputs":[
{"id":2,"doc":"doc","extension":"extension","createdAt":"2020-04-02T10:11:47.915Z","updatedAt":"2020-04-02T10:11:47.915Z","stepId":2}
],
"implementation":{"id":2,"fileName":"hello-world.py","language":"python","createdAt":"2020-04-02T10:11:47.931Z","updatedAt":"2020-04-02T10:11:47.931Z","stepId":2}
}
]);
return response;
def test_generate_twosteps(self):
response = BasicTests.generate_twosteps();
assert response.status_code == 200;
def test_generate_nested(self):
twosteps_response = BasicTests.generate_twosteps();
assert twosteps_response.status_code == 200;
client = TestClient(routes.app)
response = client.post('/generate', json=[
{"id":1,"name":"stepName","doc":"doc","type":"type","position":1,"createdAt":"2020-04-02T10:11:47.805Z","updatedAt":"2020-04-02T10:11:47.805Z","workflowId":1,
"inputs":[
{"id":1,"doc":"doc","createdAt":"2020-04-02T10:11:47.829Z","updatedAt":"2020-04-02T10:11:47.829Z","stepId":1}
],
"outputs":[
{"id":1,"doc":"doc","extension":"extension","createdAt":"2020-04-02T10:11:47.850Z","updatedAt":"2020-04-02T10:11:47.850Z","stepId":1}
],
"implementation":{"id":1,"fileName":"hello-world-outer.py","language":"python","createdAt":"2020-04-02T10:11:47.891Z","updatedAt":"2020-04-02T10:11:47.891Z","stepId":1}
},
{"id":2,"name":"stepName","doc":"doc","type":"type","position":2,"createdAt":"2020-04-02T10:11:47.899Z","updatedAt":"2020-04-02T10:11:47.899Z","workflowId":1,
"implementation": {
"steps": [
{"id":1,"name":"stepName","doc":"doc","type":"type","position":1,"createdAt":"2020-04-02T10:11:47.805Z","updatedAt":"2020-04-02T10:11:47.805Z","workflowId":1,
"inputs":[
{"id":1,"doc":"doc","createdAt":"2020-04-02T10:11:47.829Z","updatedAt":"2020-04-02T10:11:47.829Z","stepId":1}
],
"outputs":[
{"id":1,"doc":"doc","extension":"extension","createdAt":"2020-04-02T10:11:47.850Z","updatedAt":"2020-04-02T10:11:47.850Z","stepId":1}
],
"implementation":{"id":1,"fileName":"hello-world.py","language":"python","createdAt":"2020-04-02T10:11:47.891Z","updatedAt":"2020-04-02T10:11:47.891Z","stepId":1}
},
{"id":2,"name":"stepName","doc":"doc","type":"type","position":2,"createdAt":"2020-04-02T10:11:47.899Z","updatedAt":"2020-04-02T10:11:47.899Z","workflowId":1,
"inputs":[
{"id":2,"doc":"doc","createdAt":"2020-04-02T10:11:47.908Z","updatedAt":"2020-04-02T10:11:47.908Z","stepId":2}
],
"outputs":[
{"id":2,"doc":"doc","extension":"extension","createdAt":"2020-04-02T10:11:47.915Z","updatedAt":"2020-04-02T10:11:47.915Z","stepId":2}
],
"implementation":{"id":2,"fileName":"hello-world.py","language":"python","createdAt":"2020-04-02T10:11:47.931Z","updatedAt":"2020-04-02T10:11:47.931Z","stepId":2}
}
]
}
},
{"id":3,"name":"stepName","doc":"doc","type":"type","position":3,"createdAt":"2020-04-02T10:11:47.899Z","updatedAt":"2020-04-02T10:11:47.899Z","workflowId":1,
"inputs":[
{"id":3,"doc":"doc","createdAt":"2020-04-02T10:11:47.908Z","updatedAt":"2020-04-02T10:11:47.908Z","stepId":3}
],
"outputs":[
{"id":3,"doc":"doc","extension":"extension","createdAt":"2020-04-02T10:11:47.915Z","updatedAt":"2020-04-02T10:11:47.915Z","stepId":3}
],
"implementation":{"id":3,"fileName":"hello-world-outer.py","language":"python","createdAt":"2020-04-02T10:11:47.931Z","updatedAt":"2020-04-02T10:11:47.931Z","stepId":3}
}
]);
assert response.status_code == 200;
if __name__ == "__main__":
unittest.main();
| 56.763441 | 176 | 0.593862 | 701 | 5,279 | 4.440799 | 0.099857 | 0.09637 | 0.176678 | 0.208802 | 0.886605 | 0.836813 | 0.836813 | 0.825891 | 0.809509 | 0.809509 | 0 | 0.205685 | 0.153628 | 5,279 | 92 | 177 | 57.380435 | 0.491047 | 0 | 0 | 0.62069 | 1 | 0 | 0.519038 | 0.227316 | 0 | 0 | 0 | 0 | 0.045977 | 1 | 0.045977 | false | 0 | 0.045977 | 0 | 0.114943 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 10 |
35faa208685572766bad419805425f10c57a7173 | 106 | py | Python | tests_conftest.py | berpress/qa_python_courses | d8e670ef5b5a2b20390ca0b9324764908ced7f00 | [
"Apache-2.0"
] | null | null | null | tests_conftest.py | berpress/qa_python_courses | d8e670ef5b5a2b20390ca0b9324764908ced7f00 | [
"Apache-2.0"
] | null | null | null | tests_conftest.py | berpress/qa_python_courses | d8e670ef5b5a2b20390ca0b9324764908ced7f00 | [
"Apache-2.0"
] | 2 | 2021-11-18T15:37:24.000Z | 2021-12-01T20:50:23.000Z | def test_one(start):
print('start test one')
def test_two(start):
print('start test two')
| 15.142857 | 28 | 0.622642 | 16 | 106 | 4 | 0.375 | 0.21875 | 0.46875 | 0.59375 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.245283 | 106 | 6 | 29 | 17.666667 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0.28 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | false | 0 | 0 | 0 | 0.5 | 0.5 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 7 |
c403f83dba7fdf640aaccd9a53250cfe9688659a | 108 | py | Python | easy_nlp/feature_extraction/__init__.py | Moumeneb1/IRIT_INTERNSHIP | 6a443508e9a6e26e46354c2d8282e360afdc02e7 | [
"MIT"
] | 4 | 2021-06-04T09:21:21.000Z | 2022-02-05T17:32:39.000Z | easy_nlp/feature_extraction/__init__.py | Moumeneb1/IRIT_INTERNSHIP | 6a443508e9a6e26e46354c2d8282e360afdc02e7 | [
"MIT"
] | null | null | null | easy_nlp/feature_extraction/__init__.py | Moumeneb1/IRIT_INTERNSHIP | 6a443508e9a6e26e46354c2d8282e360afdc02e7 | [
"MIT"
] | null | null | null | from .bert_input import *
from .linguistic_feature_extractor import *
from .meta_feature_extractor import *
| 27 | 43 | 0.833333 | 14 | 108 | 6.071429 | 0.571429 | 0.235294 | 0.517647 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.111111 | 108 | 3 | 44 | 36 | 0.885417 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
c40e6e457b11a734d2aff07fe5b8a74538c8b6a7 | 170 | py | Python | my_bookshelf/my_bookshelf/utils.py | vhadzhiev/my_bookshelf | 0265e44616a027b406bc1914232ed00ed0b85b57 | [
"MIT"
] | null | null | null | my_bookshelf/my_bookshelf/utils.py | vhadzhiev/my_bookshelf | 0265e44616a027b406bc1914232ed00ed0b85b57 | [
"MIT"
] | null | null | null | my_bookshelf/my_bookshelf/utils.py | vhadzhiev/my_bookshelf | 0265e44616a027b406bc1914232ed00ed0b85b57 | [
"MIT"
] | null | null | null | import os
def is_production():
return os.getenv('APP_ENVIRONMENT') == 'Production'
def is_development():
return os.getenv('APP_ENVIRONMENT') == 'Development'
| 17 | 56 | 0.705882 | 20 | 170 | 5.8 | 0.5 | 0.086207 | 0.241379 | 0.293103 | 0.482759 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.152941 | 170 | 9 | 57 | 18.888889 | 0.805556 | 0 | 0 | 0 | 0 | 0 | 0.3 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.4 | true | 0 | 0.2 | 0.4 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 7 |
c411072ae0548a0668ecd167bc0c54a6721f2aae | 13,733 | py | Python | app/grandchallenge/annotations/migrations/0001_initial.py | njmhendrix/grand-challenge.org | 9bc36f5e26561a78bd405e8ea5e4c0f86c95f011 | [
"Apache-2.0"
] | 1 | 2021-02-09T10:30:44.000Z | 2021-02-09T10:30:44.000Z | app/grandchallenge/annotations/migrations/0001_initial.py | njmhendrix/grand-challenge.org | 9bc36f5e26561a78bd405e8ea5e4c0f86c95f011 | [
"Apache-2.0"
] | null | null | null | app/grandchallenge/annotations/migrations/0001_initial.py | njmhendrix/grand-challenge.org | 9bc36f5e26561a78bd405e8ea5e4c0f86c95f011 | [
"Apache-2.0"
] | null | null | null | # Generated by Django 2.1.5 on 2019-02-04 15:17
import uuid
import django.contrib.postgres.fields
import django.utils.timezone
from django.conf import settings
from django.db import migrations, models
class Migration(migrations.Migration):
initial = True
dependencies = [
("cases", "0008_auto_20190201_1312"),
migrations.swappable_dependency(settings.AUTH_USER_MODEL),
]
operations = [
migrations.CreateModel(
name="BooleanClassificationAnnotation",
fields=[
(
"id",
models.UUIDField(
default=uuid.uuid4,
editable=False,
primary_key=True,
serialize=False,
),
),
("modified", models.DateTimeField(auto_now=True)),
(
"created",
models.DateTimeField(default=django.utils.timezone.now),
),
("name", models.CharField(max_length=255)),
("value", models.BooleanField()),
(
"grader",
models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE,
to=settings.AUTH_USER_MODEL,
),
),
(
"image",
models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE,
to="cases.Image",
),
),
],
options={"abstract": False},
),
migrations.CreateModel(
name="CoordinateListAnnotation",
fields=[
(
"id",
models.UUIDField(
default=uuid.uuid4,
editable=False,
primary_key=True,
serialize=False,
),
),
("modified", models.DateTimeField(auto_now=True)),
(
"created",
models.DateTimeField(default=django.utils.timezone.now),
),
("name", models.CharField(max_length=255)),
(
"value",
django.contrib.postgres.fields.ArrayField(
base_field=django.contrib.postgres.fields.ArrayField(
base_field=models.FloatField(), size=2
),
size=None,
),
),
(
"grader",
models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE,
to=settings.AUTH_USER_MODEL,
),
),
(
"image",
models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE,
to="cases.Image",
),
),
],
options={"abstract": False},
),
migrations.CreateModel(
name="ETDRSGridAnnotation",
fields=[
(
"id",
models.UUIDField(
default=uuid.uuid4,
editable=False,
primary_key=True,
serialize=False,
),
),
("modified", models.DateTimeField(auto_now=True)),
(
"created",
models.DateTimeField(default=django.utils.timezone.now),
),
(
"fovea",
django.contrib.postgres.fields.ArrayField(
base_field=models.FloatField(), size=2
),
),
(
"optic_disk",
django.contrib.postgres.fields.ArrayField(
base_field=models.FloatField(), size=2
),
),
(
"grader",
models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE,
to=settings.AUTH_USER_MODEL,
),
),
(
"image",
models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE,
to="cases.Image",
),
),
],
),
migrations.CreateModel(
name="IntegerClassificationAnnotation",
fields=[
(
"id",
models.UUIDField(
default=uuid.uuid4,
editable=False,
primary_key=True,
serialize=False,
),
),
("modified", models.DateTimeField(auto_now=True)),
(
"created",
models.DateTimeField(default=django.utils.timezone.now),
),
("name", models.CharField(max_length=255)),
("value", models.IntegerField()),
(
"grader",
models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE,
to=settings.AUTH_USER_MODEL,
),
),
(
"image",
models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE,
to="cases.Image",
),
),
],
options={"abstract": False},
),
migrations.CreateModel(
name="LandmarkAnnotationSet",
fields=[
(
"id",
models.UUIDField(
default=uuid.uuid4,
editable=False,
primary_key=True,
serialize=False,
),
),
("modified", models.DateTimeField(auto_now=True)),
(
"created",
models.DateTimeField(default=django.utils.timezone.now),
),
(
"grader",
models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE,
to=settings.AUTH_USER_MODEL,
),
),
],
),
migrations.CreateModel(
name="MeasurementAnnotation",
fields=[
(
"id",
models.UUIDField(
default=uuid.uuid4,
editable=False,
primary_key=True,
serialize=False,
),
),
("modified", models.DateTimeField(auto_now=True)),
(
"created",
models.DateTimeField(default=django.utils.timezone.now),
),
(
"start_voxel",
django.contrib.postgres.fields.ArrayField(
base_field=models.FloatField(), size=2
),
),
(
"end_voxel",
django.contrib.postgres.fields.ArrayField(
base_field=models.FloatField(), size=2
),
),
(
"grader",
models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE,
to=settings.AUTH_USER_MODEL,
),
),
(
"image",
models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE,
to="cases.Image",
),
),
],
),
migrations.CreateModel(
name="PolygonAnnotationSet",
fields=[
(
"id",
models.UUIDField(
default=uuid.uuid4,
editable=False,
primary_key=True,
serialize=False,
),
),
("modified", models.DateTimeField(auto_now=True)),
(
"created",
models.DateTimeField(default=django.utils.timezone.now),
),
("name", models.CharField(max_length=255)),
(
"grader",
models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE,
to=settings.AUTH_USER_MODEL,
),
),
(
"image",
models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE,
to="cases.Image",
),
),
],
options={"abstract": False},
),
migrations.CreateModel(
name="SingleLandmarkAnnotation",
fields=[
(
"id",
models.UUIDField(
default=uuid.uuid4,
editable=False,
primary_key=True,
serialize=False,
),
),
("created", models.DateTimeField(auto_now_add=True)),
("modified", models.DateTimeField(auto_now=True)),
(
"landmarks",
django.contrib.postgres.fields.ArrayField(
base_field=django.contrib.postgres.fields.ArrayField(
base_field=models.FloatField(), size=2
),
size=None,
),
),
(
"annotation_set",
models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE,
to="annotations.LandmarkAnnotationSet",
),
),
(
"image",
models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE,
to="cases.Image",
),
),
],
),
migrations.CreateModel(
name="SinglePolygonAnnotation",
fields=[
(
"id",
models.UUIDField(
default=uuid.uuid4,
editable=False,
primary_key=True,
serialize=False,
),
),
("created", models.DateTimeField(auto_now_add=True)),
("modified", models.DateTimeField(auto_now=True)),
(
"value",
django.contrib.postgres.fields.ArrayField(
base_field=django.contrib.postgres.fields.ArrayField(
base_field=models.FloatField(), size=2
),
size=None,
),
),
(
"annotation_set",
models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE,
to="annotations.PolygonAnnotationSet",
),
),
],
options={"abstract": False},
),
migrations.AlterUniqueTogether(
name="singlelandmarkannotation",
unique_together={("image", "annotation_set")},
),
migrations.AlterUniqueTogether(
name="polygonannotationset",
unique_together={("image", "grader", "created", "name")},
),
migrations.AlterUniqueTogether(
name="measurementannotation",
unique_together={
("image", "grader", "created", "start_voxel", "end_voxel")
},
),
migrations.AlterUniqueTogether(
name="landmarkannotationset",
unique_together={("grader", "created")},
),
migrations.AlterUniqueTogether(
name="integerclassificationannotation",
unique_together={("image", "grader", "created", "name")},
),
migrations.AlterUniqueTogether(
name="etdrsgridannotation",
unique_together={("image", "grader", "created")},
),
migrations.AlterUniqueTogether(
name="coordinatelistannotation",
unique_together={("image", "grader", "created", "name")},
),
migrations.AlterUniqueTogether(
name="booleanclassificationannotation",
unique_together={("image", "grader", "created", "name")},
),
]
| 34.767089 | 77 | 0.386951 | 802 | 13,733 | 6.516209 | 0.125935 | 0.065442 | 0.055109 | 0.073479 | 0.801186 | 0.773823 | 0.766935 | 0.766935 | 0.766935 | 0.727325 | 0 | 0.008933 | 0.519042 | 13,733 | 394 | 78 | 34.85533 | 0.782286 | 0.003277 | 0 | 0.782383 | 1 | 0 | 0.082493 | 0.030323 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.012953 | 0 | 0.023316 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
c472b4efcbea05f0db6f9414cef8457e88b2f03d | 123 | py | Python | models/__init__.py | altosaar/rankfromsets | f38a843e7bd77d8a8b577864bc0244c3c8024179 | [
"MIT"
] | 5 | 2020-11-07T15:39:57.000Z | 2021-05-17T23:46:39.000Z | models/__init__.py | altosaar/rankfromsets | f38a843e7bd77d8a8b577864bc0244c3c8024179 | [
"MIT"
] | null | null | null | models/__init__.py | altosaar/rankfromsets | f38a843e7bd77d8a8b577864bc0244c3c8024179 | [
"MIT"
] | null | null | null | from .rank_from_sets import InnerProduct
from .rank_from_sets import Deep
from .rank_from_sets import ResidualInnerProduct
| 30.75 | 48 | 0.878049 | 18 | 123 | 5.666667 | 0.388889 | 0.235294 | 0.352941 | 0.470588 | 0.647059 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.097561 | 123 | 3 | 49 | 41 | 0.918919 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 7 |
c47d24d8875d38d250b41ddb240d602f30cebeff | 90,343 | py | Python | pb/trade_db_model_pb2.py | zheng-zy/ot_root | 920236b48458aeed72968bc7ec8e01a084b15951 | [
"Artistic-2.0"
] | 1 | 2016-03-23T07:54:55.000Z | 2016-03-23T07:54:55.000Z | pb/trade_db_model_pb2.py | zheng-zy/ot_root | 920236b48458aeed72968bc7ec8e01a084b15951 | [
"Artistic-2.0"
] | null | null | null | pb/trade_db_model_pb2.py | zheng-zy/ot_root | 920236b48458aeed72968bc7ec8e01a084b15951 | [
"Artistic-2.0"
] | 1 | 2020-07-23T18:27:53.000Z | 2020-07-23T18:27:53.000Z | # Generated by the protocol buffer compiler. DO NOT EDIT!
# source: trade_db_model.proto
import sys
_b=sys.version_info[0]<3 and (lambda x:x) or (lambda x:x.encode('latin1'))
from google.protobuf import descriptor as _descriptor
from google.protobuf import message as _message
from google.protobuf import reflection as _reflection
from google.protobuf import symbol_database as _symbol_database
from google.protobuf import descriptor_pb2
# @@protoc_insertion_point(imports)
_sym_db = _symbol_database.Default()
DESCRIPTOR = _descriptor.FileDescriptor(
name='trade_db_model.proto',
package='liyi.trade_db',
serialized_pb=_b('\n\x14trade_db_model.proto\x12\rliyi.trade_db\"\xf0\x01\n\nStockAsset\x12\x0f\n\x07\x66und_id\x18\x01 \x02(\t\x12\x12\n\nmoney_type\x18\x02 \x01(\t\x12\r\n\x05\x61sset\x18\x03 \x01(\x03\x12\x17\n\x0f\x63\x61pital_balance\x18\x04 \x01(\x03\x12\x19\n\x11\x63\x61pital_avaliable\x18\x05 \x01(\x03\x12\x1e\n\x16\x63\x61pital_freezed_by_buy\x18\x06 \x01(\x03\x12!\n\x19\x63\x61pital_freezed_by_others\x18\x07 \x01(\x03\x12\x14\n\x0cmarket_value\x18\x08 \x01(\x03\x12\x0c\n\x04info\x18\t \x01(\t\x12\x13\n\x0b\x63reate_time\x18\n \x02(\x04\"\xe3\x07\n\rStockPosition\x12\x0f\n\x07\x66und_id\x18\x01 \x02(\t\x12\x0e\n\x06market\x18\x02 \x02(\x05\x12\x10\n\x08stock_id\x18\x03 \x02(\t\x12\x12\n\nstock_name\x18\x04 \x01(\t\x12\x12\n\npre_volume\x18\x05 \x01(\x03\x12\x0e\n\x06volume\x18\x06 \x02(\x03\x12\x17\n\x0f\x63\x61n_sell_volume\x18\x07 \x01(\x03\x12\x1e\n\x16volume_freezed_by_sell\x18\x08 \x01(\x03\x12\x16\n\x0e\x63ur_buy_volume\x18\t \x01(\x03\x12\x17\n\x0f\x63ur_sell_volume\x18\n \x01(\x03\x12\x14\n\x0cmarket_value\x18\x0b \x01(\x03\x12\x12\n\ncost_price\x18\x0c \x01(\x03\x12!\n\x19stock_can_purchase_volume\x18\r \x01(\x03\x12#\n\x1bstock_can_sell_and_purchase\x18\x0e \x01(\x03\x12#\n\x1bstock_can_sell_but_purchase\x18\x0f \x01(\x03\x12$\n\x1c\x66reezed_by_sell_and_purchase\x18\x10 \x01(\x03\x12$\n\x1c\x66reezed_by_sell_but_purchase\x18\x11 \x01(\x03\x12#\n\x1bstock_can_purchase_but_sell\x18\x12 \x01(\x03\x12!\n\x19stock_freezed_by_purchase\x18\x13 \x01(\x03\x12$\n\x1cstock_cur_purchase_reduction\x18\x14 \x01(\x03\x12!\n\x19stock_cur_redeem_addition\x18\x15 \x01(\x03\x12\x1f\n\x17\x65tf_can_sell_and_redeem\x18\x16 \x01(\x03\x12\x1f\n\x17\x65tf_can_sell_but_redeem\x18\x17 \x01(\x03\x12\"\n\x1a\x66reezed_by_sell_and_redeem\x18\x18 \x01(\x03\x12\"\n\x1a\x66reezed_by_sell_but_redeem\x18\x19 \x01(\x03\x12\x1d\n\x15\x65tf_can_redeem_volume\x18\x1a \x01(\x03\x12\x1f\n\x17\x65tf_can_redeem_but_sell\x18\x1b \x01(\x03\x12\x1d\n\x15\x65tf_freezed_by_redeem\x18\x1c \x01(\x03\x12\"\n\x1a\x66reezed_by_redeem_and_sell\x18\x1d \x01(\x03\x12\"\n\x1a\x66reezed_by_redeem_but_sell\x18\x1e \x01(\x03\x12!\n\x19\x65tf_cur_purchase_addition\x18\x1f \x01(\x03\x12 \n\x18\x65tf_cur_redeem_reduction\x18 \x01(\x03\x12\x13\n\x0b\x63reate_time\x18! \x02(\x04\"\xaa\x06\n\nStockOrder\x12\n\n\x02id\x18\x01 \x02(\x0c\x12\x0f\n\x07\x66und_id\x18\x02 \x02(\t\x12\x0e\n\x06market\x18\x03 \x02(\x05\x12\x0f\n\x07\x62s_flag\x18\x04 \x02(\x05\x12\x10\n\x08order_no\x18\x05 \x01(\t\x12\x10\n\x08\x62\x61tch_no\x18\x06 \x01(\t\x12\x10\n\x08stock_id\x18\x07 \x02(\t\x12\x12\n\nstock_name\x18\x08 \x01(\t\x12\r\n\x05price\x18\t \x01(\x03\x12\x0e\n\x06volume\x18\n \x02(\x03\x12\x15\n\rfreeze_amount\x18\x0b \x01(\x03\x12\r\n\x05state\x18\x0c \x01(\x05\x12\x14\n\x0cknock_volume\x18\r \x01(\x03\x12\x14\n\x0cknock_amount\x18\x0e \x01(\x03\x12\x13\n\x0bknock_price\x18\x0f \x01(\x03\x12\x17\n\x0fwithdraw_volume\x18\x10 \x01(\x03\x12\x1b\n\x13knock_volume_by_get\x18\x11 \x01(\x03\x12\x1b\n\x13knock_amount_by_get\x18\x12 \x01(\x03\x12\x1e\n\x16withdraw_volume_by_get\x18\x13 \x01(\x03\x12\x1c\n\x14knock_volume_by_push\x18\x14 \x01(\x03\x12\x1c\n\x14knock_amount_by_push\x18\x15 \x01(\x03\x12\x1f\n\x17withdraw_volume_by_push\x18\x16 \x01(\x03\x12\x12\n\nstate_time\x18\x17 \x01(\x04\x12\x1a\n\x12order_request_time\x18\x18 \x01(\x04\x12\x1b\n\x13order_response_time\x18\x19 \x01(\x04\x12!\n\x19knock_begin_exchange_time\x18\x1a \x01(\r\x12\x1d\n\x15knock_begin_recv_time\x18\x1b \x01(\r\x12\x1f\n\x17knock_end_exchange_time\x18\x1c \x01(\r\x12\x1b\n\x13knock_end_recv_time\x18\x1d \x01(\r\x12\x11\n\tpolicy_id\x18\x1e \x01(\x0c\x12\x11\n\ttrader_id\x18\x1f \x01(\t\x12\x11\n\ttrader_ip\x18 \x01(\t\x12\x0c\n\x04info\x18! \x01(\t\x12\x13\n\x0b\x63reate_time\x18\" \x02(\x04\x12\x15\n\rbasket_amount\x18# \x01(\x03\"\xeb\x02\n\rStockWithdraw\x12\n\n\x02id\x18\x01 \x02(\t\x12\x0f\n\x07\x66und_id\x18\x02 \x02(\t\x12\x12\n\nrequest_no\x18\x03 \x02(\t\x12\x10\n\x08order_no\x18\x04 \x01(\t\x12\x10\n\x08\x62\x61tch_no\x18\x05 \x01(\t\x12\r\n\x05state\x18\x06 \x01(\x05\x12\x12\n\nstate_time\x18\x07 \x01(\r\x12\x1d\n\x15withdraw_request_time\x18\x08 \x01(\r\x12\x1e\n\x16withdraw_response_time\x18\t \x01(\r\x12\x1b\n\x13knock_exchange_time\x18\n \x01(\r\x12\x17\n\x0fknock_recv_time\x18\x0b \x01(\r\x12\x11\n\tpolicy_id\x18\x0c \x01(\x0c\x12\x11\n\ttrader_id\x18\r \x01(\t\x12\x11\n\ttrader_ip\x18\x0e \x01(\t\x12\x0c\n\x04info\x18\x0f \x01(\t\x12\x11\n\tdata_date\x18\x10 \x02(\r\x12\x13\n\x0b\x63reate_time\x18\x11 \x02(\x04\"\xbc\x04\n\nStockKnock\x12\x0f\n\x07\x66und_id\x18\x01 \x02(\t\x12\x0f\n\x07\x62s_flag\x18\x02 \x02(\x05\x12\x10\n\x08order_no\x18\x03 \x02(\t\x12\x0e\n\x06market\x18\x04 \x02(\x05\x12\x10\n\x08stock_id\x18\x05 \x02(\t\x12\x12\n\nstock_name\x18\x06 \x01(\t\x12\x12\n\nmatch_type\x18\x07 \x01(\x05\x12\x10\n\x08\x62\x61tch_no\x18\x08 \x01(\t\x12\x10\n\x08match_no\x18\t \x01(\t\x12\x16\n\x0einner_match_no\x18\n \x02(\t\x12\x12\n\nmatch_time\x18\x0b \x02(\r\x12\x13\n\x0bmatch_price\x18\x0c \x01(\x03\x12\x14\n\x0cmatch_volume\x18\r \x01(\x03\x12\x14\n\x0cmatch_amount\x18\x0e \x01(\x03\x12\x14\n\x0c\x63lear_amount\x18\x0f \x01(\x03\x12\x0e\n\x06\x62roker\x18\x10 \x01(\x05\x12\x15\n\rorder_bs_flag\x18\x11 \x01(\x05\x12\x14\n\x0corder_volume\x18\x12 \x01(\x03\x12\x13\n\x0border_price\x18\x13 \x01(\x03\x12\x11\n\tpolicy_id\x18\x14 \x01(\x0c\x12\x11\n\ttrader_id\x18\x15 \x01(\t\x12\x11\n\ttrader_ip\x18\x16 \x01(\t\x12\x19\n\x11\x62\x65longed_etf_code\x18\x17 \x01(\t\x12\x11\n\trecv_time\x18\x18 \x01(\r\x12\x0c\n\x04info\x18\x19 \x01(\t\x12\x11\n\tdata_date\x18\x1a \x02(\r\x12\x13\n\x0b\x63reate_time\x18\x1b \x02(\x04\x12\x1a\n\x12total_knock_volume\x18\x1c \x01(\x03\"\xa0\x01\n\x0b\x46utureAsset\x12\x0f\n\x07\x66und_id\x18\x01 \x02(\t\x12\x12\n\nmoney_type\x18\x02 \x01(\x05\x12\r\n\x05\x61sset\x18\x03 \x01(\x03\x12\x17\n\x0f\x63\x61pital_balance\x18\x04 \x01(\x03\x12\x19\n\x11\x63\x61pital_avaliable\x18\x05 \x01(\x03\x12\x14\n\x0cmarket_value\x18\x06 \x01(\x03\x12\x13\n\x0b\x63reate_time\x18\x07 \x02(\x04\"\xee\x01\n\x0e\x46uturePosition\x12\x0f\n\x07\x66und_id\x18\x01 \x02(\t\x12\x0e\n\x06market\x18\x02 \x01(\t\x12\x15\n\rinstrument_id\x18\x03 \x02(\t\x12\x17\n\x0finstrument_name\x18\x04 \x01(\t\x12\x13\n\x0blong_volume\x18\x05 \x01(\x03\x12\x12\n\nlong_price\x18\x06 \x01(\x03\x12\x14\n\x0cshort_volume\x18\x07 \x01(\x03\x12\x13\n\x0bshort_price\x18\x08 \x01(\x03\x12\x14\n\x0chedging_flag\x18\t \x01(\t\x12\x0c\n\x04info\x18\n \x01(\t\x12\x13\n\x0b\x63reate_time\x18\x0b \x02(\x04\"\xe3\x03\n\x0b\x46utureOrder\x12\x0f\n\x07\x66und_id\x18\x01 \x02(\t\x12\x0e\n\x06market\x18\x02 \x01(\t\x12\x0f\n\x07\x62s_flag\x18\x03 \x01(\x05\x12\x10\n\x08order_no\x18\x04 \x01(\t\x12\x15\n\rinstrument_id\x18\x05 \x02(\t\x12\x17\n\x0finstrument_name\x18\x06 \x01(\t\x12\r\n\x05price\x18\x07 \x01(\x03\x12\x0b\n\x03qty\x18\x08 \x01(\x03\x12\x0e\n\x06status\x18\t \x01(\x05\x12\x14\n\x0cknock_volume\x18\n \x01(\x03\x12\x14\n\x0cknock_amount\x18\x0b \x01(\x03\x12\x13\n\x0bknock_price\x18\x0c \x01(\x03\x12\x16\n\x0eunknock_volume\x18\r \x01(\x03\x12\x13\n\x0bsubmit_time\x18\x0e \x01(\r\x12\x0c\n\x04info\x18\x0f \x01(\t\x12\x12\n\norder_date\x18\x10 \x02(\r\x12\x12\n\norder_time\x18\x11 \x01(\r\x12\x13\n\x0b\x63reate_time\x18\x12 \x02(\x04\x12\x11\n\tpolicy_id\x18\x13 \x01(\x0c\x12\x11\n\ttrader_id\x18\x14 \x01(\t\x12\x11\n\ttrader_ip\x18\x15 \x01(\t\x12\x17\n\x0fopen_close_flag\x18\x16 \x01(\x05\x12\x13\n\x0bupdate_time\x18\x17 \x01(\x04\x12\x14\n\x0crequest_time\x18\x18 \x01(\x04\"\x99\x04\n\x0b\x46utureKnock\x12\x0f\n\x07\x66und_id\x18\x01 \x02(\t\x12\x0e\n\x06market\x18\x02 \x02(\t\x12\x0f\n\x07\x62s_flag\x18\x03 \x02(\x05\x12\x10\n\x08order_no\x18\x04 \x02(\t\x12\x15\n\rinstrument_id\x18\x05 \x02(\t\x12\x17\n\x0finstrument_name\x18\x06 \x01(\t\x12\x14\n\x0corder_volume\x18\x07 \x01(\x03\x12\x13\n\x0border_price\x18\x08 \x01(\x03\x12\x12\n\nmatch_type\x18\t \x01(\x05\x12\x10\n\x08match_no\x18\n \x01(\t\x12\x16\n\x0einner_match_no\x18\x0b \x02(\t\x12\x12\n\nmatch_time\x18\x0c \x01(\r\x12\x13\n\x0bmatch_price\x18\r \x01(\x03\x12\x14\n\x0cmatch_volume\x18\x0e \x01(\x03\x12\x11\n\tmatch_sum\x18\x0f \x01(\x03\x12\x14\n\x0cmatch_amount\x18\x10 \x01(\x03\x12\x14\n\x0c\x63lear_amount\x18\x11 \x01(\x03\x12\x0e\n\x06\x62roker\x18\x12 \x01(\x05\x12\x11\n\trecv_time\x18\x13 \x01(\r\x12\x16\n\x0e\x63tp_status_msg\x18\x14 \x01(\t\x12\x12\n\norder_date\x18\x15 \x01(\r\x12\x12\n\ntrade_date\x18\x16 \x02(\r\x12\x13\n\x0b\x63reate_time\x18\x17 \x02(\x04\x12\x11\n\tpolicy_id\x18\x18 \x01(\x0c\x12\x11\n\ttrader_id\x18\x19 \x01(\t\x12\x11\n\ttrader_ip\x18\x1a \x01(\t\"\xae\x02\n\x10\x46utureInstrument\x12\x15\n\rinstrument_id\x18\x01 \x02(\t\x12\x17\n\x0finstrument_name\x18\x02 \x01(\t\x12\x13\n\x0b\x65xchange_id\x18\x03 \x01(\t\x12\x19\n\x11long_margin_ratio\x18\x04 \x01(\x01\x12\x1a\n\x12short_margin_ratio\x18\x05 \x01(\x01\x12\x17\n\x0fvolume_multiple\x18\x06 \x01(\x05\x12\x1e\n\x16max_limit_order_volume\x18\x07 \x01(\x05\x12\x1e\n\x16min_limit_order_volume\x18\x08 \x01(\x05\x12\x18\n\x10start_deliv_date\x18\t \x01(\t\x12\x16\n\x0e\x65nd_deliv_date\x18\n \x01(\t\x12\x13\n\x0b\x65xpire_date\x18\x0b \x01(\t\"\'\n\x0c\x42\x61seResponse\x12\n\n\x02rc\x18\x01 \x02(\x05\x12\x0b\n\x03msg\x18\x02 \x01(\t')
)
_sym_db.RegisterFileDescriptor(DESCRIPTOR)
_STOCKASSET = _descriptor.Descriptor(
name='StockAsset',
full_name='liyi.trade_db.StockAsset',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='fund_id', full_name='liyi.trade_db.StockAsset.fund_id', index=0,
number=1, type=9, cpp_type=9, label=2,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='money_type', full_name='liyi.trade_db.StockAsset.money_type', index=1,
number=2, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='asset', full_name='liyi.trade_db.StockAsset.asset', index=2,
number=3, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='capital_balance', full_name='liyi.trade_db.StockAsset.capital_balance', index=3,
number=4, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='capital_avaliable', full_name='liyi.trade_db.StockAsset.capital_avaliable', index=4,
number=5, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='capital_freezed_by_buy', full_name='liyi.trade_db.StockAsset.capital_freezed_by_buy', index=5,
number=6, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='capital_freezed_by_others', full_name='liyi.trade_db.StockAsset.capital_freezed_by_others', index=6,
number=7, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='market_value', full_name='liyi.trade_db.StockAsset.market_value', index=7,
number=8, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='info', full_name='liyi.trade_db.StockAsset.info', index=8,
number=9, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='create_time', full_name='liyi.trade_db.StockAsset.create_time', index=9,
number=10, type=4, cpp_type=4, label=2,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
],
extensions=[
],
nested_types=[],
enum_types=[
],
options=None,
is_extendable=False,
extension_ranges=[],
oneofs=[
],
serialized_start=40,
serialized_end=280,
)
_STOCKPOSITION = _descriptor.Descriptor(
name='StockPosition',
full_name='liyi.trade_db.StockPosition',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='fund_id', full_name='liyi.trade_db.StockPosition.fund_id', index=0,
number=1, type=9, cpp_type=9, label=2,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='market', full_name='liyi.trade_db.StockPosition.market', index=1,
number=2, type=5, cpp_type=1, label=2,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='stock_id', full_name='liyi.trade_db.StockPosition.stock_id', index=2,
number=3, type=9, cpp_type=9, label=2,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='stock_name', full_name='liyi.trade_db.StockPosition.stock_name', index=3,
number=4, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='pre_volume', full_name='liyi.trade_db.StockPosition.pre_volume', index=4,
number=5, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='volume', full_name='liyi.trade_db.StockPosition.volume', index=5,
number=6, type=3, cpp_type=2, label=2,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='can_sell_volume', full_name='liyi.trade_db.StockPosition.can_sell_volume', index=6,
number=7, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='volume_freezed_by_sell', full_name='liyi.trade_db.StockPosition.volume_freezed_by_sell', index=7,
number=8, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='cur_buy_volume', full_name='liyi.trade_db.StockPosition.cur_buy_volume', index=8,
number=9, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='cur_sell_volume', full_name='liyi.trade_db.StockPosition.cur_sell_volume', index=9,
number=10, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='market_value', full_name='liyi.trade_db.StockPosition.market_value', index=10,
number=11, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='cost_price', full_name='liyi.trade_db.StockPosition.cost_price', index=11,
number=12, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='stock_can_purchase_volume', full_name='liyi.trade_db.StockPosition.stock_can_purchase_volume', index=12,
number=13, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='stock_can_sell_and_purchase', full_name='liyi.trade_db.StockPosition.stock_can_sell_and_purchase', index=13,
number=14, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='stock_can_sell_but_purchase', full_name='liyi.trade_db.StockPosition.stock_can_sell_but_purchase', index=14,
number=15, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='freezed_by_sell_and_purchase', full_name='liyi.trade_db.StockPosition.freezed_by_sell_and_purchase', index=15,
number=16, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='freezed_by_sell_but_purchase', full_name='liyi.trade_db.StockPosition.freezed_by_sell_but_purchase', index=16,
number=17, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='stock_can_purchase_but_sell', full_name='liyi.trade_db.StockPosition.stock_can_purchase_but_sell', index=17,
number=18, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='stock_freezed_by_purchase', full_name='liyi.trade_db.StockPosition.stock_freezed_by_purchase', index=18,
number=19, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='stock_cur_purchase_reduction', full_name='liyi.trade_db.StockPosition.stock_cur_purchase_reduction', index=19,
number=20, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='stock_cur_redeem_addition', full_name='liyi.trade_db.StockPosition.stock_cur_redeem_addition', index=20,
number=21, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='etf_can_sell_and_redeem', full_name='liyi.trade_db.StockPosition.etf_can_sell_and_redeem', index=21,
number=22, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='etf_can_sell_but_redeem', full_name='liyi.trade_db.StockPosition.etf_can_sell_but_redeem', index=22,
number=23, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='freezed_by_sell_and_redeem', full_name='liyi.trade_db.StockPosition.freezed_by_sell_and_redeem', index=23,
number=24, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='freezed_by_sell_but_redeem', full_name='liyi.trade_db.StockPosition.freezed_by_sell_but_redeem', index=24,
number=25, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='etf_can_redeem_volume', full_name='liyi.trade_db.StockPosition.etf_can_redeem_volume', index=25,
number=26, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='etf_can_redeem_but_sell', full_name='liyi.trade_db.StockPosition.etf_can_redeem_but_sell', index=26,
number=27, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='etf_freezed_by_redeem', full_name='liyi.trade_db.StockPosition.etf_freezed_by_redeem', index=27,
number=28, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='freezed_by_redeem_and_sell', full_name='liyi.trade_db.StockPosition.freezed_by_redeem_and_sell', index=28,
number=29, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='freezed_by_redeem_but_sell', full_name='liyi.trade_db.StockPosition.freezed_by_redeem_but_sell', index=29,
number=30, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='etf_cur_purchase_addition', full_name='liyi.trade_db.StockPosition.etf_cur_purchase_addition', index=30,
number=31, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='etf_cur_redeem_reduction', full_name='liyi.trade_db.StockPosition.etf_cur_redeem_reduction', index=31,
number=32, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='create_time', full_name='liyi.trade_db.StockPosition.create_time', index=32,
number=33, type=4, cpp_type=4, label=2,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
],
extensions=[
],
nested_types=[],
enum_types=[
],
options=None,
is_extendable=False,
extension_ranges=[],
oneofs=[
],
serialized_start=283,
serialized_end=1278,
)
_STOCKORDER = _descriptor.Descriptor(
name='StockOrder',
full_name='liyi.trade_db.StockOrder',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='id', full_name='liyi.trade_db.StockOrder.id', index=0,
number=1, type=12, cpp_type=9, label=2,
has_default_value=False, default_value=_b(""),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='fund_id', full_name='liyi.trade_db.StockOrder.fund_id', index=1,
number=2, type=9, cpp_type=9, label=2,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='market', full_name='liyi.trade_db.StockOrder.market', index=2,
number=3, type=5, cpp_type=1, label=2,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='bs_flag', full_name='liyi.trade_db.StockOrder.bs_flag', index=3,
number=4, type=5, cpp_type=1, label=2,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='order_no', full_name='liyi.trade_db.StockOrder.order_no', index=4,
number=5, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='batch_no', full_name='liyi.trade_db.StockOrder.batch_no', index=5,
number=6, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='stock_id', full_name='liyi.trade_db.StockOrder.stock_id', index=6,
number=7, type=9, cpp_type=9, label=2,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='stock_name', full_name='liyi.trade_db.StockOrder.stock_name', index=7,
number=8, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='price', full_name='liyi.trade_db.StockOrder.price', index=8,
number=9, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='volume', full_name='liyi.trade_db.StockOrder.volume', index=9,
number=10, type=3, cpp_type=2, label=2,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='freeze_amount', full_name='liyi.trade_db.StockOrder.freeze_amount', index=10,
number=11, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='state', full_name='liyi.trade_db.StockOrder.state', index=11,
number=12, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='knock_volume', full_name='liyi.trade_db.StockOrder.knock_volume', index=12,
number=13, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='knock_amount', full_name='liyi.trade_db.StockOrder.knock_amount', index=13,
number=14, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='knock_price', full_name='liyi.trade_db.StockOrder.knock_price', index=14,
number=15, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='withdraw_volume', full_name='liyi.trade_db.StockOrder.withdraw_volume', index=15,
number=16, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='knock_volume_by_get', full_name='liyi.trade_db.StockOrder.knock_volume_by_get', index=16,
number=17, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='knock_amount_by_get', full_name='liyi.trade_db.StockOrder.knock_amount_by_get', index=17,
number=18, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='withdraw_volume_by_get', full_name='liyi.trade_db.StockOrder.withdraw_volume_by_get', index=18,
number=19, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='knock_volume_by_push', full_name='liyi.trade_db.StockOrder.knock_volume_by_push', index=19,
number=20, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='knock_amount_by_push', full_name='liyi.trade_db.StockOrder.knock_amount_by_push', index=20,
number=21, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='withdraw_volume_by_push', full_name='liyi.trade_db.StockOrder.withdraw_volume_by_push', index=21,
number=22, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='state_time', full_name='liyi.trade_db.StockOrder.state_time', index=22,
number=23, type=4, cpp_type=4, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='order_request_time', full_name='liyi.trade_db.StockOrder.order_request_time', index=23,
number=24, type=4, cpp_type=4, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='order_response_time', full_name='liyi.trade_db.StockOrder.order_response_time', index=24,
number=25, type=4, cpp_type=4, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='knock_begin_exchange_time', full_name='liyi.trade_db.StockOrder.knock_begin_exchange_time', index=25,
number=26, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='knock_begin_recv_time', full_name='liyi.trade_db.StockOrder.knock_begin_recv_time', index=26,
number=27, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='knock_end_exchange_time', full_name='liyi.trade_db.StockOrder.knock_end_exchange_time', index=27,
number=28, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='knock_end_recv_time', full_name='liyi.trade_db.StockOrder.knock_end_recv_time', index=28,
number=29, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='policy_id', full_name='liyi.trade_db.StockOrder.policy_id', index=29,
number=30, type=12, cpp_type=9, label=1,
has_default_value=False, default_value=_b(""),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='trader_id', full_name='liyi.trade_db.StockOrder.trader_id', index=30,
number=31, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='trader_ip', full_name='liyi.trade_db.StockOrder.trader_ip', index=31,
number=32, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='info', full_name='liyi.trade_db.StockOrder.info', index=32,
number=33, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='create_time', full_name='liyi.trade_db.StockOrder.create_time', index=33,
number=34, type=4, cpp_type=4, label=2,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='basket_amount', full_name='liyi.trade_db.StockOrder.basket_amount', index=34,
number=35, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
],
extensions=[
],
nested_types=[],
enum_types=[
],
options=None,
is_extendable=False,
extension_ranges=[],
oneofs=[
],
serialized_start=1281,
serialized_end=2091,
)
_STOCKWITHDRAW = _descriptor.Descriptor(
name='StockWithdraw',
full_name='liyi.trade_db.StockWithdraw',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='id', full_name='liyi.trade_db.StockWithdraw.id', index=0,
number=1, type=9, cpp_type=9, label=2,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='fund_id', full_name='liyi.trade_db.StockWithdraw.fund_id', index=1,
number=2, type=9, cpp_type=9, label=2,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='request_no', full_name='liyi.trade_db.StockWithdraw.request_no', index=2,
number=3, type=9, cpp_type=9, label=2,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='order_no', full_name='liyi.trade_db.StockWithdraw.order_no', index=3,
number=4, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='batch_no', full_name='liyi.trade_db.StockWithdraw.batch_no', index=4,
number=5, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='state', full_name='liyi.trade_db.StockWithdraw.state', index=5,
number=6, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='state_time', full_name='liyi.trade_db.StockWithdraw.state_time', index=6,
number=7, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='withdraw_request_time', full_name='liyi.trade_db.StockWithdraw.withdraw_request_time', index=7,
number=8, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='withdraw_response_time', full_name='liyi.trade_db.StockWithdraw.withdraw_response_time', index=8,
number=9, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='knock_exchange_time', full_name='liyi.trade_db.StockWithdraw.knock_exchange_time', index=9,
number=10, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='knock_recv_time', full_name='liyi.trade_db.StockWithdraw.knock_recv_time', index=10,
number=11, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='policy_id', full_name='liyi.trade_db.StockWithdraw.policy_id', index=11,
number=12, type=12, cpp_type=9, label=1,
has_default_value=False, default_value=_b(""),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='trader_id', full_name='liyi.trade_db.StockWithdraw.trader_id', index=12,
number=13, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='trader_ip', full_name='liyi.trade_db.StockWithdraw.trader_ip', index=13,
number=14, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='info', full_name='liyi.trade_db.StockWithdraw.info', index=14,
number=15, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='data_date', full_name='liyi.trade_db.StockWithdraw.data_date', index=15,
number=16, type=13, cpp_type=3, label=2,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='create_time', full_name='liyi.trade_db.StockWithdraw.create_time', index=16,
number=17, type=4, cpp_type=4, label=2,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
],
extensions=[
],
nested_types=[],
enum_types=[
],
options=None,
is_extendable=False,
extension_ranges=[],
oneofs=[
],
serialized_start=2094,
serialized_end=2457,
)
_STOCKKNOCK = _descriptor.Descriptor(
name='StockKnock',
full_name='liyi.trade_db.StockKnock',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='fund_id', full_name='liyi.trade_db.StockKnock.fund_id', index=0,
number=1, type=9, cpp_type=9, label=2,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='bs_flag', full_name='liyi.trade_db.StockKnock.bs_flag', index=1,
number=2, type=5, cpp_type=1, label=2,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='order_no', full_name='liyi.trade_db.StockKnock.order_no', index=2,
number=3, type=9, cpp_type=9, label=2,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='market', full_name='liyi.trade_db.StockKnock.market', index=3,
number=4, type=5, cpp_type=1, label=2,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='stock_id', full_name='liyi.trade_db.StockKnock.stock_id', index=4,
number=5, type=9, cpp_type=9, label=2,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='stock_name', full_name='liyi.trade_db.StockKnock.stock_name', index=5,
number=6, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='match_type', full_name='liyi.trade_db.StockKnock.match_type', index=6,
number=7, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='batch_no', full_name='liyi.trade_db.StockKnock.batch_no', index=7,
number=8, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='match_no', full_name='liyi.trade_db.StockKnock.match_no', index=8,
number=9, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='inner_match_no', full_name='liyi.trade_db.StockKnock.inner_match_no', index=9,
number=10, type=9, cpp_type=9, label=2,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='match_time', full_name='liyi.trade_db.StockKnock.match_time', index=10,
number=11, type=13, cpp_type=3, label=2,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='match_price', full_name='liyi.trade_db.StockKnock.match_price', index=11,
number=12, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='match_volume', full_name='liyi.trade_db.StockKnock.match_volume', index=12,
number=13, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='match_amount', full_name='liyi.trade_db.StockKnock.match_amount', index=13,
number=14, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='clear_amount', full_name='liyi.trade_db.StockKnock.clear_amount', index=14,
number=15, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='broker', full_name='liyi.trade_db.StockKnock.broker', index=15,
number=16, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='order_bs_flag', full_name='liyi.trade_db.StockKnock.order_bs_flag', index=16,
number=17, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='order_volume', full_name='liyi.trade_db.StockKnock.order_volume', index=17,
number=18, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='order_price', full_name='liyi.trade_db.StockKnock.order_price', index=18,
number=19, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='policy_id', full_name='liyi.trade_db.StockKnock.policy_id', index=19,
number=20, type=12, cpp_type=9, label=1,
has_default_value=False, default_value=_b(""),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='trader_id', full_name='liyi.trade_db.StockKnock.trader_id', index=20,
number=21, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='trader_ip', full_name='liyi.trade_db.StockKnock.trader_ip', index=21,
number=22, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='belonged_etf_code', full_name='liyi.trade_db.StockKnock.belonged_etf_code', index=22,
number=23, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='recv_time', full_name='liyi.trade_db.StockKnock.recv_time', index=23,
number=24, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='info', full_name='liyi.trade_db.StockKnock.info', index=24,
number=25, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='data_date', full_name='liyi.trade_db.StockKnock.data_date', index=25,
number=26, type=13, cpp_type=3, label=2,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='create_time', full_name='liyi.trade_db.StockKnock.create_time', index=26,
number=27, type=4, cpp_type=4, label=2,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='total_knock_volume', full_name='liyi.trade_db.StockKnock.total_knock_volume', index=27,
number=28, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
],
extensions=[
],
nested_types=[],
enum_types=[
],
options=None,
is_extendable=False,
extension_ranges=[],
oneofs=[
],
serialized_start=2460,
serialized_end=3032,
)
_FUTUREASSET = _descriptor.Descriptor(
name='FutureAsset',
full_name='liyi.trade_db.FutureAsset',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='fund_id', full_name='liyi.trade_db.FutureAsset.fund_id', index=0,
number=1, type=9, cpp_type=9, label=2,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='money_type', full_name='liyi.trade_db.FutureAsset.money_type', index=1,
number=2, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='asset', full_name='liyi.trade_db.FutureAsset.asset', index=2,
number=3, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='capital_balance', full_name='liyi.trade_db.FutureAsset.capital_balance', index=3,
number=4, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='capital_avaliable', full_name='liyi.trade_db.FutureAsset.capital_avaliable', index=4,
number=5, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='market_value', full_name='liyi.trade_db.FutureAsset.market_value', index=5,
number=6, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='create_time', full_name='liyi.trade_db.FutureAsset.create_time', index=6,
number=7, type=4, cpp_type=4, label=2,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
],
extensions=[
],
nested_types=[],
enum_types=[
],
options=None,
is_extendable=False,
extension_ranges=[],
oneofs=[
],
serialized_start=3035,
serialized_end=3195,
)
_FUTUREPOSITION = _descriptor.Descriptor(
name='FuturePosition',
full_name='liyi.trade_db.FuturePosition',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='fund_id', full_name='liyi.trade_db.FuturePosition.fund_id', index=0,
number=1, type=9, cpp_type=9, label=2,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='market', full_name='liyi.trade_db.FuturePosition.market', index=1,
number=2, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='instrument_id', full_name='liyi.trade_db.FuturePosition.instrument_id', index=2,
number=3, type=9, cpp_type=9, label=2,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='instrument_name', full_name='liyi.trade_db.FuturePosition.instrument_name', index=3,
number=4, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='long_volume', full_name='liyi.trade_db.FuturePosition.long_volume', index=4,
number=5, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='long_price', full_name='liyi.trade_db.FuturePosition.long_price', index=5,
number=6, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='short_volume', full_name='liyi.trade_db.FuturePosition.short_volume', index=6,
number=7, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='short_price', full_name='liyi.trade_db.FuturePosition.short_price', index=7,
number=8, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='hedging_flag', full_name='liyi.trade_db.FuturePosition.hedging_flag', index=8,
number=9, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='info', full_name='liyi.trade_db.FuturePosition.info', index=9,
number=10, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='create_time', full_name='liyi.trade_db.FuturePosition.create_time', index=10,
number=11, type=4, cpp_type=4, label=2,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
],
extensions=[
],
nested_types=[],
enum_types=[
],
options=None,
is_extendable=False,
extension_ranges=[],
oneofs=[
],
serialized_start=3198,
serialized_end=3436,
)
_FUTUREORDER = _descriptor.Descriptor(
name='FutureOrder',
full_name='liyi.trade_db.FutureOrder',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='fund_id', full_name='liyi.trade_db.FutureOrder.fund_id', index=0,
number=1, type=9, cpp_type=9, label=2,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='market', full_name='liyi.trade_db.FutureOrder.market', index=1,
number=2, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='bs_flag', full_name='liyi.trade_db.FutureOrder.bs_flag', index=2,
number=3, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='order_no', full_name='liyi.trade_db.FutureOrder.order_no', index=3,
number=4, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='instrument_id', full_name='liyi.trade_db.FutureOrder.instrument_id', index=4,
number=5, type=9, cpp_type=9, label=2,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='instrument_name', full_name='liyi.trade_db.FutureOrder.instrument_name', index=5,
number=6, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='price', full_name='liyi.trade_db.FutureOrder.price', index=6,
number=7, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='qty', full_name='liyi.trade_db.FutureOrder.qty', index=7,
number=8, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='status', full_name='liyi.trade_db.FutureOrder.status', index=8,
number=9, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='knock_volume', full_name='liyi.trade_db.FutureOrder.knock_volume', index=9,
number=10, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='knock_amount', full_name='liyi.trade_db.FutureOrder.knock_amount', index=10,
number=11, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='knock_price', full_name='liyi.trade_db.FutureOrder.knock_price', index=11,
number=12, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='unknock_volume', full_name='liyi.trade_db.FutureOrder.unknock_volume', index=12,
number=13, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='submit_time', full_name='liyi.trade_db.FutureOrder.submit_time', index=13,
number=14, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='info', full_name='liyi.trade_db.FutureOrder.info', index=14,
number=15, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='order_date', full_name='liyi.trade_db.FutureOrder.order_date', index=15,
number=16, type=13, cpp_type=3, label=2,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='order_time', full_name='liyi.trade_db.FutureOrder.order_time', index=16,
number=17, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='create_time', full_name='liyi.trade_db.FutureOrder.create_time', index=17,
number=18, type=4, cpp_type=4, label=2,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='policy_id', full_name='liyi.trade_db.FutureOrder.policy_id', index=18,
number=19, type=12, cpp_type=9, label=1,
has_default_value=False, default_value=_b(""),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='trader_id', full_name='liyi.trade_db.FutureOrder.trader_id', index=19,
number=20, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='trader_ip', full_name='liyi.trade_db.FutureOrder.trader_ip', index=20,
number=21, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='open_close_flag', full_name='liyi.trade_db.FutureOrder.open_close_flag', index=21,
number=22, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='update_time', full_name='liyi.trade_db.FutureOrder.update_time', index=22,
number=23, type=4, cpp_type=4, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='request_time', full_name='liyi.trade_db.FutureOrder.request_time', index=23,
number=24, type=4, cpp_type=4, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
],
extensions=[
],
nested_types=[],
enum_types=[
],
options=None,
is_extendable=False,
extension_ranges=[],
oneofs=[
],
serialized_start=3439,
serialized_end=3922,
)
_FUTUREKNOCK = _descriptor.Descriptor(
name='FutureKnock',
full_name='liyi.trade_db.FutureKnock',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='fund_id', full_name='liyi.trade_db.FutureKnock.fund_id', index=0,
number=1, type=9, cpp_type=9, label=2,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='market', full_name='liyi.trade_db.FutureKnock.market', index=1,
number=2, type=9, cpp_type=9, label=2,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='bs_flag', full_name='liyi.trade_db.FutureKnock.bs_flag', index=2,
number=3, type=5, cpp_type=1, label=2,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='order_no', full_name='liyi.trade_db.FutureKnock.order_no', index=3,
number=4, type=9, cpp_type=9, label=2,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='instrument_id', full_name='liyi.trade_db.FutureKnock.instrument_id', index=4,
number=5, type=9, cpp_type=9, label=2,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='instrument_name', full_name='liyi.trade_db.FutureKnock.instrument_name', index=5,
number=6, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='order_volume', full_name='liyi.trade_db.FutureKnock.order_volume', index=6,
number=7, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='order_price', full_name='liyi.trade_db.FutureKnock.order_price', index=7,
number=8, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='match_type', full_name='liyi.trade_db.FutureKnock.match_type', index=8,
number=9, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='match_no', full_name='liyi.trade_db.FutureKnock.match_no', index=9,
number=10, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='inner_match_no', full_name='liyi.trade_db.FutureKnock.inner_match_no', index=10,
number=11, type=9, cpp_type=9, label=2,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='match_time', full_name='liyi.trade_db.FutureKnock.match_time', index=11,
number=12, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='match_price', full_name='liyi.trade_db.FutureKnock.match_price', index=12,
number=13, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='match_volume', full_name='liyi.trade_db.FutureKnock.match_volume', index=13,
number=14, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='match_sum', full_name='liyi.trade_db.FutureKnock.match_sum', index=14,
number=15, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='match_amount', full_name='liyi.trade_db.FutureKnock.match_amount', index=15,
number=16, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='clear_amount', full_name='liyi.trade_db.FutureKnock.clear_amount', index=16,
number=17, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='broker', full_name='liyi.trade_db.FutureKnock.broker', index=17,
number=18, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='recv_time', full_name='liyi.trade_db.FutureKnock.recv_time', index=18,
number=19, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='ctp_status_msg', full_name='liyi.trade_db.FutureKnock.ctp_status_msg', index=19,
number=20, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='order_date', full_name='liyi.trade_db.FutureKnock.order_date', index=20,
number=21, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='trade_date', full_name='liyi.trade_db.FutureKnock.trade_date', index=21,
number=22, type=13, cpp_type=3, label=2,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='create_time', full_name='liyi.trade_db.FutureKnock.create_time', index=22,
number=23, type=4, cpp_type=4, label=2,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='policy_id', full_name='liyi.trade_db.FutureKnock.policy_id', index=23,
number=24, type=12, cpp_type=9, label=1,
has_default_value=False, default_value=_b(""),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='trader_id', full_name='liyi.trade_db.FutureKnock.trader_id', index=24,
number=25, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='trader_ip', full_name='liyi.trade_db.FutureKnock.trader_ip', index=25,
number=26, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
],
extensions=[
],
nested_types=[],
enum_types=[
],
options=None,
is_extendable=False,
extension_ranges=[],
oneofs=[
],
serialized_start=3925,
serialized_end=4462,
)
_FUTUREINSTRUMENT = _descriptor.Descriptor(
name='FutureInstrument',
full_name='liyi.trade_db.FutureInstrument',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='instrument_id', full_name='liyi.trade_db.FutureInstrument.instrument_id', index=0,
number=1, type=9, cpp_type=9, label=2,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='instrument_name', full_name='liyi.trade_db.FutureInstrument.instrument_name', index=1,
number=2, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='exchange_id', full_name='liyi.trade_db.FutureInstrument.exchange_id', index=2,
number=3, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='long_margin_ratio', full_name='liyi.trade_db.FutureInstrument.long_margin_ratio', index=3,
number=4, type=1, cpp_type=5, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='short_margin_ratio', full_name='liyi.trade_db.FutureInstrument.short_margin_ratio', index=4,
number=5, type=1, cpp_type=5, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='volume_multiple', full_name='liyi.trade_db.FutureInstrument.volume_multiple', index=5,
number=6, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='max_limit_order_volume', full_name='liyi.trade_db.FutureInstrument.max_limit_order_volume', index=6,
number=7, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='min_limit_order_volume', full_name='liyi.trade_db.FutureInstrument.min_limit_order_volume', index=7,
number=8, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='start_deliv_date', full_name='liyi.trade_db.FutureInstrument.start_deliv_date', index=8,
number=9, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='end_deliv_date', full_name='liyi.trade_db.FutureInstrument.end_deliv_date', index=9,
number=10, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='expire_date', full_name='liyi.trade_db.FutureInstrument.expire_date', index=10,
number=11, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
],
extensions=[
],
nested_types=[],
enum_types=[
],
options=None,
is_extendable=False,
extension_ranges=[],
oneofs=[
],
serialized_start=4465,
serialized_end=4767,
)
_BASERESPONSE = _descriptor.Descriptor(
name='BaseResponse',
full_name='liyi.trade_db.BaseResponse',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='rc', full_name='liyi.trade_db.BaseResponse.rc', index=0,
number=1, type=5, cpp_type=1, label=2,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='msg', full_name='liyi.trade_db.BaseResponse.msg', index=1,
number=2, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
],
extensions=[
],
nested_types=[],
enum_types=[
],
options=None,
is_extendable=False,
extension_ranges=[],
oneofs=[
],
serialized_start=4769,
serialized_end=4808,
)
DESCRIPTOR.message_types_by_name['StockAsset'] = _STOCKASSET
DESCRIPTOR.message_types_by_name['StockPosition'] = _STOCKPOSITION
DESCRIPTOR.message_types_by_name['StockOrder'] = _STOCKORDER
DESCRIPTOR.message_types_by_name['StockWithdraw'] = _STOCKWITHDRAW
DESCRIPTOR.message_types_by_name['StockKnock'] = _STOCKKNOCK
DESCRIPTOR.message_types_by_name['FutureAsset'] = _FUTUREASSET
DESCRIPTOR.message_types_by_name['FuturePosition'] = _FUTUREPOSITION
DESCRIPTOR.message_types_by_name['FutureOrder'] = _FUTUREORDER
DESCRIPTOR.message_types_by_name['FutureKnock'] = _FUTUREKNOCK
DESCRIPTOR.message_types_by_name['FutureInstrument'] = _FUTUREINSTRUMENT
DESCRIPTOR.message_types_by_name['BaseResponse'] = _BASERESPONSE
StockAsset = _reflection.GeneratedProtocolMessageType('StockAsset', (_message.Message,), dict(
DESCRIPTOR = _STOCKASSET,
__module__ = 'trade_db_model_pb2'
# @@protoc_insertion_point(class_scope:liyi.trade_db.StockAsset)
))
_sym_db.RegisterMessage(StockAsset)
StockPosition = _reflection.GeneratedProtocolMessageType('StockPosition', (_message.Message,), dict(
DESCRIPTOR = _STOCKPOSITION,
__module__ = 'trade_db_model_pb2'
# @@protoc_insertion_point(class_scope:liyi.trade_db.StockPosition)
))
_sym_db.RegisterMessage(StockPosition)
StockOrder = _reflection.GeneratedProtocolMessageType('StockOrder', (_message.Message,), dict(
DESCRIPTOR = _STOCKORDER,
__module__ = 'trade_db_model_pb2'
# @@protoc_insertion_point(class_scope:liyi.trade_db.StockOrder)
))
_sym_db.RegisterMessage(StockOrder)
StockWithdraw = _reflection.GeneratedProtocolMessageType('StockWithdraw', (_message.Message,), dict(
DESCRIPTOR = _STOCKWITHDRAW,
__module__ = 'trade_db_model_pb2'
# @@protoc_insertion_point(class_scope:liyi.trade_db.StockWithdraw)
))
_sym_db.RegisterMessage(StockWithdraw)
StockKnock = _reflection.GeneratedProtocolMessageType('StockKnock', (_message.Message,), dict(
DESCRIPTOR = _STOCKKNOCK,
__module__ = 'trade_db_model_pb2'
# @@protoc_insertion_point(class_scope:liyi.trade_db.StockKnock)
))
_sym_db.RegisterMessage(StockKnock)
FutureAsset = _reflection.GeneratedProtocolMessageType('FutureAsset', (_message.Message,), dict(
DESCRIPTOR = _FUTUREASSET,
__module__ = 'trade_db_model_pb2'
# @@protoc_insertion_point(class_scope:liyi.trade_db.FutureAsset)
))
_sym_db.RegisterMessage(FutureAsset)
FuturePosition = _reflection.GeneratedProtocolMessageType('FuturePosition', (_message.Message,), dict(
DESCRIPTOR = _FUTUREPOSITION,
__module__ = 'trade_db_model_pb2'
# @@protoc_insertion_point(class_scope:liyi.trade_db.FuturePosition)
))
_sym_db.RegisterMessage(FuturePosition)
FutureOrder = _reflection.GeneratedProtocolMessageType('FutureOrder', (_message.Message,), dict(
DESCRIPTOR = _FUTUREORDER,
__module__ = 'trade_db_model_pb2'
# @@protoc_insertion_point(class_scope:liyi.trade_db.FutureOrder)
))
_sym_db.RegisterMessage(FutureOrder)
FutureKnock = _reflection.GeneratedProtocolMessageType('FutureKnock', (_message.Message,), dict(
DESCRIPTOR = _FUTUREKNOCK,
__module__ = 'trade_db_model_pb2'
# @@protoc_insertion_point(class_scope:liyi.trade_db.FutureKnock)
))
_sym_db.RegisterMessage(FutureKnock)
FutureInstrument = _reflection.GeneratedProtocolMessageType('FutureInstrument', (_message.Message,), dict(
DESCRIPTOR = _FUTUREINSTRUMENT,
__module__ = 'trade_db_model_pb2'
# @@protoc_insertion_point(class_scope:liyi.trade_db.FutureInstrument)
))
_sym_db.RegisterMessage(FutureInstrument)
BaseResponse = _reflection.GeneratedProtocolMessageType('BaseResponse', (_message.Message,), dict(
DESCRIPTOR = _BASERESPONSE,
__module__ = 'trade_db_model_pb2'
# @@protoc_insertion_point(class_scope:liyi.trade_db.BaseResponse)
))
_sym_db.RegisterMessage(BaseResponse)
# @@protoc_insertion_point(module_scope)
| 50.218455 | 9,268 | 0.728203 | 13,042 | 90,343 | 4.758703 | 0.028983 | 0.080306 | 0.040233 | 0.058892 | 0.900714 | 0.872291 | 0.8415 | 0.799623 | 0.758745 | 0.735978 | 0 | 0.052306 | 0.14062 | 90,343 | 1,798 | 9,269 | 50.246385 | 0.747073 | 0.009652 | 0 | 0.770558 | 1 | 0.000575 | 0.233738 | 0.200704 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.00345 | 0 | 0.00345 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
672aa8a3f2b982faea66cd1e8ab2f35b392202c7 | 21,779 | py | Python | neural_layout/network_architectures.py | vincentherrmann/neural-layout | 38fd07711dfca09c00a4c37cd5d5fcb62cc317c4 | [
"MIT"
] | 10 | 2019-12-16T23:17:20.000Z | 2020-10-25T07:25:59.000Z | neural_layout/network_architectures.py | vincentherrmann/neural-layout | 38fd07711dfca09c00a4c37cd5d5fcb62cc317c4 | [
"MIT"
] | 2 | 2019-12-15T05:29:52.000Z | 2020-11-12T13:39:07.000Z | neural_layout/network_architectures.py | vincentherrmann/neural-layout | 38fd07711dfca09c00a4c37cd5d5fcb62cc317c4 | [
"MIT"
] | null | null | null | from neural_layout.network_graph import Network
def vgg16_network():
net = Network()
net.add_layer('input', [1, 224, 224])
net.add_layer('l1', [1, 224, 224])
net.add_layer('l2', [1, 224, 224])
net.add_layer('l3', [2, 112, 112])
net.add_layer('l4', [2, 112, 112])
net.add_layer('l5', [4, 56, 56])
net.add_layer('l6', [4, 56, 56])
net.add_layer('l7', [4, 56, 56])
net.add_layer('l8', [8, 28, 28])
net.add_layer('l9', [8, 28, 28])
net.add_layer('l10', [8, 28, 28])
net.add_layer('l11', [8, 14, 14])
net.add_layer('l12', [8, 14, 14])
net.add_layer('l13', [8, 14, 14])
net.add_layer('l14', [64, 1, 1])
net.add_layer('l15', [64, 1, 1])
net.add_layer('output', [16, 1, 1])
net.add_conv2d_connections('input', 'l1', stride=(1, 1),
kernel_size=(3, 3), padding=(1, 1, 1, 1))
net.add_conv2d_connections('l1', 'l2', stride=(1, 1),
kernel_size=(3, 3), padding=(1, 1, 1, 1))
net.add_conv2d_connections('l2', 'l3', stride=(2, 2),
kernel_size=(3, 3), padding=(1, 1, 1, 1))
net.add_conv2d_connections('l3', 'l4', stride=(1, 1),
kernel_size=(3, 3), padding=(1, 1, 1, 1))
net.add_conv2d_connections('l4', 'l5', stride=(2, 2),
kernel_size=(3, 3), padding=(1, 1, 1, 1))
net.add_conv2d_connections('l5', 'l6', stride=(1, 1),
kernel_size=(3, 3), padding=(1, 1, 1, 1))
net.add_conv2d_connections('l6', 'l7', stride=(1, 1),
kernel_size=(3, 3), padding=(1, 1, 1, 1))
net.add_conv2d_connections('l7', 'l8', stride=(2, 2),
kernel_size=(3, 3), padding=(1, 1, 1, 1))
net.add_conv2d_connections('l8', 'l9', stride=(1, 1),
kernel_size=(3, 3), padding=(1, 1, 1, 1))
net.add_conv2d_connections('l9', 'l10', stride=(1, 1),
kernel_size=(3, 3), padding=(1, 1, 1, 1))
net.add_conv2d_connections('l10', 'l11', stride=(2, 2),
kernel_size=(3, 3), padding=(1, 1, 1, 1))
net.add_conv2d_connections('l11', 'l12', stride=(1, 1),
kernel_size=(3, 3), padding=(1, 1, 1, 1))
net.add_conv2d_connections('l12', 'l13', stride=(1, 1),
kernel_size=(3, 3), padding=(1, 1, 1, 1))
net.add_full_connections('l13', 'l14')
net.add_full_connections('l14', 'l15')
net.add_full_connections('l15', 'output')
return net
def vgg16_1d_network():
net = Network()
net.add_layer('input', [1, 224])
net.add_layer('l1', [8, 224])
net.add_layer('l2', [8, 224])
net.add_layer('l3', [16, 112])
net.add_layer('l4', [16, 112])
net.add_layer('l5', [32, 56])
net.add_layer('l6', [32, 56])
net.add_layer('l7', [32, 56])
net.add_layer('l8', [64, 28])
net.add_layer('l9', [64, 28])
net.add_layer('l10', [64, 28])
net.add_layer('l11', [64, 14])
net.add_layer('l12', [64, 14])
net.add_layer('l13', [64, 14])
net.add_layer('l14', [512, 1])
net.add_layer('l15', [512, 1])
net.add_layer('output', [128, 1])
net.add_conv1d_connections('input', 'l1', stride=2,
kernel_size=3, padding=(1, 1))
net.add_conv1d_connections('l1', 'l2', stride=1,
kernel_size=3, padding=(1, 1))
net.add_conv1d_connections('l2', 'l3', stride=2,
kernel_size=3, padding=(1, 1))
net.add_conv1d_connections('l3', 'l4', stride=1,
kernel_size=3, padding=(1, 1))
net.add_conv1d_connections('l4', 'l5', stride=2,
kernel_size=3, padding=(1, 1))
net.add_conv1d_connections('l5', 'l6', stride=1,
kernel_size=3, padding=(1, 1))
net.add_conv1d_connections('l6', 'l7', stride=1,
kernel_size=3, padding=(1, 1))
net.add_conv1d_connections('l7', 'l8', stride=2,
kernel_size=3, padding=(1, 1))
net.add_conv1d_connections('l8', 'l9', stride=1,
kernel_size=3, padding=(1, 1))
net.add_conv1d_connections('l9', 'l10', stride=1,
kernel_size=3, padding=(1, 1))
net.add_conv1d_connections('l10', 'l11', stride=2,
kernel_size=3, padding=(1, 1))
net.add_conv1d_connections('l11', 'l12', stride=1,
kernel_size=3, padding=(1, 1))
net.add_conv1d_connections('l12', 'l13', stride=1,
kernel_size=3, padding=(1, 1))
net.add_full_connections('l13', 'l14')
net.add_full_connections('l14', 'l15')
net.add_full_connections('l15', 'output')
return net
def resnet18_1d_network():
net = Network()
net.add_layer('input', [1, 224])
net.add_layer('conv1', [64, 112])
net.add_layer('conv2_1_a', [64, 56])
net.add_layer('conv2_1_b', [64, 56])
net.add_layer('conv2_2_a', [64, 56])
net.add_layer('conv2_2_b', [64, 56])
net.add_layer('conv3_1_a', [128, 28])
net.add_layer('conv3_1_b', [128, 28])
net.add_layer('conv3_2_a', [128, 28])
net.add_layer('conv3_2_b', [128, 28])
net.add_layer('conv4_1_a', [256, 14])
net.add_layer('conv4_1_b', [256, 14])
net.add_layer('conv4_2_a', [256, 14])
net.add_layer('conv4_2_b', [256, 14])
net.add_layer('conv5_1_a', [512, 7])
net.add_layer('conv5_1_b', [512, 7])
net.add_layer('conv5_2_a', [512, 7])
net.add_layer('conv5_2_b', [512, 7])
net.add_layer('average_pool', [512, 1])
net.add_layer('fully_connected', [1000, 1])
net.add_conv1d_connections('input', 'conv1', stride=2,
kernel_size=7, padding=(3, 3))
# conv2
net.add_conv1d_connections('conv1', 'conv2_1_a', stride=2,
kernel_size=3, padding=(1, 1))
net.add_conv1d_connections('conv2_1_a', 'conv2_1_b',
kernel_size=3, padding=(1, 1))
net.add_conv1d_connections('conv1', 'conv2_1_b', stride=2,
kernel_size=1)
net.add_conv1d_connections('conv2_1_b', 'conv2_2_a',
kernel_size=3, padding=(1, 1))
net.add_conv1d_connections('conv2_2_a', 'conv2_2_b',
kernel_size=3, padding=(1, 1))
net.add_one_to_one_connections('conv2_1_b', 'conv2_2_b')
# conv3
net.add_conv1d_connections('conv2_2_b', 'conv3_1_a', stride=2,
kernel_size=3, padding=(1, 1))
net.add_conv1d_connections('conv3_1_a', 'conv3_1_b',
kernel_size=3, padding=(1, 1))
net.add_conv1d_connections('conv2_2_b', 'conv3_1_b', stride=2,
kernel_size=1)
net.add_conv1d_connections('conv3_1_b', 'conv3_2_a',
kernel_size=3, padding=(1, 1))
net.add_conv1d_connections('conv3_2_a', 'conv3_2_b',
kernel_size=3, padding=(1, 1))
net.add_one_to_one_connections('conv3_1_b', 'conv3_2_b')
# conv4
net.add_conv1d_connections('conv3_2_b', 'conv4_1_a', stride=2,
kernel_size=3, padding=(1, 1))
net.add_conv1d_connections('conv4_1_a', 'conv4_1_b',
kernel_size=3, padding=(1, 1))
net.add_conv1d_connections('conv3_2_b', 'conv4_1_b', stride=2,
kernel_size=1)
net.add_conv1d_connections('conv4_1_b', 'conv4_2_a',
kernel_size=3, padding=(1, 1))
net.add_conv1d_connections('conv4_2_a', 'conv4_2_b',
kernel_size=3, padding=(1, 1))
net.add_one_to_one_connections('conv4_1_b', 'conv4_2_b')
# conv5
net.add_conv1d_connections('conv4_2_b', 'conv5_1_a', stride=2,
kernel_size=3, padding=(1, 1))
net.add_conv1d_connections('conv5_1_a', 'conv5_1_b',
kernel_size=3, padding=(1, 1))
net.add_conv1d_connections('conv4_2_b', 'conv5_1_b', stride=2,
kernel_size=1)
net.add_conv1d_connections('conv5_1_b', 'conv5_2_a',
kernel_size=3, padding=(1, 1))
net.add_conv1d_connections('conv5_2_a', 'conv5_2_b',
kernel_size=3, padding=(1, 1))
net.add_one_to_one_connections('conv5_1_b', 'conv5_2_b')
net.add_conv1d_connections('conv5_2_b', 'average_pool', kernel_size=7)
net.add_full_connections('average_pool', 'fully_connected')
return net
def scalogram_resnet_network():
net = Network()
net.add_layer('scalogram', [2, 292])
net.add_layer('scalogram_block_0_main_conv_1', [32, 228])
net.add_layer('scalogram_block_0_main_conv_2', [32, 114])
net.add_layer('scalogram_block_1_main_conv_1', [32, 114])
net.add_layer('scalogram_block_1_main_conv_2', [32, 114])
net.add_layer('scalogram_block_2_main_conv_1', [64, 82])
net.add_layer('scalogram_block_2_main_conv_2', [64, 41])
net.add_layer('scalogram_block_3_main_conv_1', [64, 41])
net.add_layer('scalogram_block_3_main_conv_2', [64, 41])
net.add_layer('scalogram_block_4_main_conv_1', [128, 26])
net.add_layer('scalogram_block_4_main_conv_2', [128, 13])
net.add_layer('scalogram_block_5_main_conv_1', [128, 13])
net.add_layer('scalogram_block_5_main_conv_2', [128, 13])
net.add_layer('scalogram_block_6_main_conv_1', [256, 5])
net.add_layer('scalogram_block_6_main_conv_2', [256, 5])
net.add_layer('scalogram_block_7_main_conv_1', [512, 3])
net.add_layer('scalogram_block_7_main_conv_2', [512, 1])
net.add_layer('ar_block_0', [512, 1])
net.add_layer('ar_block_1', [512, 1])
net.add_layer('ar_block_2', [512, 1])
net.add_layer('ar_block_3', [512, 1])
net.add_layer('ar_block_4', [256, 1])
net.add_layer('ar_block_5', [256, 1])
net.add_layer('ar_block_6', [256, 1])
net.add_layer('ar_block_7', [256, 1])
net.add_layer('ar_block_8', [256, 1])
# Encoder
# BLOCK 0
net.add_conv1d_connections('scalogram', 'scalogram_block_0_main_conv_1',
kernel_size=65)
net.add_conv1d_connections('scalogram_block_0_main_conv_1', 'scalogram_block_0_main_conv_2',
kernel_size=3, stride=2, padding=(1, 1))
net.add_conv1d_connections('scalogram', 'scalogram_block_0_main_conv_2',
kernel_size=1, stride=2)
# BLOCK 1
net.add_conv1d_connections('scalogram_block_0_main_conv_2', 'scalogram_block_1_main_conv_1',
kernel_size=3, padding=(1, 1))
net.add_conv1d_connections('scalogram_block_1_main_conv_1', 'scalogram_block_1_main_conv_2',
kernel_size=3, padding=(1, 1))
net.add_conv1d_connections('scalogram_block_0_main_conv_2', 'scalogram_block_1_main_conv_2',
kernel_size=1)
# BLOCK 2
net.add_conv1d_connections('scalogram_block_1_main_conv_2', 'scalogram_block_2_main_conv_1',
kernel_size=33)
net.add_conv1d_connections('scalogram_block_2_main_conv_1', 'scalogram_block_2_main_conv_2',
kernel_size=3, stride=2, padding=(1, 1))
net.add_conv1d_connections('scalogram_block_1_main_conv_2', 'scalogram_block_2_main_conv_2',
kernel_size=1, stride=2)
# BLOCK 3
net.add_conv1d_connections('scalogram_block_2_main_conv_2', 'scalogram_block_3_main_conv_1',
kernel_size=3, padding=(1, 1))
net.add_conv1d_connections('scalogram_block_3_main_conv_1', 'scalogram_block_3_main_conv_2',
kernel_size=3, padding=(1, 1))
net.add_conv1d_connections('scalogram_block_2_main_conv_2', 'scalogram_block_3_main_conv_2',
kernel_size=1)
# BLOCK 4
net.add_conv1d_connections('scalogram_block_3_main_conv_2', 'scalogram_block_4_main_conv_1',
kernel_size=16)
net.add_conv1d_connections('scalogram_block_4_main_conv_1', 'scalogram_block_4_main_conv_2',
kernel_size=3, stride=2, padding=(1, 1))
net.add_conv1d_connections('scalogram_block_3_main_conv_2', 'scalogram_block_4_main_conv_2',
kernel_size=1, stride=2)
# BLOCK 5
net.add_conv1d_connections('scalogram_block_4_main_conv_2', 'scalogram_block_5_main_conv_1',
kernel_size=3, padding=(1, 1))
net.add_conv1d_connections('scalogram_block_5_main_conv_1', 'scalogram_block_5_main_conv_2',
kernel_size=3, padding=(1, 1))
net.add_conv1d_connections('scalogram_block_4_main_conv_2', 'scalogram_block_5_main_conv_2',
kernel_size=1)
# BLOCK 6
net.add_conv1d_connections('scalogram_block_5_main_conv_2', 'scalogram_block_6_main_conv_1',
kernel_size=9)
net.add_conv1d_connections('scalogram_block_6_main_conv_1', 'scalogram_block_6_main_conv_2',
kernel_size=3, stride=1, padding=(1, 1))
net.add_conv1d_connections('scalogram_block_5_main_conv_2', 'scalogram_block_6_main_conv_2',
kernel_size=1, stride=2)
# BLOCK 7
net.add_conv1d_connections('scalogram_block_6_main_conv_2', 'scalogram_block_7_main_conv_1',
kernel_size=3)
net.add_conv1d_connections('scalogram_block_7_main_conv_1', 'scalogram_block_7_main_conv_2',
kernel_size=3)
net.add_conv1d_connections('scalogram_block_6_main_conv_2', 'scalogram_block_7_main_conv_2',
kernel_size=1)
# Autoregressive model
# BLOCK 0
net.add_conv1d_connections('scalogram_block_7_main_conv_2', 'ar_block_0',
kernel_size=1)
# BLOCK 1
net.add_conv1d_connections('ar_block_0', 'ar_block_1',
kernel_size=1)
# BLOCK 2
net.add_conv1d_connections('ar_block_1', 'ar_block_2',
kernel_size=1)
# BLOCK 3
net.add_conv1d_connections('ar_block_2', 'ar_block_3',
kernel_size=1)
# BLOCK 4
net.add_conv1d_connections('ar_block_3', 'ar_block_4',
kernel_size=1)
# BLOCK 5
net.add_conv1d_connections('ar_block_4', 'ar_block_5',
kernel_size=1)
# BLOCK 3
net.add_conv1d_connections('ar_block_5', 'ar_block_6',
kernel_size=1)
# BLOCK 4
net.add_conv1d_connections('ar_block_6', 'ar_block_7',
kernel_size=1)
# BLOCK 5
net.add_conv1d_connections('ar_block_7', 'ar_block_8',
kernel_size=1)
# scoring
net.add_conv1d_connections('ar_block_8', 'scalogram_block_7_main_conv_2',
kernel_size=1)
return net
def scalogram_resnet_network_smaller():
net = Network()
net.add_layer('scalogram', [2, 216])
net.add_layer('scalogram_block_0_main_conv_1', [8, 108])
net.add_layer('scalogram_block_0_main_conv_2', [8, 84])
net.add_layer('scalogram_block_1_main_conv_1', [16, 84])
net.add_layer('scalogram_block_1_main_conv_2', [16, 84])
net.add_layer('scalogram_block_2_main_conv_1', [32, 84])
net.add_layer('scalogram_block_2_main_conv_2', [32, 60])
net.add_layer('scalogram_block_3_main_conv_1', [64, 60])
net.add_layer('scalogram_block_3_main_conv_2', [64, 60])
net.add_layer('scalogram_block_4_main_conv_1', [128, 30])
net.add_layer('scalogram_block_4_main_conv_2', [128, 6])
net.add_layer('scalogram_block_5_main_conv_1', [256, 6])
net.add_layer('scalogram_block_5_main_conv_2', [256, 6])
net.add_layer('scalogram_block_6_main_conv_1', [512, 6])
net.add_layer('scalogram_block_6_main_conv_2', [512, 3])
net.add_layer('scalogram_block_7_main_conv_1', [512, 3])
net.add_layer('scalogram_block_7_main_conv_2', [512, 1])
net.add_layer('ar_block_0', [512, 1])
net.add_layer('ar_block_1', [512, 1])
net.add_layer('ar_block_2', [512, 1])
net.add_layer('ar_block_3', [512, 1])
net.add_layer('ar_block_4', [256, 1])
net.add_layer('ar_block_5', [256, 1])
net.add_layer('ar_block_6', [256, 1])
net.add_layer('ar_block_7', [256, 1])
net.add_layer('ar_block_8', [256, 1])
# Encoder
# BLOCK 0
net.add_conv1d_connections('scalogram', 'scalogram_block_0_main_conv_1',
kernel_size=3, stride=2, padding=(1, 1))
net.add_conv1d_connections('scalogram_block_0_main_conv_1', 'scalogram_block_0_main_conv_2',
kernel_size=25)
net.add_conv1d_connections('scalogram', 'scalogram_block_0_main_conv_2',
kernel_size=1, stride=2)
# BLOCK 1
net.add_conv1d_connections('scalogram_block_0_main_conv_2', 'scalogram_block_1_main_conv_1',
kernel_size=3, padding=(1, 1))
net.add_conv1d_connections('scalogram_block_1_main_conv_1', 'scalogram_block_1_main_conv_2',
kernel_size=3, padding=(1, 1))
net.add_conv1d_connections('scalogram_block_0_main_conv_2', 'scalogram_block_1_main_conv_2',
kernel_size=1)
# BLOCK 2
net.add_conv1d_connections('scalogram_block_1_main_conv_2', 'scalogram_block_2_main_conv_1',
kernel_size=3, padding=(1, 1))
net.add_conv1d_connections('scalogram_block_2_main_conv_1', 'scalogram_block_2_main_conv_2',
kernel_size=25)
net.add_conv1d_connections('scalogram_block_1_main_conv_2', 'scalogram_block_2_main_conv_2',
kernel_size=1, stride=2)
# BLOCK 3
net.add_conv1d_connections('scalogram_block_2_main_conv_2', 'scalogram_block_3_main_conv_1',
kernel_size=3, padding=(1, 1))
net.add_conv1d_connections('scalogram_block_3_main_conv_1', 'scalogram_block_3_main_conv_2',
kernel_size=3, padding=(1, 1))
net.add_conv1d_connections('scalogram_block_2_main_conv_2', 'scalogram_block_3_main_conv_2',
kernel_size=1)
# BLOCK 4
net.add_conv1d_connections('scalogram_block_3_main_conv_2', 'scalogram_block_4_main_conv_1',
kernel_size=3, stride=2, padding=(1, 1))
net.add_conv1d_connections('scalogram_block_4_main_conv_1', 'scalogram_block_4_main_conv_2',
kernel_size=25)
net.add_conv1d_connections('scalogram_block_3_main_conv_2', 'scalogram_block_4_main_conv_2',
kernel_size=1, stride=2)
# BLOCK 5
net.add_conv1d_connections('scalogram_block_4_main_conv_2', 'scalogram_block_5_main_conv_1',
kernel_size=3, padding=(1, 1))
net.add_conv1d_connections('scalogram_block_5_main_conv_1', 'scalogram_block_5_main_conv_2',
kernel_size=3, padding=(1, 1))
net.add_conv1d_connections('scalogram_block_4_main_conv_2', 'scalogram_block_5_main_conv_2',
kernel_size=1)
# BLOCK 6
net.add_conv1d_connections('scalogram_block_5_main_conv_2', 'scalogram_block_6_main_conv_1',
kernel_size=3, padding=(1, 1))
net.add_conv1d_connections('scalogram_block_6_main_conv_1', 'scalogram_block_6_main_conv_2',
kernel_size=4)
net.add_conv1d_connections('scalogram_block_5_main_conv_2', 'scalogram_block_6_main_conv_2',
kernel_size=1, stride=2)
# BLOCK 7
net.add_conv1d_connections('scalogram_block_6_main_conv_2', 'scalogram_block_7_main_conv_1',
kernel_size=3, padding=(1, 1))
net.add_conv1d_connections('scalogram_block_7_main_conv_1', 'scalogram_block_7_main_conv_2',
kernel_size=3)
net.add_conv1d_connections('scalogram_block_6_main_conv_2', 'scalogram_block_7_main_conv_2',
kernel_size=1)
# Autoregressive model
# BLOCK 0
net.add_conv1d_connections('scalogram_block_7_main_conv_2', 'ar_block_0',
kernel_size=1)
# BLOCK 1
net.add_conv1d_connections('ar_block_0', 'ar_block_1',
kernel_size=1)
# BLOCK 2
net.add_conv1d_connections('ar_block_1', 'ar_block_2',
kernel_size=1)
# BLOCK 3
net.add_conv1d_connections('ar_block_2', 'ar_block_3',
kernel_size=1)
# BLOCK 4
net.add_conv1d_connections('ar_block_3', 'ar_block_4',
kernel_size=1)
# BLOCK 5
net.add_conv1d_connections('ar_block_4', 'ar_block_5',
kernel_size=1)
# BLOCK 3
net.add_conv1d_connections('ar_block_5', 'ar_block_6',
kernel_size=1)
# BLOCK 4
net.add_conv1d_connections('ar_block_6', 'ar_block_7',
kernel_size=1)
# BLOCK 5
net.add_conv1d_connections('ar_block_7', 'ar_block_8',
kernel_size=1)
# scoring
net.add_conv1d_connections('ar_block_8', 'scalogram_block_7_main_conv_2',
kernel_size=1)
return net | 43.558 | 96 | 0.599752 | 3,119 | 21,779 | 3.747034 | 0.032062 | 0.11962 | 0.099769 | 0.202704 | 0.971421 | 0.930778 | 0.881321 | 0.845127 | 0.845127 | 0.787798 | 0 | 0.09743 | 0.270949 | 21,779 | 500 | 97 | 43.558 | 0.638619 | 0.016943 | 0 | 0.593407 | 0 | 0 | 0.244735 | 0.173734 | 0 | 0 | 0 | 0 | 0 | 1 | 0.013736 | false | 0 | 0.002747 | 0 | 0.03022 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
67459eed8680a850732f857b82c5698171119225 | 3,412 | py | Python | emmanoulopoulos/test/test_TK.py | lena-lin/emmanoulopoulos | 15bd29be9a22464e58ff17c8a7c5af9963259ace | [
"MIT"
] | null | null | null | emmanoulopoulos/test/test_TK.py | lena-lin/emmanoulopoulos | 15bd29be9a22464e58ff17c8a7c5af9963259ace | [
"MIT"
] | null | null | null | emmanoulopoulos/test/test_TK.py | lena-lin/emmanoulopoulos | 15bd29be9a22464e58ff17c8a7c5af9963259ace | [
"MIT"
] | null | null | null | import astropy.units as u
import pytest
import numpy as np
def test_create_tk(lc, TK):
sim_lc = TK.sample_from_lc(lc)
fit_sim_lc = sim_lc.fit_PSD().to_dict()
fit_lc = lc.fit_PSD().to_dict()
assert sim_lc.original_length == lc.interp_length
np.testing.assert_almost_equal(sim_lc.interp_flux_mean, lc.interp_flux_mean, decimal=2)
np.testing.assert_allclose(fit_sim_lc["alpha_low"], fit_lc["alpha_low"], rtol=0.5)
np.testing.assert_allclose(fit_sim_lc["alpha_high"], fit_lc["alpha_high"], rtol=0.5)
np.testing.assert_allclose(fit_sim_lc["f_bend"], fit_lc["f_bend"], rtol=5)
def test_tk_red_noise(lc):
from emmanoulopoulos.emmanoulopoulos_lc_simulation import TimmerKoenig
TK = TimmerKoenig(red_noise_factor=100)
sim_lc = TK.sample_from_lc(lc)
fit_sim_lc = sim_lc.fit_PSD().to_dict()
fit_lc = lc.fit_PSD().to_dict()
assert sim_lc.original_length == lc.interp_length
np.testing.assert_almost_equal(sim_lc.interp_flux_mean, lc.interp_flux_mean, decimal=1)
np.testing.assert_allclose(fit_sim_lc["alpha_low"], fit_lc["alpha_low"], rtol=0.5)
np.testing.assert_allclose(fit_sim_lc["alpha_high"], fit_lc["alpha_high"], rtol=0.5)
np.testing.assert_allclose(fit_sim_lc["f_bend"], fit_lc["f_bend"], rtol=5)
def test_tk_alias(lc):
from emmanoulopoulos.emmanoulopoulos_lc_simulation import TimmerKoenig
TK = TimmerKoenig(alias_tbin=10)
sim_lc = TK.sample_from_lc(lc)
fit_sim_lc = sim_lc.fit_PSD().to_dict()
fit_lc = lc.fit_PSD().to_dict()
assert sim_lc.original_length == lc.interp_length
np.testing.assert_almost_equal(sim_lc.interp_flux_mean, lc.interp_flux_mean, decimal=2)
np.testing.assert_allclose(fit_sim_lc["alpha_low"], fit_lc["alpha_low"], rtol=0.5)
np.testing.assert_allclose(fit_sim_lc["alpha_high"], fit_lc["alpha_high"], rtol=0.5)
np.testing.assert_allclose(fit_sim_lc["f_bend"], fit_lc["f_bend"], rtol=5)
def test_tk_alias_red_noise(lc):
from emmanoulopoulos.emmanoulopoulos_lc_simulation import TimmerKoenig
TK = TimmerKoenig(red_noise_factor=100, alias_tbin=10)
sim_lc = TK.sample_from_lc(lc)
fit_sim_lc = sim_lc.fit_PSD().to_dict()
fit_lc = lc.fit_PSD().to_dict()
assert sim_lc.original_length == lc.interp_length
np.testing.assert_almost_equal(sim_lc.interp_flux_mean, lc.interp_flux_mean, decimal=2)
np.testing.assert_allclose(fit_sim_lc["alpha_low"], fit_lc["alpha_low"], rtol=0.5)
np.testing.assert_allclose(fit_sim_lc["alpha_high"], fit_lc["alpha_high"], rtol=0.5)
np.testing.assert_allclose(fit_sim_lc["f_bend"], fit_lc["f_bend"], rtol=5)
def test_tk_sample_from_psd(TK):
from emmanoulopoulos.emmanoulopoulos_lc_simulation import TimmerKoenig
TK = TimmerKoenig(red_noise_factor=1000, alias_tbin=10)
psd_params = {"A": 0.02, "alpha_low": 1, "alpha_high": 5, "f_bend": 0.01, "c": 0}
lc_tk = TK.sample_from_psd(psd_params, tbin=1*u.day, N=1000)
fit_psd = lc_tk.fit_PSD()
assert lc_tk.original_length == 1000
np.testing.assert_allclose(fit_psd["alpha_low"], psd_params["alpha_low"], rtol=2)
np.testing.assert_allclose(fit_psd["alpha_high"], psd_params["alpha_high"], rtol=0.2)
np.testing.assert_almost_equal(fit_psd["f_bend"], psd_params["f_bend"], decimal=2)
np.testing.assert_allclose(fit_psd["A"], psd_params["A"], rtol=5)
np.testing.assert_allclose(fit_psd["c"], psd_params["c"], atol=0.1) | 44.894737 | 91 | 0.743552 | 587 | 3,412 | 3.942078 | 0.102215 | 0.069144 | 0.136128 | 0.159032 | 0.855229 | 0.843993 | 0.831029 | 0.78522 | 0.78522 | 0.78522 | 0 | 0.021609 | 0.118406 | 3,412 | 76 | 92 | 44.894737 | 0.747673 | 0 | 0 | 0.614035 | 0 | 0 | 0.082332 | 0 | 0 | 0 | 0 | 0 | 0.45614 | 1 | 0.087719 | false | 0 | 0.122807 | 0 | 0.210526 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
674ac8fd6d65e81b89735dbc77f8eff8777e7235 | 65,143 | py | Python | txdav/caldav/datastore/scheduling/test/test_implicit.py | backwardn/ccs-calendarserver | 13c706b985fb728b9aab42dc0fef85aae21921c3 | [
"Apache-2.0"
] | 462 | 2016-08-14T17:43:24.000Z | 2022-03-17T07:38:16.000Z | txdav/caldav/datastore/scheduling/test/test_implicit.py | backwardn/ccs-calendarserver | 13c706b985fb728b9aab42dc0fef85aae21921c3 | [
"Apache-2.0"
] | 72 | 2016-09-01T23:19:35.000Z | 2020-02-05T02:09:26.000Z | txdav/caldav/datastore/scheduling/test/test_implicit.py | backwardn/ccs-calendarserver | 13c706b985fb728b9aab42dc0fef85aae21921c3 | [
"Apache-2.0"
] | 171 | 2016-08-16T03:50:30.000Z | 2022-03-26T11:49:55.000Z | ##
# Copyright (c) 2005-2017 Apple Inc. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
##
from pycalendar.datetime import DateTime
from pycalendar.timezone import Timezone
from txweb2 import responsecode
from txweb2.http import HTTPError
from twisted.internet import reactor
from twisted.internet.defer import succeed, inlineCallbacks, returnValue
from twisted.internet.task import deferLater
from twisted.trial.unittest import TestCase
from twistedcaldav.config import config
from twistedcaldav.ical import Component
from twistedcaldav.timezones import TimezoneCache
from txdav.caldav.datastore.scheduling.cuaddress import LocalCalendarUser
from txdav.caldav.datastore.scheduling.implicit import ImplicitScheduler
from txdav.caldav.datastore.scheduling.scheduler import ScheduleResponseQueue
from txdav.caldav.icalendarstore import AttendeeAllowedError, \
ComponentUpdateState
from txdav.caldav.datastore.sql import CalendarObject
from txdav.common.datastore.test.util import CommonCommonTests, populateCalendarsFrom
from twext.enterprise.jobs.jobitem import JobItem
from twext.python.clsprop import classproperty
import hashlib
import sys
class FakeScheduler(object):
"""
A fake CalDAVScheduler that does nothing except track who messages were sent to.
"""
def __init__(self, recipients):
self.recipients = recipients
def doSchedulingViaPUT(self, originator, recipients, calendar, internal_request=False, suppress_refresh=False):
self.recipients.extend(recipients)
return succeed(ScheduleResponseQueue("FAKE", responsecode.OK))
class Implicit(CommonCommonTests, TestCase):
"""
iCalendar support tests
"""
@inlineCallbacks
def setUp(self):
yield super(Implicit, self).setUp()
yield self.buildStoreAndDirectory()
@inlineCallbacks
def test_removed_attendees(self):
data = (
(
"#1.1 Simple component, no change",
"""BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//CALENDARSERVER.ORG//NONSGML Version 1//EN
BEGIN:VEVENT
UID:12345-67890
DTSTART:20080601T120000Z
DTEND:20080601T130000Z
ORGANIZER;CN="User 01":mailto:user01@example.com
ATTENDEE:mailto:user01@example.com
ATTENDEE:mailto:user02@example.com
END:VEVENT
END:VCALENDAR
""",
"""BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//CALENDARSERVER.ORG//NONSGML Version 1//EN
BEGIN:VEVENT
UID:12345-67890
DTSTART:20080601T120000Z
DTEND:20080601T130000Z
ORGANIZER;CN="User 01":mailto:user01@example.com
ATTENDEE:mailto:user01@example.com
ATTENDEE:mailto:user02@example.com
END:VEVENT
END:VCALENDAR
""",
(),
),
(
"#1.2 Simple component, one removal",
"""BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//CALENDARSERVER.ORG//NONSGML Version 1//EN
BEGIN:VEVENT
UID:12345-67890
DTSTART:20080601T120000Z
DTEND:20080601T130000Z
ORGANIZER;CN="User 01":mailto:user01@example.com
ATTENDEE:mailto:user01@example.com
ATTENDEE:mailto:user02@example.com
END:VEVENT
END:VCALENDAR
""",
"""BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//CALENDARSERVER.ORG//NONSGML Version 1//EN
BEGIN:VEVENT
UID:12345-67890
DTSTART:20080601T120000Z
DTEND:20080601T130000Z
ORGANIZER;CN="User 01":mailto:user01@example.com
ATTENDEE:mailto:user01@example.com
END:VEVENT
END:VCALENDAR
""",
(("mailto:user02@example.com", None),),
),
(
"#1.3 Simple component, two removals",
"""BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//CALENDARSERVER.ORG//NONSGML Version 1//EN
BEGIN:VEVENT
UID:12345-67890
DTSTART:20080601T120000Z
DTEND:20080601T130000Z
ORGANIZER;CN="User 01":mailto:user01@example.com
ATTENDEE:mailto:user01@example.com
ATTENDEE:mailto:user02@example.com
ATTENDEE:mailto:user03@example.com
END:VEVENT
END:VCALENDAR
""",
"""BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//CALENDARSERVER.ORG//NONSGML Version 1//EN
BEGIN:VEVENT
UID:12345-67890
DTSTART:20080601T120000Z
DTEND:20080601T130000Z
ORGANIZER;CN="User 01":mailto:user01@example.com
ATTENDEE:mailto:user01@example.com
END:VEVENT
END:VCALENDAR
""",
(
("mailto:user02@example.com", None),
("mailto:user03@example.com", None),
),
),
(
"#2.1 Simple recurring component, two removals",
"""BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//CALENDARSERVER.ORG//NONSGML Version 1//EN
BEGIN:VEVENT
UID:12345-67890
DTSTART:20080601T120000Z
DTEND:20080601T130000Z
ORGANIZER;CN="User 01":mailto:user01@example.com
ATTENDEE:mailto:user01@example.com
ATTENDEE:mailto:user02@example.com
ATTENDEE:mailto:user03@example.com
RRULE:FREQ=MONTHLY
END:VEVENT
END:VCALENDAR
""",
"""BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//CALENDARSERVER.ORG//NONSGML Version 1//EN
BEGIN:VEVENT
UID:12345-67890
DTSTART:20080601T120000Z
DTEND:20080601T130000Z
ORGANIZER;CN="User 01":mailto:user01@example.com
ATTENDEE:mailto:user01@example.com
RRULE:FREQ=MONTHLY
END:VEVENT
END:VCALENDAR
""",
(
("mailto:user02@example.com", None),
("mailto:user03@example.com", None),
),
),
(
"#2.2 Simple recurring component, add exdate",
"""BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//CALENDARSERVER.ORG//NONSGML Version 1//EN
BEGIN:VEVENT
UID:12345-67890
DTSTART:20080601T120000Z
DTEND:20080601T130000Z
ORGANIZER;CN="User 01":mailto:user01@example.com
ATTENDEE:mailto:user01@example.com
ATTENDEE:mailto:user02@example.com
ATTENDEE:mailto:user03@example.com
RRULE:FREQ=MONTHLY
END:VEVENT
END:VCALENDAR
""",
"""BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//CALENDARSERVER.ORG//NONSGML Version 1//EN
BEGIN:VEVENT
UID:12345-67890
DTSTART:20080601T120000Z
DTEND:20080601T130000Z
ORGANIZER;CN="User 01":mailto:user01@example.com
ATTENDEE:mailto:user01@example.com
ATTENDEE:mailto:user02@example.com
ATTENDEE:mailto:user03@example.com
RRULE:FREQ=MONTHLY
EXDATE:20080801T120000Z
END:VEVENT
END:VCALENDAR
""",
(
("mailto:user01@example.com", DateTime(2008, 8, 1, 12, 0, 0, tzid=Timezone.UTCTimezone)),
("mailto:user02@example.com", DateTime(2008, 8, 1, 12, 0, 0, tzid=Timezone.UTCTimezone)),
("mailto:user03@example.com", DateTime(2008, 8, 1, 12, 0, 0, tzid=Timezone.UTCTimezone)),
),
),
(
"#2.3 Simple recurring component, add multiple comma exdates",
"""BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//CALENDARSERVER.ORG//NONSGML Version 1//EN
BEGIN:VEVENT
UID:12345-67890
DTSTART:20080601T120000Z
DTEND:20080601T130000Z
ORGANIZER;CN="User 01":mailto:user01@example.com
ATTENDEE:mailto:user01@example.com
ATTENDEE:mailto:user02@example.com
ATTENDEE:mailto:user03@example.com
RRULE:FREQ=MONTHLY
END:VEVENT
END:VCALENDAR
""",
"""BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//CALENDARSERVER.ORG//NONSGML Version 1//EN
BEGIN:VEVENT
UID:12345-67890
DTSTART:20080601T120000Z
DTEND:20080601T130000Z
ORGANIZER;CN="User 01":mailto:user01@example.com
ATTENDEE:mailto:user01@example.com
ATTENDEE:mailto:user02@example.com
ATTENDEE:mailto:user03@example.com
RRULE:FREQ=MONTHLY
EXDATE:20080801T120000Z,20080901T120000Z
END:VEVENT
END:VCALENDAR
""",
(
("mailto:user01@example.com", DateTime(2008, 8, 1, 12, 0, 0, tzid=Timezone.UTCTimezone)),
("mailto:user02@example.com", DateTime(2008, 8, 1, 12, 0, 0, tzid=Timezone.UTCTimezone)),
("mailto:user03@example.com", DateTime(2008, 8, 1, 12, 0, 0, tzid=Timezone.UTCTimezone)),
("mailto:user01@example.com", DateTime(2008, 9, 1, 12, 0, 0, tzid=Timezone.UTCTimezone)),
("mailto:user02@example.com", DateTime(2008, 9, 1, 12, 0, 0, tzid=Timezone.UTCTimezone)),
("mailto:user03@example.com", DateTime(2008, 9, 1, 12, 0, 0, tzid=Timezone.UTCTimezone)),
),
),
(
"#2.3 Simple recurring component, add multiple comma/property exdates",
"""BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//CALENDARSERVER.ORG//NONSGML Version 1//EN
BEGIN:VEVENT
UID:12345-67890
DTSTART:20080601T120000Z
DTEND:20080601T130000Z
ORGANIZER;CN="User 01":mailto:user01@example.com
ATTENDEE:mailto:user01@example.com
ATTENDEE:mailto:user02@example.com
ATTENDEE:mailto:user03@example.com
RRULE:FREQ=MONTHLY
END:VEVENT
END:VCALENDAR
""",
"""BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//CALENDARSERVER.ORG//NONSGML Version 1//EN
BEGIN:VEVENT
UID:12345-67890
DTSTART:20080601T120000Z
DTEND:20080601T130000Z
ORGANIZER;CN="User 01":mailto:user01@example.com
ATTENDEE:mailto:user01@example.com
ATTENDEE:mailto:user02@example.com
ATTENDEE:mailto:user03@example.com
RRULE:FREQ=MONTHLY
EXDATE:20080801T120000Z,20080901T120000Z
EXDATE:20081201T120000Z
END:VEVENT
END:VCALENDAR
""",
(
("mailto:user01@example.com", DateTime(2008, 8, 1, 12, 0, 0, tzid=Timezone.UTCTimezone)),
("mailto:user02@example.com", DateTime(2008, 8, 1, 12, 0, 0, tzid=Timezone.UTCTimezone)),
("mailto:user03@example.com", DateTime(2008, 8, 1, 12, 0, 0, tzid=Timezone.UTCTimezone)),
("mailto:user01@example.com", DateTime(2008, 9, 1, 12, 0, 0, tzid=Timezone.UTCTimezone)),
("mailto:user02@example.com", DateTime(2008, 9, 1, 12, 0, 0, tzid=Timezone.UTCTimezone)),
("mailto:user03@example.com", DateTime(2008, 9, 1, 12, 0, 0, tzid=Timezone.UTCTimezone)),
("mailto:user01@example.com", DateTime(2008, 12, 1, 12, 0, 0, tzid=Timezone.UTCTimezone)),
("mailto:user02@example.com", DateTime(2008, 12, 1, 12, 0, 0, tzid=Timezone.UTCTimezone)),
("mailto:user03@example.com", DateTime(2008, 12, 1, 12, 0, 0, tzid=Timezone.UTCTimezone)),
),
),
(
"#3.1 Complex recurring component with same attendees, no change",
"""BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//CALENDARSERVER.ORG//NONSGML Version 1//EN
BEGIN:VEVENT
UID:12345-67890
DTSTART:20080601T120000Z
DTEND:20080601T130000Z
ORGANIZER;CN="User 01":mailto:user01@example.com
ATTENDEE:mailto:user01@example.com
ATTENDEE:mailto:user02@example.com
ATTENDEE:mailto:user03@example.com
RRULE:FREQ=MONTHLY
END:VEVENT
BEGIN:VEVENT
UID:12345-67890
RECURRENCE-ID:20080801T120000Z
DTSTART:20080601T120000Z
DTEND:20080601T130000Z
ORGANIZER;CN="User 01":mailto:user01@example.com
ATTENDEE:mailto:user01@example.com
ATTENDEE:mailto:user02@example.com
ATTENDEE:mailto:user03@example.com
END:VEVENT
END:VCALENDAR
""",
"""BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//CALENDARSERVER.ORG//NONSGML Version 1//EN
BEGIN:VEVENT
UID:12345-67890
DTSTART:20080601T120000Z
DTEND:20080601T130000Z
ORGANIZER;CN="User 01":mailto:user01@example.com
ATTENDEE:mailto:user01@example.com
ATTENDEE:mailto:user02@example.com
ATTENDEE:mailto:user03@example.com
RRULE:FREQ=MONTHLY
END:VEVENT
BEGIN:VEVENT
UID:12345-67890
RECURRENCE-ID:20080801T120000Z
DTSTART:20080601T120000Z
DTEND:20080601T130000Z
ORGANIZER;CN="User 01":mailto:user01@example.com
ATTENDEE:mailto:user01@example.com
ATTENDEE:mailto:user02@example.com
ATTENDEE:mailto:user03@example.com
END:VEVENT
END:VCALENDAR
""",
(),
),
(
"#3.2 Complex recurring component with same attendees, change master/override",
"""BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//CALENDARSERVER.ORG//NONSGML Version 1//EN
BEGIN:VEVENT
UID:12345-67890
DTSTART:20080601T120000Z
DTEND:20080601T130000Z
ORGANIZER;CN="User 01":mailto:user01@example.com
ATTENDEE:mailto:user01@example.com
ATTENDEE:mailto:user02@example.com
ATTENDEE:mailto:user03@example.com
RRULE:FREQ=MONTHLY
END:VEVENT
BEGIN:VEVENT
UID:12345-67890
RECURRENCE-ID:20080801T120000Z
DTSTART:20080601T120000Z
DTEND:20080601T130000Z
ORGANIZER;CN="User 01":mailto:user01@example.com
ATTENDEE:mailto:user01@example.com
ATTENDEE:mailto:user02@example.com
ATTENDEE:mailto:user03@example.com
END:VEVENT
END:VCALENDAR
""",
"""BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//CALENDARSERVER.ORG//NONSGML Version 1//EN
BEGIN:VEVENT
UID:12345-67890
DTSTART:20080601T120000Z
DTEND:20080601T130000Z
ORGANIZER;CN="User 01":mailto:user01@example.com
ATTENDEE:mailto:user01@example.com
ATTENDEE:mailto:user02@example.com
RRULE:FREQ=MONTHLY
END:VEVENT
BEGIN:VEVENT
UID:12345-67890
RECURRENCE-ID:20080801T120000Z
DTSTART:20080601T120000Z
DTEND:20080601T130000Z
ORGANIZER;CN="User 01":mailto:user01@example.com
ATTENDEE:mailto:user01@example.com
ATTENDEE:mailto:user02@example.com
END:VEVENT
END:VCALENDAR
""",
(
("mailto:user03@example.com", None),
("mailto:user03@example.com", DateTime(2008, 8, 1, 12, 0, 0, tzid=Timezone.UTCTimezone)),
),
),
(
"#3.3 Complex recurring component with same attendees, change override",
"""BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//CALENDARSERVER.ORG//NONSGML Version 1//EN
BEGIN:VEVENT
UID:12345-67890
DTSTART:20080601T120000Z
DTEND:20080601T130000Z
ORGANIZER;CN="User 01":mailto:user01@example.com
ATTENDEE:mailto:user01@example.com
ATTENDEE:mailto:user02@example.com
ATTENDEE:mailto:user03@example.com
RRULE:FREQ=MONTHLY
END:VEVENT
BEGIN:VEVENT
UID:12345-67890
RECURRENCE-ID:20080801T120000Z
DTSTART:20080601T120000Z
DTEND:20080601T130000Z
ORGANIZER;CN="User 01":mailto:user01@example.com
ATTENDEE:mailto:user01@example.com
ATTENDEE:mailto:user02@example.com
ATTENDEE:mailto:user03@example.com
END:VEVENT
END:VCALENDAR
""",
"""BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//CALENDARSERVER.ORG//NONSGML Version 1//EN
BEGIN:VEVENT
UID:12345-67890
DTSTART:20080601T120000Z
DTEND:20080601T130000Z
ORGANIZER;CN="User 01":mailto:user01@example.com
ATTENDEE:mailto:user01@example.com
ATTENDEE:mailto:user02@example.com
ATTENDEE:mailto:user03@example.com
RRULE:FREQ=MONTHLY
END:VEVENT
BEGIN:VEVENT
UID:12345-67890
RECURRENCE-ID:20080801T120000Z
DTSTART:20080601T120000Z
DTEND:20080601T130000Z
ORGANIZER;CN="User 01":mailto:user01@example.com
ATTENDEE:mailto:user01@example.com
ATTENDEE:mailto:user02@example.com
END:VEVENT
END:VCALENDAR
""",
(
("mailto:user03@example.com", DateTime(2008, 8, 1, 12, 0, 0, tzid=Timezone.UTCTimezone)),
),
),
(
"#3.4 Complex recurring component with same attendees, change master",
"""BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//CALENDARSERVER.ORG//NONSGML Version 1//EN
BEGIN:VEVENT
UID:12345-67890
DTSTART:20080601T120000Z
DTEND:20080601T130000Z
ORGANIZER;CN="User 01":mailto:user01@example.com
ATTENDEE:mailto:user01@example.com
ATTENDEE:mailto:user02@example.com
ATTENDEE:mailto:user03@example.com
RRULE:FREQ=MONTHLY
END:VEVENT
BEGIN:VEVENT
UID:12345-67890
RECURRENCE-ID:20080801T120000Z
DTSTART:20080601T120000Z
DTEND:20080601T130000Z
ORGANIZER;CN="User 01":mailto:user01@example.com
ATTENDEE:mailto:user01@example.com
ATTENDEE:mailto:user02@example.com
ATTENDEE:mailto:user03@example.com
END:VEVENT
END:VCALENDAR
""",
"""BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//CALENDARSERVER.ORG//NONSGML Version 1//EN
BEGIN:VEVENT
UID:12345-67890
DTSTART:20080601T120000Z
DTEND:20080601T130000Z
ORGANIZER;CN="User 01":mailto:user01@example.com
ATTENDEE:mailto:user01@example.com
ATTENDEE:mailto:user02@example.com
RRULE:FREQ=MONTHLY
END:VEVENT
BEGIN:VEVENT
UID:12345-67890
RECURRENCE-ID:20080801T120000Z
DTSTART:20080601T120000Z
DTEND:20080601T130000Z
ORGANIZER;CN="User 01":mailto:user01@example.com
ATTENDEE:mailto:user01@example.com
ATTENDEE:mailto:user02@example.com
ATTENDEE:mailto:user03@example.com
END:VEVENT
END:VCALENDAR
""",
(
("mailto:user03@example.com", None),
),
),
(
"#3.5 Complex recurring component with same attendees, remove override - no exdate",
"""BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//CALENDARSERVER.ORG//NONSGML Version 1//EN
BEGIN:VEVENT
UID:12345-67890
DTSTART:20080601T120000Z
DTEND:20080601T130000Z
ORGANIZER;CN="User 01":mailto:user01@example.com
ATTENDEE:mailto:user01@example.com
ATTENDEE:mailto:user02@example.com
ATTENDEE:mailto:user03@example.com
RRULE:FREQ=MONTHLY
END:VEVENT
BEGIN:VEVENT
UID:12345-67890
RECURRENCE-ID:20080801T120000Z
DTSTART:20080601T120000Z
DTEND:20080601T130000Z
ORGANIZER;CN="User 01":mailto:user01@example.com
ATTENDEE:mailto:user01@example.com
ATTENDEE:mailto:user02@example.com
ATTENDEE:mailto:user03@example.com
END:VEVENT
END:VCALENDAR
""",
"""BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//CALENDARSERVER.ORG//NONSGML Version 1//EN
BEGIN:VEVENT
UID:12345-67890
DTSTART:20080601T120000Z
DTEND:20080601T130000Z
ORGANIZER;CN="User 01":mailto:user01@example.com
ATTENDEE:mailto:user01@example.com
ATTENDEE:mailto:user02@example.com
ATTENDEE:mailto:user03@example.com
RRULE:FREQ=MONTHLY
END:VEVENT
END:VCALENDAR
""",
(),
),
(
"#3.6 Complex recurring component with same attendees, remove override - exdate",
"""BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//CALENDARSERVER.ORG//NONSGML Version 1//EN
BEGIN:VEVENT
UID:12345-67890
DTSTART:20080601T120000Z
DTEND:20080601T130000Z
ORGANIZER;CN="User 01":mailto:user01@example.com
ATTENDEE:mailto:user01@example.com
ATTENDEE:mailto:user02@example.com
ATTENDEE:mailto:user03@example.com
RRULE:FREQ=MONTHLY
END:VEVENT
BEGIN:VEVENT
UID:12345-67890
RECURRENCE-ID:20080801T120000Z
DTSTART:20080601T120000Z
DTEND:20080601T130000Z
ORGANIZER;CN="User 01":mailto:user01@example.com
ATTENDEE:mailto:user01@example.com
ATTENDEE:mailto:user02@example.com
ATTENDEE:mailto:user03@example.com
END:VEVENT
END:VCALENDAR
""",
"""BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//CALENDARSERVER.ORG//NONSGML Version 1//EN
BEGIN:VEVENT
UID:12345-67890
DTSTART:20080601T120000Z
DTEND:20080601T130000Z
ORGANIZER;CN="User 01":mailto:user01@example.com
ATTENDEE:mailto:user01@example.com
ATTENDEE:mailto:user02@example.com
ATTENDEE:mailto:user03@example.com
RRULE:FREQ=MONTHLY
EXDATE:20080801T120000Z
END:VEVENT
END:VCALENDAR
""",
(
("mailto:user01@example.com", DateTime(2008, 8, 1, 12, 0, 0, tzid=Timezone.UTCTimezone)),
("mailto:user02@example.com", DateTime(2008, 8, 1, 12, 0, 0, tzid=Timezone.UTCTimezone)),
("mailto:user03@example.com", DateTime(2008, 8, 1, 12, 0, 0, tzid=Timezone.UTCTimezone)),
),
),
(
"#4.1 Complex recurring component with different attendees, change master/override",
"""BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//CALENDARSERVER.ORG//NONSGML Version 1//EN
BEGIN:VEVENT
UID:12345-67890
DTSTART:20080601T120000Z
DTEND:20080601T130000Z
ORGANIZER;CN="User 01":mailto:user01@example.com
ATTENDEE:mailto:user01@example.com
ATTENDEE:mailto:user02@example.com
ATTENDEE:mailto:user03@example.com
RRULE:FREQ=MONTHLY
END:VEVENT
BEGIN:VEVENT
UID:12345-67890
RECURRENCE-ID:20080801T120000Z
DTSTART:20080601T120000Z
DTEND:20080601T130000Z
ORGANIZER;CN="User 01":mailto:user01@example.com
ATTENDEE:mailto:user01@example.com
ATTENDEE:mailto:user02@example.com
ATTENDEE:mailto:user04@example.com
END:VEVENT
END:VCALENDAR
""",
"""BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//CALENDARSERVER.ORG//NONSGML Version 1//EN
BEGIN:VEVENT
UID:12345-67890
DTSTART:20080601T120000Z
DTEND:20080601T130000Z
ORGANIZER;CN="User 01":mailto:user01@example.com
ATTENDEE:mailto:user01@example.com
ATTENDEE:mailto:user02@example.com
RRULE:FREQ=MONTHLY
END:VEVENT
BEGIN:VEVENT
UID:12345-67890
RECURRENCE-ID:20080801T120000Z
DTSTART:20080601T120000Z
DTEND:20080601T130000Z
ORGANIZER;CN="User 01":mailto:user01@example.com
ATTENDEE:mailto:user01@example.com
ATTENDEE:mailto:user02@example.com
END:VEVENT
END:VCALENDAR
""",
(
("mailto:user03@example.com", None),
("mailto:user04@example.com", DateTime(2008, 8, 1, 12, 0, 0, tzid=Timezone.UTCTimezone)),
),
),
(
"#4.2 Complex recurring component with different attendees, remove override - no exdate",
"""BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//CALENDARSERVER.ORG//NONSGML Version 1//EN
BEGIN:VEVENT
UID:12345-67890
DTSTART:20080601T120000Z
DTEND:20080601T130000Z
ORGANIZER;CN="User 01":mailto:user01@example.com
ATTENDEE:mailto:user01@example.com
ATTENDEE:mailto:user02@example.com
ATTENDEE:mailto:user03@example.com
RRULE:FREQ=MONTHLY
END:VEVENT
BEGIN:VEVENT
UID:12345-67890
RECURRENCE-ID:20080801T120000Z
DTSTART:20080601T120000Z
DTEND:20080601T130000Z
ORGANIZER;CN="User 01":mailto:user01@example.com
ATTENDEE:mailto:user01@example.com
ATTENDEE:mailto:user02@example.com
ATTENDEE:mailto:user04@example.com
END:VEVENT
END:VCALENDAR
""",
"""BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//CALENDARSERVER.ORG//NONSGML Version 1//EN
BEGIN:VEVENT
UID:12345-67890
DTSTART:20080601T120000Z
DTEND:20080601T130000Z
ORGANIZER;CN="User 01":mailto:user01@example.com
ATTENDEE:mailto:user01@example.com
ATTENDEE:mailto:user02@example.com
ATTENDEE:mailto:user03@example.com
RRULE:FREQ=MONTHLY
END:VEVENT
END:VCALENDAR
""",
(
("mailto:user04@example.com", DateTime(2008, 8, 1, 12, 0, 0, tzid=Timezone.UTCTimezone)),
),
),
(
"#4.3 Complex recurring component with different attendees, remove override - exdate",
"""BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//CALENDARSERVER.ORG//NONSGML Version 1//EN
BEGIN:VEVENT
UID:12345-67890
DTSTART:20080601T120000Z
DTEND:20080601T130000Z
ORGANIZER;CN="User 01":mailto:user01@example.com
ATTENDEE:mailto:user01@example.com
ATTENDEE:mailto:user02@example.com
ATTENDEE:mailto:user03@example.com
RRULE:FREQ=MONTHLY
END:VEVENT
BEGIN:VEVENT
UID:12345-67890
RECURRENCE-ID:20080801T120000Z
DTSTART:20080601T120000Z
DTEND:20080601T130000Z
ORGANIZER;CN="User 01":mailto:user01@example.com
ATTENDEE:mailto:user01@example.com
ATTENDEE:mailto:user02@example.com
ATTENDEE:mailto:user04@example.com
END:VEVENT
END:VCALENDAR
""",
"""BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//CALENDARSERVER.ORG//NONSGML Version 1//EN
BEGIN:VEVENT
UID:12345-67890
DTSTART:20080601T120000Z
DTEND:20080601T130000Z
ORGANIZER;CN="User 01":mailto:user01@example.com
ATTENDEE:mailto:user01@example.com
ATTENDEE:mailto:user02@example.com
ATTENDEE:mailto:user03@example.com
RRULE:FREQ=MONTHLY
EXDATE:20080801T120000Z
END:VEVENT
END:VCALENDAR
""",
(
("mailto:user01@example.com", DateTime(2008, 8, 1, 12, 0, 0, tzid=Timezone.UTCTimezone)),
("mailto:user02@example.com", DateTime(2008, 8, 1, 12, 0, 0, tzid=Timezone.UTCTimezone)),
("mailto:user04@example.com", DateTime(2008, 8, 1, 12, 0, 0, tzid=Timezone.UTCTimezone)),
),
),
)
for description, calendar1, calendar2, result in data:
scheduler = ImplicitScheduler()
scheduler.resource = None
scheduler.oldcalendar = Component.fromString(calendar1)
scheduler.oldAttendeesByInstance = scheduler.oldcalendar.getAttendeesByInstance(True, onlyScheduleAgentServer=True)
scheduler.oldInstances = set(scheduler.oldcalendar.getComponentInstances())
scheduler.calendar = Component.fromString(calendar2)
txn = self.transactionUnderTest()
scheduler.txn = txn
scheduler.calendar_home = yield self.homeUnderTest(txn=txn, name=u"user01", create=True)
yield scheduler.extractCalendarData()
scheduler.findRemovedAttendees()
self.assertEqual(scheduler.cancelledAttendees, set(result), msg=description)
yield self.commit()
@inlineCallbacks
def test_process_request_excludes_includes(self):
"""
Test that processRequests correctly excludes or includes the specified attendees.
"""
data = (
((), None, 3, ("mailto:user02@example.com", "mailto:user03@example.com", "mailto:user04@example.com",),),
(("mailto:user02@example.com",), None, 2, ("mailto:user03@example.com", "mailto:user04@example.com",),),
((), ("mailto:user02@example.com", "mailto:user04@example.com",), 2, ("mailto:user02@example.com", "mailto:user04@example.com",),),
)
calendar = """BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//CALENDARSERVER.ORG//NONSGML Version 1//EN
BEGIN:VEVENT
UID:12345-67890
DTSTART:20080601T120000Z
DTEND:20080601T130000Z
ORGANIZER;CN="User 01":mailto:user01@example.com
ATTENDEE:mailto:user01@example.com
ATTENDEE:mailto:user02@example.com
ATTENDEE:mailto:user03@example.com
ATTENDEE:mailto:user04@example.com
END:VEVENT
END:VCALENDAR
"""
for excludes, includes, result_count, result_set in data:
scheduler = ImplicitScheduler()
scheduler.resource = None
scheduler.calendar = Component.fromString(calendar)
scheduler.state = "organizer"
scheduler.action = "modify"
scheduler.internal_request = True
scheduler.except_attendees = excludes
scheduler.only_refresh_attendees = includes
scheduler.changed_rids = None
scheduler.reinvites = None
txn = self.transactionUnderTest()
scheduler.txn = txn
scheduler.calendar_home = yield self.homeUnderTest(txn=txn, name=u"user01", create=True)
# Get some useful information from the calendar
yield scheduler.extractCalendarData()
record = yield self.directory.recordWithUID(scheduler.calendar_home.uid())
scheduler.organizerAddress = LocalCalendarUser(
"mailto:user01@example.com",
record,
)
recipients = []
def makeFakeScheduler():
return FakeScheduler(recipients)
scheduler.makeScheduler = makeFakeScheduler
count = (yield scheduler.processRequests())
self.assertEqual(count, result_count)
self.assertEqual(len(recipients), result_count)
self.assertEqual(set(recipients), set(result_set))
yield self.commit()
class ImplicitRequests(CommonCommonTests, TestCase):
"""
Test txdav.caldav.datastore.scheduling.implicit.
"""
@inlineCallbacks
def setUp(self):
yield super(ImplicitRequests, self).setUp()
yield self.buildStoreAndDirectory()
yield self.populate()
@inlineCallbacks
def populate(self):
yield populateCalendarsFrom(self.requirements, self.storeUnderTest())
self.notifierFactory.reset()
@classproperty(cache=False)
def requirements(cls): # @NoSelf
return {
"user01": {
"calendar_1": {
},
"inbox": {
},
},
"user02": {
"calendar_1": {
},
"inbox": {
},
},
"user03": {
"calendar_1": {
},
"inbox": {
},
},
}
@inlineCallbacks
def _createCalendarObject(self, data, user, name):
calendar_collection = (yield self.calendarUnderTest(home=user))
yield calendar_collection.createCalendarObjectWithName("test.ics", Component.fromString(data))
yield self.commit()
@inlineCallbacks
def _listCalendarObjects(self, user, collection_name="calendar_1"):
collection = (yield self.calendarUnderTest(name=collection_name, home=user))
items = (yield collection.listCalendarObjects())
yield self.commit()
returnValue(items)
@inlineCallbacks
def _getCalendarData(self, user, name=None):
if name is None:
items = (yield self._listCalendarObjects(user))
name = items[0]
calendar_resource = (yield self.calendarObjectUnderTest(name=name, home=user))
calendar = (yield calendar_resource.component())
yield self.commit()
returnValue(str(calendar).replace("\r\n ", ""))
@inlineCallbacks
def _setCalendarData(self, data, user, name=None):
if name is None:
items = (yield self._listCalendarObjects(user))
name = items[0]
calendar_resource = (yield self.calendarObjectUnderTest(name=name, home=user))
yield calendar_resource.setComponent(Component.fromString(data))
yield self.commit()
@inlineCallbacks
def test_testImplicitSchedulingPUT_ScheduleState(self):
"""
Test that checkImplicitState() always returns True for any organizer, valid or not.
"""
data = (
(
"""BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//CALENDARSERVER.ORG//NONSGML Version 1//EN
BEGIN:VEVENT
UID:12345-67890
DTSTART:20080601T120000Z
DTEND:20080601T130000Z
END:VEVENT
END:VCALENDAR
""",
False,
),
(
"""BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//CALENDARSERVER.ORG//NONSGML Version 1//EN
BEGIN:VEVENT
UID:12345-67890
DTSTART:20080601T120000Z
DTEND:20080601T130000Z
ORGANIZER;CN="User 01":mailto:user01@example.com
ATTENDEE:mailto:user01@example.com
ATTENDEE:mailto:user02@example.com
END:VEVENT
END:VCALENDAR
""",
True,
),
(
"""BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//CALENDARSERVER.ORG//NONSGML Version 1//EN
BEGIN:VEVENT
UID:12345-67890
DTSTART:20080601T120000Z
DTEND:20080601T130000Z
ORGANIZER;CN="User 01":mailto:bogus@bogus.com
ATTENDEE:mailto:user01@example.com
ATTENDEE:mailto:bogus@bogus.com
END:VEVENT
END:VCALENDAR
""",
True,
),
)
calendar_collection = (yield self.calendarUnderTest(home="user01"))
for calendar, result in data:
calendar = Component.fromString(calendar)
scheduler = ImplicitScheduler()
doAction, isScheduleObject = (yield scheduler.testImplicitSchedulingPUT(calendar_collection, None, calendar, False))
self.assertEqual(doAction, result)
self.assertEqual(isScheduleObject, result)
@inlineCallbacks
def test_testImplicitSchedulingPUT_FixScheduleState(self):
"""
Test that testImplicitSchedulingPUT will fix an old cached schedule object state by
re-evaluating the calendar data.
"""
calendarOld = Component.fromString("""BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//CALENDARSERVER.ORG//NONSGML Version 1//EN
BEGIN:VEVENT
UID:12345-67890
DTSTAMP:20080601T120000Z
DTSTART:20080601T120000Z
DTEND:20080601T130000Z
ORGANIZER;CN="User 01":mailto:user01@example.com
ATTENDEE:mailto:user01@example.com
ATTENDEE:mailto:user02@example.com
END:VEVENT
END:VCALENDAR
""")
calendarNew = Component.fromString("""BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//CALENDARSERVER.ORG//NONSGML Version 1//EN
BEGIN:VEVENT
UID:12345-67890
DTSTAMP:20080601T120000Z
DTSTART:20080601T120000Z
DTEND:20080601T130000Z
ORGANIZER;CN="User 01":mailto:user01@example.com
ATTENDEE:mailto:user01@example.com
ATTENDEE:mailto:user02@example.com
END:VEVENT
END:VCALENDAR
""")
calendar_collection = (yield self.calendarUnderTest(home="user01"))
calresource = (yield calendar_collection.createCalendarObjectWithName(
"1.ics", calendarOld
))
calresource.isScheduleObject = False
scheduler = ImplicitScheduler()
try:
doAction, isScheduleObject = (yield scheduler.testImplicitSchedulingPUT(calendar_collection, calresource, calendarNew, False))
except Exception as e:
print e
self.fail("Exception must not be raised")
self.assertTrue(doAction)
self.assertTrue(isScheduleObject)
@inlineCallbacks
def test_testImplicitSchedulingPUT_NoChangeScheduleState(self):
"""
Test that testImplicitSchedulingPUT will prevent attendees from changing the
schedule object state.
"""
calendarOld = Component.fromString("""BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//CALENDARSERVER.ORG//NONSGML Version 1//EN
BEGIN:VEVENT
UID:12345-67890
DTSTAMP:20080601T120000Z
DTSTART:20080601T120000Z
DTEND:20080601T130000Z
END:VEVENT
END:VCALENDAR
""")
calendarNew = Component.fromString("""BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//CALENDARSERVER.ORG//NONSGML Version 1//EN
BEGIN:VEVENT
UID:12345-67890
DTSTAMP:20080601T120000Z
DTSTART:20080601T120000Z
DTEND:20080601T130000Z
ORGANIZER;CN="User 02":mailto:user02@example.com
ATTENDEE:mailto:user01@example.com
ATTENDEE:mailto:user02@example.com
END:VEVENT
END:VCALENDAR
""")
calendar_collection = (yield self.calendarUnderTest(home="user01"))
calresource = (yield calendar_collection.createCalendarObjectWithName(
"1.ics", calendarOld
))
calresource.isScheduleObject = False
scheduler = ImplicitScheduler()
try:
yield scheduler.testImplicitSchedulingPUT(calendar_collection, calresource, calendarNew, False)
except HTTPError:
pass
except:
self.fail("HTTPError exception must be raised")
else:
self.fail("Exception must be raised")
@inlineCallbacks
def test_doImplicitScheduling_NewOrganizerEvent(self):
"""
Test that doImplicitScheduling delivers scheduling messages to attendees.
"""
data = """BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//CALENDARSERVER.ORG//NONSGML Version 1//EN
BEGIN:VEVENT
UID:12345-67890
DTSTAMP:20080601T120000Z
DTSTART:20080601T120000Z
DTEND:20080601T130000Z
ORGANIZER;CN="User 01":mailto:user01@example.com
ATTENDEE:mailto:user01@example.com
ATTENDEE:mailto:user02@example.com
END:VEVENT
END:VCALENDAR
"""
yield self._createCalendarObject(data, "user01", "test.ics")
list2 = (yield self._listCalendarObjects("user02"))
self.assertEqual(len(list2), 1)
self.assertTrue(list2[0].startswith(hashlib.md5("12345-67890").hexdigest()))
list2 = (yield self._listCalendarObjects("user02", "inbox"))
self.assertEqual(len(list2), 1)
self.assertTrue(list2[0].startswith(hashlib.md5("12345-67890").hexdigest()))
@inlineCallbacks
def test_doImplicitScheduling_UpdateOrganizerEvent(self):
"""
Test that doImplicitScheduling delivers scheduling messages to attendees.
"""
data1 = """BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//CALENDARSERVER.ORG//NONSGML Version 1//EN
BEGIN:VEVENT
UID:12345-67890
DTSTAMP:20080601T120000Z
DTSTART:20080601T120000Z
DTEND:20080601T130000Z
ORGANIZER;CN="User 01":mailto:user01@example.com
ATTENDEE:mailto:user01@example.com
ATTENDEE:mailto:user02@example.com
END:VEVENT
END:VCALENDAR
"""
data2 = """BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//CALENDARSERVER.ORG//NONSGML Version 1//EN
BEGIN:VEVENT
UID:12345-67890
DTSTAMP:20080601T120000Z
DTSTART:20080601T130000Z
DTEND:20080601T140000Z
ORGANIZER;CN="User 01":mailto:user01@example.com
ATTENDEE:mailto:user01@example.com
ATTENDEE:mailto:user02@example.com
END:VEVENT
END:VCALENDAR
"""
yield self._createCalendarObject(data1, "user01", "test.ics")
yield self._setCalendarData(data2, "user01", "test.ics")
list2 = (yield self._listCalendarObjects("user02"))
self.assertEqual(len(list2), 1)
self.assertTrue(list2[0].startswith(hashlib.md5("12345-67890").hexdigest()))
list2 = (yield self._listCalendarObjects("user02", "inbox"))
self.assertEqual(len(list2), 2)
self.assertTrue(list2[0].startswith(hashlib.md5("12345-67890").hexdigest()))
self.assertTrue(list2[1].startswith(hashlib.md5("12345-67890").hexdigest()))
@inlineCallbacks
def test_doImplicitScheduling_DeleteOrganizerEvent(self):
"""
Test that doImplicitScheduling delivers scheduling messages to attendees.
"""
data1 = """BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//CALENDARSERVER.ORG//NONSGML Version 1//EN
BEGIN:VEVENT
UID:12345-67890
DTSTAMP:20080601T120000Z
DTSTART:20080601T120000Z
DTEND:20080601T130000Z
ORGANIZER;CN="User 01":mailto:user01@example.com
ATTENDEE:mailto:user01@example.com
ATTENDEE:mailto:user02@example.com
END:VEVENT
END:VCALENDAR
"""
yield self._createCalendarObject(data1, "user01", "test.ics")
calendar_resource = (yield self.calendarObjectUnderTest(name="test.ics", home="user01"))
yield calendar_resource.remove()
yield self.commit()
list2 = (yield self._listCalendarObjects("user02"))
self.assertEqual(len(list2), 1)
self.assertTrue(list2[0].startswith(hashlib.md5("12345-67890").hexdigest()))
list2 = (yield self._listCalendarObjects("user02", "inbox"))
self.assertEqual(len(list2), 2)
self.assertTrue(list2[0].startswith(hashlib.md5("12345-67890").hexdigest()))
self.assertTrue(list2[1].startswith(hashlib.md5("12345-67890").hexdigest()))
@inlineCallbacks
def test_doImplicitScheduling_UpdateMailtoOrganizerEvent(self):
"""
Test that doImplicitScheduling works when the existing calendar data contains a non-normalized
organizer calendar user address.
"""
data1 = """BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//CALENDARSERVER.ORG//NONSGML Version 1//EN
BEGIN:VEVENT
UID:12345-67890
DTSTAMP:20080601T120000Z
DTSTART:20080601T120000Z
DTEND:20080601T130000Z
ORGANIZER;CN="User 01";SCHEDULE-AGENT=NONE:mailto:user01@example.com
ATTENDEE:mailto:user01@example.com
ATTENDEE:mailto:user02@example.com
END:VEVENT
END:VCALENDAR
"""
data2 = """BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//CALENDARSERVER.ORG//NONSGML Version 1//EN
BEGIN:VEVENT
UID:12345-67890
DTSTAMP:20080601T120000Z
DTSTART:20080601T130000Z
DTEND:20080601T140000Z
ORGANIZER;CN="User 01";SCHEDULE-AGENT=NONE:mailto:user01@example.com
ATTENDEE:mailto:user01@example.com
ATTENDEE:mailto:user02@example.com
END:VEVENT
END:VCALENDAR
"""
self.patch(CalendarObject.CalendarObjectUpgradeWork, "delay", 1)
yield self._createCalendarObject(data1, "user01", "test.ics")
cobj = yield self.calendarObjectUnderTest(home="user01", name="test.ics")
actualVersion = CalendarObject._currentDataVersion
self.patch(CalendarObject, "_currentDataVersion", 0)
yield cobj._setComponentInternal(Component.fromString(data1), internal_state=ComponentUpdateState.RAW)
CalendarObject._currentDataVersion = actualVersion
yield self.commit()
cobj = yield self.calendarObjectUnderTest(home="user01", name="test.ics")
comp = yield cobj.component()
# Because CUA normalization happens in component() now too...
self.assertTrue(comp.getOrganizer().startswith("urn:x-uid:"))
self.assertFalse(comp.getOrganizerScheduleAgent())
yield self.commit()
yield JobItem.waitEmpty(self.storeUnderTest().newTransaction, reactor, 60)
cobj = yield self.calendarObjectUnderTest(home="user01", name="test.ics")
comp = yield cobj.component()
# Because CUA normalization happens in component() now too...
self.assertTrue(comp.getOrganizer().startswith("urn:x-uid:"))
self.assertFalse(comp.getOrganizerScheduleAgent())
yield self.commit()
cobj = yield self.calendarObjectUnderTest(home="user01", name="test.ics")
actualVersion = CalendarObject._currentDataVersion
self.patch(CalendarObject, "_currentDataVersion", 0)
yield cobj.setComponent(Component.fromString(data2))
CalendarObject._currentDataVersion = actualVersion
yield self.commit()
cobj = yield self.calendarObjectUnderTest(home="user01", name="test.ics")
comp = yield cobj.component()
self.assertTrue(comp.getOrganizer().startswith("urn:x-uid:"))
self.assertTrue(comp.getOrganizerScheduleAgent())
yield self.commit()
yield JobItem.waitEmpty(self.storeUnderTest().newTransaction, reactor, 60)
cobj = yield self.calendarObjectUnderTest(home="user01", name="test.ics")
comp = yield cobj.component()
self.assertTrue(comp.getOrganizer().startswith("urn:x-uid:"))
self.assertTrue(comp.getOrganizerScheduleAgent())
yield self.commit()
@inlineCallbacks
def test_doImplicitScheduling_AttendeeEventNoOrganizerEvent(self):
"""
Test that doImplicitScheduling handles an attendee reply with no organizer event.
"""
data = """BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//CALENDARSERVER.ORG//NONSGML Version 1//EN
BEGIN:VEVENT
UID:12345-67890-attendee-no-organizer
DTSTAMP:20080601T120000Z
DTSTART:20080601T120000Z
DTEND:20080601T130000Z
ORGANIZER;CN="User 01":mailto:user01@example.com
ATTENDEE:mailto:user01@example.com
ATTENDEE;PARTSTAT=ACCEPTED:mailto:user02@example.com
END:VEVENT
END:VCALENDAR
"""
try:
yield self._createCalendarObject(data, "user02", "test.ics")
except AttendeeAllowedError:
pass
except:
self.fail("Wrong exception raised: %s" % (sys.exc_info()[0].__name__,))
else:
self.fail("Exception not raised")
list1 = (yield self._listCalendarObjects("user01", "inbox"))
self.assertEqual(len(list1), 0)
@inlineCallbacks
def test_doImplicitScheduling_AttendeeReply(self):
"""
Test that doImplicitScheduling delivers scheduling messages to attendees who can then reply.
"""
data1 = """BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//CALENDARSERVER.ORG//NONSGML Version 1//EN
BEGIN:VEVENT
UID:12345-67890-attendee-reply
DTSTAMP:20080601T120000Z
DTSTART:20080601T120000Z
DTEND:20080601T130000Z
ORGANIZER;CN="User 01":mailto:user01@example.com
ATTENDEE:mailto:user01@example.com
ATTENDEE:mailto:user02@example.com
END:VEVENT
END:VCALENDAR
"""
data2 = """BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//CALENDARSERVER.ORG//NONSGML Version 1//EN
BEGIN:VEVENT
UID:12345-67890-attendee-reply
DTSTAMP:20080601T120000Z
DTSTART:20080601T120000Z
DTEND:20080601T130000Z
ORGANIZER;CN="User 01":mailto:user01@example.com
ATTENDEE:mailto:user01@example.com
ATTENDEE;PARTSTAT=ACCEPTED:mailto:user02@example.com
END:VEVENT
END:VCALENDAR
"""
yield self._createCalendarObject(data1, "user01", "test.ics")
calendar1 = (yield self._getCalendarData("user01", "test.ics"))
self.assertTrue("SCHEDULE-STATUS=1.2" in calendar1)
list2 = (yield self._listCalendarObjects("user02", "inbox"))
self.assertEqual(len(list2), 1)
yield self._setCalendarData(data2, "user02")
yield JobItem.waitEmpty(self.storeUnderTest().newTransaction, reactor, 60)
list1 = (yield self._listCalendarObjects("user01", "inbox"))
self.assertEqual(len(list1), 1)
calendar1 = (yield self._getCalendarData("user01", "test.ics"))
self.assertTrue("SCHEDULE-STATUS=2.0" in calendar1)
self.assertTrue("PARTSTAT=ACCEPTED" in calendar1)
@inlineCallbacks
def test_doImplicitScheduling_refreshAllAttendeesExceptSome(self):
"""
Test that doImplicitScheduling delivers scheduling messages to attendees who can then reply.
"""
data1 = """BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//CALENDARSERVER.ORG//NONSGML Version 1//EN
BEGIN:VEVENT
UID:12345-67890-attendee-reply
DTSTAMP:20080601T120000Z
DTSTART:20080601T120000Z
DTEND:20080601T130000Z
ORGANIZER;CN="User 01":mailto:user01@example.com
ATTENDEE:mailto:user01@example.com
ATTENDEE:mailto:user02@example.com
ATTENDEE:mailto:user03@example.com
END:VEVENT
END:VCALENDAR
"""
data2 = """BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//CALENDARSERVER.ORG//NONSGML Version 1//EN
BEGIN:VEVENT
UID:12345-67890-attendee-reply
DTSTAMP:20080601T120000Z
DTSTART:20080601T120000Z
DTEND:20080601T130000Z
ORGANIZER;CN="User 01":mailto:user01@example.com
ATTENDEE:mailto:user01@example.com
ATTENDEE;PARTSTAT=ACCEPTED:mailto:user02@example.com
ATTENDEE:mailto:user03@example.com
END:VEVENT
END:VCALENDAR
"""
# Need refreshes to occur immediately, not via reactor.callLater
self.patch(config.Scheduling.Options, "AttendeeRefreshBatch", False)
yield self._createCalendarObject(data1, "user01", "test.ics")
list1 = (yield self._listCalendarObjects("user01", "inbox"))
self.assertEqual(len(list1), 0)
calendar1 = (yield self._getCalendarData("user01", "test.ics"))
self.assertTrue("SCHEDULE-STATUS=1.2" in calendar1)
list2 = (yield self._listCalendarObjects("user02", "inbox"))
self.assertEqual(len(list2), 1)
calendar2 = (yield self._getCalendarData("user02"))
self.assertTrue("PARTSTAT=ACCEPTED" not in calendar2)
list3 = (yield self._listCalendarObjects("user03", "inbox"))
self.assertEqual(len(list3), 1)
calendar3 = (yield self._getCalendarData("user03"))
self.assertTrue("PARTSTAT=ACCEPTED" not in calendar3)
yield self._setCalendarData(data2, "user02")
yield JobItem.waitEmpty(self.storeUnderTest().newTransaction, reactor, 60)
list1 = (yield self._listCalendarObjects("user01", "inbox"))
self.assertEqual(len(list1), 1)
calendar1 = (yield self._getCalendarData("user01", "test.ics"))
self.assertTrue("SCHEDULE-STATUS=2.0" in calendar1)
self.assertTrue("PARTSTAT=ACCEPTED" in calendar1)
list2 = (yield self._listCalendarObjects("user02", "inbox"))
self.assertEqual(len(list2), 1)
calendar2 = (yield self._getCalendarData("user02"))
self.assertTrue("PARTSTAT=ACCEPTED" in calendar2)
list3 = (yield self._listCalendarObjects("user03", "inbox"))
self.assertEqual(len(list3), 1)
calendar3 = (yield self._getCalendarData("user03"))
self.assertTrue("PARTSTAT=ACCEPTED" in calendar3)
@inlineCallbacks
def test_doImplicitScheduling_refreshAllAttendeesExceptSome_Batched(self):
"""
Test that doImplicitScheduling delivers scheduling messages to attendees who can then reply.
Verify that batched refreshing is working.
"""
data1 = """BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//CALENDARSERVER.ORG//NONSGML Version 1//EN
BEGIN:VEVENT
UID:12345-67890-attendee-reply
DTSTAMP:20080601T120000Z
DTSTART:20080601T120000Z
DTEND:20080601T130000Z
ORGANIZER;CN="User 01":mailto:user01@example.com
ATTENDEE:mailto:user01@example.com
ATTENDEE:mailto:user02@example.com
ATTENDEE:mailto:user03@example.com
END:VEVENT
END:VCALENDAR
"""
data2 = """BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//CALENDARSERVER.ORG//NONSGML Version 1//EN
BEGIN:VEVENT
UID:12345-67890-attendee-reply
DTSTAMP:20080601T120000Z
DTSTART:20080601T120000Z
DTEND:20080601T130000Z
ORGANIZER;CN="User 01":mailto:user01@example.com
ATTENDEE:mailto:user01@example.com
ATTENDEE;PARTSTAT=ACCEPTED:mailto:user02@example.com
ATTENDEE:mailto:user03@example.com
END:VEVENT
END:VCALENDAR
"""
# Need refreshes to occur immediately, not via reactor.callLater
self.patch(config.Scheduling.Options, "AttendeeRefreshBatch", 5)
self.patch(config.Scheduling.Options.WorkQueues, "AttendeeRefreshBatchDelaySeconds", 1)
yield self._createCalendarObject(data1, "user01", "test.ics")
list1 = (yield self._listCalendarObjects("user01", "inbox"))
self.assertEqual(len(list1), 0)
calendar1 = (yield self._getCalendarData("user01", "test.ics"))
self.assertTrue("SCHEDULE-STATUS=1.2" in calendar1)
list2 = (yield self._listCalendarObjects("user02", "inbox"))
self.assertEqual(len(list2), 1)
calendar2 = (yield self._getCalendarData("user02"))
self.assertTrue("PARTSTAT=ACCEPTED" not in calendar2)
list3 = (yield self._listCalendarObjects("user03", "inbox"))
self.assertEqual(len(list3), 1)
calendar3 = (yield self._getCalendarData("user03"))
self.assertTrue("PARTSTAT=ACCEPTED" not in calendar3)
yield self._setCalendarData(data2, "user02")
yield JobItem.waitEmpty(self.storeUnderTest().newTransaction, reactor, 60)
list1 = (yield self._listCalendarObjects("user01", "inbox"))
self.assertEqual(len(list1), 1)
calendar1 = (yield self._getCalendarData("user01", "test.ics"))
self.assertTrue("SCHEDULE-STATUS=2.0" in calendar1)
self.assertTrue("PARTSTAT=ACCEPTED" in calendar1)
list2 = (yield self._listCalendarObjects("user02", "inbox"))
self.assertEqual(len(list2), 1)
calendar2 = (yield self._getCalendarData("user02"))
self.assertTrue("PARTSTAT=ACCEPTED" in calendar2)
@inlineCallbacks
def _test_user03_refresh():
list3 = (yield self._listCalendarObjects("user03", "inbox"))
self.assertEqual(len(list3), 1)
calendar3 = (yield self._getCalendarData("user03"))
self.assertTrue("PARTSTAT=ACCEPTED" in calendar3)
yield deferLater(reactor, 2.0, _test_user03_refresh)
@inlineCallbacks
def test_doImplicitScheduling_OrganizerEventTimezoneDST(self):
"""
Test that doImplicitScheduling delivers scheduling messages to attendees. This test
creates an exception close to a DST transition to make sure timezone DST handling
is correct.
"""
data1 = """BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//CALENDARSERVER.ORG//NONSGML Version 1//EN
BEGIN:VEVENT
UID:12345-67890
DTSTAMP:20080601T120000Z
DTSTART;TZID=America/Los_Angeles:20140302T190000
DTEND;TZID=America/Los_Angeles:20140302T193000
ORGANIZER;CN="User 01":mailto:user01@example.com
ATTENDEE:mailto:user01@example.com
ATTENDEE:mailto:user02@example.com
RRULE:FREQ=DAILY;UNTIL=20140309T075959Z
END:VEVENT
END:VCALENDAR
"""
data2 = """BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//CALENDARSERVER.ORG//NONSGML Version 1//EN
BEGIN:VEVENT
UID:12345-67890
DTSTAMP:20080601T120000Z
DTSTART;TZID=America/Los_Angeles:20140302T190000
DTEND;TZID=America/Los_Angeles:20140302T193000
ORGANIZER;CN="User 01":mailto:user01@example.com
ATTENDEE:mailto:user01@example.com
ATTENDEE:mailto:user02@example.com
RRULE:FREQ=DAILY;UNTIL=20140309T075959Z
END:VEVENT
BEGIN:VEVENT
UID:12345-67890
DTSTAMP:20080601T120000Z
RECURRENCE-ID;TZID=America/Los_Angeles:20140308T190000
DTSTART;TZID=America/Los_Angeles:20140308T190000
DTEND;TZID=America/Los_Angeles:20140308T193000
ORGANIZER;CN="User 01":mailto:user01@example.com
ATTENDEE:mailto:user01@example.com
ATTENDEE:mailto:user02@example.com
END:VEVENT
END:VCALENDAR
"""
TimezoneCache.create()
yield self._createCalendarObject(data1, "user01", "test.ics")
yield self._setCalendarData(data2, "user01", "test.ics")
list2 = (yield self._listCalendarObjects("user02"))
self.assertEqual(len(list2), 1)
self.assertTrue(list2[0].startswith(hashlib.md5("12345-67890").hexdigest()))
list2 = (yield self._listCalendarObjects("user02", "inbox"))
self.assertEqual(len(list2), 2)
self.assertTrue(list2[0].startswith(hashlib.md5("12345-67890").hexdigest()))
self.assertTrue(list2[1].startswith(hashlib.md5("12345-67890").hexdigest()))
@inlineCallbacks
def test_doImplicitScheduling_MissingAttendeeWithInvalidUser(self):
"""
Test that doImplicitMissingAttendee works when the event contains an
invalid attendee.
"""
data1 = """BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//CALENDARSERVER.ORG//NONSGML Version 1//EN
BEGIN:VEVENT
UID:12345-67890
DTSTAMP:20080601T120000Z
DTSTART:20140302T190000Z
DURATION:PT1H
ORGANIZER;CN="User 01":mailto:user01@example.com
ATTENDEE:mailto:user01@example.com
ATTENDEE:mailto:foo@bar.com
RRULE:FREQ=DAILY;UNTIL=20140309T075959Z
END:VEVENT
END:VCALENDAR
"""
yield self._createCalendarObject(data1, "user02", "test.ics")
list2 = (yield self._listCalendarObjects("user02"))
self.assertEqual(len(list2), 1)
yield self._setCalendarData(data1, "user02", "test.ics")
list2 = (yield self._listCalendarObjects("user02"))
self.assertEqual(len(list2), 1)
@inlineCallbacks
def test_doImplicitScheduling_MissingAttendeeWithiMIP(self):
"""
Test that doImplicitMissingAttendee works when iMIP is enabled and the event
contains an iMIP attendee.
"""
data1 = """BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//CALENDARSERVER.ORG//NONSGML Version 1//EN
BEGIN:VEVENT
UID:12345-67890
DTSTAMP:20080601T120000Z
DTSTART:20140302T190000Z
DURATION:PT1H
ORGANIZER;CN="User 01":mailto:user01@example.com
ATTENDEE:mailto:user01@example.com
ATTENDEE:mailto:foo@bar.com
RRULE:FREQ=DAILY;UNTIL=20140309T075959Z
END:VEVENT
END:VCALENDAR
"""
self.patch(config.Scheduling.iMIP, "Enabled", True)
self.patch(config.Scheduling.iMIP, "AddressPatterns", ["mailto:.*"])
yield self._createCalendarObject(data1, "user02", "test.ics")
list2 = (yield self._listCalendarObjects("user02"))
self.assertEqual(len(list2), 1)
yield self._setCalendarData(data1, "user02", "test.ics")
list2 = (yield self._listCalendarObjects("user02"))
self.assertEqual(len(list2), 1)
@inlineCallbacks
def test_sendAttendeeReply_ScheduleAgentNone(self):
"""
Test that sendAttendeeReply does nothing when the Organizer has
SCHEDULE-AGENT=NONE.
"""
data1 = """BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//CALENDARSERVER.ORG//NONSGML Version 1//EN
BEGIN:VEVENT
UID:12345-67890
DTSTAMP:20080601T120000Z
DTSTART:20140302T190000Z
DURATION:PT1H
ORGANIZER;SCHEDULE-AGENT=NONE;CN="User 01":mailto:user01@example.com
ATTENDEE:mailto:user01@example.com
ATTENDEE:mailto:user02@example.com
RRULE:FREQ=DAILY;UNTIL=20140309T075959Z
END:VEVENT
END:VCALENDAR
"""
yield self._createCalendarObject(data1, "user02", "test.ics")
cobj = yield self.calendarObjectUnderTest(home="user02", name="test.ics",)
result = yield ImplicitScheduler().sendAttendeeReply(cobj._txn, cobj)
self.assertFalse(result)
@inlineCallbacks
def test_sendAttendeeReply_ScheduleAgentClient(self):
"""
Test that sendAttendeeReply does nothing when the Organizer has
SCHEDULE-AGENT=CLIENT.
"""
data1 = """BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//CALENDARSERVER.ORG//NONSGML Version 1//EN
BEGIN:VEVENT
UID:12345-67890
DTSTAMP:20080601T120000Z
DTSTART:20140302T190000Z
DURATION:PT1H
ORGANIZER;SCHEDULE-AGENT=CLIENT;CN="User 01":mailto:user01@example.com
ATTENDEE:mailto:user01@example.com
ATTENDEE:mailto:user02@example.com
RRULE:FREQ=DAILY;UNTIL=20140309T075959Z
END:VEVENT
END:VCALENDAR
"""
yield self._createCalendarObject(data1, "user02", "test.ics")
cobj = yield self.calendarObjectUnderTest(home="user02", name="test.ics",)
result = yield ImplicitScheduler().sendAttendeeReply(cobj._txn, cobj)
self.assertFalse(result)
@inlineCallbacks
def test_sendAttendeeReply_NoAttendee(self):
"""
Test that sendAttendeeReply does nothing when the Attencdee is not
listed in the event. This will not normally ever be possible, but a case
like this was seen due to a processing error elsewehere.
"""
data1 = """BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//CALENDARSERVER.ORG//NONSGML Version 1//EN
BEGIN:VEVENT
UID:12345-67890
DTSTAMP:20080601T120000Z
DTSTART:20140302T190000Z
DURATION:PT1H
ORGANIZER;CN="User 01":mailto:user01@example.com
ATTENDEE:mailto:user01@example.com
ATTENDEE:mailto:user03@example.com
RRULE:FREQ=DAILY;UNTIL=20140309T075959Z
END:VEVENT
END:VCALENDAR
"""
yield self._createCalendarObject(data1, "user02", "test.ics")
cobj = yield self.calendarObjectUnderTest(home="user02", name="test.ics",)
# Need to remove SCHEDULE-AGENT=NONE on ORGANIZER as that will have been added during the store operation
cal = yield cobj.componentForUser()
cal.removePropertyParameters("ORGANIZER", ("SCHEDULE-AGENT", "SCHEDULE-STATUS",))
result = yield ImplicitScheduler().sendAttendeeReply(cobj._txn, cobj)
self.assertFalse(result)
class ScheduleAgentFixBase(CommonCommonTests, TestCase):
"""
Test txdav.caldav.datastore.scheduling.implicit.
"""
@inlineCallbacks
def setUp(self):
yield super(ScheduleAgentFixBase, self).setUp()
yield self.buildStoreAndDirectory()
yield self.populate()
self.patch(config.Scheduling.Options, "AttendeeRefreshBatch", 0)
@inlineCallbacks
def populate(self):
yield populateCalendarsFrom(self.requirements, self.storeUnderTest())
self.notifierFactory.reset()
metadata = {
"accessMode": "PUBLIC",
"isScheduleObject": True,
"scheduleTag": "abc",
"scheduleEtags": (),
"hasPrivateComment": False,
}
@classproperty(cache=False)
def requirements(cls): # @NoSelf
return {
"user01": {
"calendar_1": {
"organizer.ics": (cls.organizer_data, cls.metadata),
},
"inbox": {
},
},
"user02": {
"calendar_1": {
"attendee2.ics": (cls.attendee2_data, cls.metadata),
},
"inbox": {
},
},
"user03": {
"calendar_1": {
"attendee3.ics": (cls.attendee3_data, cls.metadata),
},
"inbox": {
},
},
}
class ScheduleAgentFix(ScheduleAgentFixBase):
"""
Test that implicit scheduling where an attendee has S-A=CLIENT and S-A=SERVER is
corrected when the attendee updates.
"""
organizer_data = """BEGIN:VCALENDAR
CALSCALE:GREGORIAN
PRODID:-//Example Inc.//Example Calendar//EN
VERSION:2.0
BEGIN:VEVENT
DTSTAMP:20051222T205953Z
CREATED:20060101T150000Z
DTSTART:20140101T100000Z
DURATION:PT1H
SUMMARY:event 1
UID:event1@ninevah.local
ORGANIZER:urn:x-uid:user01
ATTENDEE:urn:x-uid:user01
ATTENDEE:urn:x-uid:user03
RRULE:FREQ=DAILY
END:VEVENT
BEGIN:VEVENT
DTSTAMP:20051222T205953Z
CREATED:20060101T150000Z
RECURRENCE-ID:20140102T100000Z
DTSTART:20140102T100000Z
DURATION:PT1H
SUMMARY:event 1
UID:event1@ninevah.local
ORGANIZER:urn:x-uid:user01
ATTENDEE:urn:x-uid:user01
ATTENDEE:urn:x-uid:user02
ATTENDEE:urn:x-uid:user03
END:VEVENT
END:VCALENDAR
"""
attendee2_data = """BEGIN:VCALENDAR
CALSCALE:GREGORIAN
PRODID:-//Example Inc.//Example Calendar//EN
VERSION:2.0
BEGIN:VEVENT
DTSTAMP:20051222T205953Z
CREATED:20060101T150000Z
DTSTART:20140101T100000Z
DURATION:PT1H
SUMMARY:event 1
UID:event1@ninevah.local
ORGANIZER;SCHEDULE-AGENT=CLIENT:urn:x-uid:user01
ATTENDEE:urn:x-uid:user01
ATTENDEE:urn:x-uid:user03
RRULE:FREQ=DAILY
END:VEVENT
BEGIN:VEVENT
DTSTAMP:20051222T205953Z
CREATED:20060101T150000Z
RECURRENCE-ID:20140102T100000Z
DTSTART:20140102T100000Z
DURATION:PT1H
SUMMARY:event 1
UID:event1@ninevah.local
ORGANIZER;SCHEDULE-AGENT=SERVER:urn:x-uid:user01
ATTENDEE:urn:x-uid:user01
ATTENDEE:urn:x-uid:user02
ATTENDEE:urn:x-uid:user03
END:VEVENT
END:VCALENDAR
"""
attendee2_update_data = """BEGIN:VCALENDAR
CALSCALE:GREGORIAN
PRODID:-//Example Inc.//Example Calendar//EN
VERSION:2.0
BEGIN:VEVENT
DTSTAMP:20051222T205953Z
CREATED:20060101T150000Z
DTSTART:20140101T100000Z
DURATION:PT1H
SUMMARY:event 1
UID:event1@ninevah.local
ORGANIZER;SCHEDULE-AGENT=CLIENT:urn:x-uid:user01
ATTENDEE:urn:x-uid:user01
ATTENDEE:urn:x-uid:user03
RRULE:FREQ=DAILY
END:VEVENT
BEGIN:VEVENT
DTSTAMP:20051222T205953Z
CREATED:20060101T150000Z
RECURRENCE-ID:20140102T100000Z
DTSTART:20140102T100000Z
DURATION:PT1H
SUMMARY:event 1
UID:event1@ninevah.local
ORGANIZER;SCHEDULE-AGENT=SERVER:urn:x-uid:user01
ATTENDEE:urn:x-uid:user01
ATTENDEE;PARTSTAT=ACCEPTED:urn:x-uid:user02
ATTENDEE:urn:x-uid:user03
END:VEVENT
END:VCALENDAR
"""
attendee3_data = """BEGIN:VCALENDAR
CALSCALE:GREGORIAN
PRODID:-//Example Inc.//Example Calendar//EN
VERSION:2.0
BEGIN:VEVENT
DTSTAMP:20051222T205953Z
CREATED:20060101T150000Z
DTSTART:20140101T100000Z
DURATION:PT1H
SUMMARY:event 1
UID:event1@ninevah.local
ORGANIZER:urn:x-uid:user01
ATTENDEE:urn:x-uid:user01
ATTENDEE:urn:x-uid:user03
RRULE:FREQ=DAILY
END:VEVENT
BEGIN:VEVENT
DTSTAMP:20051222T205953Z
CREATED:20060101T150000Z
RECURRENCE-ID:20140102T100000Z
DTSTART:20140102T100000Z
DURATION:PT1H
SUMMARY:event 1
UID:event1@ninevah.local
ORGANIZER:urn:x-uid:user01
ATTENDEE:urn:x-uid:user01
ATTENDEE:urn:x-uid:user02
ATTENDEE:urn:x-uid:user03
END:VEVENT
END:VCALENDAR
"""
@inlineCallbacks
def test_doImplicitScheduling(self):
"""
Test that doImplicitScheduling fixes an inconsistent schedule-agent state when an
attendee stores their data.
"""
cobj = yield self.calendarObjectUnderTest(home="user02", name="attendee2.ics")
yield cobj.setComponent(Component.fromString(self.attendee2_update_data))
yield self.commit()
cobj = yield self.calendarObjectUnderTest(home="user02", name="attendee2.ics")
comp = yield cobj.component()
self.assertTrue(comp.masterComponent() is None)
self.assertTrue(comp.getOrganizerScheduleAgent())
inbox = yield self.calendarUnderTest(home="user01", name="inbox")
cobjs = yield inbox.calendarObjects()
self.assertTrue(len(cobjs) == 1)
class MissingOrganizerFix(ScheduleAgentFixBase):
"""
Test that an attendee with a copy of an event without any organizer or attendee
properties is corrected when the organizer updates.
"""
organizer_data = """BEGIN:VCALENDAR
CALSCALE:GREGORIAN
PRODID:-//Example Inc.//Example Calendar//EN
VERSION:2.0
BEGIN:VEVENT
DTSTAMP:20051222T205953Z
CREATED:20060101T150000Z
DTSTART:20140101T100000Z
DURATION:PT1H
SUMMARY:event 1
UID:event1@ninevah.local
ORGANIZER:urn:x-uid:user01
ATTENDEE:urn:x-uid:user01
ATTENDEE:urn:x-uid:user03
END:VEVENT
END:VCALENDAR
"""
organizer_update_data = """BEGIN:VCALENDAR
CALSCALE:GREGORIAN
PRODID:-//Example Inc.//Example Calendar//EN
VERSION:2.0
BEGIN:VEVENT
DTSTAMP:20051222T205953Z
CREATED:20060101T150000Z
DTSTART:20140101T100000Z
DURATION:PT1H
SUMMARY:event 1
UID:event1@ninevah.local
ORGANIZER:urn:x-uid:user01
ATTENDEE:urn:x-uid:user01
ATTENDEE:urn:x-uid:user02
ATTENDEE:urn:x-uid:user03
END:VEVENT
END:VCALENDAR
"""
attendee2_data = """BEGIN:VCALENDAR
CALSCALE:GREGORIAN
PRODID:-//Example Inc.//Example Calendar//EN
VERSION:2.0
BEGIN:VEVENT
DTSTAMP:20051222T205953Z
CREATED:20060101T150000Z
DTSTART:20140101T100000Z
DURATION:PT1H
SUMMARY:event 1
UID:event1@ninevah.local
END:VEVENT
END:VCALENDAR
"""
attendee3_data = """BEGIN:VCALENDAR
CALSCALE:GREGORIAN
PRODID:-//Example Inc.//Example Calendar//EN
VERSION:2.0
BEGIN:VEVENT
DTSTAMP:20051222T205953Z
CREATED:20060101T150000Z
DTSTART:20140101T100000Z
DURATION:PT1H
SUMMARY:event 1
UID:event1@ninevah.local
ORGANIZER:urn:x-uid:user01
ATTENDEE:urn:x-uid:user01
ATTENDEE:urn:x-uid:user03
END:VEVENT
END:VCALENDAR
"""
@inlineCallbacks
def test_doImplicitScheduling(self):
"""
Test that doImplicitScheduling fixes an inconsistent schedule-agent state when an
attendee stores their data.
"""
cobj = yield self.calendarObjectUnderTest(home="user02", name="attendee2.ics")
comp = yield cobj.component()
self.assertTrue(comp.getOrganizer() is None)
yield self.commit()
cobj = yield self.calendarObjectUnderTest(home="user01", name="organizer.ics")
yield cobj.setComponent(Component.fromString(self.organizer_update_data))
yield self.commit()
cal = yield self.calendarUnderTest(home="user02")
cobjs = yield cal.calendarObjects()
self.assertTrue(len(cobjs) == 2)
for cobj in cobjs:
comp = yield cobj.component()
if comp.resourceUID() == "event1@ninevah.local":
self.assertTrue(comp.getOrganizer() is not None)
else:
self.assertTrue(comp.getOrganizer() is None)
inbox = yield self.calendarUnderTest(home="user02", name="inbox")
cobjs = yield inbox.calendarObjects()
self.assertTrue(len(cobjs) == 1)
| 31.124224 | 143 | 0.710498 | 7,402 | 65,143 | 6.22575 | 0.069576 | 0.064883 | 0.071089 | 0.092702 | 0.852549 | 0.832259 | 0.820802 | 0.813315 | 0.795413 | 0.788121 | 0 | 0.112349 | 0.174723 | 65,143 | 2,092 | 144 | 31.139101 | 0.744834 | 0.015121 | 0 | 0.766332 | 0 | 0 | 0.383719 | 0.213472 | 0 | 0 | 0 | 0 | 0.068677 | 0 | null | null | 0.001675 | 0.017588 | null | null | 0.000838 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
676d5bfcf6bd99400bd86172c7c0f76446d07e82 | 39,550 | py | Python | crop_rotator/strona/migrations/0001_initial.py | Bahusson/crop_rotator | c1d86d36ce1867a84b927708f92c62c7815250a4 | [
"MIT"
] | 1 | 2021-05-08T07:04:45.000Z | 2021-05-08T07:04:45.000Z | crop_rotator/strona/migrations/0001_initial.py | Bahusson/crop_rotator | c1d86d36ce1867a84b927708f92c62c7815250a4 | [
"MIT"
] | 80 | 2020-11-18T20:35:12.000Z | 2021-06-13T08:08:36.000Z | crop_rotator/strona/migrations/0001_initial.py | Bahusson/crop_rotator | c1d86d36ce1867a84b927708f92c62c7815250a4 | [
"MIT"
] | null | null | null | # Generated by Django 3.1.5 on 2021-04-03 08:13
from django.db import migrations, models
import django.db.models.deletion
class Migration(migrations.Migration):
initial = True
dependencies = [
]
operations = [
migrations.CreateModel(
name='AboutPageNames',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('about_project', models.TextField()),
('about_project_pl', models.TextField(null=True)),
('about_project_en', models.TextField(null=True)),
('send_email', models.CharField(max_length=200)),
('send_email_pl', models.CharField(max_length=200, null=True)),
('send_email_en', models.CharField(max_length=200, null=True)),
('gitter', models.CharField(max_length=200)),
('gitter_pl', models.CharField(max_length=200, null=True)),
('gitter_en', models.CharField(max_length=200, null=True)),
('github', models.CharField(max_length=200)),
('github_pl', models.CharField(max_length=200, null=True)),
('github_en', models.CharField(max_length=200, null=True)),
('login_to_see', models.CharField(max_length=200)),
('login_to_see_pl', models.CharField(max_length=200, null=True)),
('login_to_see_en', models.CharField(max_length=200, null=True)),
('curr_prog_includes', models.CharField(blank=True, max_length=40, null=True)),
('curr_prog_includes_pl', models.CharField(blank=True, max_length=40, null=True)),
('curr_prog_includes_en', models.CharField(blank=True, max_length=40, null=True)),
('over', models.CharField(blank=True, max_length=30, null=True)),
('over_pl', models.CharField(blank=True, max_length=30, null=True)),
('over_en', models.CharField(blank=True, max_length=30, null=True)),
('plants', models.CharField(blank=True, max_length=30, null=True)),
('plants_pl', models.CharField(blank=True, max_length=30, null=True)),
('plants_en', models.CharField(blank=True, max_length=30, null=True)),
('coming_from', models.CharField(blank=True, max_length=30, null=True)),
('coming_from_pl', models.CharField(blank=True, max_length=30, null=True)),
('coming_from_en', models.CharField(blank=True, max_length=30, null=True)),
('families', models.CharField(blank=True, max_length=30, null=True)),
('families_pl', models.CharField(blank=True, max_length=30, null=True)),
('families_en', models.CharField(blank=True, max_length=30, null=True)),
('marked_by', models.CharField(blank=True, max_length=30, null=True)),
('marked_by_pl', models.CharField(blank=True, max_length=30, null=True)),
('marked_by_en', models.CharField(blank=True, max_length=30, null=True)),
('categories', models.CharField(blank=True, max_length=30, null=True)),
('categories_pl', models.CharField(blank=True, max_length=30, null=True)),
('categories_en', models.CharField(blank=True, max_length=30, null=True)),
('and_over', models.CharField(blank=True, max_length=30, null=True)),
('and_over_pl', models.CharField(blank=True, max_length=30, null=True)),
('and_over_en', models.CharField(blank=True, max_length=30, null=True)),
('unique_interactions', models.CharField(blank=True, max_length=30, null=True)),
('unique_interactions_pl', models.CharField(blank=True, max_length=30, null=True)),
('unique_interactions_en', models.CharField(blank=True, max_length=30, null=True)),
('described_by', models.CharField(blank=True, max_length=30, null=True)),
('described_by_pl', models.CharField(blank=True, max_length=30, null=True)),
('described_by_en', models.CharField(blank=True, max_length=30, null=True)),
('sources', models.CharField(blank=True, max_length=30, null=True)),
('sources_pl', models.CharField(blank=True, max_length=30, null=True)),
('sources_en', models.CharField(blank=True, max_length=30, null=True)),
],
options={
'verbose_name_plural': 'About Page Names',
},
),
migrations.CreateModel(
name='BasicElement',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(max_length=50)),
('name_pl', models.CharField(max_length=50, null=True)),
('name_en', models.CharField(max_length=50, null=True)),
('latin_name', models.CharField(max_length=50)),
('symbol', models.CharField(max_length=2)),
('image', models.ImageField(blank=True, null=True, upload_to='images')),
('is_trace_element', models.BooleanField(default=True)),
('descr', models.TextField()),
('descr_pl', models.TextField(null=True)),
('descr_en', models.TextField(null=True)),
],
options={
'ordering': ['name'],
},
),
migrations.CreateModel(
name='ElementDataString',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('title', models.CharField(max_length=150)),
('title_pl', models.CharField(max_length=150, null=True)),
('title_en', models.CharField(max_length=150, null=True)),
('part1', models.CharField(blank=True, max_length=500, null=True)),
('part1_pl', models.CharField(blank=True, max_length=500, null=True)),
('part1_en', models.CharField(blank=True, max_length=500, null=True)),
('link', models.CharField(blank=True, max_length=500, null=True)),
('link_pl', models.CharField(blank=True, max_length=500, null=True)),
('link_en', models.CharField(blank=True, max_length=500, null=True)),
],
options={
'ordering': ['title'],
},
),
migrations.CreateModel(
name='Fertilizer',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(max_length=50)),
('name_pl', models.CharField(max_length=50, null=True)),
('name_en', models.CharField(max_length=50, null=True)),
('image', models.ImageField(blank=True, null=True, upload_to='images')),
('descr', models.TextField()),
('descr_pl', models.TextField(null=True)),
('descr_en', models.TextField(null=True)),
('is_natural', models.BooleanField(default=False)),
('contains_elements', models.ManyToManyField(blank=True, related_name='contains_elements', to='strona.BasicElement')),
('image_source', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='set_image_eds_fertilizer', to='strona.elementdatastring')),
],
options={
'ordering': ['name'],
},
),
migrations.CreateModel(
name='FertilizerPageNames',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('title', models.CharField(max_length=50)),
('title_pl', models.CharField(max_length=50, null=True)),
('title_en', models.CharField(max_length=50, null=True)),
('descr', models.TextField()),
('descr_pl', models.TextField(null=True)),
('descr_en', models.TextField(null=True)),
('elements_head', models.CharField(blank=True, max_length=50, null=True)),
('elements_head_pl', models.CharField(blank=True, max_length=50, null=True)),
('elements_head_en', models.CharField(blank=True, max_length=50, null=True)),
('makro_head', models.CharField(blank=True, max_length=50, null=True)),
('makro_head_pl', models.CharField(blank=True, max_length=50, null=True)),
('makro_head_en', models.CharField(blank=True, max_length=50, null=True)),
('makro_descr', models.TextField(blank=True, null=True)),
('makro_descr_pl', models.TextField(blank=True, null=True)),
('makro_descr_en', models.TextField(blank=True, null=True)),
('micro_head', models.CharField(blank=True, max_length=50, null=True)),
('micro_head_pl', models.CharField(blank=True, max_length=50, null=True)),
('micro_head_en', models.CharField(blank=True, max_length=50, null=True)),
('micro_descr', models.TextField(blank=True, null=True)),
('micro_descr_pl', models.TextField(blank=True, null=True)),
('micro_descr_en', models.TextField(blank=True, null=True)),
('fertilizers_head', models.CharField(blank=True, max_length=50, null=True)),
('fertilizers_head_pl', models.CharField(blank=True, max_length=50, null=True)),
('fertilizers_head_en', models.CharField(blank=True, max_length=50, null=True)),
('natural_fertilizers_head', models.CharField(blank=True, max_length=50, null=True)),
('natural_fertilizers_head_pl', models.CharField(blank=True, max_length=50, null=True)),
('natural_fertilizers_head_en', models.CharField(blank=True, max_length=50, null=True)),
('natural_fertilizers_descr', models.TextField(blank=True, null=True)),
('natural_fertilizers_descr_pl', models.TextField(blank=True, null=True)),
('natural_fertilizers_descr_en', models.TextField(blank=True, null=True)),
('artificial_fertilizers_head', models.CharField(blank=True, max_length=50, null=True)),
('artificial_fertilizers_head_pl', models.CharField(blank=True, max_length=50, null=True)),
('artificial_fertilizers_head_en', models.CharField(blank=True, max_length=50, null=True)),
('artificial_fertilizers_descr', models.TextField(blank=True, null=True)),
('artificial_fertilizers_descr_pl', models.TextField(blank=True, null=True)),
('artificial_fertilizers_descr_en', models.TextField(blank=True, null=True)),
],
options={
'verbose_name_plural': 'Fertilizer Page Names',
},
),
migrations.CreateModel(
name='PageNames',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('lang_flag', models.ImageField(upload_to='images')),
('lang_flag_pl', models.ImageField(null=True, upload_to='images')),
('lang_flag_en', models.ImageField(null=True, upload_to='images')),
('lang_flag_id', models.CharField(blank=True, max_length=20, null=True)),
('lang_flag_id_pl', models.CharField(blank=True, max_length=20, null=True)),
('lang_flag_id_en', models.CharField(blank=True, max_length=20, null=True)),
('headtitle', models.CharField(max_length=200)),
('headtitle_pl', models.CharField(max_length=200, null=True)),
('headtitle_en', models.CharField(max_length=200, null=True)),
('mainpage', models.CharField(max_length=200)),
('mainpage_pl', models.CharField(max_length=200, null=True)),
('mainpage_en', models.CharField(max_length=200, null=True)),
('all_plants', models.CharField(blank=True, max_length=200, null=True)),
('all_plants_pl', models.CharField(blank=True, max_length=200, null=True)),
('all_plants_en', models.CharField(blank=True, max_length=200, null=True)),
('about', models.CharField(max_length=200)),
('about_pl', models.CharField(max_length=200, null=True)),
('about_en', models.CharField(max_length=200, null=True)),
('contact', models.CharField(max_length=200)),
('contact_pl', models.CharField(max_length=200, null=True)),
('contact_en', models.CharField(max_length=200, null=True)),
('logout', models.CharField(max_length=200)),
('logout_pl', models.CharField(max_length=200, null=True)),
('logout_en', models.CharField(max_length=200, null=True)),
('login', models.CharField(max_length=200)),
('login_pl', models.CharField(max_length=200, null=True)),
('login_en', models.CharField(max_length=200, null=True)),
('register', models.CharField(max_length=50)),
('register_pl', models.CharField(max_length=50, null=True)),
('register_en', models.CharField(max_length=50, null=True)),
('my_plans', models.CharField(blank=True, max_length=200, null=True)),
('my_plans_pl', models.CharField(blank=True, max_length=200, null=True)),
('my_plans_en', models.CharField(blank=True, max_length=200, null=True)),
('all_plans', models.CharField(blank=True, max_length=200, null=True)),
('all_plans_pl', models.CharField(blank=True, max_length=200, null=True)),
('all_plans_en', models.CharField(blank=True, max_length=200, null=True)),
('see_more', models.CharField(blank=True, max_length=200, null=True)),
('see_more_pl', models.CharField(blank=True, max_length=200, null=True)),
('see_more_en', models.CharField(blank=True, max_length=200, null=True)),
('of_steps', models.CharField(blank=True, max_length=200, null=True)),
('of_steps_pl', models.CharField(blank=True, max_length=200, null=True)),
('of_steps_en', models.CharField(blank=True, max_length=200, null=True)),
('of_plants', models.CharField(blank=True, max_length=200, null=True)),
('of_plants_pl', models.CharField(blank=True, max_length=200, null=True)),
('of_plants_en', models.CharField(blank=True, max_length=200, null=True)),
('by_crops', models.CharField(blank=True, max_length=50, null=True)),
('by_crops_pl', models.CharField(blank=True, max_length=50, null=True)),
('by_crops_en', models.CharField(blank=True, max_length=50, null=True)),
('by_families', models.CharField(blank=True, max_length=50, null=True)),
('by_families_pl', models.CharField(blank=True, max_length=50, null=True)),
('by_families_en', models.CharField(blank=True, max_length=50, null=True)),
('by_tags', models.CharField(blank=True, max_length=50, null=True)),
('by_tags_pl', models.CharField(blank=True, max_length=50, null=True)),
('by_tags_en', models.CharField(blank=True, max_length=50, null=True)),
],
options={
'verbose_name_plural': 'Page Names',
},
),
migrations.CreateModel(
name='PageSkin',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('themetitle', models.CharField(max_length=200)),
('position', models.IntegerField()),
('planimagedefault', models.ImageField(blank=True, null=True, upload_to='skins')),
('rotatorlogo_main', models.ImageField(blank=True, null=True, upload_to='skins')),
],
options={
'verbose_name_plural': 'Page Skins',
'ordering': ['position'],
},
),
migrations.CreateModel(
name='RegNames',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('password', models.CharField(blank=True, max_length=50, null=True)),
('password_pl', models.CharField(blank=True, max_length=50, null=True)),
('password_en', models.CharField(blank=True, max_length=50, null=True)),
('re_password', models.CharField(blank=True, max_length=50, null=True)),
('re_password_pl', models.CharField(blank=True, max_length=50, null=True)),
('re_password_en', models.CharField(blank=True, max_length=50, null=True)),
('name', models.CharField(blank=True, max_length=50, null=True)),
('name_pl', models.CharField(blank=True, max_length=50, null=True)),
('name_en', models.CharField(blank=True, max_length=50, null=True)),
('refresh', models.CharField(blank=True, max_length=50, null=True)),
('refresh_pl', models.CharField(blank=True, max_length=50, null=True)),
('refresh_en', models.CharField(blank=True, max_length=50, null=True)),
('passwd_too_simple', models.CharField(blank=True, max_length=250, null=True)),
('passwd_too_simple_pl', models.CharField(blank=True, max_length=250, null=True)),
('passwd_too_simple_en', models.CharField(blank=True, max_length=250, null=True)),
('register', models.CharField(blank=True, max_length=50, null=True)),
('register_pl', models.CharField(blank=True, max_length=50, null=True)),
('register_en', models.CharField(blank=True, max_length=50, null=True)),
],
options={
'verbose_name_plural': 'Registry Names',
},
),
migrations.CreateModel(
name='RotatorEditorPageNames',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('new_plan', models.CharField(max_length=200)),
('new_plan_pl', models.CharField(max_length=200, null=True)),
('new_plan_en', models.CharField(max_length=200, null=True)),
('new_step', models.CharField(max_length=200)),
('new_step_pl', models.CharField(max_length=200, null=True)),
('new_step_en', models.CharField(max_length=200, null=True)),
('name_plan', models.CharField(max_length=200)),
('name_plan_pl', models.CharField(max_length=200, null=True)),
('name_plan_en', models.CharField(max_length=200, null=True)),
('name_step', models.CharField(max_length=200)),
('name_step_pl', models.CharField(max_length=200, null=True)),
('name_step_en', models.CharField(max_length=200, null=True)),
('plan_remove', models.CharField(max_length=200)),
('plan_remove_pl', models.CharField(max_length=200, null=True)),
('plan_remove_en', models.CharField(max_length=200, null=True)),
('step_remove', models.CharField(max_length=200)),
('step_remove_pl', models.CharField(max_length=200, null=True)),
('step_remove_en', models.CharField(max_length=200, null=True)),
('remove_warning', models.CharField(blank=True, max_length=200, null=True)),
('remove_warning_pl', models.CharField(blank=True, max_length=200, null=True)),
('remove_warning_en', models.CharField(blank=True, max_length=200, null=True)),
('remove_permanent', models.CharField(blank=True, max_length=200, null=True)),
('remove_permanent_pl', models.CharField(blank=True, max_length=200, null=True)),
('remove_permanent_en', models.CharField(blank=True, max_length=200, null=True)),
('dont_remove', models.CharField(blank=True, max_length=200, null=True)),
('dont_remove_pl', models.CharField(blank=True, max_length=200, null=True)),
('dont_remove_en', models.CharField(blank=True, max_length=200, null=True)),
('editme', models.CharField(max_length=200)),
('editme_pl', models.CharField(max_length=200, null=True)),
('editme_en', models.CharField(max_length=200, null=True)),
('switch_places', models.CharField(max_length=200)),
('switch_places_pl', models.CharField(max_length=200, null=True)),
('switch_places_en', models.CharField(max_length=200, null=True)),
('switch_with', models.CharField(max_length=200)),
('switch_with_pl', models.CharField(max_length=200, null=True)),
('switch_with_en', models.CharField(max_length=200, null=True)),
('switch_text', models.CharField(max_length=200)),
('switch_text_pl', models.CharField(max_length=200, null=True)),
('switch_text_en', models.CharField(max_length=200, null=True)),
('u_edit_step_no', models.CharField(max_length=200)),
('u_edit_step_no_pl', models.CharField(max_length=200, null=True)),
('u_edit_step_no_en', models.CharField(max_length=200, null=True)),
('title', models.CharField(max_length=200)),
('title_pl', models.CharField(max_length=200, null=True)),
('title_en', models.CharField(max_length=200, null=True)),
('descr', models.CharField(max_length=200)),
('descr_pl', models.CharField(max_length=200, null=True)),
('descr_en', models.CharField(max_length=200, null=True)),
('early_crop', models.CharField(max_length=200)),
('early_crop_pl', models.CharField(max_length=200, null=True)),
('early_crop_en', models.CharField(max_length=200, null=True)),
('middle_crop', models.CharField(blank=True, max_length=200, null=True)),
('middle_crop_pl', models.CharField(blank=True, max_length=200, null=True)),
('middle_crop_en', models.CharField(blank=True, max_length=200, null=True)),
('late_crop', models.CharField(max_length=200)),
('late_crop_pl', models.CharField(max_length=200, null=True)),
('late_crop_en', models.CharField(max_length=200, null=True)),
('destroy_early_crop', models.CharField(max_length=200)),
('destroy_early_crop_pl', models.CharField(max_length=200, null=True)),
('destroy_early_crop_en', models.CharField(max_length=200, null=True)),
('destroy_middle_crop', models.CharField(blank=True, max_length=200, null=True)),
('destroy_middle_crop_pl', models.CharField(blank=True, max_length=200, null=True)),
('destroy_middle_crop_en', models.CharField(blank=True, max_length=200, null=True)),
('destroy_late_crop', models.CharField(max_length=200)),
('destroy_late_crop_pl', models.CharField(max_length=200, null=True)),
('destroy_late_crop_en', models.CharField(max_length=200, null=True)),
('add_fertilizer', models.CharField(max_length=200)),
('add_fertilizer_pl', models.CharField(max_length=200, null=True)),
('add_fertilizer_en', models.CharField(max_length=200, null=True)),
('add_fertilizer_onhover', models.CharField(max_length=800)),
('add_fertilizer_onhover_pl', models.CharField(max_length=800, null=True)),
('add_fertilizer_onhover_en', models.CharField(max_length=800, null=True)),
('change', models.CharField(max_length=200)),
('change_pl', models.CharField(max_length=200, null=True)),
('change_en', models.CharField(max_length=200, null=True)),
('publish', models.CharField(blank=True, max_length=200, null=True)),
('publish_pl', models.CharField(blank=True, max_length=200, null=True)),
('publish_en', models.CharField(blank=True, max_length=200, null=True)),
('unpublish', models.CharField(blank=True, max_length=200, null=True)),
('unpublish_pl', models.CharField(blank=True, max_length=200, null=True)),
('unpublish_en', models.CharField(blank=True, max_length=200, null=True)),
('publish_text', models.CharField(blank=True, max_length=200, null=True)),
('publish_text_pl', models.CharField(blank=True, max_length=200, null=True)),
('publish_text_en', models.CharField(blank=True, max_length=200, null=True)),
('unpublish_text', models.CharField(blank=True, max_length=200, null=True)),
('unpublish_text_pl', models.CharField(blank=True, max_length=200, null=True)),
('unpublish_text_en', models.CharField(blank=True, max_length=200, null=True)),
('publish_onhover', models.CharField(blank=True, max_length=900, null=True)),
('publish_onhover_pl', models.CharField(blank=True, max_length=900, null=True)),
('publish_onhover_en', models.CharField(blank=True, max_length=900, null=True)),
('unpublish_onhover', models.CharField(blank=True, max_length=900, null=True)),
('unpublish_onhover_pl', models.CharField(blank=True, max_length=900, null=True)),
('unpublish_onhover_en', models.CharField(blank=True, max_length=900, null=True)),
('more_info', models.CharField(blank=True, max_length=900, null=True)),
('more_info_pl', models.CharField(blank=True, max_length=900, null=True)),
('more_info_en', models.CharField(blank=True, max_length=900, null=True)),
('option_select', models.CharField(blank=True, max_length=200, null=True)),
('option_select_pl', models.CharField(blank=True, max_length=200, null=True)),
('option_select_en', models.CharField(blank=True, max_length=200, null=True)),
('in_this_plan', models.CharField(blank=True, max_length=200, null=True)),
('in_this_plan_pl', models.CharField(blank=True, max_length=200, null=True)),
('in_this_plan_en', models.CharField(blank=True, max_length=200, null=True)),
('fabs_and', models.CharField(blank=True, max_length=200, null=True)),
('fabs_and_pl', models.CharField(blank=True, max_length=200, null=True)),
('fabs_and_en', models.CharField(blank=True, max_length=200, null=True)),
('should_be_fabs', models.CharField(blank=True, max_length=200, null=True)),
('should_be_fabs_pl', models.CharField(blank=True, max_length=200, null=True)),
('should_be_fabs_en', models.CharField(blank=True, max_length=200, null=True)),
('error_len', models.CharField(blank=True, max_length=200, null=True)),
('error_len_pl', models.CharField(blank=True, max_length=200, null=True)),
('error_len_en', models.CharField(blank=True, max_length=200, null=True)),
('len_required', models.CharField(blank=True, max_length=200, null=True)),
('len_required_pl', models.CharField(blank=True, max_length=200, null=True)),
('len_required_en', models.CharField(blank=True, max_length=200, null=True)),
('remove_or_add', models.CharField(blank=True, max_length=200, null=True)),
('remove_or_add_pl', models.CharField(blank=True, max_length=200, null=True)),
('remove_or_add_en', models.CharField(blank=True, max_length=200, null=True)),
('plan_limit_reached', models.TextField(blank=True, null=True)),
('plan_limit_reached_pl', models.TextField(blank=True, null=True)),
('plan_limit_reached_en', models.TextField(blank=True, null=True)),
('family', models.CharField(blank=True, max_length=200, null=True)),
('family_pl', models.CharField(blank=True, max_length=200, null=True)),
('family_en', models.CharField(blank=True, max_length=200, null=True)),
('species', models.CharField(blank=True, max_length=200, null=True)),
('species_pl', models.CharField(blank=True, max_length=200, null=True)),
('species_en', models.CharField(blank=True, max_length=200, null=True)),
('sources', models.CharField(blank=True, max_length=200, null=True)),
('sources_pl', models.CharField(blank=True, max_length=200, null=True)),
('sources_en', models.CharField(blank=True, max_length=200, null=True)),
('notes', models.CharField(blank=True, max_length=200, null=True)),
('notes_pl', models.CharField(blank=True, max_length=200, null=True)),
('notes_en', models.CharField(blank=True, max_length=200, null=True)),
('allelopatic_conflict', models.CharField(blank=True, max_length=200, null=True)),
('allelopatic_conflict_pl', models.CharField(blank=True, max_length=200, null=True)),
('allelopatic_conflict_en', models.CharField(blank=True, max_length=200, null=True)),
('harms', models.CharField(blank=True, max_length=200, null=True)),
('harms_pl', models.CharField(blank=True, max_length=200, null=True)),
('harms_en', models.CharField(blank=True, max_length=200, null=True)),
('in_step', models.CharField(blank=True, max_length=200, null=True)),
('in_step_pl', models.CharField(blank=True, max_length=200, null=True)),
('in_step_en', models.CharField(blank=True, max_length=200, null=True)),
('well_cooperates', models.CharField(blank=True, max_length=200, null=True)),
('well_cooperates_pl', models.CharField(blank=True, max_length=200, null=True)),
('well_cooperates_en', models.CharField(blank=True, max_length=200, null=True)),
('collides', models.CharField(blank=True, max_length=200, null=True)),
('collides_pl', models.CharField(blank=True, max_length=200, null=True)),
('collides_en', models.CharField(blank=True, max_length=200, null=True)),
('image_source', models.CharField(blank=True, max_length=200, null=True)),
('image_source_pl', models.CharField(blank=True, max_length=200, null=True)),
('image_source_en', models.CharField(blank=True, max_length=200, null=True)),
('add_fertilizer_main', models.CharField(blank=True, max_length=200, null=True)),
('add_fertilizer_main_pl', models.CharField(blank=True, max_length=200, null=True)),
('add_fertilizer_main_en', models.CharField(blank=True, max_length=200, null=True)),
('add_fertilizer_onhover_main', models.CharField(blank=True, max_length=300, null=True)),
('add_fertilizer_onhover_main_pl', models.CharField(blank=True, max_length=300, null=True)),
('add_fertilizer_onhover_main_en', models.CharField(blank=True, max_length=300, null=True)),
('infl_type', models.CharField(blank=True, max_length=100, null=True)),
('infl_type_pl', models.CharField(blank=True, max_length=100, null=True)),
('infl_type_en', models.CharField(blank=True, max_length=100, null=True)),
('companion', models.CharField(blank=True, max_length=100, null=True)),
('companion_pl', models.CharField(blank=True, max_length=100, null=True)),
('companion_en', models.CharField(blank=True, max_length=100, null=True)),
('following', models.CharField(blank=True, max_length=100, null=True)),
('following_pl', models.CharField(blank=True, max_length=100, null=True)),
('following_en', models.CharField(blank=True, max_length=100, null=True)),
('allelopatic', models.CharField(blank=True, max_length=150, null=True)),
('allelopatic_pl', models.CharField(blank=True, max_length=150, null=True)),
('allelopatic_en', models.CharField(blank=True, max_length=150, null=True)),
('source_button', models.CharField(blank=True, max_length=50, null=True)),
('source_button_pl', models.CharField(blank=True, max_length=50, null=True)),
('source_button_en', models.CharField(blank=True, max_length=50, null=True)),
('known_interactions', models.CharField(blank=True, max_length=200, null=True)),
('known_interactions_pl', models.CharField(blank=True, max_length=200, null=True)),
('known_interactions_en', models.CharField(blank=True, max_length=200, null=True)),
('plant_to_other', models.CharField(blank=True, max_length=200, null=True)),
('plant_to_other_pl', models.CharField(blank=True, max_length=200, null=True)),
('plant_to_other_en', models.CharField(blank=True, max_length=200, null=True)),
('other_to_plant', models.CharField(blank=True, max_length=200, null=True)),
('other_to_plant_pl', models.CharField(blank=True, max_length=200, null=True)),
('other_to_plant_en', models.CharField(blank=True, max_length=200, null=True)),
('family_to_other', models.CharField(blank=True, max_length=200, null=True)),
('family_to_other_pl', models.CharField(blank=True, max_length=200, null=True)),
('family_to_other_en', models.CharField(blank=True, max_length=200, null=True)),
('other_to_family', models.CharField(blank=True, max_length=200, null=True)),
('other_to_family_pl', models.CharField(blank=True, max_length=200, null=True)),
('other_to_family_en', models.CharField(blank=True, max_length=200, null=True)),
('category_to_other', models.CharField(blank=True, max_length=200, null=True)),
('category_to_other_pl', models.CharField(blank=True, max_length=200, null=True)),
('category_to_other_en', models.CharField(blank=True, max_length=200, null=True)),
('other_to_category', models.CharField(blank=True, max_length=200, null=True)),
('other_to_category_pl', models.CharField(blank=True, max_length=200, null=True)),
('other_to_category_en', models.CharField(blank=True, max_length=200, null=True)),
('annual', models.CharField(blank=True, max_length=50, null=True)),
('annual_pl', models.CharField(blank=True, max_length=50, null=True)),
('annual_en', models.CharField(blank=True, max_length=50, null=True)),
('perennial', models.CharField(blank=True, max_length=50, null=True)),
('perennial_pl', models.CharField(blank=True, max_length=50, null=True)),
('perennial_en', models.CharField(blank=True, max_length=50, null=True)),
('evaluate_button', models.CharField(blank=True, max_length=50, null=True)),
('evaluate_button_pl', models.CharField(blank=True, max_length=50, null=True)),
('evaluate_button_en', models.CharField(blank=True, max_length=50, null=True)),
('analysis_by_text', models.CharField(blank=True, max_length=200, null=True)),
('analysis_by_text_pl', models.CharField(blank=True, max_length=200, null=True)),
('analysis_by_text_en', models.CharField(blank=True, max_length=200, null=True)),
('remove_element', models.CharField(blank=True, max_length=50, null=True)),
('remove_element_pl', models.CharField(blank=True, max_length=50, null=True)),
('remove_element_en', models.CharField(blank=True, max_length=50, null=True)),
('add_element', models.CharField(blank=True, max_length=50, null=True)),
('add_element_pl', models.CharField(blank=True, max_length=50, null=True)),
('add_element_en', models.CharField(blank=True, max_length=50, null=True)),
('return_to_plan', models.CharField(blank=True, max_length=50, null=True)),
('return_to_plan_pl', models.CharField(blank=True, max_length=50, null=True)),
('return_to_plan_en', models.CharField(blank=True, max_length=50, null=True)),
('categories', models.CharField(blank=True, max_length=50, null=True)),
('categories_pl', models.CharField(blank=True, max_length=50, null=True)),
('categories_en', models.CharField(blank=True, max_length=50, null=True)),
],
options={
'verbose_name_plural': 'Rotator Editor Page Names',
},
),
migrations.CreateModel(
name='FertilizerDataSource',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('title', models.CharField(max_length=150)),
('title_pl', models.CharField(max_length=150, null=True)),
('title_en', models.CharField(max_length=150, null=True)),
('descr', models.TextField(blank=True, null=True)),
('descr_pl', models.TextField(blank=True, null=True)),
('descr_en', models.TextField(blank=True, null=True)),
('pages_from', models.IntegerField(blank=True, null=True)),
('pages_to', models.IntegerField(blank=True, null=True)),
('at_data_string', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='element_data_string_set', to='strona.elementdatastring')),
('from_fertilizer', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='fertilizer_source_set', to='strona.fertilizer')),
],
options={
'ordering': ['from_fertilizer', 'title'],
},
),
migrations.AddField(
model_name='basicelement',
name='image_source',
field=models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='set_image_eds_basic', to='strona.elementdatastring'),
),
]
| 75.047438 | 194 | 0.610695 | 4,672 | 39,550 | 4.940925 | 0.048587 | 0.239776 | 0.223531 | 0.268238 | 0.939612 | 0.904999 | 0.878054 | 0.863499 | 0.802071 | 0.721019 | 0 | 0.033506 | 0.239343 | 39,550 | 526 | 195 | 75.190114 | 0.733812 | 0.001138 | 0 | 0.165703 | 1 | 0 | 0.153862 | 0.02539 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.017341 | 0.003854 | 0 | 0.011561 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 10 |
67bcf0b9f692897c7c9f10eccc5cf59e2a8430b0 | 18,919 | py | Python | ads/tests/stubdata/metrics.py | cespinosa/ads | 4d60ce28cec42d2d971d8424fbde8d4de010d1e1 | [
"MIT"
] | 130 | 2015-01-13T22:17:31.000Z | 2022-03-15T00:22:06.000Z | ads/tests/stubdata/metrics.py | cespinosa/ads | 4d60ce28cec42d2d971d8424fbde8d4de010d1e1 | [
"MIT"
] | 99 | 2015-01-12T17:22:44.000Z | 2022-02-01T04:10:42.000Z | ads/tests/stubdata/metrics.py | cespinosa/ads | 4d60ce28cec42d2d971d8424fbde8d4de010d1e1 | [
"MIT"
] | 67 | 2015-01-13T09:42:13.000Z | 2022-03-10T12:39:27.000Z | # coding=utf-8
"""
Metrics service example response
"""
example_metrics_response='''{
"basic stats": {
"average number of downloads": 79.0,
"average number of reads": 111.0,
"median number of downloads": 79.0,
"median number of reads": 111.0,
"normalized paper count": 1.0,
"number of papers": 1,
"recent number of downloads": 0,
"recent number of reads": 2,
"total number of downloads": 79,
"total number of reads": 111
},
"basic stats refereed": {
"average number of downloads": 79.0,
"average number of reads": 111.0,
"median number of downloads": 79.0,
"median number of reads": 111.0,
"normalized paper count": 1.0,
"number of papers": 1,
"recent number of downloads": 0,
"recent number of reads": 2,
"total number of downloads": 79,
"total number of reads": 111
},
"citation stats": {
"average number of citations": 66.0,
"average number of refereed citations": 64.0,
"median number of citations": 66.0,
"median number of refereed citations": 64.0,
"normalized number of citations": 66.0,
"normalized number of refereed citations": 64.0,
"number of citing papers": 66,
"number of self-citations": 0,
"total number of citations": 66,
"total number of refereed citations": 64
},
"citation stats refereed": {
"average number of citations": 66.0,
"average number of refereed citations": 64.0,
"median number of citations": 66.0,
"median number of refereed citations": 64.0,
"normalized number of citations": 66.0,
"normalized number of refereed citations": 64.0,
"number of citing papers": 66,
"number of self-citations": 0,
"total number of citations": 66,
"total number of refereed citations": 64
},
"histograms": {
"citations": {
"nonrefereed to nonrefereed": {
"1981": 0,
"1982": 0,
"1983": 0,
"1984": 0,
"1985": 0,
"1986": 0,
"1987": 0,
"1988": 0,
"1989": 0,
"1990": 0,
"1991": 0,
"1992": 0,
"1993": 0,
"1994": 0,
"1995": 0,
"1996": 0,
"1997": 0,
"1998": 0,
"1999": 0,
"2000": 0,
"2001": 0,
"2002": 0,
"2003": 0,
"2004": 0,
"2005": 0,
"2006": 0,
"2007": 0,
"2008": 0,
"2009": 0,
"2010": 0,
"2011": 0,
"2012": 0,
"2013": 0,
"2014": 0,
"2015": 0
},
"nonrefereed to nonrefereed normalized": {
"1981": 0,
"1982": 0,
"1983": 0,
"1984": 0,
"1985": 0,
"1986": 0,
"1987": 0,
"1988": 0,
"1989": 0,
"1990": 0,
"1991": 0,
"1992": 0,
"1993": 0,
"1994": 0,
"1995": 0,
"1996": 0,
"1997": 0,
"1998": 0,
"1999": 0,
"2000": 0,
"2001": 0,
"2002": 0,
"2003": 0,
"2004": 0,
"2005": 0,
"2006": 0,
"2007": 0,
"2008": 0,
"2009": 0,
"2010": 0,
"2011": 0,
"2012": 0,
"2013": 0,
"2014": 0,
"2015": 0
},
"nonrefereed to refereed": {
"1981": 0,
"1982": 0,
"1983": 0,
"1984": 0,
"1985": 0,
"1986": 0,
"1987": 1,
"1988": 0,
"1989": 0,
"1990": 0,
"1991": 0,
"1992": 0,
"1993": 0,
"1994": 0,
"1995": 0,
"1996": 0,
"1997": 0,
"1998": 0,
"1999": 0,
"2000": 1,
"2001": 0,
"2002": 0,
"2003": 0,
"2004": 0,
"2005": 0,
"2006": 0,
"2007": 0,
"2008": 0,
"2009": 0,
"2010": 0,
"2011": 0,
"2012": 0,
"2013": 0,
"2014": 0,
"2015": 0
},
"nonrefereed to refereed normalized": {
"1981": 0,
"1982": 0,
"1983": 0,
"1984": 0,
"1985": 0,
"1986": 0,
"1987": 1.0,
"1988": 0,
"1989": 0,
"1990": 0,
"1991": 0,
"1992": 0,
"1993": 0,
"1994": 0,
"1995": 0,
"1996": 0,
"1997": 0,
"1998": 0,
"1999": 0,
"2000": 1.0,
"2001": 0,
"2002": 0,
"2003": 0,
"2004": 0,
"2005": 0,
"2006": 0,
"2007": 0,
"2008": 0,
"2009": 0,
"2010": 0,
"2011": 0,
"2012": 0,
"2013": 0,
"2014": 0,
"2015": 0
},
"refereed to nonrefereed": {
"1981": 0,
"1982": 0,
"1983": 0,
"1984": 0,
"1985": 0,
"1986": 0,
"1987": 0,
"1988": 0,
"1989": 0,
"1990": 0,
"1991": 0,
"1992": 0,
"1993": 0,
"1994": 0,
"1995": 0,
"1996": 0,
"1997": 0,
"1998": 0,
"1999": 0,
"2000": 0,
"2001": 0,
"2002": 0,
"2003": 0,
"2004": 0,
"2005": 0,
"2006": 0,
"2007": 0,
"2008": 0,
"2009": 0,
"2010": 0,
"2011": 0,
"2012": 0,
"2013": 0,
"2014": 0,
"2015": 0
},
"refereed to nonrefereed normalized": {
"1981": 0,
"1982": 0,
"1983": 0,
"1984": 0,
"1985": 0,
"1986": 0,
"1987": 0,
"1988": 0,
"1989": 0,
"1990": 0,
"1991": 0,
"1992": 0,
"1993": 0,
"1994": 0,
"1995": 0,
"1996": 0,
"1997": 0,
"1998": 0,
"1999": 0,
"2000": 0,
"2001": 0,
"2002": 0,
"2003": 0,
"2004": 0,
"2005": 0,
"2006": 0,
"2007": 0,
"2008": 0,
"2009": 0,
"2010": 0,
"2011": 0,
"2012": 0,
"2013": 0,
"2014": 0,
"2015": 0
},
"refereed to refereed": {
"1981": 4,
"1982": 3,
"1983": 3,
"1984": 3,
"1985": 7,
"1986": 4,
"1987": 6,
"1988": 7,
"1989": 1,
"1990": 4,
"1991": 2,
"1992": 1,
"1993": 1,
"1994": 4,
"1995": 0,
"1996": 2,
"1997": 1,
"1998": 2,
"1999": 3,
"2000": 1,
"2001": 1,
"2002": 1,
"2003": 0,
"2004": 0,
"2005": 0,
"2006": 1,
"2007": 0,
"2008": 0,
"2009": 1,
"2010": 0,
"2011": 0,
"2012": 1,
"2013": 0,
"2014": 0,
"2015": 0
},
"refereed to refereed normalized": {
"1981": 4.0,
"1982": 3.0,
"1983": 3.0,
"1984": 3.0,
"1985": 7.0,
"1986": 4.0,
"1987": 6.0,
"1988": 7.0,
"1989": 1.0,
"1990": 4.0,
"1991": 2.0,
"1992": 1.0,
"1993": 1.0,
"1994": 4.0,
"1995": 0,
"1996": 2.0,
"1997": 1.0,
"1998": 2.0,
"1999": 3.0,
"2000": 1.0,
"2001": 1.0,
"2002": 1.0,
"2003": 0,
"2004": 0,
"2005": 0,
"2006": 1.0,
"2007": 0,
"2008": 0,
"2009": 1.0,
"2010": 0,
"2011": 0,
"2012": 1.0,
"2013": 0,
"2014": 0,
"2015": 0
}
},
"downloads": {
"all downloads": {
"1996": 0,
"1997": 0,
"1998": 0,
"1999": 2,
"2000": 4,
"2001": 2,
"2002": 9,
"2003": 4,
"2004": 12,
"2005": 4,
"2006": 6,
"2007": 3,
"2008": 11,
"2009": 4,
"2010": 2,
"2011": 7,
"2012": 4,
"2013": 2,
"2014": 3,
"2015": 0
},
"all downloads normalized": {
"1996": 0.0,
"1997": 0.0,
"1998": 0.0,
"1999": 2.0,
"2000": 4.0,
"2001": 2.0,
"2002": 9.0,
"2003": 4.0,
"2004": 12.0,
"2005": 4.0,
"2006": 6.0,
"2007": 3.0,
"2008": 11.0,
"2009": 4.0,
"2010": 2.0,
"2011": 7.0,
"2012": 4.0,
"2013": 2.0,
"2014": 3.0,
"2015": 0.0
},
"refereed downloads": {
"1996": 0,
"1997": 0,
"1998": 0,
"1999": 2,
"2000": 4,
"2001": 2,
"2002": 9,
"2003": 4,
"2004": 12,
"2005": 4,
"2006": 6,
"2007": 3,
"2008": 11,
"2009": 4,
"2010": 2,
"2011": 7,
"2012": 4,
"2013": 2,
"2014": 3,
"2015": 0
},
"refereed downloads normalized": {
"1996": 0.0,
"1997": 0.0,
"1998": 0.0,
"1999": 2.0,
"2000": 4.0,
"2001": 2.0,
"2002": 9.0,
"2003": 4.0,
"2004": 12.0,
"2005": 4.0,
"2006": 6.0,
"2007": 3.0,
"2008": 11.0,
"2009": 4.0,
"2010": 2.0,
"2011": 7.0,
"2012": 4.0,
"2013": 2.0,
"2014": 3.0,
"2015": 0.0
}
},
"publications": {
"all publications": {
"1980": 1,
"1981": 0,
"1982": 0,
"1983": 0,
"1984": 0,
"1985": 0,
"1986": 0,
"1987": 0,
"1988": 0,
"1989": 0,
"1990": 0,
"1991": 0,
"1992": 0,
"1993": 0,
"1994": 0,
"1995": 0,
"1996": 0,
"1997": 0,
"1998": 0,
"1999": 0,
"2000": 0,
"2001": 0,
"2002": 0,
"2003": 0,
"2004": 0,
"2005": 0,
"2006": 0,
"2007": 0,
"2008": 0,
"2009": 0,
"2010": 0,
"2011": 0,
"2012": 0,
"2013": 0,
"2014": 0,
"2015": 0
},
"all publications normalized": {
"1980": 1.0,
"1981": 0,
"1982": 0,
"1983": 0,
"1984": 0,
"1985": 0,
"1986": 0,
"1987": 0,
"1988": 0,
"1989": 0,
"1990": 0,
"1991": 0,
"1992": 0,
"1993": 0,
"1994": 0,
"1995": 0,
"1996": 0,
"1997": 0,
"1998": 0,
"1999": 0,
"2000": 0,
"2001": 0,
"2002": 0,
"2003": 0,
"2004": 0,
"2005": 0,
"2006": 0,
"2007": 0,
"2008": 0,
"2009": 0,
"2010": 0,
"2011": 0,
"2012": 0,
"2013": 0,
"2014": 0,
"2015": 0
},
"refereed publications": {
"1980": 1,
"1981": 0,
"1982": 0,
"1983": 0,
"1984": 0,
"1985": 0,
"1986": 0,
"1987": 0,
"1988": 0,
"1989": 0,
"1990": 0,
"1991": 0,
"1992": 0,
"1993": 0,
"1994": 0,
"1995": 0,
"1996": 0,
"1997": 0,
"1998": 0,
"1999": 0,
"2000": 0,
"2001": 0,
"2002": 0,
"2003": 0,
"2004": 0,
"2005": 0,
"2006": 0,
"2007": 0,
"2008": 0,
"2009": 0,
"2010": 0,
"2011": 0,
"2012": 0,
"2013": 0,
"2014": 0,
"2015": 0
},
"refereed publications normalized": {
"1980": 1.0,
"1981": 0,
"1982": 0,
"1983": 0,
"1984": 0,
"1985": 0,
"1986": 0,
"1987": 0,
"1988": 0,
"1989": 0,
"1990": 0,
"1991": 0,
"1992": 0,
"1993": 0,
"1994": 0,
"1995": 0,
"1996": 0,
"1997": 0,
"1998": 0,
"1999": 0,
"2000": 0,
"2001": 0,
"2002": 0,
"2003": 0,
"2004": 0,
"2005": 0,
"2006": 0,
"2007": 0,
"2008": 0,
"2009": 0,
"2010": 0,
"2011": 0,
"2012": 0,
"2013": 0,
"2014": 0,
"2015": 0
}
},
"reads": {
"all reads": {
"1996": 0,
"1997": 0,
"1998": 2,
"1999": 5,
"2000": 4,
"2001": 4,
"2002": 14,
"2003": 5,
"2004": 14,
"2005": 4,
"2006": 8,
"2007": 5,
"2008": 13,
"2009": 7,
"2010": 3,
"2011": 9,
"2012": 5,
"2013": 3,
"2014": 4,
"2015": 2
},
"all reads normalized": {
"1996": 0.0,
"1997": 0.0,
"1998": 2.0,
"1999": 5.0,
"2000": 4.0,
"2001": 4.0,
"2002": 14.0,
"2003": 5.0,
"2004": 14.0,
"2005": 4.0,
"2006": 8.0,
"2007": 5.0,
"2008": 13.0,
"2009": 7.0,
"2010": 3.0,
"2011": 9.0,
"2012": 5.0,
"2013": 3.0,
"2014": 4.0,
"2015": 2.0
},
"refereed reads": {
"1996": 0,
"1997": 0,
"1998": 2,
"1999": 5,
"2000": 4,
"2001": 4,
"2002": 14,
"2003": 5,
"2004": 14,
"2005": 4,
"2006": 8,
"2007": 5,
"2008": 13,
"2009": 7,
"2010": 3,
"2011": 9,
"2012": 5,
"2013": 3,
"2014": 4,
"2015": 2
},
"refereed reads normalized": {
"1996": 0.0,
"1997": 0.0,
"1998": 2.0,
"1999": 5.0,
"2000": 4.0,
"2001": 4.0,
"2002": 14.0,
"2003": 5.0,
"2004": 14.0,
"2005": 4.0,
"2006": 8.0,
"2007": 5.0,
"2008": 13.0,
"2009": 7.0,
"2010": 3.0,
"2011": 9.0,
"2012": 5.0,
"2013": 3.0,
"2014": 4.0,
"2015": 2.0
}
}
},
"indicators": {
"g": 1,
"h": 1,
"i10": 1,
"i100": 0,
"m": 0.027777777777777776,
"read10": 0,
"riq": 47,
"tori": 2.9132693515578443
},
"indicators refereed": {
"g": 1,
"h": 1,
"i10": 1,
"i100": 0,
"m": 0.027777777777777776,
"read10": 0,
"riq": 47,
"tori": 2.9132693515578443
},
"skipped bibcodes": [],
"time series": {
"g": {
"1980": 0,
"1981": 0,
"1982": 0,
"1983": 0,
"1984": 0,
"1985": 0,
"1986": 0,
"1987": 0,
"1988": 0,
"1989": 0,
"1990": 0,
"1991": 0,
"1992": 0,
"1993": 0,
"1994": 0,
"1995": 0,
"1996": 0,
"1997": 0,
"1998": 0,
"1999": 0,
"2000": 0,
"2001": 0,
"2002": 0,
"2003": 0,
"2004": 0,
"2005": 0,
"2006": 0,
"2007": 0,
"2008": 0,
"2009": 0,
"2010": 0,
"2011": 0,
"2012": 0,
"2013": 0,
"2014": 0,
"2015": 0
},
"h": {
"1980": 0,
"1981": 0,
"1982": 0,
"1983": 0,
"1984": 0,
"1985": 0,
"1986": 0,
"1987": 0,
"1988": 0,
"1989": 0,
"1990": 0,
"1991": 0,
"1992": 0,
"1993": 0,
"1994": 0,
"1995": 0,
"1996": 0,
"1997": 0,
"1998": 0,
"1999": 0,
"2000": 0,
"2001": 0,
"2002": 0,
"2003": 0,
"2004": 0,
"2005": 0,
"2006": 0,
"2007": 0,
"2008": 0,
"2009": 0,
"2010": 0,
"2011": 0,
"2012": 0,
"2013": 0,
"2014": 0,
"2015": 0
},
"i10": {
"1980": 0,
"1981": 0,
"1982": 0,
"1983": 1,
"1984": 1,
"1985": 1,
"1986": 1,
"1987": 1,
"1988": 1,
"1989": 1,
"1990": 1,
"1991": 1,
"1992": 1,
"1993": 1,
"1994": 1,
"1995": 1,
"1996": 1,
"1997": 1,
"1998": 1,
"1999": 1,
"2000": 1,
"2001": 1,
"2002": 1,
"2003": 1,
"2004": 1,
"2005": 1,
"2006": 1,
"2007": 1,
"2008": 1,
"2009": 1,
"2010": 1,
"2011": 1,
"2012": 1,
"2013": 1,
"2014": 1,
"2015": 1
},
"i100": {
"1980": 0,
"1981": 0,
"1982": 0,
"1983": 0,
"1984": 0,
"1985": 0,
"1986": 0,
"1987": 0,
"1988": 0,
"1989": 0,
"1990": 0,
"1991": 0,
"1992": 0,
"1993": 0,
"1994": 0,
"1995": 0,
"1996": 0,
"1997": 0,
"1998": 0,
"1999": 0,
"2000": 0,
"2001": 0,
"2002": 0,
"2003": 0,
"2004": 0,
"2005": 0,
"2006": 0,
"2007": 0,
"2008": 0,
"2009": 0,
"2010": 0,
"2011": 0,
"2012": 0,
"2013": 0,
"2014": 0,
"2015": 0
},
"read10": {
"1980": 0.0,
"1981": 0.0,
"1982": 0.0,
"1983": 0.0,
"1984": 0.0,
"1985": 0.0,
"1986": 0.0,
"1987": 0.0,
"1988": 0.0,
"1989": 0.0,
"1990": 0.0,
"1991": 0.0,
"1992": 0.0,
"1993": 0.0,
"1994": 0.0,
"1995": 0.0,
"1996": 0,
"1997": 0,
"1998": 0,
"1999": 0,
"2000": 0,
"2001": 0,
"2002": 0,
"2003": 0,
"2004": 0,
"2005": 0,
"2006": 0,
"2007": 0,
"2008": 0,
"2009": 0,
"2010": 0,
"2011": 0,
"2012": 0,
"2013": 0,
"2014": 0,
"2015": 0.0
},
"tori": {
"1980": 0.0,
"1981": 0.4720238095238095,
"1982": 0.5856664212076583,
"1983": 0.8958737577307843,
"1984": 1.0222508289354417,
"1985": 1.2338053118559293,
"1986": 1.3881897427207974,
"1987": 1.7795326046518947,
"1988": 2.021415312515987,
"1989": 2.061415312515987,
"1990": 2.3636195410360052,
"1991": 2.3976064691405803,
"1992": 2.4270182338464625,
"1993": 2.479649812793831,
"1994": 2.574924230722711,
"1995": 2.574924230722711,
"1996": 2.6006982366947367,
"1997": 2.614983950980451,
"1998": 2.6688457395983374,
"1999": 2.6995311545967735,
"2000": 2.8718000621597985,
"2001": 2.879262748726963,
"2002": 2.8956561913499135,
"2003": 2.8956561913499135,
"2004": 2.8956561913499135,
"2005": 2.8956561913499135,
"2006": 2.902799048492771,
"2007": 2.902799048492771,
"2008": 2.902799048492771,
"2009": 2.9116486060148947,
"2010": 2.9116486060148947,
"2011": 2.9116486060148947,
"2012": 2.9132693515578443,
"2013": 2.9132693515578443,
"2014": 2.9132693515578443,
"2015": 2.9132693515578443
}
}
}''' | 20.126596 | 52 | 0.345684 | 2,119 | 18,919 | 3.085418 | 0.058046 | 0.046497 | 0.02019 | 0.027531 | 0.784797 | 0.769807 | 0.760324 | 0.749771 | 0.7435 | 0.737229 | 0 | 0.472892 | 0.452085 | 18,919 | 940 | 53 | 20.126596 | 0.157824 | 0.002431 | 0 | 0.8 | 0 | 0 | 0.998145 | 0.002226 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
e1e23e164e34ac841b10b7417e7322750f023fd0 | 3,788 | py | Python | Pythonjunior2020/Woche1/Aufgabe_1_7_2_Zusatz.py | Zeyecx/HPI-Potsdam | ed45ca471cee204dde74dd2c3efae3877ee71036 | [
"MIT"
] | null | null | null | Pythonjunior2020/Woche1/Aufgabe_1_7_2_Zusatz.py | Zeyecx/HPI-Potsdam | ed45ca471cee204dde74dd2c3efae3877ee71036 | [
"MIT"
] | null | null | null | Pythonjunior2020/Woche1/Aufgabe_1_7_2_Zusatz.py | Zeyecx/HPI-Potsdam | ed45ca471cee204dde74dd2c3efae3877ee71036 | [
"MIT"
] | null | null | null | # 1.7.2 Woche 1, Block 7, Aufgabe 2
# Imports
import time
from turtle import *
shape("turtle")
bgcolor("black")
pencolor("white")
fillcolor("white")
#Grundlinien
penup()
goto(0, 200)
pendown()
goto(0, -200)
penup()
goto(-200, 0)
pendown()
goto(200, 0)
#erster Quadrant
penup()
goto(0, 200)
pendown()
goto(10, 0)
penup()
goto(0, 190)
pendown()
goto(20, 0)
penup()
goto(0, 180)
pendown()
goto(30, 0)
penup()
goto(0, 170)
pendown()
goto(40, 0)
penup()
goto(0, 160)
pendown()
goto(50, 0)
penup()
goto(0, 150)
pendown()
goto(60, 0)
penup()
goto(0, 140)
pendown()
goto(70, 0)
penup()
goto(0, 130)
pendown()
goto(80, 0)
penup()
goto(0, 120)
pendown()
goto(90, 0)
penup()
goto(0, 110)
pendown()
goto(100, 0)
penup()
goto(0, 100)
pendown()
goto(110, 0)
penup()
goto(0, 90)
pendown()
goto(120, 0)
penup()
goto(0, 80)
pendown()
goto(130, 0)
penup()
goto(0, 70)
pendown()
goto(140, 0)
penup()
goto(0, 60)
pendown()
goto(150, 0)
penup()
goto(0, 50)
pendown()
goto(160, 0)
penup()
goto(0, 40)
pendown()
goto(170, 0)
penup()
goto(0, 30)
pendown()
goto(180, 0)
penup()
goto(0, 20)
pendown()
goto(190, 0)
penup()
goto(0, 10)
pendown()
goto(200, 0)
#zweiter Quadrant
goto(0, -10)
penup()
goto(190, 0)
pendown()
goto(0, -20)
penup()
goto(180, 0)
pendown()
goto(0, -30)
penup()
goto(170, 0)
pendown()
goto(0, -40)
penup()
goto(160, 0)
pendown()
goto(0, -50)
penup()
goto(150, 0)
pendown()
goto(0, -60)
penup()
goto(140, 0)
pendown()
goto(0, -70)
penup()
goto(130, 0)
pendown()
goto(0, -80)
penup()
goto(120, 0)
pendown()
goto(0, -90)
penup()
goto(110, 0)
pendown()
goto(0, -100)
penup()
goto(100, 0)
pendown()
goto(0, -110)
penup()
goto(90, 00)
pendown()
goto(0, -120)
penup()
goto(80, 0)
pendown()
goto(0, -130)
penup()
goto(70, 0)
pendown()
goto(0, -140)
penup()
goto(60, 0)
pendown()
goto(0, -150)
penup()
goto(50, 0)
pendown()
goto(0, -160)
penup()
goto(40, 0)
pendown()
goto(0, -170)
penup()
goto(30, 0)
pendown()
goto(0, -180)
penup()
goto(20, 0)
pendown()
goto(0, -190)
penup()
goto(10, 0)
pendown()
goto(0, -200)
#dritter Quadrant
goto(-10, 0)
penup()
goto(0, -190)
pendown()
goto(-20, 0)
penup()
goto(0, -180)
pendown()
goto(-30, 0)
penup()
goto(0, -170)
pendown()
goto(-40, 0)
penup()
goto(0, -160)
pendown()
goto(-50, 0)
penup()
goto(0, -150)
pendown()
goto(-60, 0)
penup()
goto(0, -140)
pendown()
goto(-70, 0)
penup()
goto(0, -130)
pendown()
goto(-80, 0)
penup()
goto(0, -120)
pendown()
goto(-90, 0)
penup()
goto(0, -110)
pendown()
goto(-100, 0)
penup()
goto(0, -100)
pendown()
goto(-110, 0)
penup()
goto(0, -90)
pendown()
goto(-120, 0)
penup()
goto(0, -80)
pendown()
goto(-130, 0)
penup()
goto(0, -70)
pendown()
goto(-140, 0)
penup()
goto(0, -60)
pendown()
goto(-150, 0)
penup()
goto(0, -50)
pendown()
goto(-160, 0)
penup()
goto(0, -40)
pendown()
goto(-170, 0)
penup()
goto(0, -30)
pendown()
goto(-180, 0)
penup()
goto(0, -20)
pendown()
goto(-190, 0)
penup()
goto(0, -10)
pendown()
goto(-200, 0)
#vierter Quadrant
goto(0, 10)
penup()
goto(-190, 0)
pendown()
goto(0, 20)
penup()
goto(-180, 0)
pendown()
goto(0, 30)
penup()
goto(-170, 0)
pendown()
goto(0, 40)
penup()
goto(-160, 0)
pendown()
goto(0, 50)
penup()
goto(-150, 0)
pendown()
goto(0, 60)
penup()
goto(-140, 0)
pendown()
goto(0, 70)
penup()
goto(-130, 0)
pendown()
goto(0, 80)
penup()
goto(-120, 0)
pendown()
goto(0, 90)
penup()
goto(-110, 0)
pendown()
goto(0, 100)
penup()
goto(-100, 0)
pendown()
goto(0, 110)
penup()
goto(-90, 00)
pendown()
goto(0, 120)
penup()
goto(-80, 0)
pendown()
goto(0, 130)
penup()
goto(-70, 0)
pendown()
goto(0, 140)
penup()
goto(-60, 0)
pendown()
goto(0, 150)
penup()
goto(-50, 0)
pendown()
goto(0, 160)
penup()
goto(-40, 0)
pendown()
goto(0, 170)
penup()
goto(-30, 0)
pendown()
goto(0, 180)
penup()
goto(-20, 0)
pendown()
goto(0, 190)
penup()
goto(-10, 0)
pendown()
goto(0, 200)
# Pause
time.sleep(10) | 11.076023 | 35 | 0.622492 | 677 | 3,788 | 3.483013 | 0.073855 | 0.171756 | 0.169635 | 0.177269 | 0.91179 | 0.91179 | 0.89313 | 0.89313 | 0.89313 | 0.89313 | 0 | 0.178036 | 0.137012 | 3,788 | 342 | 36 | 11.076023 | 0.543285 | 0.031943 | 0 | 0.966258 | 0 | 0 | 0.005739 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.006135 | 0 | 0.006135 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
c03b7477de175e3590722b27118c9efdb6568a46 | 422,133 | py | Python | tools/zonedbpy/zone_policies.py | facchinm/AceTime | df0e05995899cc5653ec583dbee737f53ad588ea | [
"MIT"
] | 1 | 2021-02-23T06:17:36.000Z | 2021-02-23T06:17:36.000Z | tools/zonedbpy/zone_policies.py | facchinm/AceTime | df0e05995899cc5653ec583dbee737f53ad588ea | [
"MIT"
] | null | null | null | tools/zonedbpy/zone_policies.py | facchinm/AceTime | df0e05995899cc5653ec583dbee737f53ad588ea | [
"MIT"
] | null | null | null | # This file was generated by the following script:
#
# $ ../tzcompiler.py
# --input_dir /home/brian/dev/tz
# --output_dir /home/brian/src/AceTime/tools/zonedbpy
# --tz_version 2021a
# --action zonedb
# --language python
# --scope extended
# --ignore_buf_size_too_large
# --start_year 1974
# --until_year 2050
#
# using the TZ Database files
#
# africa
# antarctica
# asia
# australasia
# backward
# etcetera
# europe
# northamerica
# southamerica
#
# from https://github.com/eggert/tz/releases/tag/2021a
#
# DO NOT EDIT
# numPolicies: 118
# numRules: 1142
#---------------------------------------------------------------------------
# Policy name: AN
# Rule count: 16
#---------------------------------------------------------------------------
ZONE_RULES_AN = [
# Rule AN 1971 1985 - Oct lastSun 2:00s 1:00 D
{
'from_year': 1971,
'to_year': 1985,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule AN 1972 only - Feb 27 2:00s 0 S
{
'from_year': 1972,
'to_year': 1972,
'in_month': 2,
'on_day_of_week': 0,
'on_day_of_month': 27,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': 'S',
},
# Rule AN 1973 1981 - Mar Sun>=1 2:00s 0 S
{
'from_year': 1973,
'to_year': 1981,
'in_month': 3,
'on_day_of_week': 7,
'on_day_of_month': 1,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': 'S',
},
# Rule AN 1982 only - Apr Sun>=1 2:00s 0 S
{
'from_year': 1982,
'to_year': 1982,
'in_month': 4,
'on_day_of_week': 7,
'on_day_of_month': 1,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': 'S',
},
# Rule AN 1983 1985 - Mar Sun>=1 2:00s 0 S
{
'from_year': 1983,
'to_year': 1985,
'in_month': 3,
'on_day_of_week': 7,
'on_day_of_month': 1,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': 'S',
},
# Rule AN 1986 1989 - Mar Sun>=15 2:00s 0 S
{
'from_year': 1986,
'to_year': 1989,
'in_month': 3,
'on_day_of_week': 7,
'on_day_of_month': 15,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': 'S',
},
# Rule AN 1986 only - Oct 19 2:00s 1:00 D
{
'from_year': 1986,
'to_year': 1986,
'in_month': 10,
'on_day_of_week': 0,
'on_day_of_month': 19,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule AN 1987 1999 - Oct lastSun 2:00s 1:00 D
{
'from_year': 1987,
'to_year': 1999,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule AN 1990 1995 - Mar Sun>=1 2:00s 0 S
{
'from_year': 1990,
'to_year': 1995,
'in_month': 3,
'on_day_of_week': 7,
'on_day_of_month': 1,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': 'S',
},
# Rule AN 1996 2005 - Mar lastSun 2:00s 0 S
{
'from_year': 1996,
'to_year': 2005,
'in_month': 3,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': 'S',
},
# Rule AN 2000 only - Aug lastSun 2:00s 1:00 D
{
'from_year': 2000,
'to_year': 2000,
'in_month': 8,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule AN 2001 2007 - Oct lastSun 2:00s 1:00 D
{
'from_year': 2001,
'to_year': 2007,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule AN 2006 only - Apr Sun>=1 2:00s 0 S
{
'from_year': 2006,
'to_year': 2006,
'in_month': 4,
'on_day_of_week': 7,
'on_day_of_month': 1,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': 'S',
},
# Rule AN 2007 only - Mar lastSun 2:00s 0 S
{
'from_year': 2007,
'to_year': 2007,
'in_month': 3,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': 'S',
},
# Rule AN 2008 max - Apr Sun>=1 2:00s 0 S
{
'from_year': 2008,
'to_year': 9999,
'in_month': 4,
'on_day_of_week': 7,
'on_day_of_month': 1,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': 'S',
},
# Rule AN 2008 max - Oct Sun>=1 2:00s 1:00 D
{
'from_year': 2008,
'to_year': 9999,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 1,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 3600,
'letter': 'D',
},
]
ZONE_POLICY_AN = {
'name': 'AN',
'rules': ZONE_RULES_AN
}
#---------------------------------------------------------------------------
# Policy name: AQ
# Rule count: 3
#---------------------------------------------------------------------------
ZONE_RULES_AQ = [
# Rule AQ 1972 only - Feb lastSun 2:00s 0 S
{
'from_year': 1972,
'to_year': 1972,
'in_month': 2,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': 'S',
},
# Rule AQ 1989 1991 - Oct lastSun 2:00s 1:00 D
{
'from_year': 1989,
'to_year': 1991,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule AQ 1990 1992 - Mar Sun>=1 2:00s 0 S
{
'from_year': 1990,
'to_year': 1992,
'in_month': 3,
'on_day_of_week': 7,
'on_day_of_month': 1,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': 'S',
},
]
ZONE_POLICY_AQ = {
'name': 'AQ',
'rules': ZONE_RULES_AQ
}
#---------------------------------------------------------------------------
# Policy name: AS
# Rule count: 15
#---------------------------------------------------------------------------
ZONE_RULES_AS = [
# Rule AS 1971 1985 - Oct lastSun 2:00s 1:00 D
{
'from_year': 1971,
'to_year': 1985,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule AS 1986 only - Oct 19 2:00s 1:00 D
{
'from_year': 1986,
'to_year': 1986,
'in_month': 10,
'on_day_of_week': 0,
'on_day_of_month': 19,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule AS 1987 2007 - Oct lastSun 2:00s 1:00 D
{
'from_year': 1987,
'to_year': 2007,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule AS 1972 only - Feb 27 2:00s 0 S
{
'from_year': 1972,
'to_year': 1972,
'in_month': 2,
'on_day_of_week': 0,
'on_day_of_month': 27,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': 'S',
},
# Rule AS 1973 1985 - Mar Sun>=1 2:00s 0 S
{
'from_year': 1973,
'to_year': 1985,
'in_month': 3,
'on_day_of_week': 7,
'on_day_of_month': 1,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': 'S',
},
# Rule AS 1986 1990 - Mar Sun>=15 2:00s 0 S
{
'from_year': 1986,
'to_year': 1990,
'in_month': 3,
'on_day_of_week': 7,
'on_day_of_month': 15,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': 'S',
},
# Rule AS 1991 only - Mar 3 2:00s 0 S
{
'from_year': 1991,
'to_year': 1991,
'in_month': 3,
'on_day_of_week': 0,
'on_day_of_month': 3,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': 'S',
},
# Rule AS 1992 only - Mar 22 2:00s 0 S
{
'from_year': 1992,
'to_year': 1992,
'in_month': 3,
'on_day_of_week': 0,
'on_day_of_month': 22,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': 'S',
},
# Rule AS 1993 only - Mar 7 2:00s 0 S
{
'from_year': 1993,
'to_year': 1993,
'in_month': 3,
'on_day_of_week': 0,
'on_day_of_month': 7,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': 'S',
},
# Rule AS 1994 only - Mar 20 2:00s 0 S
{
'from_year': 1994,
'to_year': 1994,
'in_month': 3,
'on_day_of_week': 0,
'on_day_of_month': 20,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': 'S',
},
# Rule AS 1995 2005 - Mar lastSun 2:00s 0 S
{
'from_year': 1995,
'to_year': 2005,
'in_month': 3,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': 'S',
},
# Rule AS 2006 only - Apr 2 2:00s 0 S
{
'from_year': 2006,
'to_year': 2006,
'in_month': 4,
'on_day_of_week': 0,
'on_day_of_month': 2,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': 'S',
},
# Rule AS 2007 only - Mar lastSun 2:00s 0 S
{
'from_year': 2007,
'to_year': 2007,
'in_month': 3,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': 'S',
},
# Rule AS 2008 max - Apr Sun>=1 2:00s 0 S
{
'from_year': 2008,
'to_year': 9999,
'in_month': 4,
'on_day_of_week': 7,
'on_day_of_month': 1,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': 'S',
},
# Rule AS 2008 max - Oct Sun>=1 2:00s 1:00 D
{
'from_year': 2008,
'to_year': 9999,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 1,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 3600,
'letter': 'D',
},
]
ZONE_POLICY_AS = {
'name': 'AS',
'rules': ZONE_RULES_AS
}
#---------------------------------------------------------------------------
# Policy name: AT
# Rule count: 16
#---------------------------------------------------------------------------
ZONE_RULES_AT = [
# Rule AT 1968 1985 - Oct lastSun 2:00s 1:00 D
{
'from_year': 1968,
'to_year': 1985,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule AT 1972 only - Feb lastSun 2:00s 0 S
{
'from_year': 1972,
'to_year': 1972,
'in_month': 2,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': 'S',
},
# Rule AT 1973 1981 - Mar Sun>=1 2:00s 0 S
{
'from_year': 1973,
'to_year': 1981,
'in_month': 3,
'on_day_of_week': 7,
'on_day_of_month': 1,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': 'S',
},
# Rule AT 1982 1983 - Mar lastSun 2:00s 0 S
{
'from_year': 1982,
'to_year': 1983,
'in_month': 3,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': 'S',
},
# Rule AT 1984 1986 - Mar Sun>=1 2:00s 0 S
{
'from_year': 1984,
'to_year': 1986,
'in_month': 3,
'on_day_of_week': 7,
'on_day_of_month': 1,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': 'S',
},
# Rule AT 1986 only - Oct Sun>=15 2:00s 1:00 D
{
'from_year': 1986,
'to_year': 1986,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 15,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule AT 1987 1990 - Mar Sun>=15 2:00s 0 S
{
'from_year': 1987,
'to_year': 1990,
'in_month': 3,
'on_day_of_week': 7,
'on_day_of_month': 15,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': 'S',
},
# Rule AT 1987 only - Oct Sun>=22 2:00s 1:00 D
{
'from_year': 1987,
'to_year': 1987,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 22,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule AT 1988 1990 - Oct lastSun 2:00s 1:00 D
{
'from_year': 1988,
'to_year': 1990,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule AT 1991 1999 - Oct Sun>=1 2:00s 1:00 D
{
'from_year': 1991,
'to_year': 1999,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 1,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule AT 1991 2005 - Mar lastSun 2:00s 0 S
{
'from_year': 1991,
'to_year': 2005,
'in_month': 3,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': 'S',
},
# Rule AT 2000 only - Aug lastSun 2:00s 1:00 D
{
'from_year': 2000,
'to_year': 2000,
'in_month': 8,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule AT 2001 max - Oct Sun>=1 2:00s 1:00 D
{
'from_year': 2001,
'to_year': 9999,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 1,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule AT 2006 only - Apr Sun>=1 2:00s 0 S
{
'from_year': 2006,
'to_year': 2006,
'in_month': 4,
'on_day_of_week': 7,
'on_day_of_month': 1,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': 'S',
},
# Rule AT 2007 only - Mar lastSun 2:00s 0 S
{
'from_year': 2007,
'to_year': 2007,
'in_month': 3,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': 'S',
},
# Rule AT 2008 max - Apr Sun>=1 2:00s 0 S
{
'from_year': 2008,
'to_year': 9999,
'in_month': 4,
'on_day_of_week': 7,
'on_day_of_month': 1,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': 'S',
},
]
ZONE_POLICY_AT = {
'name': 'AT',
'rules': ZONE_RULES_AT
}
#---------------------------------------------------------------------------
# Policy name: AV
# Rule count: 14
#---------------------------------------------------------------------------
ZONE_RULES_AV = [
# Rule AV 1971 1985 - Oct lastSun 2:00s 1:00 D
{
'from_year': 1971,
'to_year': 1985,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule AV 1972 only - Feb lastSun 2:00s 0 S
{
'from_year': 1972,
'to_year': 1972,
'in_month': 2,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': 'S',
},
# Rule AV 1973 1985 - Mar Sun>=1 2:00s 0 S
{
'from_year': 1973,
'to_year': 1985,
'in_month': 3,
'on_day_of_week': 7,
'on_day_of_month': 1,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': 'S',
},
# Rule AV 1986 1990 - Mar Sun>=15 2:00s 0 S
{
'from_year': 1986,
'to_year': 1990,
'in_month': 3,
'on_day_of_week': 7,
'on_day_of_month': 15,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': 'S',
},
# Rule AV 1986 1987 - Oct Sun>=15 2:00s 1:00 D
{
'from_year': 1986,
'to_year': 1987,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 15,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule AV 1988 1999 - Oct lastSun 2:00s 1:00 D
{
'from_year': 1988,
'to_year': 1999,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule AV 1991 1994 - Mar Sun>=1 2:00s 0 S
{
'from_year': 1991,
'to_year': 1994,
'in_month': 3,
'on_day_of_week': 7,
'on_day_of_month': 1,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': 'S',
},
# Rule AV 1995 2005 - Mar lastSun 2:00s 0 S
{
'from_year': 1995,
'to_year': 2005,
'in_month': 3,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': 'S',
},
# Rule AV 2000 only - Aug lastSun 2:00s 1:00 D
{
'from_year': 2000,
'to_year': 2000,
'in_month': 8,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule AV 2001 2007 - Oct lastSun 2:00s 1:00 D
{
'from_year': 2001,
'to_year': 2007,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule AV 2006 only - Apr Sun>=1 2:00s 0 S
{
'from_year': 2006,
'to_year': 2006,
'in_month': 4,
'on_day_of_week': 7,
'on_day_of_month': 1,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': 'S',
},
# Rule AV 2007 only - Mar lastSun 2:00s 0 S
{
'from_year': 2007,
'to_year': 2007,
'in_month': 3,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': 'S',
},
# Rule AV 2008 max - Apr Sun>=1 2:00s 0 S
{
'from_year': 2008,
'to_year': 9999,
'in_month': 4,
'on_day_of_week': 7,
'on_day_of_month': 1,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': 'S',
},
# Rule AV 2008 max - Oct Sun>=1 2:00s 1:00 D
{
'from_year': 2008,
'to_year': 9999,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 1,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 3600,
'letter': 'D',
},
]
ZONE_POLICY_AV = {
'name': 'AV',
'rules': ZONE_RULES_AV
}
#---------------------------------------------------------------------------
# Policy name: AW
# Rule count: 10
#---------------------------------------------------------------------------
ZONE_RULES_AW = [
# Anchor: Rule AW 1975 only - Mar Sun>=1 2:00s 0 S
{
'from_year': 0,
'to_year': 0,
'in_month': 1,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': 'S',
},
# Rule AW 1974 only - Oct lastSun 2:00s 1:00 D
{
'from_year': 1974,
'to_year': 1974,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule AW 1975 only - Mar Sun>=1 2:00s 0 S
{
'from_year': 1975,
'to_year': 1975,
'in_month': 3,
'on_day_of_week': 7,
'on_day_of_month': 1,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': 'S',
},
# Rule AW 1983 only - Oct lastSun 2:00s 1:00 D
{
'from_year': 1983,
'to_year': 1983,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule AW 1984 only - Mar Sun>=1 2:00s 0 S
{
'from_year': 1984,
'to_year': 1984,
'in_month': 3,
'on_day_of_week': 7,
'on_day_of_month': 1,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': 'S',
},
# Rule AW 1991 only - Nov 17 2:00s 1:00 D
{
'from_year': 1991,
'to_year': 1991,
'in_month': 11,
'on_day_of_week': 0,
'on_day_of_month': 17,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule AW 1992 only - Mar Sun>=1 2:00s 0 S
{
'from_year': 1992,
'to_year': 1992,
'in_month': 3,
'on_day_of_week': 7,
'on_day_of_month': 1,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': 'S',
},
# Rule AW 2006 only - Dec 3 2:00s 1:00 D
{
'from_year': 2006,
'to_year': 2006,
'in_month': 12,
'on_day_of_week': 0,
'on_day_of_month': 3,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule AW 2007 2009 - Mar lastSun 2:00s 0 S
{
'from_year': 2007,
'to_year': 2009,
'in_month': 3,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': 'S',
},
# Rule AW 2007 2008 - Oct lastSun 2:00s 1:00 D
{
'from_year': 2007,
'to_year': 2008,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 3600,
'letter': 'D',
},
]
ZONE_POLICY_AW = {
'name': 'AW',
'rules': ZONE_RULES_AW
}
#---------------------------------------------------------------------------
# Policy name: Albania
# Rule count: 22
#---------------------------------------------------------------------------
ZONE_RULES_Albania = [
# Rule Albania 1943 only - Apr 10 3:00 0 -
{
'from_year': 1943,
'to_year': 1943,
'in_month': 4,
'on_day_of_week': 0,
'on_day_of_month': 10,
'at_seconds': 10800,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Albania 1974 only - May 4 0:00 1:00 S
{
'from_year': 1974,
'to_year': 1974,
'in_month': 5,
'on_day_of_week': 0,
'on_day_of_month': 4,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Albania 1974 only - Oct 2 0:00 0 -
{
'from_year': 1974,
'to_year': 1974,
'in_month': 10,
'on_day_of_week': 0,
'on_day_of_month': 2,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Albania 1975 only - May 1 0:00 1:00 S
{
'from_year': 1975,
'to_year': 1975,
'in_month': 5,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Albania 1975 only - Oct 2 0:00 0 -
{
'from_year': 1975,
'to_year': 1975,
'in_month': 10,
'on_day_of_week': 0,
'on_day_of_month': 2,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Albania 1976 only - May 2 0:00 1:00 S
{
'from_year': 1976,
'to_year': 1976,
'in_month': 5,
'on_day_of_week': 0,
'on_day_of_month': 2,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Albania 1976 only - Oct 3 0:00 0 -
{
'from_year': 1976,
'to_year': 1976,
'in_month': 10,
'on_day_of_week': 0,
'on_day_of_month': 3,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Albania 1977 only - May 8 0:00 1:00 S
{
'from_year': 1977,
'to_year': 1977,
'in_month': 5,
'on_day_of_week': 0,
'on_day_of_month': 8,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Albania 1977 only - Oct 2 0:00 0 -
{
'from_year': 1977,
'to_year': 1977,
'in_month': 10,
'on_day_of_week': 0,
'on_day_of_month': 2,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Albania 1978 only - May 6 0:00 1:00 S
{
'from_year': 1978,
'to_year': 1978,
'in_month': 5,
'on_day_of_week': 0,
'on_day_of_month': 6,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Albania 1978 only - Oct 1 0:00 0 -
{
'from_year': 1978,
'to_year': 1978,
'in_month': 10,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Albania 1979 only - May 5 0:00 1:00 S
{
'from_year': 1979,
'to_year': 1979,
'in_month': 5,
'on_day_of_week': 0,
'on_day_of_month': 5,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Albania 1979 only - Sep 30 0:00 0 -
{
'from_year': 1979,
'to_year': 1979,
'in_month': 9,
'on_day_of_week': 0,
'on_day_of_month': 30,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Albania 1980 only - May 3 0:00 1:00 S
{
'from_year': 1980,
'to_year': 1980,
'in_month': 5,
'on_day_of_week': 0,
'on_day_of_month': 3,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Albania 1980 only - Oct 4 0:00 0 -
{
'from_year': 1980,
'to_year': 1980,
'in_month': 10,
'on_day_of_week': 0,
'on_day_of_month': 4,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Albania 1981 only - Apr 26 0:00 1:00 S
{
'from_year': 1981,
'to_year': 1981,
'in_month': 4,
'on_day_of_week': 0,
'on_day_of_month': 26,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Albania 1981 only - Sep 27 0:00 0 -
{
'from_year': 1981,
'to_year': 1981,
'in_month': 9,
'on_day_of_week': 0,
'on_day_of_month': 27,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Albania 1982 only - May 2 0:00 1:00 S
{
'from_year': 1982,
'to_year': 1982,
'in_month': 5,
'on_day_of_week': 0,
'on_day_of_month': 2,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Albania 1982 only - Oct 3 0:00 0 -
{
'from_year': 1982,
'to_year': 1982,
'in_month': 10,
'on_day_of_week': 0,
'on_day_of_month': 3,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Albania 1983 only - Apr 18 0:00 1:00 S
{
'from_year': 1983,
'to_year': 1983,
'in_month': 4,
'on_day_of_week': 0,
'on_day_of_month': 18,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Albania 1983 only - Oct 1 0:00 0 -
{
'from_year': 1983,
'to_year': 1983,
'in_month': 10,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Albania 1984 only - Apr 1 0:00 1:00 S
{
'from_year': 1984,
'to_year': 1984,
'in_month': 4,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'S',
},
]
ZONE_POLICY_Albania = {
'name': 'Albania',
'rules': ZONE_RULES_Albania
}
#---------------------------------------------------------------------------
# Policy name: Algeria
# Rule count: 7
#---------------------------------------------------------------------------
ZONE_RULES_Algeria = [
# Rule Algeria 1971 only - Sep 26 23:00s 0 -
{
'from_year': 1971,
'to_year': 1971,
'in_month': 9,
'on_day_of_week': 0,
'on_day_of_month': 26,
'at_seconds': 82800,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': '-',
},
# Rule Algeria 1977 only - May 6 0:00 1:00 S
{
'from_year': 1977,
'to_year': 1977,
'in_month': 5,
'on_day_of_week': 0,
'on_day_of_month': 6,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Algeria 1977 only - Oct 21 0:00 0 -
{
'from_year': 1977,
'to_year': 1977,
'in_month': 10,
'on_day_of_week': 0,
'on_day_of_month': 21,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Algeria 1978 only - Mar 24 1:00 1:00 S
{
'from_year': 1978,
'to_year': 1978,
'in_month': 3,
'on_day_of_week': 0,
'on_day_of_month': 24,
'at_seconds': 3600,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Algeria 1978 only - Sep 22 3:00 0 -
{
'from_year': 1978,
'to_year': 1978,
'in_month': 9,
'on_day_of_week': 0,
'on_day_of_month': 22,
'at_seconds': 10800,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Algeria 1980 only - Apr 25 0:00 1:00 S
{
'from_year': 1980,
'to_year': 1980,
'in_month': 4,
'on_day_of_week': 0,
'on_day_of_month': 25,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Algeria 1980 only - Oct 31 2:00 0 -
{
'from_year': 1980,
'to_year': 1980,
'in_month': 10,
'on_day_of_week': 0,
'on_day_of_month': 31,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
]
ZONE_POLICY_Algeria = {
'name': 'Algeria',
'rules': ZONE_RULES_Algeria
}
#---------------------------------------------------------------------------
# Policy name: Arg
# Rule count: 11
#---------------------------------------------------------------------------
ZONE_RULES_Arg = [
# Rule Arg 1968 1969 - Apr Sun>=1 0:00 0 -
{
'from_year': 1968,
'to_year': 1969,
'in_month': 4,
'on_day_of_week': 7,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Arg 1974 only - Jan 23 0:00 1:00 -
{
'from_year': 1974,
'to_year': 1974,
'in_month': 1,
'on_day_of_week': 0,
'on_day_of_month': 23,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Arg 1974 only - May 1 0:00 0 -
{
'from_year': 1974,
'to_year': 1974,
'in_month': 5,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Arg 1988 only - Dec 1 0:00 1:00 -
{
'from_year': 1988,
'to_year': 1988,
'in_month': 12,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Arg 1989 1993 - Mar Sun>=1 0:00 0 -
{
'from_year': 1989,
'to_year': 1993,
'in_month': 3,
'on_day_of_week': 7,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Arg 1989 1992 - Oct Sun>=15 0:00 1:00 -
{
'from_year': 1989,
'to_year': 1992,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 15,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Arg 1999 only - Oct Sun>=1 0:00 1:00 -
{
'from_year': 1999,
'to_year': 1999,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Arg 2000 only - Mar 3 0:00 0 -
{
'from_year': 2000,
'to_year': 2000,
'in_month': 3,
'on_day_of_week': 0,
'on_day_of_month': 3,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Arg 2007 only - Dec 30 0:00 1:00 -
{
'from_year': 2007,
'to_year': 2007,
'in_month': 12,
'on_day_of_week': 0,
'on_day_of_month': 30,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Arg 2008 2009 - Mar Sun>=15 0:00 0 -
{
'from_year': 2008,
'to_year': 2009,
'in_month': 3,
'on_day_of_week': 7,
'on_day_of_month': 15,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Arg 2008 only - Oct Sun>=15 0:00 1:00 -
{
'from_year': 2008,
'to_year': 2008,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 15,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
]
ZONE_POLICY_Arg = {
'name': 'Arg',
'rules': ZONE_RULES_Arg
}
#---------------------------------------------------------------------------
# Policy name: Armenia
# Rule count: 3
#---------------------------------------------------------------------------
ZONE_RULES_Armenia = [
# Anchor: Rule Armenia 2011 only - Oct lastSun 2:00s 0 -
{
'from_year': 0,
'to_year': 0,
'in_month': 1,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Armenia 2011 only - Mar lastSun 2:00s 1:00 -
{
'from_year': 2011,
'to_year': 2011,
'in_month': 3,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Armenia 2011 only - Oct lastSun 2:00s 0 -
{
'from_year': 2011,
'to_year': 2011,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': '-',
},
]
ZONE_POLICY_Armenia = {
'name': 'Armenia',
'rules': ZONE_RULES_Armenia
}
#---------------------------------------------------------------------------
# Policy name: Aus
# Rule count: 1
#---------------------------------------------------------------------------
ZONE_RULES_Aus = [
# Rule Aus 1943 1944 - Mar lastSun 2:00s 0 S
{
'from_year': 1943,
'to_year': 1944,
'in_month': 3,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': 'S',
},
]
ZONE_POLICY_Aus = {
'name': 'Aus',
'rules': ZONE_RULES_Aus
}
#---------------------------------------------------------------------------
# Policy name: Austria
# Rule count: 3
#---------------------------------------------------------------------------
ZONE_RULES_Austria = [
# Rule Austria 1947 1948 - Oct Sun>=1 2:00s 0 -
{
'from_year': 1947,
'to_year': 1948,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 1,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': '-',
},
# Rule Austria 1980 only - Apr 6 0:00 1:00 S
{
'from_year': 1980,
'to_year': 1980,
'in_month': 4,
'on_day_of_week': 0,
'on_day_of_month': 6,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Austria 1980 only - Sep 28 0:00 0 -
{
'from_year': 1980,
'to_year': 1980,
'in_month': 9,
'on_day_of_week': 0,
'on_day_of_month': 28,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
]
ZONE_POLICY_Austria = {
'name': 'Austria',
'rules': ZONE_RULES_Austria
}
#---------------------------------------------------------------------------
# Policy name: Azer
# Rule count: 3
#---------------------------------------------------------------------------
ZONE_RULES_Azer = [
# Anchor: Rule Azer 1997 2015 - Oct lastSun 5:00 0 -
{
'from_year': 0,
'to_year': 0,
'in_month': 1,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Azer 1997 2015 - Mar lastSun 4:00 1:00 -
{
'from_year': 1997,
'to_year': 2015,
'in_month': 3,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 14400,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Azer 1997 2015 - Oct lastSun 5:00 0 -
{
'from_year': 1997,
'to_year': 2015,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 18000,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
]
ZONE_POLICY_Azer = {
'name': 'Azer',
'rules': ZONE_RULES_Azer
}
#---------------------------------------------------------------------------
# Policy name: Bahamas
# Rule count: 3
#---------------------------------------------------------------------------
ZONE_RULES_Bahamas = [
# Rule Bahamas 1945 only - Oct 17 24:00 0 S
{
'from_year': 1945,
'to_year': 1945,
'in_month': 10,
'on_day_of_week': 0,
'on_day_of_month': 17,
'at_seconds': 86400,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': 'S',
},
# Rule Bahamas 1964 1975 - Oct lastSun 2:00 0 S
{
'from_year': 1964,
'to_year': 1975,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': 'S',
},
# Rule Bahamas 1964 1975 - Apr lastSun 2:00 1:00 D
{
'from_year': 1964,
'to_year': 1975,
'in_month': 4,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'D',
},
]
ZONE_POLICY_Bahamas = {
'name': 'Bahamas',
'rules': ZONE_RULES_Bahamas
}
#---------------------------------------------------------------------------
# Policy name: Barb
# Rule count: 6
#---------------------------------------------------------------------------
ZONE_RULES_Barb = [
# Anchor: Rule Barb 1977 1978 - Oct Sun>=1 2:00 0 S
{
'from_year': 0,
'to_year': 0,
'in_month': 1,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': 'S',
},
# Rule Barb 1977 only - Jun 12 2:00 1:00 D
{
'from_year': 1977,
'to_year': 1977,
'in_month': 6,
'on_day_of_week': 0,
'on_day_of_month': 12,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule Barb 1977 1978 - Oct Sun>=1 2:00 0 S
{
'from_year': 1977,
'to_year': 1978,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 1,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': 'S',
},
# Rule Barb 1978 1980 - Apr Sun>=15 2:00 1:00 D
{
'from_year': 1978,
'to_year': 1980,
'in_month': 4,
'on_day_of_week': 7,
'on_day_of_month': 15,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule Barb 1979 only - Sep 30 2:00 0 S
{
'from_year': 1979,
'to_year': 1979,
'in_month': 9,
'on_day_of_week': 0,
'on_day_of_month': 30,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': 'S',
},
# Rule Barb 1980 only - Sep 25 2:00 0 S
{
'from_year': 1980,
'to_year': 1980,
'in_month': 9,
'on_day_of_week': 0,
'on_day_of_month': 25,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': 'S',
},
]
ZONE_POLICY_Barb = {
'name': 'Barb',
'rules': ZONE_RULES_Barb
}
#---------------------------------------------------------------------------
# Policy name: Belgium
# Rule count: 1
#---------------------------------------------------------------------------
ZONE_RULES_Belgium = [
# Rule Belgium 1946 only - Oct 7 2:00s 0 -
{
'from_year': 1946,
'to_year': 1946,
'in_month': 10,
'on_day_of_week': 0,
'on_day_of_month': 7,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': '-',
},
]
ZONE_POLICY_Belgium = {
'name': 'Belgium',
'rules': ZONE_RULES_Belgium
}
#---------------------------------------------------------------------------
# Policy name: Belize
# Rule count: 5
#---------------------------------------------------------------------------
ZONE_RULES_Belize = [
# Rule Belize 1948 1968 - Feb Sat>=8 24:00 0 CST
{
'from_year': 1948,
'to_year': 1968,
'in_month': 2,
'on_day_of_week': 6,
'on_day_of_month': 8,
'at_seconds': 86400,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': 'CST',
},
# Rule Belize 1973 only - Dec 5 0:00 1:00 CDT
{
'from_year': 1973,
'to_year': 1973,
'in_month': 12,
'on_day_of_week': 0,
'on_day_of_month': 5,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'CDT',
},
# Rule Belize 1974 only - Feb 9 0:00 0 CST
{
'from_year': 1974,
'to_year': 1974,
'in_month': 2,
'on_day_of_week': 0,
'on_day_of_month': 9,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': 'CST',
},
# Rule Belize 1982 only - Dec 18 0:00 1:00 CDT
{
'from_year': 1982,
'to_year': 1982,
'in_month': 12,
'on_day_of_week': 0,
'on_day_of_month': 18,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'CDT',
},
# Rule Belize 1983 only - Feb 12 0:00 0 CST
{
'from_year': 1983,
'to_year': 1983,
'in_month': 2,
'on_day_of_week': 0,
'on_day_of_month': 12,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': 'CST',
},
]
ZONE_POLICY_Belize = {
'name': 'Belize',
'rules': ZONE_RULES_Belize
}
#---------------------------------------------------------------------------
# Policy name: Bermuda
# Rule count: 1
#---------------------------------------------------------------------------
ZONE_RULES_Bermuda = [
# Rule Bermuda 1956 only - Oct lastSun 2:00 0 S
{
'from_year': 1956,
'to_year': 1956,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': 'S',
},
]
ZONE_POLICY_Bermuda = {
'name': 'Bermuda',
'rules': ZONE_RULES_Bermuda
}
#---------------------------------------------------------------------------
# Policy name: Brazil
# Rule count: 44
#---------------------------------------------------------------------------
ZONE_RULES_Brazil = [
# Rule Brazil 1966 1968 - Mar 1 0:00 0 -
{
'from_year': 1966,
'to_year': 1968,
'in_month': 3,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Brazil 1985 only - Nov 2 0:00 1:00 -
{
'from_year': 1985,
'to_year': 1985,
'in_month': 11,
'on_day_of_week': 0,
'on_day_of_month': 2,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Brazil 1986 only - Mar 15 0:00 0 -
{
'from_year': 1986,
'to_year': 1986,
'in_month': 3,
'on_day_of_week': 0,
'on_day_of_month': 15,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Brazil 1986 only - Oct 25 0:00 1:00 -
{
'from_year': 1986,
'to_year': 1986,
'in_month': 10,
'on_day_of_week': 0,
'on_day_of_month': 25,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Brazil 1987 only - Feb 14 0:00 0 -
{
'from_year': 1987,
'to_year': 1987,
'in_month': 2,
'on_day_of_week': 0,
'on_day_of_month': 14,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Brazil 1987 only - Oct 25 0:00 1:00 -
{
'from_year': 1987,
'to_year': 1987,
'in_month': 10,
'on_day_of_week': 0,
'on_day_of_month': 25,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Brazil 1988 only - Feb 7 0:00 0 -
{
'from_year': 1988,
'to_year': 1988,
'in_month': 2,
'on_day_of_week': 0,
'on_day_of_month': 7,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Brazil 1988 only - Oct 16 0:00 1:00 -
{
'from_year': 1988,
'to_year': 1988,
'in_month': 10,
'on_day_of_week': 0,
'on_day_of_month': 16,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Brazil 1989 only - Jan 29 0:00 0 -
{
'from_year': 1989,
'to_year': 1989,
'in_month': 1,
'on_day_of_week': 0,
'on_day_of_month': 29,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Brazil 1989 only - Oct 15 0:00 1:00 -
{
'from_year': 1989,
'to_year': 1989,
'in_month': 10,
'on_day_of_week': 0,
'on_day_of_month': 15,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Brazil 1990 only - Feb 11 0:00 0 -
{
'from_year': 1990,
'to_year': 1990,
'in_month': 2,
'on_day_of_week': 0,
'on_day_of_month': 11,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Brazil 1990 only - Oct 21 0:00 1:00 -
{
'from_year': 1990,
'to_year': 1990,
'in_month': 10,
'on_day_of_week': 0,
'on_day_of_month': 21,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Brazil 1991 only - Feb 17 0:00 0 -
{
'from_year': 1991,
'to_year': 1991,
'in_month': 2,
'on_day_of_week': 0,
'on_day_of_month': 17,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Brazil 1991 only - Oct 20 0:00 1:00 -
{
'from_year': 1991,
'to_year': 1991,
'in_month': 10,
'on_day_of_week': 0,
'on_day_of_month': 20,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Brazil 1992 only - Feb 9 0:00 0 -
{
'from_year': 1992,
'to_year': 1992,
'in_month': 2,
'on_day_of_week': 0,
'on_day_of_month': 9,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Brazil 1992 only - Oct 25 0:00 1:00 -
{
'from_year': 1992,
'to_year': 1992,
'in_month': 10,
'on_day_of_week': 0,
'on_day_of_month': 25,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Brazil 1993 only - Jan 31 0:00 0 -
{
'from_year': 1993,
'to_year': 1993,
'in_month': 1,
'on_day_of_week': 0,
'on_day_of_month': 31,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Brazil 1993 1995 - Oct Sun>=11 0:00 1:00 -
{
'from_year': 1993,
'to_year': 1995,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 11,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Brazil 1994 1995 - Feb Sun>=15 0:00 0 -
{
'from_year': 1994,
'to_year': 1995,
'in_month': 2,
'on_day_of_week': 7,
'on_day_of_month': 15,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Brazil 1996 only - Feb 11 0:00 0 -
{
'from_year': 1996,
'to_year': 1996,
'in_month': 2,
'on_day_of_week': 0,
'on_day_of_month': 11,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Brazil 1996 only - Oct 6 0:00 1:00 -
{
'from_year': 1996,
'to_year': 1996,
'in_month': 10,
'on_day_of_week': 0,
'on_day_of_month': 6,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Brazil 1997 only - Feb 16 0:00 0 -
{
'from_year': 1997,
'to_year': 1997,
'in_month': 2,
'on_day_of_week': 0,
'on_day_of_month': 16,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Brazil 1997 only - Oct 6 0:00 1:00 -
{
'from_year': 1997,
'to_year': 1997,
'in_month': 10,
'on_day_of_week': 0,
'on_day_of_month': 6,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Brazil 1998 only - Mar 1 0:00 0 -
{
'from_year': 1998,
'to_year': 1998,
'in_month': 3,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Brazil 1998 only - Oct 11 0:00 1:00 -
{
'from_year': 1998,
'to_year': 1998,
'in_month': 10,
'on_day_of_week': 0,
'on_day_of_month': 11,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Brazil 1999 only - Feb 21 0:00 0 -
{
'from_year': 1999,
'to_year': 1999,
'in_month': 2,
'on_day_of_week': 0,
'on_day_of_month': 21,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Brazil 1999 only - Oct 3 0:00 1:00 -
{
'from_year': 1999,
'to_year': 1999,
'in_month': 10,
'on_day_of_week': 0,
'on_day_of_month': 3,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Brazil 2000 only - Feb 27 0:00 0 -
{
'from_year': 2000,
'to_year': 2000,
'in_month': 2,
'on_day_of_week': 0,
'on_day_of_month': 27,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Brazil 2000 2001 - Oct Sun>=8 0:00 1:00 -
{
'from_year': 2000,
'to_year': 2001,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 8,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Brazil 2001 2006 - Feb Sun>=15 0:00 0 -
{
'from_year': 2001,
'to_year': 2006,
'in_month': 2,
'on_day_of_week': 7,
'on_day_of_month': 15,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Brazil 2002 only - Nov 3 0:00 1:00 -
{
'from_year': 2002,
'to_year': 2002,
'in_month': 11,
'on_day_of_week': 0,
'on_day_of_month': 3,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Brazil 2003 only - Oct 19 0:00 1:00 -
{
'from_year': 2003,
'to_year': 2003,
'in_month': 10,
'on_day_of_week': 0,
'on_day_of_month': 19,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Brazil 2004 only - Nov 2 0:00 1:00 -
{
'from_year': 2004,
'to_year': 2004,
'in_month': 11,
'on_day_of_week': 0,
'on_day_of_month': 2,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Brazil 2005 only - Oct 16 0:00 1:00 -
{
'from_year': 2005,
'to_year': 2005,
'in_month': 10,
'on_day_of_week': 0,
'on_day_of_month': 16,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Brazil 2006 only - Nov 5 0:00 1:00 -
{
'from_year': 2006,
'to_year': 2006,
'in_month': 11,
'on_day_of_week': 0,
'on_day_of_month': 5,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Brazil 2007 only - Feb 25 0:00 0 -
{
'from_year': 2007,
'to_year': 2007,
'in_month': 2,
'on_day_of_week': 0,
'on_day_of_month': 25,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Brazil 2007 only - Oct Sun>=8 0:00 1:00 -
{
'from_year': 2007,
'to_year': 2007,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 8,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Brazil 2008 2017 - Oct Sun>=15 0:00 1:00 -
{
'from_year': 2008,
'to_year': 2017,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 15,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Brazil 2008 2011 - Feb Sun>=15 0:00 0 -
{
'from_year': 2008,
'to_year': 2011,
'in_month': 2,
'on_day_of_week': 7,
'on_day_of_month': 15,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Brazil 2012 only - Feb Sun>=22 0:00 0 -
{
'from_year': 2012,
'to_year': 2012,
'in_month': 2,
'on_day_of_week': 7,
'on_day_of_month': 22,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Brazil 2013 2014 - Feb Sun>=15 0:00 0 -
{
'from_year': 2013,
'to_year': 2014,
'in_month': 2,
'on_day_of_week': 7,
'on_day_of_month': 15,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Brazil 2015 only - Feb Sun>=22 0:00 0 -
{
'from_year': 2015,
'to_year': 2015,
'in_month': 2,
'on_day_of_week': 7,
'on_day_of_month': 22,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Brazil 2016 2019 - Feb Sun>=15 0:00 0 -
{
'from_year': 2016,
'to_year': 2019,
'in_month': 2,
'on_day_of_week': 7,
'on_day_of_month': 15,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Brazil 2018 only - Nov Sun>=1 0:00 1:00 -
{
'from_year': 2018,
'to_year': 2018,
'in_month': 11,
'on_day_of_week': 7,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
]
ZONE_POLICY_Brazil = {
'name': 'Brazil',
'rules': ZONE_RULES_Brazil
}
#---------------------------------------------------------------------------
# Policy name: Bulg
# Rule count: 6
#---------------------------------------------------------------------------
ZONE_RULES_Bulg = [
# Anchor: Rule Bulg 1979 only - Oct 1 1:00 0 -
{
'from_year': 0,
'to_year': 0,
'in_month': 1,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Bulg 1979 only - Mar 31 23:00 1:00 S
{
'from_year': 1979,
'to_year': 1979,
'in_month': 3,
'on_day_of_week': 0,
'on_day_of_month': 31,
'at_seconds': 82800,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Bulg 1979 only - Oct 1 1:00 0 -
{
'from_year': 1979,
'to_year': 1979,
'in_month': 10,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 3600,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Bulg 1980 1982 - Apr Sat>=1 23:00 1:00 S
{
'from_year': 1980,
'to_year': 1982,
'in_month': 4,
'on_day_of_week': 6,
'on_day_of_month': 1,
'at_seconds': 82800,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Bulg 1980 only - Sep 29 1:00 0 -
{
'from_year': 1980,
'to_year': 1980,
'in_month': 9,
'on_day_of_week': 0,
'on_day_of_month': 29,
'at_seconds': 3600,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Bulg 1981 only - Sep 27 2:00 0 -
{
'from_year': 1981,
'to_year': 1981,
'in_month': 9,
'on_day_of_week': 0,
'on_day_of_month': 27,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
]
ZONE_POLICY_Bulg = {
'name': 'Bulg',
'rules': ZONE_RULES_Bulg
}
#---------------------------------------------------------------------------
# Policy name: C_Eur
# Rule count: 7
#---------------------------------------------------------------------------
ZONE_RULES_C_Eur = [
# Rule C-Eur 1945 only - Sep 16 2:00s 0 -
{
'from_year': 1945,
'to_year': 1945,
'in_month': 9,
'on_day_of_week': 0,
'on_day_of_month': 16,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': '-',
},
# Rule C-Eur 1977 1980 - Apr Sun>=1 2:00s 1:00 S
{
'from_year': 1977,
'to_year': 1980,
'in_month': 4,
'on_day_of_week': 7,
'on_day_of_month': 1,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule C-Eur 1977 only - Sep lastSun 2:00s 0 -
{
'from_year': 1977,
'to_year': 1977,
'in_month': 9,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': '-',
},
# Rule C-Eur 1978 only - Oct 1 2:00s 0 -
{
'from_year': 1978,
'to_year': 1978,
'in_month': 10,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': '-',
},
# Rule C-Eur 1979 1995 - Sep lastSun 2:00s 0 -
{
'from_year': 1979,
'to_year': 1995,
'in_month': 9,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': '-',
},
# Rule C-Eur 1981 max - Mar lastSun 2:00s 1:00 S
{
'from_year': 1981,
'to_year': 9999,
'in_month': 3,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule C-Eur 1996 max - Oct lastSun 2:00s 0 -
{
'from_year': 1996,
'to_year': 9999,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': '-',
},
]
ZONE_POLICY_C_Eur = {
'name': 'C_Eur',
'rules': ZONE_RULES_C_Eur
}
#---------------------------------------------------------------------------
# Policy name: CO
# Rule count: 3
#---------------------------------------------------------------------------
ZONE_RULES_CO = [
# Anchor: Rule CO 1993 only - Apr 4 0:00 0 -
{
'from_year': 0,
'to_year': 0,
'in_month': 1,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule CO 1992 only - May 3 0:00 1:00 -
{
'from_year': 1992,
'to_year': 1992,
'in_month': 5,
'on_day_of_week': 0,
'on_day_of_month': 3,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule CO 1993 only - Apr 4 0:00 0 -
{
'from_year': 1993,
'to_year': 1993,
'in_month': 4,
'on_day_of_week': 0,
'on_day_of_month': 4,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
]
ZONE_POLICY_CO = {
'name': 'CO',
'rules': ZONE_RULES_CO
}
#---------------------------------------------------------------------------
# Policy name: CR
# Rule count: 6
#---------------------------------------------------------------------------
ZONE_RULES_CR = [
# Anchor: Rule CR 1979 1980 - Jun Sun>=1 0:00 0 S
{
'from_year': 0,
'to_year': 0,
'in_month': 1,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': 'S',
},
# Rule CR 1979 1980 - Feb lastSun 0:00 1:00 D
{
'from_year': 1979,
'to_year': 1980,
'in_month': 2,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule CR 1979 1980 - Jun Sun>=1 0:00 0 S
{
'from_year': 1979,
'to_year': 1980,
'in_month': 6,
'on_day_of_week': 7,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': 'S',
},
# Rule CR 1991 1992 - Jan Sat>=15 0:00 1:00 D
{
'from_year': 1991,
'to_year': 1992,
'in_month': 1,
'on_day_of_week': 6,
'on_day_of_month': 15,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule CR 1991 only - Jul 1 0:00 0 S
{
'from_year': 1991,
'to_year': 1991,
'in_month': 7,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': 'S',
},
# Rule CR 1992 only - Mar 15 0:00 0 S
{
'from_year': 1992,
'to_year': 1992,
'in_month': 3,
'on_day_of_week': 0,
'on_day_of_month': 15,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': 'S',
},
]
ZONE_POLICY_CR = {
'name': 'CR',
'rules': ZONE_RULES_CR
}
#---------------------------------------------------------------------------
# Policy name: Canada
# Rule count: 6
#---------------------------------------------------------------------------
ZONE_RULES_Canada = [
# Rule Canada 1945 only - Sep 30 2:00 0 S
{
'from_year': 1945,
'to_year': 1945,
'in_month': 9,
'on_day_of_week': 0,
'on_day_of_month': 30,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': 'S',
},
# Rule Canada 1974 1986 - Apr lastSun 2:00 1:00 D
{
'from_year': 1974,
'to_year': 1986,
'in_month': 4,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule Canada 1974 2006 - Oct lastSun 2:00 0 S
{
'from_year': 1974,
'to_year': 2006,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': 'S',
},
# Rule Canada 1987 2006 - Apr Sun>=1 2:00 1:00 D
{
'from_year': 1987,
'to_year': 2006,
'in_month': 4,
'on_day_of_week': 7,
'on_day_of_month': 1,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule Canada 2007 max - Mar Sun>=8 2:00 1:00 D
{
'from_year': 2007,
'to_year': 9999,
'in_month': 3,
'on_day_of_week': 7,
'on_day_of_month': 8,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule Canada 2007 max - Nov Sun>=1 2:00 0 S
{
'from_year': 2007,
'to_year': 9999,
'in_month': 11,
'on_day_of_week': 7,
'on_day_of_month': 1,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': 'S',
},
]
ZONE_POLICY_Canada = {
'name': 'Canada',
'rules': ZONE_RULES_Canada
}
#---------------------------------------------------------------------------
# Policy name: Chatham
# Rule count: 10
#---------------------------------------------------------------------------
ZONE_RULES_Chatham = [
# Anchor: Rule Chatham 1975 only - Feb lastSun 2:45s 0 -
{
'from_year': 0,
'to_year': 0,
'in_month': 1,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Chatham 1974 only - Nov Sun>=1 2:45s 1:00 -
{
'from_year': 1974,
'to_year': 1974,
'in_month': 11,
'on_day_of_week': 7,
'on_day_of_month': 1,
'at_seconds': 9900,
'at_time_suffix': 's',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Chatham 1975 only - Feb lastSun 2:45s 0 -
{
'from_year': 1975,
'to_year': 1975,
'in_month': 2,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 9900,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': '-',
},
# Rule Chatham 1975 1988 - Oct lastSun 2:45s 1:00 -
{
'from_year': 1975,
'to_year': 1988,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 9900,
'at_time_suffix': 's',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Chatham 1976 1989 - Mar Sun>=1 2:45s 0 -
{
'from_year': 1976,
'to_year': 1989,
'in_month': 3,
'on_day_of_week': 7,
'on_day_of_month': 1,
'at_seconds': 9900,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': '-',
},
# Rule Chatham 1989 only - Oct Sun>=8 2:45s 1:00 -
{
'from_year': 1989,
'to_year': 1989,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 8,
'at_seconds': 9900,
'at_time_suffix': 's',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Chatham 1990 2006 - Oct Sun>=1 2:45s 1:00 -
{
'from_year': 1990,
'to_year': 2006,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 1,
'at_seconds': 9900,
'at_time_suffix': 's',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Chatham 1990 2007 - Mar Sun>=15 2:45s 0 -
{
'from_year': 1990,
'to_year': 2007,
'in_month': 3,
'on_day_of_week': 7,
'on_day_of_month': 15,
'at_seconds': 9900,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': '-',
},
# Rule Chatham 2007 max - Sep lastSun 2:45s 1:00 -
{
'from_year': 2007,
'to_year': 9999,
'in_month': 9,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 9900,
'at_time_suffix': 's',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Chatham 2008 max - Apr Sun>=1 2:45s 0 -
{
'from_year': 2008,
'to_year': 9999,
'in_month': 4,
'on_day_of_week': 7,
'on_day_of_month': 1,
'at_seconds': 9900,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': '-',
},
]
ZONE_POLICY_Chatham = {
'name': 'Chatham',
'rules': ZONE_RULES_Chatham
}
#---------------------------------------------------------------------------
# Policy name: Chile
# Rule count: 27
#---------------------------------------------------------------------------
ZONE_RULES_Chile = [
# Rule Chile 1970 1972 - Oct Sun>=9 4:00u 1:00 -
{
'from_year': 1970,
'to_year': 1972,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 9,
'at_seconds': 14400,
'at_time_suffix': 'u',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Chile 1972 1986 - Mar Sun>=9 3:00u 0 -
{
'from_year': 1972,
'to_year': 1986,
'in_month': 3,
'on_day_of_week': 7,
'on_day_of_month': 9,
'at_seconds': 10800,
'at_time_suffix': 'u',
'delta_seconds': 0,
'letter': '-',
},
# Rule Chile 1973 only - Sep 30 4:00u 1:00 -
{
'from_year': 1973,
'to_year': 1973,
'in_month': 9,
'on_day_of_week': 0,
'on_day_of_month': 30,
'at_seconds': 14400,
'at_time_suffix': 'u',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Chile 1974 1987 - Oct Sun>=9 4:00u 1:00 -
{
'from_year': 1974,
'to_year': 1987,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 9,
'at_seconds': 14400,
'at_time_suffix': 'u',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Chile 1987 only - Apr 12 3:00u 0 -
{
'from_year': 1987,
'to_year': 1987,
'in_month': 4,
'on_day_of_week': 0,
'on_day_of_month': 12,
'at_seconds': 10800,
'at_time_suffix': 'u',
'delta_seconds': 0,
'letter': '-',
},
# Rule Chile 1988 1990 - Mar Sun>=9 3:00u 0 -
{
'from_year': 1988,
'to_year': 1990,
'in_month': 3,
'on_day_of_week': 7,
'on_day_of_month': 9,
'at_seconds': 10800,
'at_time_suffix': 'u',
'delta_seconds': 0,
'letter': '-',
},
# Rule Chile 1988 1989 - Oct Sun>=9 4:00u 1:00 -
{
'from_year': 1988,
'to_year': 1989,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 9,
'at_seconds': 14400,
'at_time_suffix': 'u',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Chile 1990 only - Sep 16 4:00u 1:00 -
{
'from_year': 1990,
'to_year': 1990,
'in_month': 9,
'on_day_of_week': 0,
'on_day_of_month': 16,
'at_seconds': 14400,
'at_time_suffix': 'u',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Chile 1991 1996 - Mar Sun>=9 3:00u 0 -
{
'from_year': 1991,
'to_year': 1996,
'in_month': 3,
'on_day_of_week': 7,
'on_day_of_month': 9,
'at_seconds': 10800,
'at_time_suffix': 'u',
'delta_seconds': 0,
'letter': '-',
},
# Rule Chile 1991 1997 - Oct Sun>=9 4:00u 1:00 -
{
'from_year': 1991,
'to_year': 1997,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 9,
'at_seconds': 14400,
'at_time_suffix': 'u',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Chile 1997 only - Mar 30 3:00u 0 -
{
'from_year': 1997,
'to_year': 1997,
'in_month': 3,
'on_day_of_week': 0,
'on_day_of_month': 30,
'at_seconds': 10800,
'at_time_suffix': 'u',
'delta_seconds': 0,
'letter': '-',
},
# Rule Chile 1998 only - Mar Sun>=9 3:00u 0 -
{
'from_year': 1998,
'to_year': 1998,
'in_month': 3,
'on_day_of_week': 7,
'on_day_of_month': 9,
'at_seconds': 10800,
'at_time_suffix': 'u',
'delta_seconds': 0,
'letter': '-',
},
# Rule Chile 1998 only - Sep 27 4:00u 1:00 -
{
'from_year': 1998,
'to_year': 1998,
'in_month': 9,
'on_day_of_week': 0,
'on_day_of_month': 27,
'at_seconds': 14400,
'at_time_suffix': 'u',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Chile 1999 only - Apr 4 3:00u 0 -
{
'from_year': 1999,
'to_year': 1999,
'in_month': 4,
'on_day_of_week': 0,
'on_day_of_month': 4,
'at_seconds': 10800,
'at_time_suffix': 'u',
'delta_seconds': 0,
'letter': '-',
},
# Rule Chile 1999 2010 - Oct Sun>=9 4:00u 1:00 -
{
'from_year': 1999,
'to_year': 2010,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 9,
'at_seconds': 14400,
'at_time_suffix': 'u',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Chile 2000 2007 - Mar Sun>=9 3:00u 0 -
{
'from_year': 2000,
'to_year': 2007,
'in_month': 3,
'on_day_of_week': 7,
'on_day_of_month': 9,
'at_seconds': 10800,
'at_time_suffix': 'u',
'delta_seconds': 0,
'letter': '-',
},
# Rule Chile 2008 only - Mar 30 3:00u 0 -
{
'from_year': 2008,
'to_year': 2008,
'in_month': 3,
'on_day_of_week': 0,
'on_day_of_month': 30,
'at_seconds': 10800,
'at_time_suffix': 'u',
'delta_seconds': 0,
'letter': '-',
},
# Rule Chile 2009 only - Mar Sun>=9 3:00u 0 -
{
'from_year': 2009,
'to_year': 2009,
'in_month': 3,
'on_day_of_week': 7,
'on_day_of_month': 9,
'at_seconds': 10800,
'at_time_suffix': 'u',
'delta_seconds': 0,
'letter': '-',
},
# Rule Chile 2010 only - Apr Sun>=1 3:00u 0 -
{
'from_year': 2010,
'to_year': 2010,
'in_month': 4,
'on_day_of_week': 7,
'on_day_of_month': 1,
'at_seconds': 10800,
'at_time_suffix': 'u',
'delta_seconds': 0,
'letter': '-',
},
# Rule Chile 2011 only - May Sun>=2 3:00u 0 -
{
'from_year': 2011,
'to_year': 2011,
'in_month': 5,
'on_day_of_week': 7,
'on_day_of_month': 2,
'at_seconds': 10800,
'at_time_suffix': 'u',
'delta_seconds': 0,
'letter': '-',
},
# Rule Chile 2011 only - Aug Sun>=16 4:00u 1:00 -
{
'from_year': 2011,
'to_year': 2011,
'in_month': 8,
'on_day_of_week': 7,
'on_day_of_month': 16,
'at_seconds': 14400,
'at_time_suffix': 'u',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Chile 2012 2014 - Apr Sun>=23 3:00u 0 -
{
'from_year': 2012,
'to_year': 2014,
'in_month': 4,
'on_day_of_week': 7,
'on_day_of_month': 23,
'at_seconds': 10800,
'at_time_suffix': 'u',
'delta_seconds': 0,
'letter': '-',
},
# Rule Chile 2012 2014 - Sep Sun>=2 4:00u 1:00 -
{
'from_year': 2012,
'to_year': 2014,
'in_month': 9,
'on_day_of_week': 7,
'on_day_of_month': 2,
'at_seconds': 14400,
'at_time_suffix': 'u',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Chile 2016 2018 - May Sun>=9 3:00u 0 -
{
'from_year': 2016,
'to_year': 2018,
'in_month': 5,
'on_day_of_week': 7,
'on_day_of_month': 9,
'at_seconds': 10800,
'at_time_suffix': 'u',
'delta_seconds': 0,
'letter': '-',
},
# Rule Chile 2016 2018 - Aug Sun>=9 4:00u 1:00 -
{
'from_year': 2016,
'to_year': 2018,
'in_month': 8,
'on_day_of_week': 7,
'on_day_of_month': 9,
'at_seconds': 14400,
'at_time_suffix': 'u',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Chile 2019 max - Apr Sun>=2 3:00u 0 -
{
'from_year': 2019,
'to_year': 9999,
'in_month': 4,
'on_day_of_week': 7,
'on_day_of_month': 2,
'at_seconds': 10800,
'at_time_suffix': 'u',
'delta_seconds': 0,
'letter': '-',
},
# Rule Chile 2019 max - Sep Sun>=2 4:00u 1:00 -
{
'from_year': 2019,
'to_year': 9999,
'in_month': 9,
'on_day_of_week': 7,
'on_day_of_month': 2,
'at_seconds': 14400,
'at_time_suffix': 'u',
'delta_seconds': 3600,
'letter': '-',
},
]
ZONE_POLICY_Chile = {
'name': 'Chile',
'rules': ZONE_RULES_Chile
}
#---------------------------------------------------------------------------
# Policy name: Cook
# Rule count: 4
#---------------------------------------------------------------------------
ZONE_RULES_Cook = [
# Anchor: Rule Cook 1979 1991 - Mar Sun>=1 0:00 0 -
{
'from_year': 0,
'to_year': 0,
'in_month': 1,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Cook 1978 only - Nov 12 0:00 0:30 -
{
'from_year': 1978,
'to_year': 1978,
'in_month': 11,
'on_day_of_week': 0,
'on_day_of_month': 12,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 1800,
'letter': '-',
},
# Rule Cook 1979 1991 - Mar Sun>=1 0:00 0 -
{
'from_year': 1979,
'to_year': 1991,
'in_month': 3,
'on_day_of_week': 7,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Cook 1979 1990 - Oct lastSun 0:00 0:30 -
{
'from_year': 1979,
'to_year': 1990,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 1800,
'letter': '-',
},
]
ZONE_POLICY_Cook = {
'name': 'Cook',
'rules': ZONE_RULES_Cook
}
#---------------------------------------------------------------------------
# Policy name: Cuba
# Rule count: 26
#---------------------------------------------------------------------------
ZONE_RULES_Cuba = [
# Rule Cuba 1969 1977 - Apr lastSun 0:00 1:00 D
{
'from_year': 1969,
'to_year': 1977,
'in_month': 4,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule Cuba 1969 1971 - Oct lastSun 0:00 0 S
{
'from_year': 1969,
'to_year': 1971,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': 'S',
},
# Rule Cuba 1972 1974 - Oct 8 0:00 0 S
{
'from_year': 1972,
'to_year': 1974,
'in_month': 10,
'on_day_of_week': 0,
'on_day_of_month': 8,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': 'S',
},
# Rule Cuba 1975 1977 - Oct lastSun 0:00 0 S
{
'from_year': 1975,
'to_year': 1977,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': 'S',
},
# Rule Cuba 1978 only - May 7 0:00 1:00 D
{
'from_year': 1978,
'to_year': 1978,
'in_month': 5,
'on_day_of_week': 0,
'on_day_of_month': 7,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule Cuba 1978 1990 - Oct Sun>=8 0:00 0 S
{
'from_year': 1978,
'to_year': 1990,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 8,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': 'S',
},
# Rule Cuba 1979 1980 - Mar Sun>=15 0:00 1:00 D
{
'from_year': 1979,
'to_year': 1980,
'in_month': 3,
'on_day_of_week': 7,
'on_day_of_month': 15,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule Cuba 1981 1985 - May Sun>=5 0:00 1:00 D
{
'from_year': 1981,
'to_year': 1985,
'in_month': 5,
'on_day_of_week': 7,
'on_day_of_month': 5,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule Cuba 1986 1989 - Mar Sun>=14 0:00 1:00 D
{
'from_year': 1986,
'to_year': 1989,
'in_month': 3,
'on_day_of_week': 7,
'on_day_of_month': 14,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule Cuba 1990 1997 - Apr Sun>=1 0:00 1:00 D
{
'from_year': 1990,
'to_year': 1997,
'in_month': 4,
'on_day_of_week': 7,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule Cuba 1991 1995 - Oct Sun>=8 0:00s 0 S
{
'from_year': 1991,
'to_year': 1995,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 8,
'at_seconds': 0,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': 'S',
},
# Rule Cuba 1996 only - Oct 6 0:00s 0 S
{
'from_year': 1996,
'to_year': 1996,
'in_month': 10,
'on_day_of_week': 0,
'on_day_of_month': 6,
'at_seconds': 0,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': 'S',
},
# Rule Cuba 1997 only - Oct 12 0:00s 0 S
{
'from_year': 1997,
'to_year': 1997,
'in_month': 10,
'on_day_of_week': 0,
'on_day_of_month': 12,
'at_seconds': 0,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': 'S',
},
# Rule Cuba 1998 1999 - Mar lastSun 0:00s 1:00 D
{
'from_year': 1998,
'to_year': 1999,
'in_month': 3,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 0,
'at_time_suffix': 's',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule Cuba 1998 2003 - Oct lastSun 0:00s 0 S
{
'from_year': 1998,
'to_year': 2003,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 0,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': 'S',
},
# Rule Cuba 2000 2003 - Apr Sun>=1 0:00s 1:00 D
{
'from_year': 2000,
'to_year': 2003,
'in_month': 4,
'on_day_of_week': 7,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 's',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule Cuba 2004 only - Mar lastSun 0:00s 1:00 D
{
'from_year': 2004,
'to_year': 2004,
'in_month': 3,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 0,
'at_time_suffix': 's',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule Cuba 2006 2010 - Oct lastSun 0:00s 0 S
{
'from_year': 2006,
'to_year': 2010,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 0,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': 'S',
},
# Rule Cuba 2007 only - Mar Sun>=8 0:00s 1:00 D
{
'from_year': 2007,
'to_year': 2007,
'in_month': 3,
'on_day_of_week': 7,
'on_day_of_month': 8,
'at_seconds': 0,
'at_time_suffix': 's',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule Cuba 2008 only - Mar Sun>=15 0:00s 1:00 D
{
'from_year': 2008,
'to_year': 2008,
'in_month': 3,
'on_day_of_week': 7,
'on_day_of_month': 15,
'at_seconds': 0,
'at_time_suffix': 's',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule Cuba 2009 2010 - Mar Sun>=8 0:00s 1:00 D
{
'from_year': 2009,
'to_year': 2010,
'in_month': 3,
'on_day_of_week': 7,
'on_day_of_month': 8,
'at_seconds': 0,
'at_time_suffix': 's',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule Cuba 2011 only - Mar Sun>=15 0:00s 1:00 D
{
'from_year': 2011,
'to_year': 2011,
'in_month': 3,
'on_day_of_week': 7,
'on_day_of_month': 15,
'at_seconds': 0,
'at_time_suffix': 's',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule Cuba 2011 only - Nov 13 0:00s 0 S
{
'from_year': 2011,
'to_year': 2011,
'in_month': 11,
'on_day_of_week': 0,
'on_day_of_month': 13,
'at_seconds': 0,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': 'S',
},
# Rule Cuba 2012 only - Apr 1 0:00s 1:00 D
{
'from_year': 2012,
'to_year': 2012,
'in_month': 4,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 's',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule Cuba 2012 max - Nov Sun>=1 0:00s 0 S
{
'from_year': 2012,
'to_year': 9999,
'in_month': 11,
'on_day_of_week': 7,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': 'S',
},
# Rule Cuba 2013 max - Mar Sun>=8 0:00s 1:00 D
{
'from_year': 2013,
'to_year': 9999,
'in_month': 3,
'on_day_of_week': 7,
'on_day_of_month': 8,
'at_seconds': 0,
'at_time_suffix': 's',
'delta_seconds': 3600,
'letter': 'D',
},
]
ZONE_POLICY_Cuba = {
'name': 'Cuba',
'rules': ZONE_RULES_Cuba
}
#---------------------------------------------------------------------------
# Policy name: Cyprus
# Rule count: 10
#---------------------------------------------------------------------------
ZONE_RULES_Cyprus = [
# Anchor: Rule Cyprus 1975 only - Oct 12 0:00 0 -
{
'from_year': 0,
'to_year': 0,
'in_month': 1,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Cyprus 1975 only - Apr 13 0:00 1:00 S
{
'from_year': 1975,
'to_year': 1975,
'in_month': 4,
'on_day_of_week': 0,
'on_day_of_month': 13,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Cyprus 1975 only - Oct 12 0:00 0 -
{
'from_year': 1975,
'to_year': 1975,
'in_month': 10,
'on_day_of_week': 0,
'on_day_of_month': 12,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Cyprus 1976 only - May 15 0:00 1:00 S
{
'from_year': 1976,
'to_year': 1976,
'in_month': 5,
'on_day_of_week': 0,
'on_day_of_month': 15,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Cyprus 1976 only - Oct 11 0:00 0 -
{
'from_year': 1976,
'to_year': 1976,
'in_month': 10,
'on_day_of_week': 0,
'on_day_of_month': 11,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Cyprus 1977 1980 - Apr Sun>=1 0:00 1:00 S
{
'from_year': 1977,
'to_year': 1980,
'in_month': 4,
'on_day_of_week': 7,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Cyprus 1977 only - Sep 25 0:00 0 -
{
'from_year': 1977,
'to_year': 1977,
'in_month': 9,
'on_day_of_week': 0,
'on_day_of_month': 25,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Cyprus 1978 only - Oct 2 0:00 0 -
{
'from_year': 1978,
'to_year': 1978,
'in_month': 10,
'on_day_of_week': 0,
'on_day_of_month': 2,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Cyprus 1979 1997 - Sep lastSun 0:00 0 -
{
'from_year': 1979,
'to_year': 1997,
'in_month': 9,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Cyprus 1981 1998 - Mar lastSun 0:00 1:00 S
{
'from_year': 1981,
'to_year': 1998,
'in_month': 3,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'S',
},
]
ZONE_POLICY_Cyprus = {
'name': 'Cyprus',
'rules': ZONE_RULES_Cyprus
}
#---------------------------------------------------------------------------
# Policy name: Czech
# Rule count: 1
#---------------------------------------------------------------------------
ZONE_RULES_Czech = [
# Rule Czech 1946 1949 - Oct Sun>=1 2:00s 0 -
{
'from_year': 1946,
'to_year': 1949,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 1,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': '-',
},
]
ZONE_POLICY_Czech = {
'name': 'Czech',
'rules': ZONE_RULES_Czech
}
#---------------------------------------------------------------------------
# Policy name: DR
# Rule count: 3
#---------------------------------------------------------------------------
ZONE_RULES_DR = [
# Rule DR 1969 1973 - Oct lastSun 0:00 0:30 -0430
{
'from_year': 1969,
'to_year': 1973,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 1800,
'letter': '-0430',
},
# Rule DR 1971 only - Jan 20 0:00 0 EST
{
'from_year': 1971,
'to_year': 1971,
'in_month': 1,
'on_day_of_week': 0,
'on_day_of_month': 20,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': 'EST',
},
# Rule DR 1972 1974 - Jan 21 0:00 0 EST
{
'from_year': 1972,
'to_year': 1974,
'in_month': 1,
'on_day_of_week': 0,
'on_day_of_month': 21,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': 'EST',
},
]
ZONE_POLICY_DR = {
'name': 'DR',
'rules': ZONE_RULES_DR
}
#---------------------------------------------------------------------------
# Policy name: Denmark
# Rule count: 1
#---------------------------------------------------------------------------
ZONE_RULES_Denmark = [
# Rule Denmark 1948 only - Aug 8 2:00s 0 -
{
'from_year': 1948,
'to_year': 1948,
'in_month': 8,
'on_day_of_week': 0,
'on_day_of_month': 8,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': '-',
},
]
ZONE_POLICY_Denmark = {
'name': 'Denmark',
'rules': ZONE_RULES_Denmark
}
#---------------------------------------------------------------------------
# Policy name: Dhaka
# Rule count: 3
#---------------------------------------------------------------------------
ZONE_RULES_Dhaka = [
# Anchor: Rule Dhaka 2009 only - Dec 31 24:00 0 -
{
'from_year': 0,
'to_year': 0,
'in_month': 1,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Dhaka 2009 only - Jun 19 23:00 1:00 -
{
'from_year': 2009,
'to_year': 2009,
'in_month': 6,
'on_day_of_week': 0,
'on_day_of_month': 19,
'at_seconds': 82800,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Dhaka 2009 only - Dec 31 24:00 0 -
{
'from_year': 2009,
'to_year': 2009,
'in_month': 12,
'on_day_of_week': 0,
'on_day_of_month': 31,
'at_seconds': 86400,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
]
ZONE_POLICY_Dhaka = {
'name': 'Dhaka',
'rules': ZONE_RULES_Dhaka
}
#---------------------------------------------------------------------------
# Policy name: E_Eur
# Rule count: 5
#---------------------------------------------------------------------------
ZONE_RULES_E_Eur = [
# Anchor: Rule E-Eur 1979 1995 - Sep lastSun 0:00 0 -
{
'from_year': 0,
'to_year': 0,
'in_month': 1,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule E-Eur 1977 1980 - Apr Sun>=1 0:00 1:00 S
{
'from_year': 1977,
'to_year': 1980,
'in_month': 4,
'on_day_of_week': 7,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule E-Eur 1979 1995 - Sep lastSun 0:00 0 -
{
'from_year': 1979,
'to_year': 1995,
'in_month': 9,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule E-Eur 1981 max - Mar lastSun 0:00 1:00 S
{
'from_year': 1981,
'to_year': 9999,
'in_month': 3,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule E-Eur 1996 max - Oct lastSun 0:00 0 -
{
'from_year': 1996,
'to_year': 9999,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
]
ZONE_POLICY_E_Eur = {
'name': 'E_Eur',
'rules': ZONE_RULES_E_Eur
}
#---------------------------------------------------------------------------
# Policy name: E_EurAsia
# Rule count: 4
#---------------------------------------------------------------------------
ZONE_RULES_E_EurAsia = [
# Anchor: Rule E-EurAsia 1979 1995 - Sep lastSun 0:00 0 -
{
'from_year': 0,
'to_year': 0,
'in_month': 1,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule E-EurAsia 1981 max - Mar lastSun 0:00 1:00 -
{
'from_year': 1981,
'to_year': 9999,
'in_month': 3,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule E-EurAsia 1979 1995 - Sep lastSun 0:00 0 -
{
'from_year': 1979,
'to_year': 1995,
'in_month': 9,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule E-EurAsia 1996 max - Oct lastSun 0:00 0 -
{
'from_year': 1996,
'to_year': 9999,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
]
ZONE_POLICY_E_EurAsia = {
'name': 'E_EurAsia',
'rules': ZONE_RULES_E_EurAsia
}
#---------------------------------------------------------------------------
# Policy name: EU
# Rule count: 7
#---------------------------------------------------------------------------
ZONE_RULES_EU = [
# Anchor: Rule EU 1977 only - Sep lastSun 1:00u 0 -
{
'from_year': 0,
'to_year': 0,
'in_month': 1,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule EU 1977 1980 - Apr Sun>=1 1:00u 1:00 S
{
'from_year': 1977,
'to_year': 1980,
'in_month': 4,
'on_day_of_week': 7,
'on_day_of_month': 1,
'at_seconds': 3600,
'at_time_suffix': 'u',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule EU 1977 only - Sep lastSun 1:00u 0 -
{
'from_year': 1977,
'to_year': 1977,
'in_month': 9,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 3600,
'at_time_suffix': 'u',
'delta_seconds': 0,
'letter': '-',
},
# Rule EU 1978 only - Oct 1 1:00u 0 -
{
'from_year': 1978,
'to_year': 1978,
'in_month': 10,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 3600,
'at_time_suffix': 'u',
'delta_seconds': 0,
'letter': '-',
},
# Rule EU 1979 1995 - Sep lastSun 1:00u 0 -
{
'from_year': 1979,
'to_year': 1995,
'in_month': 9,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 3600,
'at_time_suffix': 'u',
'delta_seconds': 0,
'letter': '-',
},
# Rule EU 1981 max - Mar lastSun 1:00u 1:00 S
{
'from_year': 1981,
'to_year': 9999,
'in_month': 3,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 3600,
'at_time_suffix': 'u',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule EU 1996 max - Oct lastSun 1:00u 0 -
{
'from_year': 1996,
'to_year': 9999,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 3600,
'at_time_suffix': 'u',
'delta_seconds': 0,
'letter': '-',
},
]
ZONE_POLICY_EU = {
'name': 'EU',
'rules': ZONE_RULES_EU
}
#---------------------------------------------------------------------------
# Policy name: EUAsia
# Rule count: 4
#---------------------------------------------------------------------------
ZONE_RULES_EUAsia = [
# Anchor: Rule EUAsia 1979 1995 - Sep lastSun 1:00u 0 -
{
'from_year': 0,
'to_year': 0,
'in_month': 1,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule EUAsia 1981 max - Mar lastSun 1:00u 1:00 S
{
'from_year': 1981,
'to_year': 9999,
'in_month': 3,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 3600,
'at_time_suffix': 'u',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule EUAsia 1979 1995 - Sep lastSun 1:00u 0 -
{
'from_year': 1979,
'to_year': 1995,
'in_month': 9,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 3600,
'at_time_suffix': 'u',
'delta_seconds': 0,
'letter': '-',
},
# Rule EUAsia 1996 max - Oct lastSun 1:00u 0 -
{
'from_year': 1996,
'to_year': 9999,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 3600,
'at_time_suffix': 'u',
'delta_seconds': 0,
'letter': '-',
},
]
ZONE_POLICY_EUAsia = {
'name': 'EUAsia',
'rules': ZONE_RULES_EUAsia
}
#---------------------------------------------------------------------------
# Policy name: Ecuador
# Rule count: 3
#---------------------------------------------------------------------------
ZONE_RULES_Ecuador = [
# Anchor: Rule Ecuador 1993 only - Feb 5 0:00 0 -
{
'from_year': 0,
'to_year': 0,
'in_month': 1,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Ecuador 1992 only - Nov 28 0:00 1:00 -
{
'from_year': 1992,
'to_year': 1992,
'in_month': 11,
'on_day_of_week': 0,
'on_day_of_month': 28,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Ecuador 1993 only - Feb 5 0:00 0 -
{
'from_year': 1993,
'to_year': 1993,
'in_month': 2,
'on_day_of_week': 0,
'on_day_of_month': 5,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
]
ZONE_POLICY_Ecuador = {
'name': 'Ecuador',
'rules': ZONE_RULES_Ecuador
}
#---------------------------------------------------------------------------
# Policy name: Edm
# Rule count: 3
#---------------------------------------------------------------------------
ZONE_RULES_Edm = [
# Rule Edm 1947 only - Sep lastSun 2:00 0 S
{
'from_year': 1947,
'to_year': 1947,
'in_month': 9,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': 'S',
},
# Rule Edm 1972 1986 - Apr lastSun 2:00 1:00 D
{
'from_year': 1972,
'to_year': 1986,
'in_month': 4,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule Edm 1972 2006 - Oct lastSun 2:00 0 S
{
'from_year': 1972,
'to_year': 2006,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': 'S',
},
]
ZONE_POLICY_Edm = {
'name': 'Edm',
'rules': ZONE_RULES_Edm
}
#---------------------------------------------------------------------------
# Policy name: Egypt
# Rule count: 21
#---------------------------------------------------------------------------
ZONE_RULES_Egypt = [
# Rule Egypt 1959 1981 - May 1 1:00 1:00 S
{
'from_year': 1959,
'to_year': 1981,
'in_month': 5,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 3600,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Egypt 1959 1965 - Sep 30 3:00 0 -
{
'from_year': 1959,
'to_year': 1965,
'in_month': 9,
'on_day_of_week': 0,
'on_day_of_month': 30,
'at_seconds': 10800,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Egypt 1966 1994 - Oct 1 3:00 0 -
{
'from_year': 1966,
'to_year': 1994,
'in_month': 10,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 10800,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Egypt 1982 only - Jul 25 1:00 1:00 S
{
'from_year': 1982,
'to_year': 1982,
'in_month': 7,
'on_day_of_week': 0,
'on_day_of_month': 25,
'at_seconds': 3600,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Egypt 1983 only - Jul 12 1:00 1:00 S
{
'from_year': 1983,
'to_year': 1983,
'in_month': 7,
'on_day_of_week': 0,
'on_day_of_month': 12,
'at_seconds': 3600,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Egypt 1984 1988 - May 1 1:00 1:00 S
{
'from_year': 1984,
'to_year': 1988,
'in_month': 5,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 3600,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Egypt 1989 only - May 6 1:00 1:00 S
{
'from_year': 1989,
'to_year': 1989,
'in_month': 5,
'on_day_of_week': 0,
'on_day_of_month': 6,
'at_seconds': 3600,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Egypt 1990 1994 - May 1 1:00 1:00 S
{
'from_year': 1990,
'to_year': 1994,
'in_month': 5,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 3600,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Egypt 1995 2010 - Apr lastFri 0:00s 1:00 S
{
'from_year': 1995,
'to_year': 2010,
'in_month': 4,
'on_day_of_week': 5,
'on_day_of_month': 0,
'at_seconds': 0,
'at_time_suffix': 's',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Egypt 1995 2005 - Sep lastThu 24:00 0 -
{
'from_year': 1995,
'to_year': 2005,
'in_month': 9,
'on_day_of_week': 4,
'on_day_of_month': 0,
'at_seconds': 86400,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Egypt 2006 only - Sep 21 24:00 0 -
{
'from_year': 2006,
'to_year': 2006,
'in_month': 9,
'on_day_of_week': 0,
'on_day_of_month': 21,
'at_seconds': 86400,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Egypt 2007 only - Sep Thu>=1 24:00 0 -
{
'from_year': 2007,
'to_year': 2007,
'in_month': 9,
'on_day_of_week': 4,
'on_day_of_month': 1,
'at_seconds': 86400,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Egypt 2008 only - Aug lastThu 24:00 0 -
{
'from_year': 2008,
'to_year': 2008,
'in_month': 8,
'on_day_of_week': 4,
'on_day_of_month': 0,
'at_seconds': 86400,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Egypt 2009 only - Aug 20 24:00 0 -
{
'from_year': 2009,
'to_year': 2009,
'in_month': 8,
'on_day_of_week': 0,
'on_day_of_month': 20,
'at_seconds': 86400,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Egypt 2010 only - Aug 10 24:00 0 -
{
'from_year': 2010,
'to_year': 2010,
'in_month': 8,
'on_day_of_week': 0,
'on_day_of_month': 10,
'at_seconds': 86400,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Egypt 2010 only - Sep 9 24:00 1:00 S
{
'from_year': 2010,
'to_year': 2010,
'in_month': 9,
'on_day_of_week': 0,
'on_day_of_month': 9,
'at_seconds': 86400,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Egypt 2010 only - Sep lastThu 24:00 0 -
{
'from_year': 2010,
'to_year': 2010,
'in_month': 9,
'on_day_of_week': 4,
'on_day_of_month': 0,
'at_seconds': 86400,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Egypt 2014 only - May 15 24:00 1:00 S
{
'from_year': 2014,
'to_year': 2014,
'in_month': 5,
'on_day_of_week': 0,
'on_day_of_month': 15,
'at_seconds': 86400,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Egypt 2014 only - Jun 26 24:00 0 -
{
'from_year': 2014,
'to_year': 2014,
'in_month': 6,
'on_day_of_week': 0,
'on_day_of_month': 26,
'at_seconds': 86400,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Egypt 2014 only - Jul 31 24:00 1:00 S
{
'from_year': 2014,
'to_year': 2014,
'in_month': 7,
'on_day_of_week': 0,
'on_day_of_month': 31,
'at_seconds': 86400,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Egypt 2014 only - Sep lastThu 24:00 0 -
{
'from_year': 2014,
'to_year': 2014,
'in_month': 9,
'on_day_of_week': 4,
'on_day_of_month': 0,
'at_seconds': 86400,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
]
ZONE_POLICY_Egypt = {
'name': 'Egypt',
'rules': ZONE_RULES_Egypt
}
#---------------------------------------------------------------------------
# Policy name: Eire
# Rule count: 7
#---------------------------------------------------------------------------
ZONE_RULES_Eire = [
# Rule Eire 1971 only - Oct 31 2:00u -1:00 -
{
'from_year': 1971,
'to_year': 1971,
'in_month': 10,
'on_day_of_week': 0,
'on_day_of_month': 31,
'at_seconds': 7200,
'at_time_suffix': 'u',
'delta_seconds': -3600,
'letter': '-',
},
# Rule Eire 1972 1980 - Mar Sun>=16 2:00u 0 -
{
'from_year': 1972,
'to_year': 1980,
'in_month': 3,
'on_day_of_week': 7,
'on_day_of_month': 16,
'at_seconds': 7200,
'at_time_suffix': 'u',
'delta_seconds': 0,
'letter': '-',
},
# Rule Eire 1972 1980 - Oct Sun>=23 2:00u -1:00 -
{
'from_year': 1972,
'to_year': 1980,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 23,
'at_seconds': 7200,
'at_time_suffix': 'u',
'delta_seconds': -3600,
'letter': '-',
},
# Rule Eire 1981 max - Mar lastSun 1:00u 0 -
{
'from_year': 1981,
'to_year': 9999,
'in_month': 3,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 3600,
'at_time_suffix': 'u',
'delta_seconds': 0,
'letter': '-',
},
# Rule Eire 1981 1989 - Oct Sun>=23 1:00u -1:00 -
{
'from_year': 1981,
'to_year': 1989,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 23,
'at_seconds': 3600,
'at_time_suffix': 'u',
'delta_seconds': -3600,
'letter': '-',
},
# Rule Eire 1990 1995 - Oct Sun>=22 1:00u -1:00 -
{
'from_year': 1990,
'to_year': 1995,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 22,
'at_seconds': 3600,
'at_time_suffix': 'u',
'delta_seconds': -3600,
'letter': '-',
},
# Rule Eire 1996 max - Oct lastSun 1:00u -1:00 -
{
'from_year': 1996,
'to_year': 9999,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 3600,
'at_time_suffix': 'u',
'delta_seconds': -3600,
'letter': '-',
},
]
ZONE_POLICY_Eire = {
'name': 'Eire',
'rules': ZONE_RULES_Eire
}
#---------------------------------------------------------------------------
# Policy name: Falk
# Rule count: 8
#---------------------------------------------------------------------------
ZONE_RULES_Falk = [
# Rule Falk 1943 only - Jan 1 0:00 0 -
{
'from_year': 1943,
'to_year': 1943,
'in_month': 1,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Falk 1983 only - Sep lastSun 0:00 1:00 -
{
'from_year': 1983,
'to_year': 1983,
'in_month': 9,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Falk 1984 1985 - Apr lastSun 0:00 0 -
{
'from_year': 1984,
'to_year': 1985,
'in_month': 4,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Falk 1984 only - Sep 16 0:00 1:00 -
{
'from_year': 1984,
'to_year': 1984,
'in_month': 9,
'on_day_of_week': 0,
'on_day_of_month': 16,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Falk 1985 2000 - Sep Sun>=9 0:00 1:00 -
{
'from_year': 1985,
'to_year': 2000,
'in_month': 9,
'on_day_of_week': 7,
'on_day_of_month': 9,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Falk 1986 2000 - Apr Sun>=16 0:00 0 -
{
'from_year': 1986,
'to_year': 2000,
'in_month': 4,
'on_day_of_week': 7,
'on_day_of_month': 16,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Falk 2001 2010 - Apr Sun>=15 2:00 0 -
{
'from_year': 2001,
'to_year': 2010,
'in_month': 4,
'on_day_of_week': 7,
'on_day_of_month': 15,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Falk 2001 2010 - Sep Sun>=1 2:00 1:00 -
{
'from_year': 2001,
'to_year': 2010,
'in_month': 9,
'on_day_of_week': 7,
'on_day_of_month': 1,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
]
ZONE_POLICY_Falk = {
'name': 'Falk',
'rules': ZONE_RULES_Falk
}
#---------------------------------------------------------------------------
# Policy name: Fiji
# Rule count: 14
#---------------------------------------------------------------------------
ZONE_RULES_Fiji = [
# Anchor: Rule Fiji 1999 2000 - Feb lastSun 3:00 0 -
{
'from_year': 0,
'to_year': 0,
'in_month': 1,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Fiji 1998 1999 - Nov Sun>=1 2:00 1:00 -
{
'from_year': 1998,
'to_year': 1999,
'in_month': 11,
'on_day_of_week': 7,
'on_day_of_month': 1,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Fiji 1999 2000 - Feb lastSun 3:00 0 -
{
'from_year': 1999,
'to_year': 2000,
'in_month': 2,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 10800,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Fiji 2009 only - Nov 29 2:00 1:00 -
{
'from_year': 2009,
'to_year': 2009,
'in_month': 11,
'on_day_of_week': 0,
'on_day_of_month': 29,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Fiji 2010 only - Mar lastSun 3:00 0 -
{
'from_year': 2010,
'to_year': 2010,
'in_month': 3,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 10800,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Fiji 2010 2013 - Oct Sun>=21 2:00 1:00 -
{
'from_year': 2010,
'to_year': 2013,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 21,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Fiji 2011 only - Mar Sun>=1 3:00 0 -
{
'from_year': 2011,
'to_year': 2011,
'in_month': 3,
'on_day_of_week': 7,
'on_day_of_month': 1,
'at_seconds': 10800,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Fiji 2012 2013 - Jan Sun>=18 3:00 0 -
{
'from_year': 2012,
'to_year': 2013,
'in_month': 1,
'on_day_of_week': 7,
'on_day_of_month': 18,
'at_seconds': 10800,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Fiji 2014 only - Jan Sun>=18 2:00 0 -
{
'from_year': 2014,
'to_year': 2014,
'in_month': 1,
'on_day_of_week': 7,
'on_day_of_month': 18,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Fiji 2014 2018 - Nov Sun>=1 2:00 1:00 -
{
'from_year': 2014,
'to_year': 2018,
'in_month': 11,
'on_day_of_week': 7,
'on_day_of_month': 1,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Fiji 2015 max - Jan Sun>=12 3:00 0 -
{
'from_year': 2015,
'to_year': 9999,
'in_month': 1,
'on_day_of_week': 7,
'on_day_of_month': 12,
'at_seconds': 10800,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Fiji 2019 only - Nov Sun>=8 2:00 1:00 -
{
'from_year': 2019,
'to_year': 2019,
'in_month': 11,
'on_day_of_week': 7,
'on_day_of_month': 8,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Fiji 2020 only - Dec 20 2:00 1:00 -
{
'from_year': 2020,
'to_year': 2020,
'in_month': 12,
'on_day_of_week': 0,
'on_day_of_month': 20,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Fiji 2021 max - Nov Sun>=8 2:00 1:00 -
{
'from_year': 2021,
'to_year': 9999,
'in_month': 11,
'on_day_of_week': 7,
'on_day_of_month': 8,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
]
ZONE_POLICY_Fiji = {
'name': 'Fiji',
'rules': ZONE_RULES_Fiji
}
#---------------------------------------------------------------------------
# Policy name: Finland
# Rule count: 3
#---------------------------------------------------------------------------
ZONE_RULES_Finland = [
# Rule Finland 1942 only - Oct 4 1:00 0 -
{
'from_year': 1942,
'to_year': 1942,
'in_month': 10,
'on_day_of_week': 0,
'on_day_of_month': 4,
'at_seconds': 3600,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Finland 1981 1982 - Mar lastSun 2:00 1:00 S
{
'from_year': 1981,
'to_year': 1982,
'in_month': 3,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Finland 1981 1982 - Sep lastSun 3:00 0 -
{
'from_year': 1981,
'to_year': 1982,
'in_month': 9,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 10800,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
]
ZONE_POLICY_Finland = {
'name': 'Finland',
'rules': ZONE_RULES_Finland
}
#---------------------------------------------------------------------------
# Policy name: France
# Rule count: 3
#---------------------------------------------------------------------------
ZONE_RULES_France = [
# Rule France 1945 only - Sep 16 3:00 0 -
{
'from_year': 1945,
'to_year': 1945,
'in_month': 9,
'on_day_of_week': 0,
'on_day_of_month': 16,
'at_seconds': 10800,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule France 1976 only - Mar 28 1:00 1:00 S
{
'from_year': 1976,
'to_year': 1976,
'in_month': 3,
'on_day_of_week': 0,
'on_day_of_month': 28,
'at_seconds': 3600,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule France 1976 only - Sep 26 1:00 0 -
{
'from_year': 1976,
'to_year': 1976,
'in_month': 9,
'on_day_of_week': 0,
'on_day_of_month': 26,
'at_seconds': 3600,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
]
ZONE_POLICY_France = {
'name': 'France',
'rules': ZONE_RULES_France
}
#---------------------------------------------------------------------------
# Policy name: GB_Eire
# Rule count: 6
#---------------------------------------------------------------------------
ZONE_RULES_GB_Eire = [
# Rule GB-Eire 1961 1968 - Oct Sun>=23 2:00s 0 GMT
{
'from_year': 1961,
'to_year': 1968,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 23,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': 'GMT',
},
# Rule GB-Eire 1972 1980 - Mar Sun>=16 2:00s 1:00 BST
{
'from_year': 1972,
'to_year': 1980,
'in_month': 3,
'on_day_of_week': 7,
'on_day_of_month': 16,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 3600,
'letter': 'BST',
},
# Rule GB-Eire 1972 1980 - Oct Sun>=23 2:00s 0 GMT
{
'from_year': 1972,
'to_year': 1980,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 23,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': 'GMT',
},
# Rule GB-Eire 1981 1995 - Mar lastSun 1:00u 1:00 BST
{
'from_year': 1981,
'to_year': 1995,
'in_month': 3,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 3600,
'at_time_suffix': 'u',
'delta_seconds': 3600,
'letter': 'BST',
},
# Rule GB-Eire 1981 1989 - Oct Sun>=23 1:00u 0 GMT
{
'from_year': 1981,
'to_year': 1989,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 23,
'at_seconds': 3600,
'at_time_suffix': 'u',
'delta_seconds': 0,
'letter': 'GMT',
},
# Rule GB-Eire 1990 1995 - Oct Sun>=22 1:00u 0 GMT
{
'from_year': 1990,
'to_year': 1995,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 22,
'at_seconds': 3600,
'at_time_suffix': 'u',
'delta_seconds': 0,
'letter': 'GMT',
},
]
ZONE_POLICY_GB_Eire = {
'name': 'GB_Eire',
'rules': ZONE_RULES_GB_Eire
}
#---------------------------------------------------------------------------
# Policy name: Germany
# Rule count: 1
#---------------------------------------------------------------------------
ZONE_RULES_Germany = [
# Rule Germany 1947 1949 - Oct Sun>=1 2:00s 0 -
{
'from_year': 1947,
'to_year': 1949,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 1,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': '-',
},
]
ZONE_POLICY_Germany = {
'name': 'Germany',
'rules': ZONE_RULES_Germany
}
#---------------------------------------------------------------------------
# Policy name: Ghana
# Rule count: 1
#---------------------------------------------------------------------------
ZONE_RULES_Ghana = [
# Rule Ghana 1951 1956 - Jan 1 2:00 0 GMT
{
'from_year': 1951,
'to_year': 1956,
'in_month': 1,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': 'GMT',
},
]
ZONE_POLICY_Ghana = {
'name': 'Ghana',
'rules': ZONE_RULES_Ghana
}
#---------------------------------------------------------------------------
# Policy name: Greece
# Rule count: 12
#---------------------------------------------------------------------------
ZONE_RULES_Greece = [
# Rule Greece 1952 only - Nov 2 0:00 0 -
{
'from_year': 1952,
'to_year': 1952,
'in_month': 11,
'on_day_of_week': 0,
'on_day_of_month': 2,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Greece 1975 only - Apr 12 0:00s 1:00 S
{
'from_year': 1975,
'to_year': 1975,
'in_month': 4,
'on_day_of_week': 0,
'on_day_of_month': 12,
'at_seconds': 0,
'at_time_suffix': 's',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Greece 1975 only - Nov 26 0:00s 0 -
{
'from_year': 1975,
'to_year': 1975,
'in_month': 11,
'on_day_of_week': 0,
'on_day_of_month': 26,
'at_seconds': 0,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': '-',
},
# Rule Greece 1976 only - Apr 11 2:00s 1:00 S
{
'from_year': 1976,
'to_year': 1976,
'in_month': 4,
'on_day_of_week': 0,
'on_day_of_month': 11,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Greece 1976 only - Oct 10 2:00s 0 -
{
'from_year': 1976,
'to_year': 1976,
'in_month': 10,
'on_day_of_week': 0,
'on_day_of_month': 10,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': '-',
},
# Rule Greece 1977 1978 - Apr Sun>=1 2:00s 1:00 S
{
'from_year': 1977,
'to_year': 1978,
'in_month': 4,
'on_day_of_week': 7,
'on_day_of_month': 1,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Greece 1977 only - Sep 26 2:00s 0 -
{
'from_year': 1977,
'to_year': 1977,
'in_month': 9,
'on_day_of_week': 0,
'on_day_of_month': 26,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': '-',
},
# Rule Greece 1978 only - Sep 24 4:00 0 -
{
'from_year': 1978,
'to_year': 1978,
'in_month': 9,
'on_day_of_week': 0,
'on_day_of_month': 24,
'at_seconds': 14400,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Greece 1979 only - Apr 1 9:00 1:00 S
{
'from_year': 1979,
'to_year': 1979,
'in_month': 4,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 32400,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Greece 1979 only - Sep 29 2:00 0 -
{
'from_year': 1979,
'to_year': 1979,
'in_month': 9,
'on_day_of_week': 0,
'on_day_of_month': 29,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Greece 1980 only - Apr 1 0:00 1:00 S
{
'from_year': 1980,
'to_year': 1980,
'in_month': 4,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Greece 1980 only - Sep 28 0:00 0 -
{
'from_year': 1980,
'to_year': 1980,
'in_month': 9,
'on_day_of_week': 0,
'on_day_of_month': 28,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
]
ZONE_POLICY_Greece = {
'name': 'Greece',
'rules': ZONE_RULES_Greece
}
#---------------------------------------------------------------------------
# Policy name: Guam
# Rule count: 7
#---------------------------------------------------------------------------
ZONE_RULES_Guam = [
# Rule Guam 1970 1971 - Sep Sun>=1 2:00 0 S
{
'from_year': 1970,
'to_year': 1971,
'in_month': 9,
'on_day_of_week': 7,
'on_day_of_month': 1,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': 'S',
},
# Rule Guam 1973 only - Dec 16 2:00 1:00 D
{
'from_year': 1973,
'to_year': 1973,
'in_month': 12,
'on_day_of_week': 0,
'on_day_of_month': 16,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule Guam 1974 only - Feb 24 2:00 0 S
{
'from_year': 1974,
'to_year': 1974,
'in_month': 2,
'on_day_of_week': 0,
'on_day_of_month': 24,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': 'S',
},
# Rule Guam 1976 only - May 26 2:00 1:00 D
{
'from_year': 1976,
'to_year': 1976,
'in_month': 5,
'on_day_of_week': 0,
'on_day_of_month': 26,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule Guam 1976 only - Aug 22 2:01 0 S
{
'from_year': 1976,
'to_year': 1976,
'in_month': 8,
'on_day_of_week': 0,
'on_day_of_month': 22,
'at_seconds': 7260,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': 'S',
},
# Rule Guam 1977 only - Apr 24 2:00 1:00 D
{
'from_year': 1977,
'to_year': 1977,
'in_month': 4,
'on_day_of_week': 0,
'on_day_of_month': 24,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule Guam 1977 only - Aug 28 2:00 0 S
{
'from_year': 1977,
'to_year': 1977,
'in_month': 8,
'on_day_of_week': 0,
'on_day_of_month': 28,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': 'S',
},
]
ZONE_POLICY_Guam = {
'name': 'Guam',
'rules': ZONE_RULES_Guam
}
#---------------------------------------------------------------------------
# Policy name: Guat
# Rule count: 9
#---------------------------------------------------------------------------
ZONE_RULES_Guat = [
# Anchor: Rule Guat 1974 only - Feb 24 0:00 0 S
{
'from_year': 0,
'to_year': 0,
'in_month': 1,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': 'S',
},
# Rule Guat 1973 only - Nov 25 0:00 1:00 D
{
'from_year': 1973,
'to_year': 1973,
'in_month': 11,
'on_day_of_week': 0,
'on_day_of_month': 25,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule Guat 1974 only - Feb 24 0:00 0 S
{
'from_year': 1974,
'to_year': 1974,
'in_month': 2,
'on_day_of_week': 0,
'on_day_of_month': 24,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': 'S',
},
# Rule Guat 1983 only - May 21 0:00 1:00 D
{
'from_year': 1983,
'to_year': 1983,
'in_month': 5,
'on_day_of_week': 0,
'on_day_of_month': 21,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule Guat 1983 only - Sep 22 0:00 0 S
{
'from_year': 1983,
'to_year': 1983,
'in_month': 9,
'on_day_of_week': 0,
'on_day_of_month': 22,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': 'S',
},
# Rule Guat 1991 only - Mar 23 0:00 1:00 D
{
'from_year': 1991,
'to_year': 1991,
'in_month': 3,
'on_day_of_week': 0,
'on_day_of_month': 23,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule Guat 1991 only - Sep 7 0:00 0 S
{
'from_year': 1991,
'to_year': 1991,
'in_month': 9,
'on_day_of_week': 0,
'on_day_of_month': 7,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': 'S',
},
# Rule Guat 2006 only - Apr 30 0:00 1:00 D
{
'from_year': 2006,
'to_year': 2006,
'in_month': 4,
'on_day_of_week': 0,
'on_day_of_month': 30,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule Guat 2006 only - Oct 1 0:00 0 S
{
'from_year': 2006,
'to_year': 2006,
'in_month': 10,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': 'S',
},
]
ZONE_POLICY_Guat = {
'name': 'Guat',
'rules': ZONE_RULES_Guat
}
#---------------------------------------------------------------------------
# Policy name: HK
# Rule count: 6
#---------------------------------------------------------------------------
ZONE_RULES_HK = [
# Rule HK 1953 1964 - Oct Sun>=31 3:30 0 -
{
'from_year': 1953,
'to_year': 1964,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 31,
'at_seconds': 12600,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule HK 1965 1976 - Apr Sun>=16 3:30 1:00 S
{
'from_year': 1965,
'to_year': 1976,
'in_month': 4,
'on_day_of_week': 7,
'on_day_of_month': 16,
'at_seconds': 12600,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule HK 1965 1976 - Oct Sun>=16 3:30 0 -
{
'from_year': 1965,
'to_year': 1976,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 16,
'at_seconds': 12600,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule HK 1973 only - Dec 30 3:30 1:00 S
{
'from_year': 1973,
'to_year': 1973,
'in_month': 12,
'on_day_of_week': 0,
'on_day_of_month': 30,
'at_seconds': 12600,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule HK 1979 only - May 13 3:30 1:00 S
{
'from_year': 1979,
'to_year': 1979,
'in_month': 5,
'on_day_of_week': 0,
'on_day_of_month': 13,
'at_seconds': 12600,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule HK 1979 only - Oct 21 3:30 0 -
{
'from_year': 1979,
'to_year': 1979,
'in_month': 10,
'on_day_of_week': 0,
'on_day_of_month': 21,
'at_seconds': 12600,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
]
ZONE_POLICY_HK = {
'name': 'HK',
'rules': ZONE_RULES_HK
}
#---------------------------------------------------------------------------
# Policy name: Haiti
# Rule count: 12
#---------------------------------------------------------------------------
ZONE_RULES_Haiti = [
# Anchor: Rule Haiti 1983 1987 - Oct lastSun 0:00 0 S
{
'from_year': 0,
'to_year': 0,
'in_month': 1,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': 'S',
},
# Rule Haiti 1983 only - May 8 0:00 1:00 D
{
'from_year': 1983,
'to_year': 1983,
'in_month': 5,
'on_day_of_week': 0,
'on_day_of_month': 8,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule Haiti 1984 1987 - Apr lastSun 0:00 1:00 D
{
'from_year': 1984,
'to_year': 1987,
'in_month': 4,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule Haiti 1983 1987 - Oct lastSun 0:00 0 S
{
'from_year': 1983,
'to_year': 1987,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': 'S',
},
# Rule Haiti 1988 1997 - Apr Sun>=1 1:00s 1:00 D
{
'from_year': 1988,
'to_year': 1997,
'in_month': 4,
'on_day_of_week': 7,
'on_day_of_month': 1,
'at_seconds': 3600,
'at_time_suffix': 's',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule Haiti 1988 1997 - Oct lastSun 1:00s 0 S
{
'from_year': 1988,
'to_year': 1997,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 3600,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': 'S',
},
# Rule Haiti 2005 2006 - Apr Sun>=1 0:00 1:00 D
{
'from_year': 2005,
'to_year': 2006,
'in_month': 4,
'on_day_of_week': 7,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule Haiti 2005 2006 - Oct lastSun 0:00 0 S
{
'from_year': 2005,
'to_year': 2006,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': 'S',
},
# Rule Haiti 2012 2015 - Mar Sun>=8 2:00 1:00 D
{
'from_year': 2012,
'to_year': 2015,
'in_month': 3,
'on_day_of_week': 7,
'on_day_of_month': 8,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule Haiti 2012 2015 - Nov Sun>=1 2:00 0 S
{
'from_year': 2012,
'to_year': 2015,
'in_month': 11,
'on_day_of_week': 7,
'on_day_of_month': 1,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': 'S',
},
# Rule Haiti 2017 max - Mar Sun>=8 2:00 1:00 D
{
'from_year': 2017,
'to_year': 9999,
'in_month': 3,
'on_day_of_week': 7,
'on_day_of_month': 8,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule Haiti 2017 max - Nov Sun>=1 2:00 0 S
{
'from_year': 2017,
'to_year': 9999,
'in_month': 11,
'on_day_of_week': 7,
'on_day_of_month': 1,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': 'S',
},
]
ZONE_POLICY_Haiti = {
'name': 'Haiti',
'rules': ZONE_RULES_Haiti
}
#---------------------------------------------------------------------------
# Policy name: Halifax
# Rule count: 3
#---------------------------------------------------------------------------
ZONE_RULES_Halifax = [
# Rule Halifax 1956 1959 - Sep lastSun 2:00 0 S
{
'from_year': 1956,
'to_year': 1959,
'in_month': 9,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': 'S',
},
# Rule Halifax 1962 1973 - Apr lastSun 2:00 1:00 D
{
'from_year': 1962,
'to_year': 1973,
'in_month': 4,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule Halifax 1962 1973 - Oct lastSun 2:00 0 S
{
'from_year': 1962,
'to_year': 1973,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': 'S',
},
]
ZONE_POLICY_Halifax = {
'name': 'Halifax',
'rules': ZONE_RULES_Halifax
}
#---------------------------------------------------------------------------
# Policy name: Holiday
# Rule count: 3
#---------------------------------------------------------------------------
ZONE_RULES_Holiday = [
# Anchor: Rule Holiday 1993 1994 - Mar Sun>=1 2:00s 0 S
{
'from_year': 0,
'to_year': 0,
'in_month': 1,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': 'S',
},
# Rule Holiday 1992 1993 - Oct lastSun 2:00s 1:00 D
{
'from_year': 1992,
'to_year': 1993,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule Holiday 1993 1994 - Mar Sun>=1 2:00s 0 S
{
'from_year': 1993,
'to_year': 1994,
'in_month': 3,
'on_day_of_week': 7,
'on_day_of_month': 1,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': 'S',
},
]
ZONE_POLICY_Holiday = {
'name': 'Holiday',
'rules': ZONE_RULES_Holiday
}
#---------------------------------------------------------------------------
# Policy name: Hond
# Rule count: 5
#---------------------------------------------------------------------------
ZONE_RULES_Hond = [
# Anchor: Rule Hond 1987 1988 - Sep lastSun 0:00 0 S
{
'from_year': 0,
'to_year': 0,
'in_month': 1,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': 'S',
},
# Rule Hond 1987 1988 - May Sun>=1 0:00 1:00 D
{
'from_year': 1987,
'to_year': 1988,
'in_month': 5,
'on_day_of_week': 7,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule Hond 1987 1988 - Sep lastSun 0:00 0 S
{
'from_year': 1987,
'to_year': 1988,
'in_month': 9,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': 'S',
},
# Rule Hond 2006 only - May Sun>=1 0:00 1:00 D
{
'from_year': 2006,
'to_year': 2006,
'in_month': 5,
'on_day_of_week': 7,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule Hond 2006 only - Aug Mon>=1 0:00 0 S
{
'from_year': 2006,
'to_year': 2006,
'in_month': 8,
'on_day_of_week': 1,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': 'S',
},
]
ZONE_POLICY_Hond = {
'name': 'Hond',
'rules': ZONE_RULES_Hond
}
#---------------------------------------------------------------------------
# Policy name: Hungary
# Rule count: 5
#---------------------------------------------------------------------------
ZONE_RULES_Hungary = [
# Rule Hungary 1956 1957 - Sep lastSun 3:00 0 -
{
'from_year': 1956,
'to_year': 1957,
'in_month': 9,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 10800,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Hungary 1980 only - Apr 6 0:00 1:00 S
{
'from_year': 1980,
'to_year': 1980,
'in_month': 4,
'on_day_of_week': 0,
'on_day_of_month': 6,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Hungary 1980 only - Sep 28 1:00 0 -
{
'from_year': 1980,
'to_year': 1980,
'in_month': 9,
'on_day_of_week': 0,
'on_day_of_month': 28,
'at_seconds': 3600,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Hungary 1981 1983 - Mar lastSun 0:00 1:00 S
{
'from_year': 1981,
'to_year': 1983,
'in_month': 3,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Hungary 1981 1983 - Sep lastSun 1:00 0 -
{
'from_year': 1981,
'to_year': 1983,
'in_month': 9,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 3600,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
]
ZONE_POLICY_Hungary = {
'name': 'Hungary',
'rules': ZONE_RULES_Hungary
}
#---------------------------------------------------------------------------
# Policy name: Iran
# Rule count: 64
#---------------------------------------------------------------------------
ZONE_RULES_Iran = [
# Anchor: Rule Iran 1978 only - Oct 20 24:00 0 -
{
'from_year': 0,
'to_year': 0,
'in_month': 1,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Iran 1978 1980 - Mar 20 24:00 1:00 -
{
'from_year': 1978,
'to_year': 1980,
'in_month': 3,
'on_day_of_week': 0,
'on_day_of_month': 20,
'at_seconds': 86400,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Iran 1978 only - Oct 20 24:00 0 -
{
'from_year': 1978,
'to_year': 1978,
'in_month': 10,
'on_day_of_week': 0,
'on_day_of_month': 20,
'at_seconds': 86400,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Iran 1979 only - Sep 18 24:00 0 -
{
'from_year': 1979,
'to_year': 1979,
'in_month': 9,
'on_day_of_week': 0,
'on_day_of_month': 18,
'at_seconds': 86400,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Iran 1980 only - Sep 22 24:00 0 -
{
'from_year': 1980,
'to_year': 1980,
'in_month': 9,
'on_day_of_week': 0,
'on_day_of_month': 22,
'at_seconds': 86400,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Iran 1991 only - May 2 24:00 1:00 -
{
'from_year': 1991,
'to_year': 1991,
'in_month': 5,
'on_day_of_week': 0,
'on_day_of_month': 2,
'at_seconds': 86400,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Iran 1992 1995 - Mar 21 24:00 1:00 -
{
'from_year': 1992,
'to_year': 1995,
'in_month': 3,
'on_day_of_week': 0,
'on_day_of_month': 21,
'at_seconds': 86400,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Iran 1991 1995 - Sep 21 24:00 0 -
{
'from_year': 1991,
'to_year': 1995,
'in_month': 9,
'on_day_of_week': 0,
'on_day_of_month': 21,
'at_seconds': 86400,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Iran 1996 only - Mar 20 24:00 1:00 -
{
'from_year': 1996,
'to_year': 1996,
'in_month': 3,
'on_day_of_week': 0,
'on_day_of_month': 20,
'at_seconds': 86400,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Iran 1996 only - Sep 20 24:00 0 -
{
'from_year': 1996,
'to_year': 1996,
'in_month': 9,
'on_day_of_week': 0,
'on_day_of_month': 20,
'at_seconds': 86400,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Iran 1997 1999 - Mar 21 24:00 1:00 -
{
'from_year': 1997,
'to_year': 1999,
'in_month': 3,
'on_day_of_week': 0,
'on_day_of_month': 21,
'at_seconds': 86400,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Iran 1997 1999 - Sep 21 24:00 0 -
{
'from_year': 1997,
'to_year': 1999,
'in_month': 9,
'on_day_of_week': 0,
'on_day_of_month': 21,
'at_seconds': 86400,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Iran 2000 only - Mar 20 24:00 1:00 -
{
'from_year': 2000,
'to_year': 2000,
'in_month': 3,
'on_day_of_week': 0,
'on_day_of_month': 20,
'at_seconds': 86400,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Iran 2000 only - Sep 20 24:00 0 -
{
'from_year': 2000,
'to_year': 2000,
'in_month': 9,
'on_day_of_week': 0,
'on_day_of_month': 20,
'at_seconds': 86400,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Iran 2001 2003 - Mar 21 24:00 1:00 -
{
'from_year': 2001,
'to_year': 2003,
'in_month': 3,
'on_day_of_week': 0,
'on_day_of_month': 21,
'at_seconds': 86400,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Iran 2001 2003 - Sep 21 24:00 0 -
{
'from_year': 2001,
'to_year': 2003,
'in_month': 9,
'on_day_of_week': 0,
'on_day_of_month': 21,
'at_seconds': 86400,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Iran 2004 only - Mar 20 24:00 1:00 -
{
'from_year': 2004,
'to_year': 2004,
'in_month': 3,
'on_day_of_week': 0,
'on_day_of_month': 20,
'at_seconds': 86400,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Iran 2004 only - Sep 20 24:00 0 -
{
'from_year': 2004,
'to_year': 2004,
'in_month': 9,
'on_day_of_week': 0,
'on_day_of_month': 20,
'at_seconds': 86400,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Iran 2005 only - Mar 21 24:00 1:00 -
{
'from_year': 2005,
'to_year': 2005,
'in_month': 3,
'on_day_of_week': 0,
'on_day_of_month': 21,
'at_seconds': 86400,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Iran 2005 only - Sep 21 24:00 0 -
{
'from_year': 2005,
'to_year': 2005,
'in_month': 9,
'on_day_of_week': 0,
'on_day_of_month': 21,
'at_seconds': 86400,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Iran 2008 only - Mar 20 24:00 1:00 -
{
'from_year': 2008,
'to_year': 2008,
'in_month': 3,
'on_day_of_week': 0,
'on_day_of_month': 20,
'at_seconds': 86400,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Iran 2008 only - Sep 20 24:00 0 -
{
'from_year': 2008,
'to_year': 2008,
'in_month': 9,
'on_day_of_week': 0,
'on_day_of_month': 20,
'at_seconds': 86400,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Iran 2009 2011 - Mar 21 24:00 1:00 -
{
'from_year': 2009,
'to_year': 2011,
'in_month': 3,
'on_day_of_week': 0,
'on_day_of_month': 21,
'at_seconds': 86400,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Iran 2009 2011 - Sep 21 24:00 0 -
{
'from_year': 2009,
'to_year': 2011,
'in_month': 9,
'on_day_of_week': 0,
'on_day_of_month': 21,
'at_seconds': 86400,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Iran 2012 only - Mar 20 24:00 1:00 -
{
'from_year': 2012,
'to_year': 2012,
'in_month': 3,
'on_day_of_week': 0,
'on_day_of_month': 20,
'at_seconds': 86400,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Iran 2012 only - Sep 20 24:00 0 -
{
'from_year': 2012,
'to_year': 2012,
'in_month': 9,
'on_day_of_week': 0,
'on_day_of_month': 20,
'at_seconds': 86400,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Iran 2013 2015 - Mar 21 24:00 1:00 -
{
'from_year': 2013,
'to_year': 2015,
'in_month': 3,
'on_day_of_week': 0,
'on_day_of_month': 21,
'at_seconds': 86400,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Iran 2013 2015 - Sep 21 24:00 0 -
{
'from_year': 2013,
'to_year': 2015,
'in_month': 9,
'on_day_of_week': 0,
'on_day_of_month': 21,
'at_seconds': 86400,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Iran 2016 only - Mar 20 24:00 1:00 -
{
'from_year': 2016,
'to_year': 2016,
'in_month': 3,
'on_day_of_week': 0,
'on_day_of_month': 20,
'at_seconds': 86400,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Iran 2016 only - Sep 20 24:00 0 -
{
'from_year': 2016,
'to_year': 2016,
'in_month': 9,
'on_day_of_week': 0,
'on_day_of_month': 20,
'at_seconds': 86400,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Iran 2017 2019 - Mar 21 24:00 1:00 -
{
'from_year': 2017,
'to_year': 2019,
'in_month': 3,
'on_day_of_week': 0,
'on_day_of_month': 21,
'at_seconds': 86400,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Iran 2017 2019 - Sep 21 24:00 0 -
{
'from_year': 2017,
'to_year': 2019,
'in_month': 9,
'on_day_of_week': 0,
'on_day_of_month': 21,
'at_seconds': 86400,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Iran 2020 only - Mar 20 24:00 1:00 -
{
'from_year': 2020,
'to_year': 2020,
'in_month': 3,
'on_day_of_week': 0,
'on_day_of_month': 20,
'at_seconds': 86400,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Iran 2020 only - Sep 20 24:00 0 -
{
'from_year': 2020,
'to_year': 2020,
'in_month': 9,
'on_day_of_week': 0,
'on_day_of_month': 20,
'at_seconds': 86400,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Iran 2021 2023 - Mar 21 24:00 1:00 -
{
'from_year': 2021,
'to_year': 2023,
'in_month': 3,
'on_day_of_week': 0,
'on_day_of_month': 21,
'at_seconds': 86400,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Iran 2021 2023 - Sep 21 24:00 0 -
{
'from_year': 2021,
'to_year': 2023,
'in_month': 9,
'on_day_of_week': 0,
'on_day_of_month': 21,
'at_seconds': 86400,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Iran 2024 only - Mar 20 24:00 1:00 -
{
'from_year': 2024,
'to_year': 2024,
'in_month': 3,
'on_day_of_week': 0,
'on_day_of_month': 20,
'at_seconds': 86400,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Iran 2024 only - Sep 20 24:00 0 -
{
'from_year': 2024,
'to_year': 2024,
'in_month': 9,
'on_day_of_week': 0,
'on_day_of_month': 20,
'at_seconds': 86400,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Iran 2025 2027 - Mar 21 24:00 1:00 -
{
'from_year': 2025,
'to_year': 2027,
'in_month': 3,
'on_day_of_week': 0,
'on_day_of_month': 21,
'at_seconds': 86400,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Iran 2025 2027 - Sep 21 24:00 0 -
{
'from_year': 2025,
'to_year': 2027,
'in_month': 9,
'on_day_of_week': 0,
'on_day_of_month': 21,
'at_seconds': 86400,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Iran 2028 2029 - Mar 20 24:00 1:00 -
{
'from_year': 2028,
'to_year': 2029,
'in_month': 3,
'on_day_of_week': 0,
'on_day_of_month': 20,
'at_seconds': 86400,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Iran 2028 2029 - Sep 20 24:00 0 -
{
'from_year': 2028,
'to_year': 2029,
'in_month': 9,
'on_day_of_week': 0,
'on_day_of_month': 20,
'at_seconds': 86400,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Iran 2030 2031 - Mar 21 24:00 1:00 -
{
'from_year': 2030,
'to_year': 2031,
'in_month': 3,
'on_day_of_week': 0,
'on_day_of_month': 21,
'at_seconds': 86400,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Iran 2030 2031 - Sep 21 24:00 0 -
{
'from_year': 2030,
'to_year': 2031,
'in_month': 9,
'on_day_of_week': 0,
'on_day_of_month': 21,
'at_seconds': 86400,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Iran 2032 2033 - Mar 20 24:00 1:00 -
{
'from_year': 2032,
'to_year': 2033,
'in_month': 3,
'on_day_of_week': 0,
'on_day_of_month': 20,
'at_seconds': 86400,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Iran 2032 2033 - Sep 20 24:00 0 -
{
'from_year': 2032,
'to_year': 2033,
'in_month': 9,
'on_day_of_week': 0,
'on_day_of_month': 20,
'at_seconds': 86400,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Iran 2034 2035 - Mar 21 24:00 1:00 -
{
'from_year': 2034,
'to_year': 2035,
'in_month': 3,
'on_day_of_week': 0,
'on_day_of_month': 21,
'at_seconds': 86400,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Iran 2034 2035 - Sep 21 24:00 0 -
{
'from_year': 2034,
'to_year': 2035,
'in_month': 9,
'on_day_of_week': 0,
'on_day_of_month': 21,
'at_seconds': 86400,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Iran 2036 2037 - Mar 20 24:00 1:00 -
{
'from_year': 2036,
'to_year': 2037,
'in_month': 3,
'on_day_of_week': 0,
'on_day_of_month': 20,
'at_seconds': 86400,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Iran 2036 2037 - Sep 20 24:00 0 -
{
'from_year': 2036,
'to_year': 2037,
'in_month': 9,
'on_day_of_week': 0,
'on_day_of_month': 20,
'at_seconds': 86400,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Iran 2038 2039 - Mar 21 24:00 1:00 -
{
'from_year': 2038,
'to_year': 2039,
'in_month': 3,
'on_day_of_week': 0,
'on_day_of_month': 21,
'at_seconds': 86400,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Iran 2038 2039 - Sep 21 24:00 0 -
{
'from_year': 2038,
'to_year': 2039,
'in_month': 9,
'on_day_of_week': 0,
'on_day_of_month': 21,
'at_seconds': 86400,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Iran 2040 2041 - Mar 20 24:00 1:00 -
{
'from_year': 2040,
'to_year': 2041,
'in_month': 3,
'on_day_of_week': 0,
'on_day_of_month': 20,
'at_seconds': 86400,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Iran 2040 2041 - Sep 20 24:00 0 -
{
'from_year': 2040,
'to_year': 2041,
'in_month': 9,
'on_day_of_week': 0,
'on_day_of_month': 20,
'at_seconds': 86400,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Iran 2042 2043 - Mar 21 24:00 1:00 -
{
'from_year': 2042,
'to_year': 2043,
'in_month': 3,
'on_day_of_week': 0,
'on_day_of_month': 21,
'at_seconds': 86400,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Iran 2042 2043 - Sep 21 24:00 0 -
{
'from_year': 2042,
'to_year': 2043,
'in_month': 9,
'on_day_of_week': 0,
'on_day_of_month': 21,
'at_seconds': 86400,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Iran 2044 2045 - Mar 20 24:00 1:00 -
{
'from_year': 2044,
'to_year': 2045,
'in_month': 3,
'on_day_of_week': 0,
'on_day_of_month': 20,
'at_seconds': 86400,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Iran 2044 2045 - Sep 20 24:00 0 -
{
'from_year': 2044,
'to_year': 2045,
'in_month': 9,
'on_day_of_week': 0,
'on_day_of_month': 20,
'at_seconds': 86400,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Iran 2046 2047 - Mar 21 24:00 1:00 -
{
'from_year': 2046,
'to_year': 2047,
'in_month': 3,
'on_day_of_week': 0,
'on_day_of_month': 21,
'at_seconds': 86400,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Iran 2046 2047 - Sep 21 24:00 0 -
{
'from_year': 2046,
'to_year': 2047,
'in_month': 9,
'on_day_of_week': 0,
'on_day_of_month': 21,
'at_seconds': 86400,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Iran 2048 2049 - Mar 20 24:00 1:00 -
{
'from_year': 2048,
'to_year': 2049,
'in_month': 3,
'on_day_of_week': 0,
'on_day_of_month': 20,
'at_seconds': 86400,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Iran 2048 2049 - Sep 20 24:00 0 -
{
'from_year': 2048,
'to_year': 2049,
'in_month': 9,
'on_day_of_week': 0,
'on_day_of_month': 20,
'at_seconds': 86400,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Iran 2050 2051 - Mar 21 24:00 1:00 -
{
'from_year': 2050,
'to_year': 2051,
'in_month': 3,
'on_day_of_week': 0,
'on_day_of_month': 21,
'at_seconds': 86400,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Iran 2050 2051 - Sep 21 24:00 0 -
{
'from_year': 2050,
'to_year': 2051,
'in_month': 9,
'on_day_of_week': 0,
'on_day_of_month': 21,
'at_seconds': 86400,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
]
ZONE_POLICY_Iran = {
'name': 'Iran',
'rules': ZONE_RULES_Iran
}
#---------------------------------------------------------------------------
# Policy name: Iraq
# Rule count: 9
#---------------------------------------------------------------------------
ZONE_RULES_Iraq = [
# Anchor: Rule Iraq 1982 1984 - Oct 1 0:00 0 -
{
'from_year': 0,
'to_year': 0,
'in_month': 1,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Iraq 1982 only - May 1 0:00 1:00 -
{
'from_year': 1982,
'to_year': 1982,
'in_month': 5,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Iraq 1982 1984 - Oct 1 0:00 0 -
{
'from_year': 1982,
'to_year': 1984,
'in_month': 10,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Iraq 1983 only - Mar 31 0:00 1:00 -
{
'from_year': 1983,
'to_year': 1983,
'in_month': 3,
'on_day_of_week': 0,
'on_day_of_month': 31,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Iraq 1984 1985 - Apr 1 0:00 1:00 -
{
'from_year': 1984,
'to_year': 1985,
'in_month': 4,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Iraq 1985 1990 - Sep lastSun 1:00s 0 -
{
'from_year': 1985,
'to_year': 1990,
'in_month': 9,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 3600,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': '-',
},
# Rule Iraq 1986 1990 - Mar lastSun 1:00s 1:00 -
{
'from_year': 1986,
'to_year': 1990,
'in_month': 3,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 3600,
'at_time_suffix': 's',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Iraq 1991 2007 - Apr 1 3:00s 1:00 -
{
'from_year': 1991,
'to_year': 2007,
'in_month': 4,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 10800,
'at_time_suffix': 's',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Iraq 1991 2007 - Oct 1 3:00s 0 -
{
'from_year': 1991,
'to_year': 2007,
'in_month': 10,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 10800,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': '-',
},
]
ZONE_POLICY_Iraq = {
'name': 'Iraq',
'rules': ZONE_RULES_Iraq
}
#---------------------------------------------------------------------------
# Policy name: Italy
# Rule count: 10
#---------------------------------------------------------------------------
ZONE_RULES_Italy = [
# Rule Italy 1972 only - Oct 1 0:00s 0 -
{
'from_year': 1972,
'to_year': 1972,
'in_month': 10,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': '-',
},
# Rule Italy 1973 only - Jun 3 0:00s 1:00 S
{
'from_year': 1973,
'to_year': 1973,
'in_month': 6,
'on_day_of_week': 0,
'on_day_of_month': 3,
'at_seconds': 0,
'at_time_suffix': 's',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Italy 1973 1974 - Sep lastSun 0:00s 0 -
{
'from_year': 1973,
'to_year': 1974,
'in_month': 9,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 0,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': '-',
},
# Rule Italy 1974 only - May 26 0:00s 1:00 S
{
'from_year': 1974,
'to_year': 1974,
'in_month': 5,
'on_day_of_week': 0,
'on_day_of_month': 26,
'at_seconds': 0,
'at_time_suffix': 's',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Italy 1975 only - Jun 1 0:00s 1:00 S
{
'from_year': 1975,
'to_year': 1975,
'in_month': 6,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 's',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Italy 1975 1977 - Sep lastSun 0:00s 0 -
{
'from_year': 1975,
'to_year': 1977,
'in_month': 9,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 0,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': '-',
},
# Rule Italy 1976 only - May 30 0:00s 1:00 S
{
'from_year': 1976,
'to_year': 1976,
'in_month': 5,
'on_day_of_week': 0,
'on_day_of_month': 30,
'at_seconds': 0,
'at_time_suffix': 's',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Italy 1977 1979 - May Sun>=22 0:00s 1:00 S
{
'from_year': 1977,
'to_year': 1979,
'in_month': 5,
'on_day_of_week': 7,
'on_day_of_month': 22,
'at_seconds': 0,
'at_time_suffix': 's',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Italy 1978 only - Oct 1 0:00s 0 -
{
'from_year': 1978,
'to_year': 1978,
'in_month': 10,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': '-',
},
# Rule Italy 1979 only - Sep 30 0:00s 0 -
{
'from_year': 1979,
'to_year': 1979,
'in_month': 9,
'on_day_of_week': 0,
'on_day_of_month': 30,
'at_seconds': 0,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': '-',
},
]
ZONE_POLICY_Italy = {
'name': 'Italy',
'rules': ZONE_RULES_Italy
}
#---------------------------------------------------------------------------
# Policy name: Japan
# Rule count: 1
#---------------------------------------------------------------------------
ZONE_RULES_Japan = [
# Rule Japan 1948 1951 - Sep Sat>=8 25:00 0 S
{
'from_year': 1948,
'to_year': 1951,
'in_month': 9,
'on_day_of_week': 6,
'on_day_of_month': 8,
'at_seconds': 90000,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': 'S',
},
]
ZONE_POLICY_Japan = {
'name': 'Japan',
'rules': ZONE_RULES_Japan
}
#---------------------------------------------------------------------------
# Policy name: Jordan
# Rule count: 32
#---------------------------------------------------------------------------
ZONE_RULES_Jordan = [
# Anchor: Rule Jordan 1973 1975 - Oct 1 0:00 0 -
{
'from_year': 0,
'to_year': 0,
'in_month': 1,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Jordan 1973 only - Jun 6 0:00 1:00 S
{
'from_year': 1973,
'to_year': 1973,
'in_month': 6,
'on_day_of_week': 0,
'on_day_of_month': 6,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Jordan 1973 1975 - Oct 1 0:00 0 -
{
'from_year': 1973,
'to_year': 1975,
'in_month': 10,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Jordan 1974 1977 - May 1 0:00 1:00 S
{
'from_year': 1974,
'to_year': 1977,
'in_month': 5,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Jordan 1976 only - Nov 1 0:00 0 -
{
'from_year': 1976,
'to_year': 1976,
'in_month': 11,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Jordan 1977 only - Oct 1 0:00 0 -
{
'from_year': 1977,
'to_year': 1977,
'in_month': 10,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Jordan 1978 only - Apr 30 0:00 1:00 S
{
'from_year': 1978,
'to_year': 1978,
'in_month': 4,
'on_day_of_week': 0,
'on_day_of_month': 30,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Jordan 1978 only - Sep 30 0:00 0 -
{
'from_year': 1978,
'to_year': 1978,
'in_month': 9,
'on_day_of_week': 0,
'on_day_of_month': 30,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Jordan 1985 only - Apr 1 0:00 1:00 S
{
'from_year': 1985,
'to_year': 1985,
'in_month': 4,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Jordan 1985 only - Oct 1 0:00 0 -
{
'from_year': 1985,
'to_year': 1985,
'in_month': 10,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Jordan 1986 1988 - Apr Fri>=1 0:00 1:00 S
{
'from_year': 1986,
'to_year': 1988,
'in_month': 4,
'on_day_of_week': 5,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Jordan 1986 1990 - Oct Fri>=1 0:00 0 -
{
'from_year': 1986,
'to_year': 1990,
'in_month': 10,
'on_day_of_week': 5,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Jordan 1989 only - May 8 0:00 1:00 S
{
'from_year': 1989,
'to_year': 1989,
'in_month': 5,
'on_day_of_week': 0,
'on_day_of_month': 8,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Jordan 1990 only - Apr 27 0:00 1:00 S
{
'from_year': 1990,
'to_year': 1990,
'in_month': 4,
'on_day_of_week': 0,
'on_day_of_month': 27,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Jordan 1991 only - Apr 17 0:00 1:00 S
{
'from_year': 1991,
'to_year': 1991,
'in_month': 4,
'on_day_of_week': 0,
'on_day_of_month': 17,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Jordan 1991 only - Sep 27 0:00 0 -
{
'from_year': 1991,
'to_year': 1991,
'in_month': 9,
'on_day_of_week': 0,
'on_day_of_month': 27,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Jordan 1992 only - Apr 10 0:00 1:00 S
{
'from_year': 1992,
'to_year': 1992,
'in_month': 4,
'on_day_of_week': 0,
'on_day_of_month': 10,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Jordan 1992 1993 - Oct Fri>=1 0:00 0 -
{
'from_year': 1992,
'to_year': 1993,
'in_month': 10,
'on_day_of_week': 5,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Jordan 1993 1998 - Apr Fri>=1 0:00 1:00 S
{
'from_year': 1993,
'to_year': 1998,
'in_month': 4,
'on_day_of_week': 5,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Jordan 1994 only - Sep Fri>=15 0:00 0 -
{
'from_year': 1994,
'to_year': 1994,
'in_month': 9,
'on_day_of_week': 5,
'on_day_of_month': 15,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Jordan 1995 1998 - Sep Fri>=15 0:00s 0 -
{
'from_year': 1995,
'to_year': 1998,
'in_month': 9,
'on_day_of_week': 5,
'on_day_of_month': 15,
'at_seconds': 0,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': '-',
},
# Rule Jordan 1999 only - Jul 1 0:00s 1:00 S
{
'from_year': 1999,
'to_year': 1999,
'in_month': 7,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 's',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Jordan 1999 2002 - Sep lastFri 0:00s 0 -
{
'from_year': 1999,
'to_year': 2002,
'in_month': 9,
'on_day_of_week': 5,
'on_day_of_month': 0,
'at_seconds': 0,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': '-',
},
# Rule Jordan 2000 2001 - Mar lastThu 0:00s 1:00 S
{
'from_year': 2000,
'to_year': 2001,
'in_month': 3,
'on_day_of_week': 4,
'on_day_of_month': 0,
'at_seconds': 0,
'at_time_suffix': 's',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Jordan 2002 2012 - Mar lastThu 24:00 1:00 S
{
'from_year': 2002,
'to_year': 2012,
'in_month': 3,
'on_day_of_week': 4,
'on_day_of_month': 0,
'at_seconds': 86400,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Jordan 2003 only - Oct 24 0:00s 0 -
{
'from_year': 2003,
'to_year': 2003,
'in_month': 10,
'on_day_of_week': 0,
'on_day_of_month': 24,
'at_seconds': 0,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': '-',
},
# Rule Jordan 2004 only - Oct 15 0:00s 0 -
{
'from_year': 2004,
'to_year': 2004,
'in_month': 10,
'on_day_of_week': 0,
'on_day_of_month': 15,
'at_seconds': 0,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': '-',
},
# Rule Jordan 2005 only - Sep lastFri 0:00s 0 -
{
'from_year': 2005,
'to_year': 2005,
'in_month': 9,
'on_day_of_week': 5,
'on_day_of_month': 0,
'at_seconds': 0,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': '-',
},
# Rule Jordan 2006 2011 - Oct lastFri 0:00s 0 -
{
'from_year': 2006,
'to_year': 2011,
'in_month': 10,
'on_day_of_week': 5,
'on_day_of_month': 0,
'at_seconds': 0,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': '-',
},
# Rule Jordan 2013 only - Dec 20 0:00 0 -
{
'from_year': 2013,
'to_year': 2013,
'in_month': 12,
'on_day_of_week': 0,
'on_day_of_month': 20,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Jordan 2014 max - Mar lastThu 24:00 1:00 S
{
'from_year': 2014,
'to_year': 9999,
'in_month': 3,
'on_day_of_week': 4,
'on_day_of_month': 0,
'at_seconds': 86400,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Jordan 2014 max - Oct lastFri 0:00s 0 -
{
'from_year': 2014,
'to_year': 9999,
'in_month': 10,
'on_day_of_week': 5,
'on_day_of_month': 0,
'at_seconds': 0,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': '-',
},
]
ZONE_POLICY_Jordan = {
'name': 'Jordan',
'rules': ZONE_RULES_Jordan
}
#---------------------------------------------------------------------------
# Policy name: Kyrgyz
# Rule count: 5
#---------------------------------------------------------------------------
ZONE_RULES_Kyrgyz = [
# Anchor: Rule Kyrgyz 1992 1996 - Sep lastSun 0:00 0 -
{
'from_year': 0,
'to_year': 0,
'in_month': 1,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Kyrgyz 1992 1996 - Apr Sun>=7 0:00s 1:00 -
{
'from_year': 1992,
'to_year': 1996,
'in_month': 4,
'on_day_of_week': 7,
'on_day_of_month': 7,
'at_seconds': 0,
'at_time_suffix': 's',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Kyrgyz 1992 1996 - Sep lastSun 0:00 0 -
{
'from_year': 1992,
'to_year': 1996,
'in_month': 9,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Kyrgyz 1997 2005 - Mar lastSun 2:30 1:00 -
{
'from_year': 1997,
'to_year': 2005,
'in_month': 3,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 9000,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Kyrgyz 1997 2004 - Oct lastSun 2:30 0 -
{
'from_year': 1997,
'to_year': 2004,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 9000,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
]
ZONE_POLICY_Kyrgyz = {
'name': 'Kyrgyz',
'rules': ZONE_RULES_Kyrgyz
}
#---------------------------------------------------------------------------
# Policy name: LH
# Rule count: 15
#---------------------------------------------------------------------------
ZONE_RULES_LH = [
# Anchor: Rule LH 1982 1985 - Mar Sun>=1 2:00 0 -
{
'from_year': 0,
'to_year': 0,
'in_month': 1,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule LH 1981 1984 - Oct lastSun 2:00 1:00 -
{
'from_year': 1981,
'to_year': 1984,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule LH 1982 1985 - Mar Sun>=1 2:00 0 -
{
'from_year': 1982,
'to_year': 1985,
'in_month': 3,
'on_day_of_week': 7,
'on_day_of_month': 1,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule LH 1985 only - Oct lastSun 2:00 0:30 -
{
'from_year': 1985,
'to_year': 1985,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 1800,
'letter': '-',
},
# Rule LH 1986 1989 - Mar Sun>=15 2:00 0 -
{
'from_year': 1986,
'to_year': 1989,
'in_month': 3,
'on_day_of_week': 7,
'on_day_of_month': 15,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule LH 1986 only - Oct 19 2:00 0:30 -
{
'from_year': 1986,
'to_year': 1986,
'in_month': 10,
'on_day_of_week': 0,
'on_day_of_month': 19,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 1800,
'letter': '-',
},
# Rule LH 1987 1999 - Oct lastSun 2:00 0:30 -
{
'from_year': 1987,
'to_year': 1999,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 1800,
'letter': '-',
},
# Rule LH 1990 1995 - Mar Sun>=1 2:00 0 -
{
'from_year': 1990,
'to_year': 1995,
'in_month': 3,
'on_day_of_week': 7,
'on_day_of_month': 1,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule LH 1996 2005 - Mar lastSun 2:00 0 -
{
'from_year': 1996,
'to_year': 2005,
'in_month': 3,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule LH 2000 only - Aug lastSun 2:00 0:30 -
{
'from_year': 2000,
'to_year': 2000,
'in_month': 8,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 1800,
'letter': '-',
},
# Rule LH 2001 2007 - Oct lastSun 2:00 0:30 -
{
'from_year': 2001,
'to_year': 2007,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 1800,
'letter': '-',
},
# Rule LH 2006 only - Apr Sun>=1 2:00 0 -
{
'from_year': 2006,
'to_year': 2006,
'in_month': 4,
'on_day_of_week': 7,
'on_day_of_month': 1,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule LH 2007 only - Mar lastSun 2:00 0 -
{
'from_year': 2007,
'to_year': 2007,
'in_month': 3,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule LH 2008 max - Apr Sun>=1 2:00 0 -
{
'from_year': 2008,
'to_year': 9999,
'in_month': 4,
'on_day_of_week': 7,
'on_day_of_month': 1,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule LH 2008 max - Oct Sun>=1 2:00 0:30 -
{
'from_year': 2008,
'to_year': 9999,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 1,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 1800,
'letter': '-',
},
]
ZONE_POLICY_LH = {
'name': 'LH',
'rules': ZONE_RULES_LH
}
#---------------------------------------------------------------------------
# Policy name: Latvia
# Rule count: 3
#---------------------------------------------------------------------------
ZONE_RULES_Latvia = [
# Anchor: Rule Latvia 1989 1996 - Sep lastSun 2:00s 0 -
{
'from_year': 0,
'to_year': 0,
'in_month': 1,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Latvia 1989 1996 - Mar lastSun 2:00s 1:00 S
{
'from_year': 1989,
'to_year': 1996,
'in_month': 3,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Latvia 1989 1996 - Sep lastSun 2:00s 0 -
{
'from_year': 1989,
'to_year': 1996,
'in_month': 9,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': '-',
},
]
ZONE_POLICY_Latvia = {
'name': 'Latvia',
'rules': ZONE_RULES_Latvia
}
#---------------------------------------------------------------------------
# Policy name: Lebanon
# Rule count: 14
#---------------------------------------------------------------------------
ZONE_RULES_Lebanon = [
# Rule Lebanon 1972 only - Jun 22 0:00 1:00 S
{
'from_year': 1972,
'to_year': 1972,
'in_month': 6,
'on_day_of_week': 0,
'on_day_of_month': 22,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Lebanon 1972 1977 - Oct 1 0:00 0 -
{
'from_year': 1972,
'to_year': 1977,
'in_month': 10,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Lebanon 1973 1977 - May 1 0:00 1:00 S
{
'from_year': 1973,
'to_year': 1977,
'in_month': 5,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Lebanon 1978 only - Apr 30 0:00 1:00 S
{
'from_year': 1978,
'to_year': 1978,
'in_month': 4,
'on_day_of_week': 0,
'on_day_of_month': 30,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Lebanon 1978 only - Sep 30 0:00 0 -
{
'from_year': 1978,
'to_year': 1978,
'in_month': 9,
'on_day_of_week': 0,
'on_day_of_month': 30,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Lebanon 1984 1987 - May 1 0:00 1:00 S
{
'from_year': 1984,
'to_year': 1987,
'in_month': 5,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Lebanon 1984 1991 - Oct 16 0:00 0 -
{
'from_year': 1984,
'to_year': 1991,
'in_month': 10,
'on_day_of_week': 0,
'on_day_of_month': 16,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Lebanon 1988 only - Jun 1 0:00 1:00 S
{
'from_year': 1988,
'to_year': 1988,
'in_month': 6,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Lebanon 1989 only - May 10 0:00 1:00 S
{
'from_year': 1989,
'to_year': 1989,
'in_month': 5,
'on_day_of_week': 0,
'on_day_of_month': 10,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Lebanon 1990 1992 - May 1 0:00 1:00 S
{
'from_year': 1990,
'to_year': 1992,
'in_month': 5,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Lebanon 1992 only - Oct 4 0:00 0 -
{
'from_year': 1992,
'to_year': 1992,
'in_month': 10,
'on_day_of_week': 0,
'on_day_of_month': 4,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Lebanon 1993 max - Mar lastSun 0:00 1:00 S
{
'from_year': 1993,
'to_year': 9999,
'in_month': 3,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Lebanon 1993 1998 - Sep lastSun 0:00 0 -
{
'from_year': 1993,
'to_year': 1998,
'in_month': 9,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Lebanon 1999 max - Oct lastSun 0:00 0 -
{
'from_year': 1999,
'to_year': 9999,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
]
ZONE_POLICY_Lebanon = {
'name': 'Lebanon',
'rules': ZONE_RULES_Lebanon
}
#---------------------------------------------------------------------------
# Policy name: Libya
# Rule count: 12
#---------------------------------------------------------------------------
ZONE_RULES_Libya = [
# Rule Libya 1956 only - Jan 1 0:00 0 -
{
'from_year': 1956,
'to_year': 1956,
'in_month': 1,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Libya 1982 1984 - Apr 1 0:00 1:00 S
{
'from_year': 1982,
'to_year': 1984,
'in_month': 4,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Libya 1982 1985 - Oct 1 0:00 0 -
{
'from_year': 1982,
'to_year': 1985,
'in_month': 10,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Libya 1985 only - Apr 6 0:00 1:00 S
{
'from_year': 1985,
'to_year': 1985,
'in_month': 4,
'on_day_of_week': 0,
'on_day_of_month': 6,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Libya 1986 only - Apr 4 0:00 1:00 S
{
'from_year': 1986,
'to_year': 1986,
'in_month': 4,
'on_day_of_week': 0,
'on_day_of_month': 4,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Libya 1986 only - Oct 3 0:00 0 -
{
'from_year': 1986,
'to_year': 1986,
'in_month': 10,
'on_day_of_week': 0,
'on_day_of_month': 3,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Libya 1987 1989 - Apr 1 0:00 1:00 S
{
'from_year': 1987,
'to_year': 1989,
'in_month': 4,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Libya 1987 1989 - Oct 1 0:00 0 -
{
'from_year': 1987,
'to_year': 1989,
'in_month': 10,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Libya 1997 only - Apr 4 0:00 1:00 S
{
'from_year': 1997,
'to_year': 1997,
'in_month': 4,
'on_day_of_week': 0,
'on_day_of_month': 4,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Libya 1997 only - Oct 4 0:00 0 -
{
'from_year': 1997,
'to_year': 1997,
'in_month': 10,
'on_day_of_week': 0,
'on_day_of_month': 4,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Libya 2013 only - Mar lastFri 1:00 1:00 S
{
'from_year': 2013,
'to_year': 2013,
'in_month': 3,
'on_day_of_week': 5,
'on_day_of_month': 0,
'at_seconds': 3600,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Libya 2013 only - Oct lastFri 2:00 0 -
{
'from_year': 2013,
'to_year': 2013,
'in_month': 10,
'on_day_of_week': 5,
'on_day_of_month': 0,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
]
ZONE_POLICY_Libya = {
'name': 'Libya',
'rules': ZONE_RULES_Libya
}
#---------------------------------------------------------------------------
# Policy name: Macau
# Rule count: 7
#---------------------------------------------------------------------------
ZONE_RULES_Macau = [
# Rule Macau 1965 1973 - Apr Sun>=16 03:30 1:00 D
{
'from_year': 1965,
'to_year': 1973,
'in_month': 4,
'on_day_of_week': 7,
'on_day_of_month': 16,
'at_seconds': 12600,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule Macau 1965 1966 - Oct Sun>=16 02:30 0 S
{
'from_year': 1965,
'to_year': 1966,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 16,
'at_seconds': 9000,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': 'S',
},
# Rule Macau 1967 1976 - Oct Sun>=16 03:30 0 S
{
'from_year': 1967,
'to_year': 1976,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 16,
'at_seconds': 12600,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': 'S',
},
# Rule Macau 1973 only - Dec 30 03:30 1:00 D
{
'from_year': 1973,
'to_year': 1973,
'in_month': 12,
'on_day_of_week': 0,
'on_day_of_month': 30,
'at_seconds': 12600,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule Macau 1975 1976 - Apr Sun>=16 03:30 1:00 D
{
'from_year': 1975,
'to_year': 1976,
'in_month': 4,
'on_day_of_week': 7,
'on_day_of_month': 16,
'at_seconds': 12600,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule Macau 1979 only - May 13 03:30 1:00 D
{
'from_year': 1979,
'to_year': 1979,
'in_month': 5,
'on_day_of_week': 0,
'on_day_of_month': 13,
'at_seconds': 12600,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule Macau 1979 only - Oct Sun>=16 03:30 0 S
{
'from_year': 1979,
'to_year': 1979,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 16,
'at_seconds': 12600,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': 'S',
},
]
ZONE_POLICY_Macau = {
'name': 'Macau',
'rules': ZONE_RULES_Macau
}
#---------------------------------------------------------------------------
# Policy name: Malta
# Rule count: 8
#---------------------------------------------------------------------------
ZONE_RULES_Malta = [
# Anchor: Rule Malta 1973 only - Sep 29 0:00s 0 -
{
'from_year': 0,
'to_year': 0,
'in_month': 1,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Malta 1973 only - Mar 31 0:00s 1:00 S
{
'from_year': 1973,
'to_year': 1973,
'in_month': 3,
'on_day_of_week': 0,
'on_day_of_month': 31,
'at_seconds': 0,
'at_time_suffix': 's',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Malta 1973 only - Sep 29 0:00s 0 -
{
'from_year': 1973,
'to_year': 1973,
'in_month': 9,
'on_day_of_week': 0,
'on_day_of_month': 29,
'at_seconds': 0,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': '-',
},
# Rule Malta 1974 only - Apr 21 0:00s 1:00 S
{
'from_year': 1974,
'to_year': 1974,
'in_month': 4,
'on_day_of_week': 0,
'on_day_of_month': 21,
'at_seconds': 0,
'at_time_suffix': 's',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Malta 1974 only - Sep 16 0:00s 0 -
{
'from_year': 1974,
'to_year': 1974,
'in_month': 9,
'on_day_of_week': 0,
'on_day_of_month': 16,
'at_seconds': 0,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': '-',
},
# Rule Malta 1975 1979 - Apr Sun>=15 2:00 1:00 S
{
'from_year': 1975,
'to_year': 1979,
'in_month': 4,
'on_day_of_week': 7,
'on_day_of_month': 15,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Malta 1975 1980 - Sep Sun>=15 2:00 0 -
{
'from_year': 1975,
'to_year': 1980,
'in_month': 9,
'on_day_of_week': 7,
'on_day_of_month': 15,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Malta 1980 only - Mar 31 2:00 1:00 S
{
'from_year': 1980,
'to_year': 1980,
'in_month': 3,
'on_day_of_week': 0,
'on_day_of_month': 31,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'S',
},
]
ZONE_POLICY_Malta = {
'name': 'Malta',
'rules': ZONE_RULES_Malta
}
#---------------------------------------------------------------------------
# Policy name: Mauritius
# Rule count: 5
#---------------------------------------------------------------------------
ZONE_RULES_Mauritius = [
# Anchor: Rule Mauritius 1983 only - Mar 21 0:00 0 -
{
'from_year': 0,
'to_year': 0,
'in_month': 1,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Mauritius 1982 only - Oct 10 0:00 1:00 -
{
'from_year': 1982,
'to_year': 1982,
'in_month': 10,
'on_day_of_week': 0,
'on_day_of_month': 10,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Mauritius 1983 only - Mar 21 0:00 0 -
{
'from_year': 1983,
'to_year': 1983,
'in_month': 3,
'on_day_of_week': 0,
'on_day_of_month': 21,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Mauritius 2008 only - Oct lastSun 2:00 1:00 -
{
'from_year': 2008,
'to_year': 2008,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Mauritius 2009 only - Mar lastSun 2:00 0 -
{
'from_year': 2009,
'to_year': 2009,
'in_month': 3,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
]
ZONE_POLICY_Mauritius = {
'name': 'Mauritius',
'rules': ZONE_RULES_Mauritius
}
#---------------------------------------------------------------------------
# Policy name: Mexico
# Rule count: 7
#---------------------------------------------------------------------------
ZONE_RULES_Mexico = [
# Rule Mexico 1950 only - Jul 30 0:00 0 S
{
'from_year': 1950,
'to_year': 1950,
'in_month': 7,
'on_day_of_week': 0,
'on_day_of_month': 30,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': 'S',
},
# Rule Mexico 1996 2000 - Apr Sun>=1 2:00 1:00 D
{
'from_year': 1996,
'to_year': 2000,
'in_month': 4,
'on_day_of_week': 7,
'on_day_of_month': 1,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule Mexico 1996 2000 - Oct lastSun 2:00 0 S
{
'from_year': 1996,
'to_year': 2000,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': 'S',
},
# Rule Mexico 2001 only - May Sun>=1 2:00 1:00 D
{
'from_year': 2001,
'to_year': 2001,
'in_month': 5,
'on_day_of_week': 7,
'on_day_of_month': 1,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule Mexico 2001 only - Sep lastSun 2:00 0 S
{
'from_year': 2001,
'to_year': 2001,
'in_month': 9,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': 'S',
},
# Rule Mexico 2002 max - Apr Sun>=1 2:00 1:00 D
{
'from_year': 2002,
'to_year': 9999,
'in_month': 4,
'on_day_of_week': 7,
'on_day_of_month': 1,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule Mexico 2002 max - Oct lastSun 2:00 0 S
{
'from_year': 2002,
'to_year': 9999,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': 'S',
},
]
ZONE_POLICY_Mexico = {
'name': 'Mexico',
'rules': ZONE_RULES_Mexico
}
#---------------------------------------------------------------------------
# Policy name: Moldova
# Rule count: 3
#---------------------------------------------------------------------------
ZONE_RULES_Moldova = [
# Anchor: Rule Moldova 1997 max - Oct lastSun 3:00 0 -
{
'from_year': 0,
'to_year': 0,
'in_month': 1,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Moldova 1997 max - Mar lastSun 2:00 1:00 S
{
'from_year': 1997,
'to_year': 9999,
'in_month': 3,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Moldova 1997 max - Oct lastSun 3:00 0 -
{
'from_year': 1997,
'to_year': 9999,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 10800,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
]
ZONE_POLICY_Moldova = {
'name': 'Moldova',
'rules': ZONE_RULES_Moldova
}
#---------------------------------------------------------------------------
# Policy name: Moncton
# Rule count: 3
#---------------------------------------------------------------------------
ZONE_RULES_Moncton = [
# Rule Moncton 1957 1972 - Oct lastSun 2:00 0 S
{
'from_year': 1957,
'to_year': 1972,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': 'S',
},
# Rule Moncton 1993 2006 - Apr Sun>=1 0:01 1:00 D
{
'from_year': 1993,
'to_year': 2006,
'in_month': 4,
'on_day_of_week': 7,
'on_day_of_month': 1,
'at_seconds': 60,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule Moncton 1993 2006 - Oct lastSun 0:01 0 S
{
'from_year': 1993,
'to_year': 2006,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 60,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': 'S',
},
]
ZONE_POLICY_Moncton = {
'name': 'Moncton',
'rules': ZONE_RULES_Moncton
}
#---------------------------------------------------------------------------
# Policy name: Mongol
# Rule count: 10
#---------------------------------------------------------------------------
ZONE_RULES_Mongol = [
# Anchor: Rule Mongol 1983 only - Oct 1 0:00 0 -
{
'from_year': 0,
'to_year': 0,
'in_month': 1,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Mongol 1983 1984 - Apr 1 0:00 1:00 -
{
'from_year': 1983,
'to_year': 1984,
'in_month': 4,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Mongol 1983 only - Oct 1 0:00 0 -
{
'from_year': 1983,
'to_year': 1983,
'in_month': 10,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Mongol 1985 1998 - Mar lastSun 0:00 1:00 -
{
'from_year': 1985,
'to_year': 1998,
'in_month': 3,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Mongol 1984 1998 - Sep lastSun 0:00 0 -
{
'from_year': 1984,
'to_year': 1998,
'in_month': 9,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Mongol 2001 only - Apr lastSat 2:00 1:00 -
{
'from_year': 2001,
'to_year': 2001,
'in_month': 4,
'on_day_of_week': 6,
'on_day_of_month': 0,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Mongol 2001 2006 - Sep lastSat 2:00 0 -
{
'from_year': 2001,
'to_year': 2006,
'in_month': 9,
'on_day_of_week': 6,
'on_day_of_month': 0,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Mongol 2002 2006 - Mar lastSat 2:00 1:00 -
{
'from_year': 2002,
'to_year': 2006,
'in_month': 3,
'on_day_of_week': 6,
'on_day_of_month': 0,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Mongol 2015 2016 - Mar lastSat 2:00 1:00 -
{
'from_year': 2015,
'to_year': 2016,
'in_month': 3,
'on_day_of_week': 6,
'on_day_of_month': 0,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Mongol 2015 2016 - Sep lastSat 0:00 0 -
{
'from_year': 2015,
'to_year': 2016,
'in_month': 9,
'on_day_of_week': 6,
'on_day_of_month': 0,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
]
ZONE_POLICY_Mongol = {
'name': 'Mongol',
'rules': ZONE_RULES_Mongol
}
#---------------------------------------------------------------------------
# Policy name: Morocco
# Rule count: 101
#---------------------------------------------------------------------------
ZONE_RULES_Morocco = [
# Rule Morocco 1967 only - Oct 1 0:00 0 -
{
'from_year': 1967,
'to_year': 1967,
'in_month': 10,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Morocco 1974 only - Jun 24 0:00 1:00 -
{
'from_year': 1974,
'to_year': 1974,
'in_month': 6,
'on_day_of_week': 0,
'on_day_of_month': 24,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Morocco 1974 only - Sep 1 0:00 0 -
{
'from_year': 1974,
'to_year': 1974,
'in_month': 9,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Morocco 1976 1977 - May 1 0:00 1:00 -
{
'from_year': 1976,
'to_year': 1977,
'in_month': 5,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Morocco 1976 only - Aug 1 0:00 0 -
{
'from_year': 1976,
'to_year': 1976,
'in_month': 8,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Morocco 1977 only - Sep 28 0:00 0 -
{
'from_year': 1977,
'to_year': 1977,
'in_month': 9,
'on_day_of_week': 0,
'on_day_of_month': 28,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Morocco 1978 only - Jun 1 0:00 1:00 -
{
'from_year': 1978,
'to_year': 1978,
'in_month': 6,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Morocco 1978 only - Aug 4 0:00 0 -
{
'from_year': 1978,
'to_year': 1978,
'in_month': 8,
'on_day_of_week': 0,
'on_day_of_month': 4,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Morocco 2008 only - Jun 1 0:00 1:00 -
{
'from_year': 2008,
'to_year': 2008,
'in_month': 6,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Morocco 2008 only - Sep 1 0:00 0 -
{
'from_year': 2008,
'to_year': 2008,
'in_month': 9,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Morocco 2009 only - Jun 1 0:00 1:00 -
{
'from_year': 2009,
'to_year': 2009,
'in_month': 6,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Morocco 2009 only - Aug 21 0:00 0 -
{
'from_year': 2009,
'to_year': 2009,
'in_month': 8,
'on_day_of_week': 0,
'on_day_of_month': 21,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Morocco 2010 only - May 2 0:00 1:00 -
{
'from_year': 2010,
'to_year': 2010,
'in_month': 5,
'on_day_of_week': 0,
'on_day_of_month': 2,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Morocco 2010 only - Aug 8 0:00 0 -
{
'from_year': 2010,
'to_year': 2010,
'in_month': 8,
'on_day_of_week': 0,
'on_day_of_month': 8,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Morocco 2011 only - Apr 3 0:00 1:00 -
{
'from_year': 2011,
'to_year': 2011,
'in_month': 4,
'on_day_of_week': 0,
'on_day_of_month': 3,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Morocco 2011 only - Jul 31 0:00 0 -
{
'from_year': 2011,
'to_year': 2011,
'in_month': 7,
'on_day_of_week': 0,
'on_day_of_month': 31,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Morocco 2012 2013 - Apr lastSun 2:00 1:00 -
{
'from_year': 2012,
'to_year': 2013,
'in_month': 4,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Morocco 2012 only - Jul 20 3:00 0 -
{
'from_year': 2012,
'to_year': 2012,
'in_month': 7,
'on_day_of_week': 0,
'on_day_of_month': 20,
'at_seconds': 10800,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Morocco 2012 only - Aug 20 2:00 1:00 -
{
'from_year': 2012,
'to_year': 2012,
'in_month': 8,
'on_day_of_week': 0,
'on_day_of_month': 20,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Morocco 2012 only - Sep 30 3:00 0 -
{
'from_year': 2012,
'to_year': 2012,
'in_month': 9,
'on_day_of_week': 0,
'on_day_of_month': 30,
'at_seconds': 10800,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Morocco 2013 only - Jul 7 3:00 0 -
{
'from_year': 2013,
'to_year': 2013,
'in_month': 7,
'on_day_of_week': 0,
'on_day_of_month': 7,
'at_seconds': 10800,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Morocco 2013 only - Aug 10 2:00 1:00 -
{
'from_year': 2013,
'to_year': 2013,
'in_month': 8,
'on_day_of_week': 0,
'on_day_of_month': 10,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Morocco 2013 2018 - Oct lastSun 3:00 0 -
{
'from_year': 2013,
'to_year': 2018,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 10800,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Morocco 2014 2018 - Mar lastSun 2:00 1:00 -
{
'from_year': 2014,
'to_year': 2018,
'in_month': 3,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Morocco 2014 only - Jun 28 3:00 0 -
{
'from_year': 2014,
'to_year': 2014,
'in_month': 6,
'on_day_of_week': 0,
'on_day_of_month': 28,
'at_seconds': 10800,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Morocco 2014 only - Aug 2 2:00 1:00 -
{
'from_year': 2014,
'to_year': 2014,
'in_month': 8,
'on_day_of_week': 0,
'on_day_of_month': 2,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Morocco 2015 only - Jun 14 3:00 0 -
{
'from_year': 2015,
'to_year': 2015,
'in_month': 6,
'on_day_of_week': 0,
'on_day_of_month': 14,
'at_seconds': 10800,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Morocco 2015 only - Jul 19 2:00 1:00 -
{
'from_year': 2015,
'to_year': 2015,
'in_month': 7,
'on_day_of_week': 0,
'on_day_of_month': 19,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Morocco 2016 only - Jun 5 3:00 0 -
{
'from_year': 2016,
'to_year': 2016,
'in_month': 6,
'on_day_of_week': 0,
'on_day_of_month': 5,
'at_seconds': 10800,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Morocco 2016 only - Jul 10 2:00 1:00 -
{
'from_year': 2016,
'to_year': 2016,
'in_month': 7,
'on_day_of_week': 0,
'on_day_of_month': 10,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Morocco 2017 only - May 21 3:00 0 -
{
'from_year': 2017,
'to_year': 2017,
'in_month': 5,
'on_day_of_week': 0,
'on_day_of_month': 21,
'at_seconds': 10800,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Morocco 2017 only - Jul 2 2:00 1:00 -
{
'from_year': 2017,
'to_year': 2017,
'in_month': 7,
'on_day_of_week': 0,
'on_day_of_month': 2,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Morocco 2018 only - May 13 3:00 0 -
{
'from_year': 2018,
'to_year': 2018,
'in_month': 5,
'on_day_of_week': 0,
'on_day_of_month': 13,
'at_seconds': 10800,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Morocco 2018 only - Jun 17 2:00 1:00 -
{
'from_year': 2018,
'to_year': 2018,
'in_month': 6,
'on_day_of_week': 0,
'on_day_of_month': 17,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Morocco 2019 only - May 5 3:00 -1:00 -
{
'from_year': 2019,
'to_year': 2019,
'in_month': 5,
'on_day_of_week': 0,
'on_day_of_month': 5,
'at_seconds': 10800,
'at_time_suffix': 'w',
'delta_seconds': -3600,
'letter': '-',
},
# Rule Morocco 2019 only - Jun 9 2:00 0 -
{
'from_year': 2019,
'to_year': 2019,
'in_month': 6,
'on_day_of_week': 0,
'on_day_of_month': 9,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Morocco 2020 only - Apr 19 3:00 -1:00 -
{
'from_year': 2020,
'to_year': 2020,
'in_month': 4,
'on_day_of_week': 0,
'on_day_of_month': 19,
'at_seconds': 10800,
'at_time_suffix': 'w',
'delta_seconds': -3600,
'letter': '-',
},
# Rule Morocco 2020 only - May 31 2:00 0 -
{
'from_year': 2020,
'to_year': 2020,
'in_month': 5,
'on_day_of_week': 0,
'on_day_of_month': 31,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Morocco 2021 only - Apr 11 3:00 -1:00 -
{
'from_year': 2021,
'to_year': 2021,
'in_month': 4,
'on_day_of_week': 0,
'on_day_of_month': 11,
'at_seconds': 10800,
'at_time_suffix': 'w',
'delta_seconds': -3600,
'letter': '-',
},
# Rule Morocco 2021 only - May 16 2:00 0 -
{
'from_year': 2021,
'to_year': 2021,
'in_month': 5,
'on_day_of_week': 0,
'on_day_of_month': 16,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Morocco 2022 only - Mar 27 3:00 -1:00 -
{
'from_year': 2022,
'to_year': 2022,
'in_month': 3,
'on_day_of_week': 0,
'on_day_of_month': 27,
'at_seconds': 10800,
'at_time_suffix': 'w',
'delta_seconds': -3600,
'letter': '-',
},
# Rule Morocco 2022 only - May 8 2:00 0 -
{
'from_year': 2022,
'to_year': 2022,
'in_month': 5,
'on_day_of_week': 0,
'on_day_of_month': 8,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Morocco 2023 only - Mar 19 3:00 -1:00 -
{
'from_year': 2023,
'to_year': 2023,
'in_month': 3,
'on_day_of_week': 0,
'on_day_of_month': 19,
'at_seconds': 10800,
'at_time_suffix': 'w',
'delta_seconds': -3600,
'letter': '-',
},
# Rule Morocco 2023 only - Apr 30 2:00 0 -
{
'from_year': 2023,
'to_year': 2023,
'in_month': 4,
'on_day_of_week': 0,
'on_day_of_month': 30,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Morocco 2024 only - Mar 10 3:00 -1:00 -
{
'from_year': 2024,
'to_year': 2024,
'in_month': 3,
'on_day_of_week': 0,
'on_day_of_month': 10,
'at_seconds': 10800,
'at_time_suffix': 'w',
'delta_seconds': -3600,
'letter': '-',
},
# Rule Morocco 2024 only - Apr 14 2:00 0 -
{
'from_year': 2024,
'to_year': 2024,
'in_month': 4,
'on_day_of_week': 0,
'on_day_of_month': 14,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Morocco 2025 only - Feb 23 3:00 -1:00 -
{
'from_year': 2025,
'to_year': 2025,
'in_month': 2,
'on_day_of_week': 0,
'on_day_of_month': 23,
'at_seconds': 10800,
'at_time_suffix': 'w',
'delta_seconds': -3600,
'letter': '-',
},
# Rule Morocco 2025 only - Apr 6 2:00 0 -
{
'from_year': 2025,
'to_year': 2025,
'in_month': 4,
'on_day_of_week': 0,
'on_day_of_month': 6,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Morocco 2026 only - Feb 15 3:00 -1:00 -
{
'from_year': 2026,
'to_year': 2026,
'in_month': 2,
'on_day_of_week': 0,
'on_day_of_month': 15,
'at_seconds': 10800,
'at_time_suffix': 'w',
'delta_seconds': -3600,
'letter': '-',
},
# Rule Morocco 2026 only - Mar 22 2:00 0 -
{
'from_year': 2026,
'to_year': 2026,
'in_month': 3,
'on_day_of_week': 0,
'on_day_of_month': 22,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Morocco 2027 only - Feb 7 3:00 -1:00 -
{
'from_year': 2027,
'to_year': 2027,
'in_month': 2,
'on_day_of_week': 0,
'on_day_of_month': 7,
'at_seconds': 10800,
'at_time_suffix': 'w',
'delta_seconds': -3600,
'letter': '-',
},
# Rule Morocco 2027 only - Mar 14 2:00 0 -
{
'from_year': 2027,
'to_year': 2027,
'in_month': 3,
'on_day_of_week': 0,
'on_day_of_month': 14,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Morocco 2028 only - Jan 23 3:00 -1:00 -
{
'from_year': 2028,
'to_year': 2028,
'in_month': 1,
'on_day_of_week': 0,
'on_day_of_month': 23,
'at_seconds': 10800,
'at_time_suffix': 'w',
'delta_seconds': -3600,
'letter': '-',
},
# Rule Morocco 2028 only - Mar 5 2:00 0 -
{
'from_year': 2028,
'to_year': 2028,
'in_month': 3,
'on_day_of_week': 0,
'on_day_of_month': 5,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Morocco 2029 only - Jan 14 3:00 -1:00 -
{
'from_year': 2029,
'to_year': 2029,
'in_month': 1,
'on_day_of_week': 0,
'on_day_of_month': 14,
'at_seconds': 10800,
'at_time_suffix': 'w',
'delta_seconds': -3600,
'letter': '-',
},
# Rule Morocco 2029 only - Feb 18 2:00 0 -
{
'from_year': 2029,
'to_year': 2029,
'in_month': 2,
'on_day_of_week': 0,
'on_day_of_month': 18,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Morocco 2029 only - Dec 30 3:00 -1:00 -
{
'from_year': 2029,
'to_year': 2029,
'in_month': 12,
'on_day_of_week': 0,
'on_day_of_month': 30,
'at_seconds': 10800,
'at_time_suffix': 'w',
'delta_seconds': -3600,
'letter': '-',
},
# Rule Morocco 2030 only - Feb 10 2:00 0 -
{
'from_year': 2030,
'to_year': 2030,
'in_month': 2,
'on_day_of_week': 0,
'on_day_of_month': 10,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Morocco 2030 only - Dec 22 3:00 -1:00 -
{
'from_year': 2030,
'to_year': 2030,
'in_month': 12,
'on_day_of_week': 0,
'on_day_of_month': 22,
'at_seconds': 10800,
'at_time_suffix': 'w',
'delta_seconds': -3600,
'letter': '-',
},
# Rule Morocco 2031 only - Feb 2 2:00 0 -
{
'from_year': 2031,
'to_year': 2031,
'in_month': 2,
'on_day_of_week': 0,
'on_day_of_month': 2,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Morocco 2031 only - Dec 14 3:00 -1:00 -
{
'from_year': 2031,
'to_year': 2031,
'in_month': 12,
'on_day_of_week': 0,
'on_day_of_month': 14,
'at_seconds': 10800,
'at_time_suffix': 'w',
'delta_seconds': -3600,
'letter': '-',
},
# Rule Morocco 2032 only - Jan 18 2:00 0 -
{
'from_year': 2032,
'to_year': 2032,
'in_month': 1,
'on_day_of_week': 0,
'on_day_of_month': 18,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Morocco 2032 only - Nov 28 3:00 -1:00 -
{
'from_year': 2032,
'to_year': 2032,
'in_month': 11,
'on_day_of_week': 0,
'on_day_of_month': 28,
'at_seconds': 10800,
'at_time_suffix': 'w',
'delta_seconds': -3600,
'letter': '-',
},
# Rule Morocco 2033 only - Jan 9 2:00 0 -
{
'from_year': 2033,
'to_year': 2033,
'in_month': 1,
'on_day_of_week': 0,
'on_day_of_month': 9,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Morocco 2033 only - Nov 20 3:00 -1:00 -
{
'from_year': 2033,
'to_year': 2033,
'in_month': 11,
'on_day_of_week': 0,
'on_day_of_month': 20,
'at_seconds': 10800,
'at_time_suffix': 'w',
'delta_seconds': -3600,
'letter': '-',
},
# Rule Morocco 2033 only - Dec 25 2:00 0 -
{
'from_year': 2033,
'to_year': 2033,
'in_month': 12,
'on_day_of_week': 0,
'on_day_of_month': 25,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Morocco 2034 only - Nov 5 3:00 -1:00 -
{
'from_year': 2034,
'to_year': 2034,
'in_month': 11,
'on_day_of_week': 0,
'on_day_of_month': 5,
'at_seconds': 10800,
'at_time_suffix': 'w',
'delta_seconds': -3600,
'letter': '-',
},
# Rule Morocco 2034 only - Dec 17 2:00 0 -
{
'from_year': 2034,
'to_year': 2034,
'in_month': 12,
'on_day_of_week': 0,
'on_day_of_month': 17,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Morocco 2035 only - Oct 28 3:00 -1:00 -
{
'from_year': 2035,
'to_year': 2035,
'in_month': 10,
'on_day_of_week': 0,
'on_day_of_month': 28,
'at_seconds': 10800,
'at_time_suffix': 'w',
'delta_seconds': -3600,
'letter': '-',
},
# Rule Morocco 2035 only - Dec 9 2:00 0 -
{
'from_year': 2035,
'to_year': 2035,
'in_month': 12,
'on_day_of_week': 0,
'on_day_of_month': 9,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Morocco 2036 only - Oct 19 3:00 -1:00 -
{
'from_year': 2036,
'to_year': 2036,
'in_month': 10,
'on_day_of_week': 0,
'on_day_of_month': 19,
'at_seconds': 10800,
'at_time_suffix': 'w',
'delta_seconds': -3600,
'letter': '-',
},
# Rule Morocco 2036 only - Nov 23 2:00 0 -
{
'from_year': 2036,
'to_year': 2036,
'in_month': 11,
'on_day_of_week': 0,
'on_day_of_month': 23,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Morocco 2037 only - Oct 4 3:00 -1:00 -
{
'from_year': 2037,
'to_year': 2037,
'in_month': 10,
'on_day_of_week': 0,
'on_day_of_month': 4,
'at_seconds': 10800,
'at_time_suffix': 'w',
'delta_seconds': -3600,
'letter': '-',
},
# Rule Morocco 2037 only - Nov 15 2:00 0 -
{
'from_year': 2037,
'to_year': 2037,
'in_month': 11,
'on_day_of_week': 0,
'on_day_of_month': 15,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Morocco 2038 only - Sep 26 3:00 -1:00 -
{
'from_year': 2038,
'to_year': 2038,
'in_month': 9,
'on_day_of_week': 0,
'on_day_of_month': 26,
'at_seconds': 10800,
'at_time_suffix': 'w',
'delta_seconds': -3600,
'letter': '-',
},
# Rule Morocco 2038 only - Nov 7 2:00 0 -
{
'from_year': 2038,
'to_year': 2038,
'in_month': 11,
'on_day_of_week': 0,
'on_day_of_month': 7,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Morocco 2039 only - Sep 18 3:00 -1:00 -
{
'from_year': 2039,
'to_year': 2039,
'in_month': 9,
'on_day_of_week': 0,
'on_day_of_month': 18,
'at_seconds': 10800,
'at_time_suffix': 'w',
'delta_seconds': -3600,
'letter': '-',
},
# Rule Morocco 2039 only - Oct 23 2:00 0 -
{
'from_year': 2039,
'to_year': 2039,
'in_month': 10,
'on_day_of_week': 0,
'on_day_of_month': 23,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Morocco 2040 only - Sep 2 3:00 -1:00 -
{
'from_year': 2040,
'to_year': 2040,
'in_month': 9,
'on_day_of_week': 0,
'on_day_of_month': 2,
'at_seconds': 10800,
'at_time_suffix': 'w',
'delta_seconds': -3600,
'letter': '-',
},
# Rule Morocco 2040 only - Oct 14 2:00 0 -
{
'from_year': 2040,
'to_year': 2040,
'in_month': 10,
'on_day_of_week': 0,
'on_day_of_month': 14,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Morocco 2041 only - Aug 25 3:00 -1:00 -
{
'from_year': 2041,
'to_year': 2041,
'in_month': 8,
'on_day_of_week': 0,
'on_day_of_month': 25,
'at_seconds': 10800,
'at_time_suffix': 'w',
'delta_seconds': -3600,
'letter': '-',
},
# Rule Morocco 2041 only - Sep 29 2:00 0 -
{
'from_year': 2041,
'to_year': 2041,
'in_month': 9,
'on_day_of_week': 0,
'on_day_of_month': 29,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Morocco 2042 only - Aug 10 3:00 -1:00 -
{
'from_year': 2042,
'to_year': 2042,
'in_month': 8,
'on_day_of_week': 0,
'on_day_of_month': 10,
'at_seconds': 10800,
'at_time_suffix': 'w',
'delta_seconds': -3600,
'letter': '-',
},
# Rule Morocco 2042 only - Sep 21 2:00 0 -
{
'from_year': 2042,
'to_year': 2042,
'in_month': 9,
'on_day_of_week': 0,
'on_day_of_month': 21,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Morocco 2043 only - Aug 2 3:00 -1:00 -
{
'from_year': 2043,
'to_year': 2043,
'in_month': 8,
'on_day_of_week': 0,
'on_day_of_month': 2,
'at_seconds': 10800,
'at_time_suffix': 'w',
'delta_seconds': -3600,
'letter': '-',
},
# Rule Morocco 2043 only - Sep 13 2:00 0 -
{
'from_year': 2043,
'to_year': 2043,
'in_month': 9,
'on_day_of_week': 0,
'on_day_of_month': 13,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Morocco 2044 only - Jul 24 3:00 -1:00 -
{
'from_year': 2044,
'to_year': 2044,
'in_month': 7,
'on_day_of_week': 0,
'on_day_of_month': 24,
'at_seconds': 10800,
'at_time_suffix': 'w',
'delta_seconds': -3600,
'letter': '-',
},
# Rule Morocco 2044 only - Aug 28 2:00 0 -
{
'from_year': 2044,
'to_year': 2044,
'in_month': 8,
'on_day_of_week': 0,
'on_day_of_month': 28,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Morocco 2045 only - Jul 9 3:00 -1:00 -
{
'from_year': 2045,
'to_year': 2045,
'in_month': 7,
'on_day_of_week': 0,
'on_day_of_month': 9,
'at_seconds': 10800,
'at_time_suffix': 'w',
'delta_seconds': -3600,
'letter': '-',
},
# Rule Morocco 2045 only - Aug 20 2:00 0 -
{
'from_year': 2045,
'to_year': 2045,
'in_month': 8,
'on_day_of_week': 0,
'on_day_of_month': 20,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Morocco 2046 only - Jul 1 3:00 -1:00 -
{
'from_year': 2046,
'to_year': 2046,
'in_month': 7,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 10800,
'at_time_suffix': 'w',
'delta_seconds': -3600,
'letter': '-',
},
# Rule Morocco 2046 only - Aug 12 2:00 0 -
{
'from_year': 2046,
'to_year': 2046,
'in_month': 8,
'on_day_of_week': 0,
'on_day_of_month': 12,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Morocco 2047 only - Jun 23 3:00 -1:00 -
{
'from_year': 2047,
'to_year': 2047,
'in_month': 6,
'on_day_of_week': 0,
'on_day_of_month': 23,
'at_seconds': 10800,
'at_time_suffix': 'w',
'delta_seconds': -3600,
'letter': '-',
},
# Rule Morocco 2047 only - Jul 28 2:00 0 -
{
'from_year': 2047,
'to_year': 2047,
'in_month': 7,
'on_day_of_week': 0,
'on_day_of_month': 28,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Morocco 2048 only - Jun 7 3:00 -1:00 -
{
'from_year': 2048,
'to_year': 2048,
'in_month': 6,
'on_day_of_week': 0,
'on_day_of_month': 7,
'at_seconds': 10800,
'at_time_suffix': 'w',
'delta_seconds': -3600,
'letter': '-',
},
# Rule Morocco 2048 only - Jul 19 2:00 0 -
{
'from_year': 2048,
'to_year': 2048,
'in_month': 7,
'on_day_of_week': 0,
'on_day_of_month': 19,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Morocco 2049 only - May 30 3:00 -1:00 -
{
'from_year': 2049,
'to_year': 2049,
'in_month': 5,
'on_day_of_week': 0,
'on_day_of_month': 30,
'at_seconds': 10800,
'at_time_suffix': 'w',
'delta_seconds': -3600,
'letter': '-',
},
# Rule Morocco 2049 only - Jul 4 2:00 0 -
{
'from_year': 2049,
'to_year': 2049,
'in_month': 7,
'on_day_of_week': 0,
'on_day_of_month': 4,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Morocco 2050 only - May 15 3:00 -1:00 -
{
'from_year': 2050,
'to_year': 2050,
'in_month': 5,
'on_day_of_week': 0,
'on_day_of_month': 15,
'at_seconds': 10800,
'at_time_suffix': 'w',
'delta_seconds': -3600,
'letter': '-',
},
# Rule Morocco 2050 only - Jun 26 2:00 0 -
{
'from_year': 2050,
'to_year': 2050,
'in_month': 6,
'on_day_of_week': 0,
'on_day_of_month': 26,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Morocco 2051 only - May 7 3:00 -1:00 -
{
'from_year': 2051,
'to_year': 2051,
'in_month': 5,
'on_day_of_week': 0,
'on_day_of_month': 7,
'at_seconds': 10800,
'at_time_suffix': 'w',
'delta_seconds': -3600,
'letter': '-',
},
]
ZONE_POLICY_Morocco = {
'name': 'Morocco',
'rules': ZONE_RULES_Morocco
}
#---------------------------------------------------------------------------
# Policy name: NC
# Rule count: 5
#---------------------------------------------------------------------------
ZONE_RULES_NC = [
# Anchor: Rule NC 1978 1979 - Feb 27 0:00 0 -
{
'from_year': 0,
'to_year': 0,
'in_month': 1,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule NC 1977 1978 - Dec Sun>=1 0:00 1:00 -
{
'from_year': 1977,
'to_year': 1978,
'in_month': 12,
'on_day_of_week': 7,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule NC 1978 1979 - Feb 27 0:00 0 -
{
'from_year': 1978,
'to_year': 1979,
'in_month': 2,
'on_day_of_week': 0,
'on_day_of_month': 27,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule NC 1996 only - Dec 1 2:00s 1:00 -
{
'from_year': 1996,
'to_year': 1996,
'in_month': 12,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 3600,
'letter': '-',
},
# Rule NC 1997 only - Mar 2 2:00s 0 -
{
'from_year': 1997,
'to_year': 1997,
'in_month': 3,
'on_day_of_week': 0,
'on_day_of_month': 2,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': '-',
},
]
ZONE_POLICY_NC = {
'name': 'NC',
'rules': ZONE_RULES_NC
}
#---------------------------------------------------------------------------
# Policy name: NT_YK
# Rule count: 4
#---------------------------------------------------------------------------
ZONE_RULES_NT_YK = [
# Rule NT_YK 1965 only - Oct lastSun 2:00 0 S
{
'from_year': 1965,
'to_year': 1965,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': 'S',
},
# Rule NT_YK 1980 1986 - Apr lastSun 2:00 1:00 D
{
'from_year': 1980,
'to_year': 1986,
'in_month': 4,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule NT_YK 1980 2006 - Oct lastSun 2:00 0 S
{
'from_year': 1980,
'to_year': 2006,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': 'S',
},
# Rule NT_YK 1987 2006 - Apr Sun>=1 2:00 1:00 D
{
'from_year': 1987,
'to_year': 2006,
'in_month': 4,
'on_day_of_week': 7,
'on_day_of_month': 1,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'D',
},
]
ZONE_POLICY_NT_YK = {
'name': 'NT_YK',
'rules': ZONE_RULES_NT_YK
}
#---------------------------------------------------------------------------
# Policy name: NZ
# Rule count: 10
#---------------------------------------------------------------------------
ZONE_RULES_NZ = [
# Rule NZ 1946 only - Jan 1 0:00 0 S
{
'from_year': 1946,
'to_year': 1946,
'in_month': 1,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': 'S',
},
# Rule NZ 1974 only - Nov Sun>=1 2:00s 1:00 D
{
'from_year': 1974,
'to_year': 1974,
'in_month': 11,
'on_day_of_week': 7,
'on_day_of_month': 1,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule NZ 1975 only - Feb lastSun 2:00s 0 S
{
'from_year': 1975,
'to_year': 1975,
'in_month': 2,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': 'S',
},
# Rule NZ 1975 1988 - Oct lastSun 2:00s 1:00 D
{
'from_year': 1975,
'to_year': 1988,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule NZ 1976 1989 - Mar Sun>=1 2:00s 0 S
{
'from_year': 1976,
'to_year': 1989,
'in_month': 3,
'on_day_of_week': 7,
'on_day_of_month': 1,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': 'S',
},
# Rule NZ 1989 only - Oct Sun>=8 2:00s 1:00 D
{
'from_year': 1989,
'to_year': 1989,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 8,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule NZ 1990 2006 - Oct Sun>=1 2:00s 1:00 D
{
'from_year': 1990,
'to_year': 2006,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 1,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule NZ 1990 2007 - Mar Sun>=15 2:00s 0 S
{
'from_year': 1990,
'to_year': 2007,
'in_month': 3,
'on_day_of_week': 7,
'on_day_of_month': 15,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': 'S',
},
# Rule NZ 2007 max - Sep lastSun 2:00s 1:00 D
{
'from_year': 2007,
'to_year': 9999,
'in_month': 9,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule NZ 2008 max - Apr Sun>=1 2:00s 0 S
{
'from_year': 2008,
'to_year': 9999,
'in_month': 4,
'on_day_of_week': 7,
'on_day_of_month': 1,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': 'S',
},
]
ZONE_POLICY_NZ = {
'name': 'NZ',
'rules': ZONE_RULES_NZ
}
#---------------------------------------------------------------------------
# Policy name: Namibia
# Rule count: 4
#---------------------------------------------------------------------------
ZONE_RULES_Namibia = [
# Anchor: Rule Namibia 1994 2017 - Sep Sun>=1 2:00 0 CAT
{
'from_year': 0,
'to_year': 0,
'in_month': 1,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': 'CAT',
},
# Rule Namibia 1994 only - Mar 21 0:00 -1:00 WAT
{
'from_year': 1994,
'to_year': 1994,
'in_month': 3,
'on_day_of_week': 0,
'on_day_of_month': 21,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': -3600,
'letter': 'WAT',
},
# Rule Namibia 1994 2017 - Sep Sun>=1 2:00 0 CAT
{
'from_year': 1994,
'to_year': 2017,
'in_month': 9,
'on_day_of_week': 7,
'on_day_of_month': 1,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': 'CAT',
},
# Rule Namibia 1995 2017 - Apr Sun>=1 2:00 -1:00 WAT
{
'from_year': 1995,
'to_year': 2017,
'in_month': 4,
'on_day_of_week': 7,
'on_day_of_month': 1,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': -3600,
'letter': 'WAT',
},
]
ZONE_POLICY_Namibia = {
'name': 'Namibia',
'rules': ZONE_RULES_Namibia
}
#---------------------------------------------------------------------------
# Policy name: Neth
# Rule count: 1
#---------------------------------------------------------------------------
ZONE_RULES_Neth = [
# Rule Neth 1945 only - Sep 16 2:00s 0 -
{
'from_year': 1945,
'to_year': 1945,
'in_month': 9,
'on_day_of_week': 0,
'on_day_of_month': 16,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': '-',
},
]
ZONE_POLICY_Neth = {
'name': 'Neth',
'rules': ZONE_RULES_Neth
}
#---------------------------------------------------------------------------
# Policy name: Nic
# Rule count: 7
#---------------------------------------------------------------------------
ZONE_RULES_Nic = [
# Anchor: Rule Nic 1979 1980 - Jun Mon>=23 0:00 0 S
{
'from_year': 0,
'to_year': 0,
'in_month': 1,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': 'S',
},
# Rule Nic 1979 1980 - Mar Sun>=16 0:00 1:00 D
{
'from_year': 1979,
'to_year': 1980,
'in_month': 3,
'on_day_of_week': 7,
'on_day_of_month': 16,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule Nic 1979 1980 - Jun Mon>=23 0:00 0 S
{
'from_year': 1979,
'to_year': 1980,
'in_month': 6,
'on_day_of_week': 1,
'on_day_of_month': 23,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': 'S',
},
# Rule Nic 2005 only - Apr 10 0:00 1:00 D
{
'from_year': 2005,
'to_year': 2005,
'in_month': 4,
'on_day_of_week': 0,
'on_day_of_month': 10,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule Nic 2005 only - Oct Sun>=1 0:00 0 S
{
'from_year': 2005,
'to_year': 2005,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': 'S',
},
# Rule Nic 2006 only - Apr 30 2:00 1:00 D
{
'from_year': 2006,
'to_year': 2006,
'in_month': 4,
'on_day_of_week': 0,
'on_day_of_month': 30,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule Nic 2006 only - Oct Sun>=1 1:00 0 S
{
'from_year': 2006,
'to_year': 2006,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 1,
'at_seconds': 3600,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': 'S',
},
]
ZONE_POLICY_Nic = {
'name': 'Nic',
'rules': ZONE_RULES_Nic
}
#---------------------------------------------------------------------------
# Policy name: Norway
# Rule count: 1
#---------------------------------------------------------------------------
ZONE_RULES_Norway = [
# Rule Norway 1959 1965 - Sep Sun>=15 2:00s 0 -
{
'from_year': 1959,
'to_year': 1965,
'in_month': 9,
'on_day_of_week': 7,
'on_day_of_month': 15,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': '-',
},
]
ZONE_POLICY_Norway = {
'name': 'Norway',
'rules': ZONE_RULES_Norway
}
#---------------------------------------------------------------------------
# Policy name: PRC
# Rule count: 4
#---------------------------------------------------------------------------
ZONE_RULES_PRC = [
# Anchor: Rule PRC 1986 1991 - Sep Sun>=11 2:00 0 S
{
'from_year': 0,
'to_year': 0,
'in_month': 1,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': 'S',
},
# Rule PRC 1986 only - May 4 2:00 1:00 D
{
'from_year': 1986,
'to_year': 1986,
'in_month': 5,
'on_day_of_week': 0,
'on_day_of_month': 4,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule PRC 1986 1991 - Sep Sun>=11 2:00 0 S
{
'from_year': 1986,
'to_year': 1991,
'in_month': 9,
'on_day_of_week': 7,
'on_day_of_month': 11,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': 'S',
},
# Rule PRC 1987 1991 - Apr Sun>=11 2:00 1:00 D
{
'from_year': 1987,
'to_year': 1991,
'in_month': 4,
'on_day_of_week': 7,
'on_day_of_month': 11,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'D',
},
]
ZONE_POLICY_PRC = {
'name': 'PRC',
'rules': ZONE_RULES_PRC
}
#---------------------------------------------------------------------------
# Policy name: Pakistan
# Rule count: 6
#---------------------------------------------------------------------------
ZONE_RULES_Pakistan = [
# Anchor: Rule Pakistan 2002 only - Oct Sun>=2 0:00 0 -
{
'from_year': 0,
'to_year': 0,
'in_month': 1,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Pakistan 2002 only - Apr Sun>=2 0:00 1:00 S
{
'from_year': 2002,
'to_year': 2002,
'in_month': 4,
'on_day_of_week': 7,
'on_day_of_month': 2,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Pakistan 2002 only - Oct Sun>=2 0:00 0 -
{
'from_year': 2002,
'to_year': 2002,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 2,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Pakistan 2008 only - Jun 1 0:00 1:00 S
{
'from_year': 2008,
'to_year': 2008,
'in_month': 6,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Pakistan 2008 2009 - Nov 1 0:00 0 -
{
'from_year': 2008,
'to_year': 2009,
'in_month': 11,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Pakistan 2009 only - Apr 15 0:00 1:00 S
{
'from_year': 2009,
'to_year': 2009,
'in_month': 4,
'on_day_of_week': 0,
'on_day_of_month': 15,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'S',
},
]
ZONE_POLICY_Pakistan = {
'name': 'Pakistan',
'rules': ZONE_RULES_Pakistan
}
#---------------------------------------------------------------------------
# Policy name: Palestine
# Rule count: 29
#---------------------------------------------------------------------------
ZONE_RULES_Palestine = [
# Anchor: Rule Palestine 1999 2003 - Oct Fri>=15 0:00 0 -
{
'from_year': 0,
'to_year': 0,
'in_month': 1,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Palestine 1999 2005 - Apr Fri>=15 0:00 1:00 S
{
'from_year': 1999,
'to_year': 2005,
'in_month': 4,
'on_day_of_week': 5,
'on_day_of_month': 15,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Palestine 1999 2003 - Oct Fri>=15 0:00 0 -
{
'from_year': 1999,
'to_year': 2003,
'in_month': 10,
'on_day_of_week': 5,
'on_day_of_month': 15,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Palestine 2004 only - Oct 1 1:00 0 -
{
'from_year': 2004,
'to_year': 2004,
'in_month': 10,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 3600,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Palestine 2005 only - Oct 4 2:00 0 -
{
'from_year': 2005,
'to_year': 2005,
'in_month': 10,
'on_day_of_week': 0,
'on_day_of_month': 4,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Palestine 2006 2007 - Apr 1 0:00 1:00 S
{
'from_year': 2006,
'to_year': 2007,
'in_month': 4,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Palestine 2006 only - Sep 22 0:00 0 -
{
'from_year': 2006,
'to_year': 2006,
'in_month': 9,
'on_day_of_week': 0,
'on_day_of_month': 22,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Palestine 2007 only - Sep 13 2:00 0 -
{
'from_year': 2007,
'to_year': 2007,
'in_month': 9,
'on_day_of_week': 0,
'on_day_of_month': 13,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Palestine 2008 2009 - Mar lastFri 0:00 1:00 S
{
'from_year': 2008,
'to_year': 2009,
'in_month': 3,
'on_day_of_week': 5,
'on_day_of_month': 0,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Palestine 2008 only - Sep 1 0:00 0 -
{
'from_year': 2008,
'to_year': 2008,
'in_month': 9,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Palestine 2009 only - Sep 4 1:00 0 -
{
'from_year': 2009,
'to_year': 2009,
'in_month': 9,
'on_day_of_week': 0,
'on_day_of_month': 4,
'at_seconds': 3600,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Palestine 2010 only - Mar 26 0:00 1:00 S
{
'from_year': 2010,
'to_year': 2010,
'in_month': 3,
'on_day_of_week': 0,
'on_day_of_month': 26,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Palestine 2010 only - Aug 11 0:00 0 -
{
'from_year': 2010,
'to_year': 2010,
'in_month': 8,
'on_day_of_week': 0,
'on_day_of_month': 11,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Palestine 2011 only - Apr 1 0:01 1:00 S
{
'from_year': 2011,
'to_year': 2011,
'in_month': 4,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 60,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Palestine 2011 only - Aug 1 0:00 0 -
{
'from_year': 2011,
'to_year': 2011,
'in_month': 8,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Palestine 2011 only - Aug 30 0:00 1:00 S
{
'from_year': 2011,
'to_year': 2011,
'in_month': 8,
'on_day_of_week': 0,
'on_day_of_month': 30,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Palestine 2011 only - Sep 30 0:00 0 -
{
'from_year': 2011,
'to_year': 2011,
'in_month': 9,
'on_day_of_week': 0,
'on_day_of_month': 30,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Palestine 2012 2014 - Mar lastThu 24:00 1:00 S
{
'from_year': 2012,
'to_year': 2014,
'in_month': 3,
'on_day_of_week': 4,
'on_day_of_month': 0,
'at_seconds': 86400,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Palestine 2012 only - Sep 21 1:00 0 -
{
'from_year': 2012,
'to_year': 2012,
'in_month': 9,
'on_day_of_week': 0,
'on_day_of_month': 21,
'at_seconds': 3600,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Palestine 2013 only - Sep 27 0:00 0 -
{
'from_year': 2013,
'to_year': 2013,
'in_month': 9,
'on_day_of_week': 0,
'on_day_of_month': 27,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Palestine 2014 only - Oct 24 0:00 0 -
{
'from_year': 2014,
'to_year': 2014,
'in_month': 10,
'on_day_of_week': 0,
'on_day_of_month': 24,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Palestine 2015 only - Mar 28 0:00 1:00 S
{
'from_year': 2015,
'to_year': 2015,
'in_month': 3,
'on_day_of_week': 0,
'on_day_of_month': 28,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Palestine 2015 only - Oct 23 1:00 0 -
{
'from_year': 2015,
'to_year': 2015,
'in_month': 10,
'on_day_of_week': 0,
'on_day_of_month': 23,
'at_seconds': 3600,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Palestine 2016 2018 - Mar Sat>=24 1:00 1:00 S
{
'from_year': 2016,
'to_year': 2018,
'in_month': 3,
'on_day_of_week': 6,
'on_day_of_month': 24,
'at_seconds': 3600,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Palestine 2016 2018 - Oct Sat>=24 1:00 0 -
{
'from_year': 2016,
'to_year': 2018,
'in_month': 10,
'on_day_of_week': 6,
'on_day_of_month': 24,
'at_seconds': 3600,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Palestine 2019 only - Mar 29 0:00 1:00 S
{
'from_year': 2019,
'to_year': 2019,
'in_month': 3,
'on_day_of_week': 0,
'on_day_of_month': 29,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Palestine 2019 only - Oct Sat>=24 0:00 0 -
{
'from_year': 2019,
'to_year': 2019,
'in_month': 10,
'on_day_of_week': 6,
'on_day_of_month': 24,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Palestine 2020 max - Mar Sat>=24 0:00 1:00 S
{
'from_year': 2020,
'to_year': 9999,
'in_month': 3,
'on_day_of_week': 6,
'on_day_of_month': 24,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Palestine 2020 max - Oct Sat>=24 1:00 0 -
{
'from_year': 2020,
'to_year': 9999,
'in_month': 10,
'on_day_of_week': 6,
'on_day_of_month': 24,
'at_seconds': 3600,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
]
ZONE_POLICY_Palestine = {
'name': 'Palestine',
'rules': ZONE_RULES_Palestine
}
#---------------------------------------------------------------------------
# Policy name: Para
# Rule count: 23
#---------------------------------------------------------------------------
ZONE_RULES_Para = [
# Anchor: Rule Para 1975 1978 - Mar 1 0:00 0 -
{
'from_year': 0,
'to_year': 0,
'in_month': 1,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Para 1975 1988 - Oct 1 0:00 1:00 -
{
'from_year': 1975,
'to_year': 1988,
'in_month': 10,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Para 1975 1978 - Mar 1 0:00 0 -
{
'from_year': 1975,
'to_year': 1978,
'in_month': 3,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Para 1979 1991 - Apr 1 0:00 0 -
{
'from_year': 1979,
'to_year': 1991,
'in_month': 4,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Para 1989 only - Oct 22 0:00 1:00 -
{
'from_year': 1989,
'to_year': 1989,
'in_month': 10,
'on_day_of_week': 0,
'on_day_of_month': 22,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Para 1990 only - Oct 1 0:00 1:00 -
{
'from_year': 1990,
'to_year': 1990,
'in_month': 10,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Para 1991 only - Oct 6 0:00 1:00 -
{
'from_year': 1991,
'to_year': 1991,
'in_month': 10,
'on_day_of_week': 0,
'on_day_of_month': 6,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Para 1992 only - Mar 1 0:00 0 -
{
'from_year': 1992,
'to_year': 1992,
'in_month': 3,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Para 1992 only - Oct 5 0:00 1:00 -
{
'from_year': 1992,
'to_year': 1992,
'in_month': 10,
'on_day_of_week': 0,
'on_day_of_month': 5,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Para 1993 only - Mar 31 0:00 0 -
{
'from_year': 1993,
'to_year': 1993,
'in_month': 3,
'on_day_of_week': 0,
'on_day_of_month': 31,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Para 1993 1995 - Oct 1 0:00 1:00 -
{
'from_year': 1993,
'to_year': 1995,
'in_month': 10,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Para 1994 1995 - Feb lastSun 0:00 0 -
{
'from_year': 1994,
'to_year': 1995,
'in_month': 2,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Para 1996 only - Mar 1 0:00 0 -
{
'from_year': 1996,
'to_year': 1996,
'in_month': 3,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Para 1996 2001 - Oct Sun>=1 0:00 1:00 -
{
'from_year': 1996,
'to_year': 2001,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Para 1997 only - Feb lastSun 0:00 0 -
{
'from_year': 1997,
'to_year': 1997,
'in_month': 2,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Para 1998 2001 - Mar Sun>=1 0:00 0 -
{
'from_year': 1998,
'to_year': 2001,
'in_month': 3,
'on_day_of_week': 7,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Para 2002 2004 - Apr Sun>=1 0:00 0 -
{
'from_year': 2002,
'to_year': 2004,
'in_month': 4,
'on_day_of_week': 7,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Para 2002 2003 - Sep Sun>=1 0:00 1:00 -
{
'from_year': 2002,
'to_year': 2003,
'in_month': 9,
'on_day_of_week': 7,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Para 2004 2009 - Oct Sun>=15 0:00 1:00 -
{
'from_year': 2004,
'to_year': 2009,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 15,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Para 2005 2009 - Mar Sun>=8 0:00 0 -
{
'from_year': 2005,
'to_year': 2009,
'in_month': 3,
'on_day_of_week': 7,
'on_day_of_month': 8,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Para 2010 max - Oct Sun>=1 0:00 1:00 -
{
'from_year': 2010,
'to_year': 9999,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Para 2010 2012 - Apr Sun>=8 0:00 0 -
{
'from_year': 2010,
'to_year': 2012,
'in_month': 4,
'on_day_of_week': 7,
'on_day_of_month': 8,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Para 2013 max - Mar Sun>=22 0:00 0 -
{
'from_year': 2013,
'to_year': 9999,
'in_month': 3,
'on_day_of_week': 7,
'on_day_of_month': 22,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
]
ZONE_POLICY_Para = {
'name': 'Para',
'rules': ZONE_RULES_Para
}
#---------------------------------------------------------------------------
# Policy name: Peru
# Rule count: 7
#---------------------------------------------------------------------------
ZONE_RULES_Peru = [
# Rule Peru 1939 1940 - Mar Sun>=24 0:00 0 -
{
'from_year': 1939,
'to_year': 1940,
'in_month': 3,
'on_day_of_week': 7,
'on_day_of_month': 24,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Peru 1986 1987 - Jan 1 0:00 1:00 -
{
'from_year': 1986,
'to_year': 1987,
'in_month': 1,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Peru 1986 1987 - Apr 1 0:00 0 -
{
'from_year': 1986,
'to_year': 1987,
'in_month': 4,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Peru 1990 only - Jan 1 0:00 1:00 -
{
'from_year': 1990,
'to_year': 1990,
'in_month': 1,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Peru 1990 only - Apr 1 0:00 0 -
{
'from_year': 1990,
'to_year': 1990,
'in_month': 4,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Peru 1994 only - Jan 1 0:00 1:00 -
{
'from_year': 1994,
'to_year': 1994,
'in_month': 1,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Peru 1994 only - Apr 1 0:00 0 -
{
'from_year': 1994,
'to_year': 1994,
'in_month': 4,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
]
ZONE_POLICY_Peru = {
'name': 'Peru',
'rules': ZONE_RULES_Peru
}
#---------------------------------------------------------------------------
# Policy name: Phil
# Rule count: 3
#---------------------------------------------------------------------------
ZONE_RULES_Phil = [
# Rule Phil 1954 only - Jul 1 0:00 0 S
{
'from_year': 1954,
'to_year': 1954,
'in_month': 7,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': 'S',
},
# Rule Phil 1978 only - Mar 22 0:00 1:00 D
{
'from_year': 1978,
'to_year': 1978,
'in_month': 3,
'on_day_of_week': 0,
'on_day_of_month': 22,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule Phil 1978 only - Sep 21 0:00 0 S
{
'from_year': 1978,
'to_year': 1978,
'in_month': 9,
'on_day_of_week': 0,
'on_day_of_month': 21,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': 'S',
},
]
ZONE_POLICY_Phil = {
'name': 'Phil',
'rules': ZONE_RULES_Phil
}
#---------------------------------------------------------------------------
# Policy name: Poland
# Rule count: 1
#---------------------------------------------------------------------------
ZONE_RULES_Poland = [
# Rule Poland 1962 1964 - Sep lastSun 1:00s 0 -
{
'from_year': 1962,
'to_year': 1964,
'in_month': 9,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 3600,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': '-',
},
]
ZONE_POLICY_Poland = {
'name': 'Poland',
'rules': ZONE_RULES_Poland
}
#---------------------------------------------------------------------------
# Policy name: Port
# Rule count: 9
#---------------------------------------------------------------------------
ZONE_RULES_Port = [
# Rule Port 1951 1965 - Oct Sun>=1 2:00s 0 -
{
'from_year': 1951,
'to_year': 1965,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 1,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': '-',
},
# Rule Port 1977 only - Mar 27 0:00s 1:00 S
{
'from_year': 1977,
'to_year': 1977,
'in_month': 3,
'on_day_of_week': 0,
'on_day_of_month': 27,
'at_seconds': 0,
'at_time_suffix': 's',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Port 1977 only - Sep 25 0:00s 0 -
{
'from_year': 1977,
'to_year': 1977,
'in_month': 9,
'on_day_of_week': 0,
'on_day_of_month': 25,
'at_seconds': 0,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': '-',
},
# Rule Port 1978 1979 - Apr Sun>=1 0:00s 1:00 S
{
'from_year': 1978,
'to_year': 1979,
'in_month': 4,
'on_day_of_week': 7,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 's',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Port 1978 only - Oct 1 0:00s 0 -
{
'from_year': 1978,
'to_year': 1978,
'in_month': 10,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': '-',
},
# Rule Port 1979 1982 - Sep lastSun 1:00s 0 -
{
'from_year': 1979,
'to_year': 1982,
'in_month': 9,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 3600,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': '-',
},
# Rule Port 1980 only - Mar lastSun 0:00s 1:00 S
{
'from_year': 1980,
'to_year': 1980,
'in_month': 3,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 0,
'at_time_suffix': 's',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Port 1981 1982 - Mar lastSun 1:00s 1:00 S
{
'from_year': 1981,
'to_year': 1982,
'in_month': 3,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 3600,
'at_time_suffix': 's',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Port 1983 only - Mar lastSun 2:00s 1:00 S
{
'from_year': 1983,
'to_year': 1983,
'in_month': 3,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 3600,
'letter': 'S',
},
]
ZONE_POLICY_Port = {
'name': 'Port',
'rules': ZONE_RULES_Port
}
#---------------------------------------------------------------------------
# Policy name: ROK
# Rule count: 3
#---------------------------------------------------------------------------
ZONE_RULES_ROK = [
# Rule ROK 1957 1960 - Sep Sat>=17 24:00 0 S
{
'from_year': 1957,
'to_year': 1960,
'in_month': 9,
'on_day_of_week': 6,
'on_day_of_month': 17,
'at_seconds': 86400,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': 'S',
},
# Rule ROK 1987 1988 - May Sun>=8 2:00 1:00 D
{
'from_year': 1987,
'to_year': 1988,
'in_month': 5,
'on_day_of_week': 7,
'on_day_of_month': 8,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule ROK 1987 1988 - Oct Sun>=8 3:00 0 S
{
'from_year': 1987,
'to_year': 1988,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 8,
'at_seconds': 10800,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': 'S',
},
]
ZONE_POLICY_ROK = {
'name': 'ROK',
'rules': ZONE_RULES_ROK
}
#---------------------------------------------------------------------------
# Policy name: Romania
# Rule count: 7
#---------------------------------------------------------------------------
ZONE_RULES_Romania = [
# Rule Romania 1932 1939 - Oct Sun>=1 0:00s 0 -
{
'from_year': 1932,
'to_year': 1939,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': '-',
},
# Rule Romania 1979 only - May 27 0:00 1:00 S
{
'from_year': 1979,
'to_year': 1979,
'in_month': 5,
'on_day_of_week': 0,
'on_day_of_month': 27,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Romania 1979 only - Sep lastSun 0:00 0 -
{
'from_year': 1979,
'to_year': 1979,
'in_month': 9,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Romania 1980 only - Apr 5 23:00 1:00 S
{
'from_year': 1980,
'to_year': 1980,
'in_month': 4,
'on_day_of_week': 0,
'on_day_of_month': 5,
'at_seconds': 82800,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Romania 1980 only - Sep lastSun 1:00 0 -
{
'from_year': 1980,
'to_year': 1980,
'in_month': 9,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 3600,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Romania 1991 1993 - Mar lastSun 0:00s 1:00 S
{
'from_year': 1991,
'to_year': 1993,
'in_month': 3,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 0,
'at_time_suffix': 's',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Romania 1991 1993 - Sep lastSun 0:00s 0 -
{
'from_year': 1991,
'to_year': 1993,
'in_month': 9,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 0,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': '-',
},
]
ZONE_POLICY_Romania = {
'name': 'Romania',
'rules': ZONE_RULES_Romania
}
#---------------------------------------------------------------------------
# Policy name: Russia
# Rule count: 6
#---------------------------------------------------------------------------
ZONE_RULES_Russia = [
# Rule Russia 1921 only - Oct 1 0:00 0 -
{
'from_year': 1921,
'to_year': 1921,
'in_month': 10,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Russia 1981 1984 - Apr 1 0:00 1:00 S
{
'from_year': 1981,
'to_year': 1984,
'in_month': 4,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Russia 1981 1983 - Oct 1 0:00 0 -
{
'from_year': 1981,
'to_year': 1983,
'in_month': 10,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Russia 1984 1995 - Sep lastSun 2:00s 0 -
{
'from_year': 1984,
'to_year': 1995,
'in_month': 9,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': '-',
},
# Rule Russia 1985 2010 - Mar lastSun 2:00s 1:00 S
{
'from_year': 1985,
'to_year': 2010,
'in_month': 3,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Russia 1996 2010 - Oct lastSun 2:00s 0 -
{
'from_year': 1996,
'to_year': 2010,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': '-',
},
]
ZONE_POLICY_Russia = {
'name': 'Russia',
'rules': ZONE_RULES_Russia
}
#---------------------------------------------------------------------------
# Policy name: RussiaAsia
# Rule count: 6
#---------------------------------------------------------------------------
ZONE_RULES_RussiaAsia = [
# Anchor: Rule RussiaAsia 1981 1983 - Oct 1 0:00 0 -
{
'from_year': 0,
'to_year': 0,
'in_month': 1,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule RussiaAsia 1981 1984 - Apr 1 0:00 1:00 -
{
'from_year': 1981,
'to_year': 1984,
'in_month': 4,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule RussiaAsia 1981 1983 - Oct 1 0:00 0 -
{
'from_year': 1981,
'to_year': 1983,
'in_month': 10,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule RussiaAsia 1984 1995 - Sep lastSun 2:00s 0 -
{
'from_year': 1984,
'to_year': 1995,
'in_month': 9,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': '-',
},
# Rule RussiaAsia 1985 2010 - Mar lastSun 2:00s 1:00 -
{
'from_year': 1985,
'to_year': 2010,
'in_month': 3,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 3600,
'letter': '-',
},
# Rule RussiaAsia 1996 2010 - Oct lastSun 2:00s 0 -
{
'from_year': 1996,
'to_year': 2010,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': '-',
},
]
ZONE_POLICY_RussiaAsia = {
'name': 'RussiaAsia',
'rules': ZONE_RULES_RussiaAsia
}
#---------------------------------------------------------------------------
# Policy name: SA
# Rule count: 1
#---------------------------------------------------------------------------
ZONE_RULES_SA = [
# Rule SA 1943 1944 - Mar Sun>=15 2:00 0 -
{
'from_year': 1943,
'to_year': 1944,
'in_month': 3,
'on_day_of_week': 7,
'on_day_of_month': 15,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
]
ZONE_POLICY_SA = {
'name': 'SA',
'rules': ZONE_RULES_SA
}
#---------------------------------------------------------------------------
# Policy name: Salv
# Rule count: 3
#---------------------------------------------------------------------------
ZONE_RULES_Salv = [
# Anchor: Rule Salv 1987 1988 - Sep lastSun 0:00 0 S
{
'from_year': 0,
'to_year': 0,
'in_month': 1,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': 'S',
},
# Rule Salv 1987 1988 - May Sun>=1 0:00 1:00 D
{
'from_year': 1987,
'to_year': 1988,
'in_month': 5,
'on_day_of_week': 7,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule Salv 1987 1988 - Sep lastSun 0:00 0 S
{
'from_year': 1987,
'to_year': 1988,
'in_month': 9,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': 'S',
},
]
ZONE_POLICY_Salv = {
'name': 'Salv',
'rules': ZONE_RULES_Salv
}
#---------------------------------------------------------------------------
# Policy name: SanLuis
# Rule count: 3
#---------------------------------------------------------------------------
ZONE_RULES_SanLuis = [
# Anchor: Rule SanLuis 2008 2009 - Mar Sun>=8 0:00 0 -
{
'from_year': 0,
'to_year': 0,
'in_month': 1,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule SanLuis 2008 2009 - Mar Sun>=8 0:00 0 -
{
'from_year': 2008,
'to_year': 2009,
'in_month': 3,
'on_day_of_week': 7,
'on_day_of_month': 8,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule SanLuis 2007 2008 - Oct Sun>=8 0:00 1:00 -
{
'from_year': 2007,
'to_year': 2008,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 8,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
]
ZONE_POLICY_SanLuis = {
'name': 'SanLuis',
'rules': ZONE_RULES_SanLuis
}
#---------------------------------------------------------------------------
# Policy name: Spain
# Rule count: 8
#---------------------------------------------------------------------------
ZONE_RULES_Spain = [
# Rule Spain 1949 only - Oct 2 1:00 0 -
{
'from_year': 1949,
'to_year': 1949,
'in_month': 10,
'on_day_of_week': 0,
'on_day_of_month': 2,
'at_seconds': 3600,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Spain 1974 1975 - Apr Sat>=12 23:00 1:00 S
{
'from_year': 1974,
'to_year': 1975,
'in_month': 4,
'on_day_of_week': 6,
'on_day_of_month': 12,
'at_seconds': 82800,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Spain 1974 1975 - Oct Sun>=1 1:00 0 -
{
'from_year': 1974,
'to_year': 1975,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 1,
'at_seconds': 3600,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Spain 1976 only - Mar 27 23:00 1:00 S
{
'from_year': 1976,
'to_year': 1976,
'in_month': 3,
'on_day_of_week': 0,
'on_day_of_month': 27,
'at_seconds': 82800,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Spain 1976 1977 - Sep lastSun 1:00 0 -
{
'from_year': 1976,
'to_year': 1977,
'in_month': 9,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 3600,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Spain 1977 only - Apr 2 23:00 1:00 S
{
'from_year': 1977,
'to_year': 1977,
'in_month': 4,
'on_day_of_week': 0,
'on_day_of_month': 2,
'at_seconds': 82800,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Spain 1978 only - Apr 2 2:00s 1:00 S
{
'from_year': 1978,
'to_year': 1978,
'in_month': 4,
'on_day_of_week': 0,
'on_day_of_month': 2,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Spain 1978 only - Oct 1 2:00s 0 -
{
'from_year': 1978,
'to_year': 1978,
'in_month': 10,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': '-',
},
]
ZONE_POLICY_Spain = {
'name': 'Spain',
'rules': ZONE_RULES_Spain
}
#---------------------------------------------------------------------------
# Policy name: SpainAfrica
# Rule count: 8
#---------------------------------------------------------------------------
ZONE_RULES_SpainAfrica = [
# Rule SpainAfrica 1967 only - Oct 1 0:00 0 -
{
'from_year': 1967,
'to_year': 1967,
'in_month': 10,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule SpainAfrica 1974 only - Jun 24 0:00 1:00 S
{
'from_year': 1974,
'to_year': 1974,
'in_month': 6,
'on_day_of_week': 0,
'on_day_of_month': 24,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule SpainAfrica 1974 only - Sep 1 0:00 0 -
{
'from_year': 1974,
'to_year': 1974,
'in_month': 9,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule SpainAfrica 1976 1977 - May 1 0:00 1:00 S
{
'from_year': 1976,
'to_year': 1977,
'in_month': 5,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule SpainAfrica 1976 only - Aug 1 0:00 0 -
{
'from_year': 1976,
'to_year': 1976,
'in_month': 8,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule SpainAfrica 1977 only - Sep 28 0:00 0 -
{
'from_year': 1977,
'to_year': 1977,
'in_month': 9,
'on_day_of_week': 0,
'on_day_of_month': 28,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule SpainAfrica 1978 only - Jun 1 0:00 1:00 S
{
'from_year': 1978,
'to_year': 1978,
'in_month': 6,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule SpainAfrica 1978 only - Aug 4 0:00 0 -
{
'from_year': 1978,
'to_year': 1978,
'in_month': 8,
'on_day_of_week': 0,
'on_day_of_month': 4,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
]
ZONE_POLICY_SpainAfrica = {
'name': 'SpainAfrica',
'rules': ZONE_RULES_SpainAfrica
}
#---------------------------------------------------------------------------
# Policy name: StJohns
# Rule count: 9
#---------------------------------------------------------------------------
ZONE_RULES_StJohns = [
# Rule StJohns 1951 1986 - Apr lastSun 2:00 1:00 D
{
'from_year': 1951,
'to_year': 1986,
'in_month': 4,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule StJohns 1951 1959 - Sep lastSun 2:00 0 S
{
'from_year': 1951,
'to_year': 1959,
'in_month': 9,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': 'S',
},
# Rule StJohns 1960 1986 - Oct lastSun 2:00 0 S
{
'from_year': 1960,
'to_year': 1986,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': 'S',
},
# Rule StJohns 1987 only - Apr Sun>=1 0:01 1:00 D
{
'from_year': 1987,
'to_year': 1987,
'in_month': 4,
'on_day_of_week': 7,
'on_day_of_month': 1,
'at_seconds': 60,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule StJohns 1987 2006 - Oct lastSun 0:01 0 S
{
'from_year': 1987,
'to_year': 2006,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 60,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': 'S',
},
# Rule StJohns 1988 only - Apr Sun>=1 0:01 2:00 DD
{
'from_year': 1988,
'to_year': 1988,
'in_month': 4,
'on_day_of_week': 7,
'on_day_of_month': 1,
'at_seconds': 60,
'at_time_suffix': 'w',
'delta_seconds': 7200,
'letter': 'DD',
},
# Rule StJohns 1989 2006 - Apr Sun>=1 0:01 1:00 D
{
'from_year': 1989,
'to_year': 2006,
'in_month': 4,
'on_day_of_week': 7,
'on_day_of_month': 1,
'at_seconds': 60,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule StJohns 2007 2011 - Mar Sun>=8 0:01 1:00 D
{
'from_year': 2007,
'to_year': 2011,
'in_month': 3,
'on_day_of_week': 7,
'on_day_of_month': 8,
'at_seconds': 60,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule StJohns 2007 2010 - Nov Sun>=1 0:01 0 S
{
'from_year': 2007,
'to_year': 2010,
'in_month': 11,
'on_day_of_week': 7,
'on_day_of_month': 1,
'at_seconds': 60,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': 'S',
},
]
ZONE_POLICY_StJohns = {
'name': 'StJohns',
'rules': ZONE_RULES_StJohns
}
#---------------------------------------------------------------------------
# Policy name: Sudan
# Rule count: 3
#---------------------------------------------------------------------------
ZONE_RULES_Sudan = [
# Rule Sudan 1970 1985 - Oct 15 0:00 0 -
{
'from_year': 1970,
'to_year': 1985,
'in_month': 10,
'on_day_of_week': 0,
'on_day_of_month': 15,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Sudan 1971 only - Apr 30 0:00 1:00 S
{
'from_year': 1971,
'to_year': 1971,
'in_month': 4,
'on_day_of_week': 0,
'on_day_of_month': 30,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Sudan 1972 1985 - Apr lastSun 0:00 1:00 S
{
'from_year': 1972,
'to_year': 1985,
'in_month': 4,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'S',
},
]
ZONE_POLICY_Sudan = {
'name': 'Sudan',
'rules': ZONE_RULES_Sudan
}
#---------------------------------------------------------------------------
# Policy name: Swiss
# Rule count: 1
#---------------------------------------------------------------------------
ZONE_RULES_Swiss = [
# Rule Swiss 1941 1942 - Oct Mon>=1 2:00 0 -
{
'from_year': 1941,
'to_year': 1942,
'in_month': 10,
'on_day_of_week': 1,
'on_day_of_month': 1,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
]
ZONE_POLICY_Swiss = {
'name': 'Swiss',
'rules': ZONE_RULES_Swiss
}
#---------------------------------------------------------------------------
# Policy name: Syria
# Rule count: 33
#---------------------------------------------------------------------------
ZONE_RULES_Syria = [
# Rule Syria 1966 only - Apr 24 2:00 1:00 S
{
'from_year': 1966,
'to_year': 1966,
'in_month': 4,
'on_day_of_week': 0,
'on_day_of_month': 24,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Syria 1966 1976 - Oct 1 2:00 0 -
{
'from_year': 1966,
'to_year': 1976,
'in_month': 10,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Syria 1967 1978 - May 1 2:00 1:00 S
{
'from_year': 1967,
'to_year': 1978,
'in_month': 5,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Syria 1977 1978 - Sep 1 2:00 0 -
{
'from_year': 1977,
'to_year': 1978,
'in_month': 9,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Syria 1983 1984 - Apr 9 2:00 1:00 S
{
'from_year': 1983,
'to_year': 1984,
'in_month': 4,
'on_day_of_week': 0,
'on_day_of_month': 9,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Syria 1983 1984 - Oct 1 2:00 0 -
{
'from_year': 1983,
'to_year': 1984,
'in_month': 10,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Syria 1986 only - Feb 16 2:00 1:00 S
{
'from_year': 1986,
'to_year': 1986,
'in_month': 2,
'on_day_of_week': 0,
'on_day_of_month': 16,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Syria 1986 only - Oct 9 2:00 0 -
{
'from_year': 1986,
'to_year': 1986,
'in_month': 10,
'on_day_of_week': 0,
'on_day_of_month': 9,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Syria 1987 only - Mar 1 2:00 1:00 S
{
'from_year': 1987,
'to_year': 1987,
'in_month': 3,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Syria 1987 1988 - Oct 31 2:00 0 -
{
'from_year': 1987,
'to_year': 1988,
'in_month': 10,
'on_day_of_week': 0,
'on_day_of_month': 31,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Syria 1988 only - Mar 15 2:00 1:00 S
{
'from_year': 1988,
'to_year': 1988,
'in_month': 3,
'on_day_of_week': 0,
'on_day_of_month': 15,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Syria 1989 only - Mar 31 2:00 1:00 S
{
'from_year': 1989,
'to_year': 1989,
'in_month': 3,
'on_day_of_week': 0,
'on_day_of_month': 31,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Syria 1989 only - Oct 1 2:00 0 -
{
'from_year': 1989,
'to_year': 1989,
'in_month': 10,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Syria 1990 only - Apr 1 2:00 1:00 S
{
'from_year': 1990,
'to_year': 1990,
'in_month': 4,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Syria 1990 only - Sep 30 2:00 0 -
{
'from_year': 1990,
'to_year': 1990,
'in_month': 9,
'on_day_of_week': 0,
'on_day_of_month': 30,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Syria 1991 only - Apr 1 0:00 1:00 S
{
'from_year': 1991,
'to_year': 1991,
'in_month': 4,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Syria 1991 1992 - Oct 1 0:00 0 -
{
'from_year': 1991,
'to_year': 1992,
'in_month': 10,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Syria 1992 only - Apr 8 0:00 1:00 S
{
'from_year': 1992,
'to_year': 1992,
'in_month': 4,
'on_day_of_week': 0,
'on_day_of_month': 8,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Syria 1993 only - Mar 26 0:00 1:00 S
{
'from_year': 1993,
'to_year': 1993,
'in_month': 3,
'on_day_of_week': 0,
'on_day_of_month': 26,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Syria 1993 only - Sep 25 0:00 0 -
{
'from_year': 1993,
'to_year': 1993,
'in_month': 9,
'on_day_of_week': 0,
'on_day_of_month': 25,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Syria 1994 1996 - Apr 1 0:00 1:00 S
{
'from_year': 1994,
'to_year': 1996,
'in_month': 4,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Syria 1994 2005 - Oct 1 0:00 0 -
{
'from_year': 1994,
'to_year': 2005,
'in_month': 10,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Syria 1997 1998 - Mar lastMon 0:00 1:00 S
{
'from_year': 1997,
'to_year': 1998,
'in_month': 3,
'on_day_of_week': 1,
'on_day_of_month': 0,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Syria 1999 2006 - Apr 1 0:00 1:00 S
{
'from_year': 1999,
'to_year': 2006,
'in_month': 4,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Syria 2006 only - Sep 22 0:00 0 -
{
'from_year': 2006,
'to_year': 2006,
'in_month': 9,
'on_day_of_week': 0,
'on_day_of_month': 22,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Syria 2007 only - Mar lastFri 0:00 1:00 S
{
'from_year': 2007,
'to_year': 2007,
'in_month': 3,
'on_day_of_week': 5,
'on_day_of_month': 0,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Syria 2007 only - Nov Fri>=1 0:00 0 -
{
'from_year': 2007,
'to_year': 2007,
'in_month': 11,
'on_day_of_week': 5,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Syria 2008 only - Apr Fri>=1 0:00 1:00 S
{
'from_year': 2008,
'to_year': 2008,
'in_month': 4,
'on_day_of_week': 5,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Syria 2008 only - Nov 1 0:00 0 -
{
'from_year': 2008,
'to_year': 2008,
'in_month': 11,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Syria 2009 only - Mar lastFri 0:00 1:00 S
{
'from_year': 2009,
'to_year': 2009,
'in_month': 3,
'on_day_of_week': 5,
'on_day_of_month': 0,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Syria 2010 2011 - Apr Fri>=1 0:00 1:00 S
{
'from_year': 2010,
'to_year': 2011,
'in_month': 4,
'on_day_of_week': 5,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Syria 2012 max - Mar lastFri 0:00 1:00 S
{
'from_year': 2012,
'to_year': 9999,
'in_month': 3,
'on_day_of_week': 5,
'on_day_of_month': 0,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Syria 2009 max - Oct lastFri 0:00 0 -
{
'from_year': 2009,
'to_year': 9999,
'in_month': 10,
'on_day_of_week': 5,
'on_day_of_month': 0,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
]
ZONE_POLICY_Syria = {
'name': 'Syria',
'rules': ZONE_RULES_Syria
}
#---------------------------------------------------------------------------
# Policy name: Taiwan
# Rule count: 5
#---------------------------------------------------------------------------
ZONE_RULES_Taiwan = [
# Rule Taiwan 1955 1961 - Oct 1 0:00 0 S
{
'from_year': 1955,
'to_year': 1961,
'in_month': 10,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': 'S',
},
# Rule Taiwan 1974 1975 - Apr 1 0:00 1:00 D
{
'from_year': 1974,
'to_year': 1975,
'in_month': 4,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule Taiwan 1974 1975 - Oct 1 0:00 0 S
{
'from_year': 1974,
'to_year': 1975,
'in_month': 10,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': 'S',
},
# Rule Taiwan 1979 only - Jul 1 0:00 1:00 D
{
'from_year': 1979,
'to_year': 1979,
'in_month': 7,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule Taiwan 1979 only - Oct 1 0:00 0 S
{
'from_year': 1979,
'to_year': 1979,
'in_month': 10,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': 'S',
},
]
ZONE_POLICY_Taiwan = {
'name': 'Taiwan',
'rules': ZONE_RULES_Taiwan
}
#---------------------------------------------------------------------------
# Policy name: Thule
# Rule count: 7
#---------------------------------------------------------------------------
ZONE_RULES_Thule = [
# Anchor: Rule Thule 1991 1992 - Sep lastSun 2:00 0 S
{
'from_year': 0,
'to_year': 0,
'in_month': 1,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': 'S',
},
# Rule Thule 1991 1992 - Mar lastSun 2:00 1:00 D
{
'from_year': 1991,
'to_year': 1992,
'in_month': 3,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule Thule 1991 1992 - Sep lastSun 2:00 0 S
{
'from_year': 1991,
'to_year': 1992,
'in_month': 9,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': 'S',
},
# Rule Thule 1993 2006 - Apr Sun>=1 2:00 1:00 D
{
'from_year': 1993,
'to_year': 2006,
'in_month': 4,
'on_day_of_week': 7,
'on_day_of_month': 1,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule Thule 1993 2006 - Oct lastSun 2:00 0 S
{
'from_year': 1993,
'to_year': 2006,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': 'S',
},
# Rule Thule 2007 max - Mar Sun>=8 2:00 1:00 D
{
'from_year': 2007,
'to_year': 9999,
'in_month': 3,
'on_day_of_week': 7,
'on_day_of_month': 8,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule Thule 2007 max - Nov Sun>=1 2:00 0 S
{
'from_year': 2007,
'to_year': 9999,
'in_month': 11,
'on_day_of_week': 7,
'on_day_of_month': 1,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': 'S',
},
]
ZONE_POLICY_Thule = {
'name': 'Thule',
'rules': ZONE_RULES_Thule
}
#---------------------------------------------------------------------------
# Policy name: Tonga
# Rule count: 7
#---------------------------------------------------------------------------
ZONE_RULES_Tonga = [
# Anchor: Rule Tonga 2000 only - Mar 19 2:00s 0 -
{
'from_year': 0,
'to_year': 0,
'in_month': 1,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Tonga 1999 only - Oct 7 2:00s 1:00 -
{
'from_year': 1999,
'to_year': 1999,
'in_month': 10,
'on_day_of_week': 0,
'on_day_of_month': 7,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Tonga 2000 only - Mar 19 2:00s 0 -
{
'from_year': 2000,
'to_year': 2000,
'in_month': 3,
'on_day_of_week': 0,
'on_day_of_month': 19,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': '-',
},
# Rule Tonga 2000 2001 - Nov Sun>=1 2:00 1:00 -
{
'from_year': 2000,
'to_year': 2001,
'in_month': 11,
'on_day_of_week': 7,
'on_day_of_month': 1,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Tonga 2001 2002 - Jan lastSun 2:00 0 -
{
'from_year': 2001,
'to_year': 2002,
'in_month': 1,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Tonga 2016 only - Nov Sun>=1 2:00 1:00 -
{
'from_year': 2016,
'to_year': 2016,
'in_month': 11,
'on_day_of_week': 7,
'on_day_of_month': 1,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Tonga 2017 only - Jan Sun>=15 3:00 0 -
{
'from_year': 2017,
'to_year': 2017,
'in_month': 1,
'on_day_of_week': 7,
'on_day_of_month': 15,
'at_seconds': 10800,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
]
ZONE_POLICY_Tonga = {
'name': 'Tonga',
'rules': ZONE_RULES_Tonga
}
#---------------------------------------------------------------------------
# Policy name: Toronto
# Rule count: 3
#---------------------------------------------------------------------------
ZONE_RULES_Toronto = [
# Rule Toronto 1950 1973 - Apr lastSun 2:00 1:00 D
{
'from_year': 1950,
'to_year': 1973,
'in_month': 4,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule Toronto 1951 1956 - Sep lastSun 2:00 0 S
{
'from_year': 1951,
'to_year': 1956,
'in_month': 9,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': 'S',
},
# Rule Toronto 1957 1973 - Oct lastSun 2:00 0 S
{
'from_year': 1957,
'to_year': 1973,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': 'S',
},
]
ZONE_POLICY_Toronto = {
'name': 'Toronto',
'rules': ZONE_RULES_Toronto
}
#---------------------------------------------------------------------------
# Policy name: Troll
# Rule count: 3
#---------------------------------------------------------------------------
ZONE_RULES_Troll = [
# Anchor: Rule Troll 2004 max - Oct lastSun 1:00u 0:00 +00
{
'from_year': 0,
'to_year': 0,
'in_month': 1,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '+00',
},
# Rule Troll 2005 max - Mar lastSun 1:00u 2:00 +02
{
'from_year': 2005,
'to_year': 9999,
'in_month': 3,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 3600,
'at_time_suffix': 'u',
'delta_seconds': 7200,
'letter': '+02',
},
# Rule Troll 2004 max - Oct lastSun 1:00u 0:00 +00
{
'from_year': 2004,
'to_year': 9999,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 3600,
'at_time_suffix': 'u',
'delta_seconds': 0,
'letter': '+00',
},
]
ZONE_POLICY_Troll = {
'name': 'Troll',
'rules': ZONE_RULES_Troll
}
#---------------------------------------------------------------------------
# Policy name: Tunisia
# Rule count: 13
#---------------------------------------------------------------------------
ZONE_RULES_Tunisia = [
# Rule Tunisia 1945 only - Sep 16 0:00 0 -
{
'from_year': 1945,
'to_year': 1945,
'in_month': 9,
'on_day_of_week': 0,
'on_day_of_month': 16,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Tunisia 1977 only - Apr 30 0:00s 1:00 S
{
'from_year': 1977,
'to_year': 1977,
'in_month': 4,
'on_day_of_week': 0,
'on_day_of_month': 30,
'at_seconds': 0,
'at_time_suffix': 's',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Tunisia 1977 only - Sep 24 0:00s 0 -
{
'from_year': 1977,
'to_year': 1977,
'in_month': 9,
'on_day_of_week': 0,
'on_day_of_month': 24,
'at_seconds': 0,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': '-',
},
# Rule Tunisia 1978 only - May 1 0:00s 1:00 S
{
'from_year': 1978,
'to_year': 1978,
'in_month': 5,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 's',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Tunisia 1978 only - Oct 1 0:00s 0 -
{
'from_year': 1978,
'to_year': 1978,
'in_month': 10,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': '-',
},
# Rule Tunisia 1988 only - Jun 1 0:00s 1:00 S
{
'from_year': 1988,
'to_year': 1988,
'in_month': 6,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 's',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Tunisia 1988 1990 - Sep lastSun 0:00s 0 -
{
'from_year': 1988,
'to_year': 1990,
'in_month': 9,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 0,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': '-',
},
# Rule Tunisia 1989 only - Mar 26 0:00s 1:00 S
{
'from_year': 1989,
'to_year': 1989,
'in_month': 3,
'on_day_of_week': 0,
'on_day_of_month': 26,
'at_seconds': 0,
'at_time_suffix': 's',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Tunisia 1990 only - May 1 0:00s 1:00 S
{
'from_year': 1990,
'to_year': 1990,
'in_month': 5,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 's',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Tunisia 2005 only - May 1 0:00s 1:00 S
{
'from_year': 2005,
'to_year': 2005,
'in_month': 5,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 's',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Tunisia 2005 only - Sep 30 1:00s 0 -
{
'from_year': 2005,
'to_year': 2005,
'in_month': 9,
'on_day_of_week': 0,
'on_day_of_month': 30,
'at_seconds': 3600,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': '-',
},
# Rule Tunisia 2006 2008 - Mar lastSun 2:00s 1:00 S
{
'from_year': 2006,
'to_year': 2008,
'in_month': 3,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Tunisia 2006 2008 - Oct lastSun 2:00s 0 -
{
'from_year': 2006,
'to_year': 2008,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': '-',
},
]
ZONE_POLICY_Tunisia = {
'name': 'Tunisia',
'rules': ZONE_RULES_Tunisia
}
#---------------------------------------------------------------------------
# Policy name: Turkey
# Rule count: 18
#---------------------------------------------------------------------------
ZONE_RULES_Turkey = [
# Rule Turkey 1964 only - Oct 1 0:00 0 -
{
'from_year': 1964,
'to_year': 1964,
'in_month': 10,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Turkey 1973 only - Jun 3 1:00 1:00 S
{
'from_year': 1973,
'to_year': 1973,
'in_month': 6,
'on_day_of_week': 0,
'on_day_of_month': 3,
'at_seconds': 3600,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Turkey 1973 1976 - Oct Sun>=31 2:00 0 -
{
'from_year': 1973,
'to_year': 1976,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 31,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Turkey 1974 only - Mar 31 2:00 1:00 S
{
'from_year': 1974,
'to_year': 1974,
'in_month': 3,
'on_day_of_week': 0,
'on_day_of_month': 31,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Turkey 1975 only - Mar 22 2:00 1:00 S
{
'from_year': 1975,
'to_year': 1975,
'in_month': 3,
'on_day_of_week': 0,
'on_day_of_month': 22,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Turkey 1976 only - Mar 21 2:00 1:00 S
{
'from_year': 1976,
'to_year': 1976,
'in_month': 3,
'on_day_of_week': 0,
'on_day_of_month': 21,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Turkey 1977 1978 - Apr Sun>=1 2:00 1:00 S
{
'from_year': 1977,
'to_year': 1978,
'in_month': 4,
'on_day_of_week': 7,
'on_day_of_month': 1,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Turkey 1977 1978 - Oct Sun>=15 2:00 0 -
{
'from_year': 1977,
'to_year': 1978,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 15,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Turkey 1978 only - Jun 29 0:00 0 -
{
'from_year': 1978,
'to_year': 1978,
'in_month': 6,
'on_day_of_week': 0,
'on_day_of_month': 29,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Turkey 1983 only - Jul 31 2:00 1:00 S
{
'from_year': 1983,
'to_year': 1983,
'in_month': 7,
'on_day_of_week': 0,
'on_day_of_month': 31,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Turkey 1983 only - Oct 2 2:00 0 -
{
'from_year': 1983,
'to_year': 1983,
'in_month': 10,
'on_day_of_week': 0,
'on_day_of_month': 2,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Turkey 1985 only - Apr 20 1:00s 1:00 S
{
'from_year': 1985,
'to_year': 1985,
'in_month': 4,
'on_day_of_week': 0,
'on_day_of_month': 20,
'at_seconds': 3600,
'at_time_suffix': 's',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Turkey 1985 only - Sep 28 1:00s 0 -
{
'from_year': 1985,
'to_year': 1985,
'in_month': 9,
'on_day_of_week': 0,
'on_day_of_month': 28,
'at_seconds': 3600,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': '-',
},
# Rule Turkey 1986 1993 - Mar lastSun 1:00s 1:00 S
{
'from_year': 1986,
'to_year': 1993,
'in_month': 3,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 3600,
'at_time_suffix': 's',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Turkey 1986 1995 - Sep lastSun 1:00s 0 -
{
'from_year': 1986,
'to_year': 1995,
'in_month': 9,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 3600,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': '-',
},
# Rule Turkey 1994 only - Mar 20 1:00s 1:00 S
{
'from_year': 1994,
'to_year': 1994,
'in_month': 3,
'on_day_of_week': 0,
'on_day_of_month': 20,
'at_seconds': 3600,
'at_time_suffix': 's',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Turkey 1995 2006 - Mar lastSun 1:00s 1:00 S
{
'from_year': 1995,
'to_year': 2006,
'in_month': 3,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 3600,
'at_time_suffix': 's',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule Turkey 1996 2006 - Oct lastSun 1:00s 0 -
{
'from_year': 1996,
'to_year': 2006,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 3600,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': '-',
},
]
ZONE_POLICY_Turkey = {
'name': 'Turkey',
'rules': ZONE_RULES_Turkey
}
#---------------------------------------------------------------------------
# Policy name: US
# Rule count: 9
#---------------------------------------------------------------------------
ZONE_RULES_US = [
# Rule US 1945 only - Sep 30 2:00 0 S
{
'from_year': 1945,
'to_year': 1945,
'in_month': 9,
'on_day_of_week': 0,
'on_day_of_month': 30,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': 'S',
},
# Rule US 1967 2006 - Oct lastSun 2:00 0 S
{
'from_year': 1967,
'to_year': 2006,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': 'S',
},
# Rule US 1967 1973 - Apr lastSun 2:00 1:00 D
{
'from_year': 1967,
'to_year': 1973,
'in_month': 4,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule US 1974 only - Jan 6 2:00 1:00 D
{
'from_year': 1974,
'to_year': 1974,
'in_month': 1,
'on_day_of_week': 0,
'on_day_of_month': 6,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule US 1975 only - Feb lastSun 2:00 1:00 D
{
'from_year': 1975,
'to_year': 1975,
'in_month': 2,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule US 1976 1986 - Apr lastSun 2:00 1:00 D
{
'from_year': 1976,
'to_year': 1986,
'in_month': 4,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule US 1987 2006 - Apr Sun>=1 2:00 1:00 D
{
'from_year': 1987,
'to_year': 2006,
'in_month': 4,
'on_day_of_week': 7,
'on_day_of_month': 1,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule US 2007 max - Mar Sun>=8 2:00 1:00 D
{
'from_year': 2007,
'to_year': 9999,
'in_month': 3,
'on_day_of_week': 7,
'on_day_of_month': 8,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule US 2007 max - Nov Sun>=1 2:00 0 S
{
'from_year': 2007,
'to_year': 9999,
'in_month': 11,
'on_day_of_week': 7,
'on_day_of_month': 1,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': 'S',
},
]
ZONE_POLICY_US = {
'name': 'US',
'rules': ZONE_RULES_US
}
#---------------------------------------------------------------------------
# Policy name: Uruguay
# Rule count: 28
#---------------------------------------------------------------------------
ZONE_RULES_Uruguay = [
# Rule Uruguay 1972 only - Jul 16 0:00 0 -
{
'from_year': 1972,
'to_year': 1972,
'in_month': 7,
'on_day_of_week': 0,
'on_day_of_month': 16,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Uruguay 1974 only - Jan 13 0:00 1:30 -
{
'from_year': 1974,
'to_year': 1974,
'in_month': 1,
'on_day_of_week': 0,
'on_day_of_month': 13,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 5400,
'letter': '-',
},
# Rule Uruguay 1974 only - Mar 10 0:00 0:30 -
{
'from_year': 1974,
'to_year': 1974,
'in_month': 3,
'on_day_of_week': 0,
'on_day_of_month': 10,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 1800,
'letter': '-',
},
# Rule Uruguay 1974 only - Sep 1 0:00 0 -
{
'from_year': 1974,
'to_year': 1974,
'in_month': 9,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Uruguay 1974 only - Dec 22 0:00 1:00 -
{
'from_year': 1974,
'to_year': 1974,
'in_month': 12,
'on_day_of_week': 0,
'on_day_of_month': 22,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Uruguay 1975 only - Mar 30 0:00 0 -
{
'from_year': 1975,
'to_year': 1975,
'in_month': 3,
'on_day_of_week': 0,
'on_day_of_month': 30,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Uruguay 1976 only - Dec 19 0:00 1:00 -
{
'from_year': 1976,
'to_year': 1976,
'in_month': 12,
'on_day_of_week': 0,
'on_day_of_month': 19,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Uruguay 1977 only - Mar 6 0:00 0 -
{
'from_year': 1977,
'to_year': 1977,
'in_month': 3,
'on_day_of_week': 0,
'on_day_of_month': 6,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Uruguay 1977 only - Dec 4 0:00 1:00 -
{
'from_year': 1977,
'to_year': 1977,
'in_month': 12,
'on_day_of_week': 0,
'on_day_of_month': 4,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Uruguay 1978 1979 - Mar Sun>=1 0:00 0 -
{
'from_year': 1978,
'to_year': 1979,
'in_month': 3,
'on_day_of_week': 7,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Uruguay 1978 only - Dec 17 0:00 1:00 -
{
'from_year': 1978,
'to_year': 1978,
'in_month': 12,
'on_day_of_week': 0,
'on_day_of_month': 17,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Uruguay 1979 only - Apr 29 0:00 1:00 -
{
'from_year': 1979,
'to_year': 1979,
'in_month': 4,
'on_day_of_week': 0,
'on_day_of_month': 29,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Uruguay 1980 only - Mar 16 0:00 0 -
{
'from_year': 1980,
'to_year': 1980,
'in_month': 3,
'on_day_of_week': 0,
'on_day_of_month': 16,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Uruguay 1987 only - Dec 14 0:00 1:00 -
{
'from_year': 1987,
'to_year': 1987,
'in_month': 12,
'on_day_of_week': 0,
'on_day_of_month': 14,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Uruguay 1988 only - Feb 28 0:00 0 -
{
'from_year': 1988,
'to_year': 1988,
'in_month': 2,
'on_day_of_week': 0,
'on_day_of_month': 28,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Uruguay 1988 only - Dec 11 0:00 1:00 -
{
'from_year': 1988,
'to_year': 1988,
'in_month': 12,
'on_day_of_week': 0,
'on_day_of_month': 11,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Uruguay 1989 only - Mar 5 0:00 0 -
{
'from_year': 1989,
'to_year': 1989,
'in_month': 3,
'on_day_of_week': 0,
'on_day_of_month': 5,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Uruguay 1989 only - Oct 29 0:00 1:00 -
{
'from_year': 1989,
'to_year': 1989,
'in_month': 10,
'on_day_of_week': 0,
'on_day_of_month': 29,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Uruguay 1990 only - Feb 25 0:00 0 -
{
'from_year': 1990,
'to_year': 1990,
'in_month': 2,
'on_day_of_week': 0,
'on_day_of_month': 25,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Uruguay 1990 1991 - Oct Sun>=21 0:00 1:00 -
{
'from_year': 1990,
'to_year': 1991,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 21,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Uruguay 1991 1992 - Mar Sun>=1 0:00 0 -
{
'from_year': 1991,
'to_year': 1992,
'in_month': 3,
'on_day_of_week': 7,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Uruguay 1992 only - Oct 18 0:00 1:00 -
{
'from_year': 1992,
'to_year': 1992,
'in_month': 10,
'on_day_of_week': 0,
'on_day_of_month': 18,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Uruguay 1993 only - Feb 28 0:00 0 -
{
'from_year': 1993,
'to_year': 1993,
'in_month': 2,
'on_day_of_week': 0,
'on_day_of_month': 28,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Uruguay 2004 only - Sep 19 0:00 1:00 -
{
'from_year': 2004,
'to_year': 2004,
'in_month': 9,
'on_day_of_week': 0,
'on_day_of_month': 19,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Uruguay 2005 only - Mar 27 2:00 0 -
{
'from_year': 2005,
'to_year': 2005,
'in_month': 3,
'on_day_of_week': 0,
'on_day_of_month': 27,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Uruguay 2005 only - Oct 9 2:00 1:00 -
{
'from_year': 2005,
'to_year': 2005,
'in_month': 10,
'on_day_of_week': 0,
'on_day_of_month': 9,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Uruguay 2006 2015 - Mar Sun>=8 2:00 0 -
{
'from_year': 2006,
'to_year': 2015,
'in_month': 3,
'on_day_of_week': 7,
'on_day_of_month': 8,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Uruguay 2006 2014 - Oct Sun>=1 2:00 1:00 -
{
'from_year': 2006,
'to_year': 2014,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 1,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
]
ZONE_POLICY_Uruguay = {
'name': 'Uruguay',
'rules': ZONE_RULES_Uruguay
}
#---------------------------------------------------------------------------
# Policy name: Vanc
# Rule count: 3
#---------------------------------------------------------------------------
ZONE_RULES_Vanc = [
# Rule Vanc 1946 1986 - Apr lastSun 2:00 1:00 D
{
'from_year': 1946,
'to_year': 1986,
'in_month': 4,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule Vanc 1947 1961 - Sep lastSun 2:00 0 S
{
'from_year': 1947,
'to_year': 1961,
'in_month': 9,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': 'S',
},
# Rule Vanc 1962 2006 - Oct lastSun 2:00 0 S
{
'from_year': 1962,
'to_year': 2006,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': 'S',
},
]
ZONE_POLICY_Vanc = {
'name': 'Vanc',
'rules': ZONE_RULES_Vanc
}
#---------------------------------------------------------------------------
# Policy name: Vanuatu
# Rule count: 7
#---------------------------------------------------------------------------
ZONE_RULES_Vanuatu = [
# Anchor: Rule Vanuatu 1974 only - Mar 30 12:00u 0 -
{
'from_year': 0,
'to_year': 0,
'in_month': 1,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Vanuatu 1973 only - Dec 22 12:00u 1:00 -
{
'from_year': 1973,
'to_year': 1973,
'in_month': 12,
'on_day_of_week': 0,
'on_day_of_month': 22,
'at_seconds': 43200,
'at_time_suffix': 'u',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Vanuatu 1974 only - Mar 30 12:00u 0 -
{
'from_year': 1974,
'to_year': 1974,
'in_month': 3,
'on_day_of_week': 0,
'on_day_of_month': 30,
'at_seconds': 43200,
'at_time_suffix': 'u',
'delta_seconds': 0,
'letter': '-',
},
# Rule Vanuatu 1983 1991 - Sep Sat>=22 24:00 1:00 -
{
'from_year': 1983,
'to_year': 1991,
'in_month': 9,
'on_day_of_week': 6,
'on_day_of_month': 22,
'at_seconds': 86400,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule Vanuatu 1984 1991 - Mar Sat>=22 24:00 0 -
{
'from_year': 1984,
'to_year': 1991,
'in_month': 3,
'on_day_of_week': 6,
'on_day_of_month': 22,
'at_seconds': 86400,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Vanuatu 1992 1993 - Jan Sat>=22 24:00 0 -
{
'from_year': 1992,
'to_year': 1993,
'in_month': 1,
'on_day_of_week': 6,
'on_day_of_month': 22,
'at_seconds': 86400,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule Vanuatu 1992 only - Oct Sat>=22 24:00 1:00 -
{
'from_year': 1992,
'to_year': 1992,
'in_month': 10,
'on_day_of_week': 6,
'on_day_of_month': 22,
'at_seconds': 86400,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
]
ZONE_POLICY_Vanuatu = {
'name': 'Vanuatu',
'rules': ZONE_RULES_Vanuatu
}
#---------------------------------------------------------------------------
# Policy name: W_Eur
# Rule count: 6
#---------------------------------------------------------------------------
ZONE_RULES_W_Eur = [
# Anchor: Rule W-Eur 1977 only - Sep lastSun 1:00s 0 -
{
'from_year': 0,
'to_year': 0,
'in_month': 1,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule W-Eur 1977 1980 - Apr Sun>=1 1:00s 1:00 S
{
'from_year': 1977,
'to_year': 1980,
'in_month': 4,
'on_day_of_week': 7,
'on_day_of_month': 1,
'at_seconds': 3600,
'at_time_suffix': 's',
'delta_seconds': 3600,
'letter': 'S',
},
# Rule W-Eur 1977 only - Sep lastSun 1:00s 0 -
{
'from_year': 1977,
'to_year': 1977,
'in_month': 9,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 3600,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': '-',
},
# Rule W-Eur 1978 only - Oct 1 1:00s 0 -
{
'from_year': 1978,
'to_year': 1978,
'in_month': 10,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 3600,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': '-',
},
# Rule W-Eur 1979 1995 - Sep lastSun 1:00s 0 -
{
'from_year': 1979,
'to_year': 1995,
'in_month': 9,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 3600,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': '-',
},
# Rule W-Eur 1981 max - Mar lastSun 1:00s 1:00 S
{
'from_year': 1981,
'to_year': 9999,
'in_month': 3,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 3600,
'at_time_suffix': 's',
'delta_seconds': 3600,
'letter': 'S',
},
]
ZONE_POLICY_W_Eur = {
'name': 'W_Eur',
'rules': ZONE_RULES_W_Eur
}
#---------------------------------------------------------------------------
# Policy name: WS
# Rule count: 6
#---------------------------------------------------------------------------
ZONE_RULES_WS = [
# Anchor: Rule WS 2011 only - Apr Sat>=1 4:00 0 -
{
'from_year': 0,
'to_year': 0,
'in_month': 1,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule WS 2010 only - Sep lastSun 0:00 1 -
{
'from_year': 2010,
'to_year': 2010,
'in_month': 9,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule WS 2011 only - Apr Sat>=1 4:00 0 -
{
'from_year': 2011,
'to_year': 2011,
'in_month': 4,
'on_day_of_week': 6,
'on_day_of_month': 1,
'at_seconds': 14400,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule WS 2011 only - Sep lastSat 3:00 1 -
{
'from_year': 2011,
'to_year': 2011,
'in_month': 9,
'on_day_of_week': 6,
'on_day_of_month': 0,
'at_seconds': 10800,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
# Rule WS 2012 max - Apr Sun>=1 4:00 0 -
{
'from_year': 2012,
'to_year': 9999,
'in_month': 4,
'on_day_of_week': 7,
'on_day_of_month': 1,
'at_seconds': 14400,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': '-',
},
# Rule WS 2012 max - Sep lastSun 3:00 1 -
{
'from_year': 2012,
'to_year': 9999,
'in_month': 9,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 10800,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': '-',
},
]
ZONE_POLICY_WS = {
'name': 'WS',
'rules': ZONE_RULES_WS
}
#---------------------------------------------------------------------------
# Policy name: Winn
# Rule count: 4
#---------------------------------------------------------------------------
ZONE_RULES_Winn = [
# Rule Winn 1963 only - Sep 22 2:00 0 S
{
'from_year': 1963,
'to_year': 1963,
'in_month': 9,
'on_day_of_week': 0,
'on_day_of_month': 22,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': 'S',
},
# Rule Winn 1966 1986 - Apr lastSun 2:00s 1:00 D
{
'from_year': 1966,
'to_year': 1986,
'in_month': 4,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule Winn 1966 2005 - Oct lastSun 2:00s 0 S
{
'from_year': 1966,
'to_year': 2005,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': 'S',
},
# Rule Winn 1987 2005 - Apr Sun>=1 2:00s 1:00 D
{
'from_year': 1987,
'to_year': 2005,
'in_month': 4,
'on_day_of_week': 7,
'on_day_of_month': 1,
'at_seconds': 7200,
'at_time_suffix': 's',
'delta_seconds': 3600,
'letter': 'D',
},
]
ZONE_POLICY_Winn = {
'name': 'Winn',
'rules': ZONE_RULES_Winn
}
#---------------------------------------------------------------------------
# Policy name: Zion
# Rule count: 60
#---------------------------------------------------------------------------
ZONE_RULES_Zion = [
# Rule Zion 1957 only - Sep 21 24:00u 0 S
{
'from_year': 1957,
'to_year': 1957,
'in_month': 9,
'on_day_of_week': 0,
'on_day_of_month': 21,
'at_seconds': 86400,
'at_time_suffix': 'u',
'delta_seconds': 0,
'letter': 'S',
},
# Rule Zion 1974 only - Jul 6 24:00 1:00 D
{
'from_year': 1974,
'to_year': 1974,
'in_month': 7,
'on_day_of_week': 0,
'on_day_of_month': 6,
'at_seconds': 86400,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule Zion 1974 only - Oct 12 24:00 0 S
{
'from_year': 1974,
'to_year': 1974,
'in_month': 10,
'on_day_of_week': 0,
'on_day_of_month': 12,
'at_seconds': 86400,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': 'S',
},
# Rule Zion 1975 only - Apr 19 24:00 1:00 D
{
'from_year': 1975,
'to_year': 1975,
'in_month': 4,
'on_day_of_week': 0,
'on_day_of_month': 19,
'at_seconds': 86400,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule Zion 1975 only - Aug 30 24:00 0 S
{
'from_year': 1975,
'to_year': 1975,
'in_month': 8,
'on_day_of_week': 0,
'on_day_of_month': 30,
'at_seconds': 86400,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': 'S',
},
# Rule Zion 1980 only - Aug 2 24:00s 1:00 D
{
'from_year': 1980,
'to_year': 1980,
'in_month': 8,
'on_day_of_week': 0,
'on_day_of_month': 2,
'at_seconds': 86400,
'at_time_suffix': 's',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule Zion 1980 only - Sep 13 24:00s 0 S
{
'from_year': 1980,
'to_year': 1980,
'in_month': 9,
'on_day_of_week': 0,
'on_day_of_month': 13,
'at_seconds': 86400,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': 'S',
},
# Rule Zion 1984 only - May 5 24:00s 1:00 D
{
'from_year': 1984,
'to_year': 1984,
'in_month': 5,
'on_day_of_week': 0,
'on_day_of_month': 5,
'at_seconds': 86400,
'at_time_suffix': 's',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule Zion 1984 only - Aug 25 24:00s 0 S
{
'from_year': 1984,
'to_year': 1984,
'in_month': 8,
'on_day_of_week': 0,
'on_day_of_month': 25,
'at_seconds': 86400,
'at_time_suffix': 's',
'delta_seconds': 0,
'letter': 'S',
},
# Rule Zion 1985 only - Apr 13 24:00 1:00 D
{
'from_year': 1985,
'to_year': 1985,
'in_month': 4,
'on_day_of_week': 0,
'on_day_of_month': 13,
'at_seconds': 86400,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule Zion 1985 only - Aug 31 24:00 0 S
{
'from_year': 1985,
'to_year': 1985,
'in_month': 8,
'on_day_of_week': 0,
'on_day_of_month': 31,
'at_seconds': 86400,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': 'S',
},
# Rule Zion 1986 only - May 17 24:00 1:00 D
{
'from_year': 1986,
'to_year': 1986,
'in_month': 5,
'on_day_of_week': 0,
'on_day_of_month': 17,
'at_seconds': 86400,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule Zion 1986 only - Sep 6 24:00 0 S
{
'from_year': 1986,
'to_year': 1986,
'in_month': 9,
'on_day_of_week': 0,
'on_day_of_month': 6,
'at_seconds': 86400,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': 'S',
},
# Rule Zion 1987 only - Apr 14 24:00 1:00 D
{
'from_year': 1987,
'to_year': 1987,
'in_month': 4,
'on_day_of_week': 0,
'on_day_of_month': 14,
'at_seconds': 86400,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule Zion 1987 only - Sep 12 24:00 0 S
{
'from_year': 1987,
'to_year': 1987,
'in_month': 9,
'on_day_of_week': 0,
'on_day_of_month': 12,
'at_seconds': 86400,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': 'S',
},
# Rule Zion 1988 only - Apr 9 24:00 1:00 D
{
'from_year': 1988,
'to_year': 1988,
'in_month': 4,
'on_day_of_week': 0,
'on_day_of_month': 9,
'at_seconds': 86400,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule Zion 1988 only - Sep 3 24:00 0 S
{
'from_year': 1988,
'to_year': 1988,
'in_month': 9,
'on_day_of_week': 0,
'on_day_of_month': 3,
'at_seconds': 86400,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': 'S',
},
# Rule Zion 1989 only - Apr 29 24:00 1:00 D
{
'from_year': 1989,
'to_year': 1989,
'in_month': 4,
'on_day_of_week': 0,
'on_day_of_month': 29,
'at_seconds': 86400,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule Zion 1989 only - Sep 2 24:00 0 S
{
'from_year': 1989,
'to_year': 1989,
'in_month': 9,
'on_day_of_week': 0,
'on_day_of_month': 2,
'at_seconds': 86400,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': 'S',
},
# Rule Zion 1990 only - Mar 24 24:00 1:00 D
{
'from_year': 1990,
'to_year': 1990,
'in_month': 3,
'on_day_of_week': 0,
'on_day_of_month': 24,
'at_seconds': 86400,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule Zion 1990 only - Aug 25 24:00 0 S
{
'from_year': 1990,
'to_year': 1990,
'in_month': 8,
'on_day_of_week': 0,
'on_day_of_month': 25,
'at_seconds': 86400,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': 'S',
},
# Rule Zion 1991 only - Mar 23 24:00 1:00 D
{
'from_year': 1991,
'to_year': 1991,
'in_month': 3,
'on_day_of_week': 0,
'on_day_of_month': 23,
'at_seconds': 86400,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule Zion 1991 only - Aug 31 24:00 0 S
{
'from_year': 1991,
'to_year': 1991,
'in_month': 8,
'on_day_of_week': 0,
'on_day_of_month': 31,
'at_seconds': 86400,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': 'S',
},
# Rule Zion 1992 only - Mar 28 24:00 1:00 D
{
'from_year': 1992,
'to_year': 1992,
'in_month': 3,
'on_day_of_week': 0,
'on_day_of_month': 28,
'at_seconds': 86400,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule Zion 1992 only - Sep 5 24:00 0 S
{
'from_year': 1992,
'to_year': 1992,
'in_month': 9,
'on_day_of_week': 0,
'on_day_of_month': 5,
'at_seconds': 86400,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': 'S',
},
# Rule Zion 1993 only - Apr 2 0:00 1:00 D
{
'from_year': 1993,
'to_year': 1993,
'in_month': 4,
'on_day_of_week': 0,
'on_day_of_month': 2,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule Zion 1993 only - Sep 5 0:00 0 S
{
'from_year': 1993,
'to_year': 1993,
'in_month': 9,
'on_day_of_week': 0,
'on_day_of_month': 5,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': 'S',
},
# Rule Zion 1994 only - Apr 1 0:00 1:00 D
{
'from_year': 1994,
'to_year': 1994,
'in_month': 4,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule Zion 1994 only - Aug 28 0:00 0 S
{
'from_year': 1994,
'to_year': 1994,
'in_month': 8,
'on_day_of_week': 0,
'on_day_of_month': 28,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': 'S',
},
# Rule Zion 1995 only - Mar 31 0:00 1:00 D
{
'from_year': 1995,
'to_year': 1995,
'in_month': 3,
'on_day_of_week': 0,
'on_day_of_month': 31,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule Zion 1995 only - Sep 3 0:00 0 S
{
'from_year': 1995,
'to_year': 1995,
'in_month': 9,
'on_day_of_week': 0,
'on_day_of_month': 3,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': 'S',
},
# Rule Zion 1996 only - Mar 14 24:00 1:00 D
{
'from_year': 1996,
'to_year': 1996,
'in_month': 3,
'on_day_of_week': 0,
'on_day_of_month': 14,
'at_seconds': 86400,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule Zion 1996 only - Sep 15 24:00 0 S
{
'from_year': 1996,
'to_year': 1996,
'in_month': 9,
'on_day_of_week': 0,
'on_day_of_month': 15,
'at_seconds': 86400,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': 'S',
},
# Rule Zion 1997 only - Mar 20 24:00 1:00 D
{
'from_year': 1997,
'to_year': 1997,
'in_month': 3,
'on_day_of_week': 0,
'on_day_of_month': 20,
'at_seconds': 86400,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule Zion 1997 only - Sep 13 24:00 0 S
{
'from_year': 1997,
'to_year': 1997,
'in_month': 9,
'on_day_of_week': 0,
'on_day_of_month': 13,
'at_seconds': 86400,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': 'S',
},
# Rule Zion 1998 only - Mar 20 0:00 1:00 D
{
'from_year': 1998,
'to_year': 1998,
'in_month': 3,
'on_day_of_week': 0,
'on_day_of_month': 20,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule Zion 1998 only - Sep 6 0:00 0 S
{
'from_year': 1998,
'to_year': 1998,
'in_month': 9,
'on_day_of_week': 0,
'on_day_of_month': 6,
'at_seconds': 0,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': 'S',
},
# Rule Zion 1999 only - Apr 2 2:00 1:00 D
{
'from_year': 1999,
'to_year': 1999,
'in_month': 4,
'on_day_of_week': 0,
'on_day_of_month': 2,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule Zion 1999 only - Sep 3 2:00 0 S
{
'from_year': 1999,
'to_year': 1999,
'in_month': 9,
'on_day_of_week': 0,
'on_day_of_month': 3,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': 'S',
},
# Rule Zion 2000 only - Apr 14 2:00 1:00 D
{
'from_year': 2000,
'to_year': 2000,
'in_month': 4,
'on_day_of_week': 0,
'on_day_of_month': 14,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule Zion 2000 only - Oct 6 1:00 0 S
{
'from_year': 2000,
'to_year': 2000,
'in_month': 10,
'on_day_of_week': 0,
'on_day_of_month': 6,
'at_seconds': 3600,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': 'S',
},
# Rule Zion 2001 only - Apr 9 1:00 1:00 D
{
'from_year': 2001,
'to_year': 2001,
'in_month': 4,
'on_day_of_week': 0,
'on_day_of_month': 9,
'at_seconds': 3600,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule Zion 2001 only - Sep 24 1:00 0 S
{
'from_year': 2001,
'to_year': 2001,
'in_month': 9,
'on_day_of_week': 0,
'on_day_of_month': 24,
'at_seconds': 3600,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': 'S',
},
# Rule Zion 2002 only - Mar 29 1:00 1:00 D
{
'from_year': 2002,
'to_year': 2002,
'in_month': 3,
'on_day_of_week': 0,
'on_day_of_month': 29,
'at_seconds': 3600,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule Zion 2002 only - Oct 7 1:00 0 S
{
'from_year': 2002,
'to_year': 2002,
'in_month': 10,
'on_day_of_week': 0,
'on_day_of_month': 7,
'at_seconds': 3600,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': 'S',
},
# Rule Zion 2003 only - Mar 28 1:00 1:00 D
{
'from_year': 2003,
'to_year': 2003,
'in_month': 3,
'on_day_of_week': 0,
'on_day_of_month': 28,
'at_seconds': 3600,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule Zion 2003 only - Oct 3 1:00 0 S
{
'from_year': 2003,
'to_year': 2003,
'in_month': 10,
'on_day_of_week': 0,
'on_day_of_month': 3,
'at_seconds': 3600,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': 'S',
},
# Rule Zion 2004 only - Apr 7 1:00 1:00 D
{
'from_year': 2004,
'to_year': 2004,
'in_month': 4,
'on_day_of_week': 0,
'on_day_of_month': 7,
'at_seconds': 3600,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule Zion 2004 only - Sep 22 1:00 0 S
{
'from_year': 2004,
'to_year': 2004,
'in_month': 9,
'on_day_of_week': 0,
'on_day_of_month': 22,
'at_seconds': 3600,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': 'S',
},
# Rule Zion 2005 2012 - Apr Fri<=1 2:00 1:00 D
{
'from_year': 2005,
'to_year': 2012,
'in_month': 4,
'on_day_of_week': 5,
'on_day_of_month': -1,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule Zion 2005 only - Oct 9 2:00 0 S
{
'from_year': 2005,
'to_year': 2005,
'in_month': 10,
'on_day_of_week': 0,
'on_day_of_month': 9,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': 'S',
},
# Rule Zion 2006 only - Oct 1 2:00 0 S
{
'from_year': 2006,
'to_year': 2006,
'in_month': 10,
'on_day_of_week': 0,
'on_day_of_month': 1,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': 'S',
},
# Rule Zion 2007 only - Sep 16 2:00 0 S
{
'from_year': 2007,
'to_year': 2007,
'in_month': 9,
'on_day_of_week': 0,
'on_day_of_month': 16,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': 'S',
},
# Rule Zion 2008 only - Oct 5 2:00 0 S
{
'from_year': 2008,
'to_year': 2008,
'in_month': 10,
'on_day_of_week': 0,
'on_day_of_month': 5,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': 'S',
},
# Rule Zion 2009 only - Sep 27 2:00 0 S
{
'from_year': 2009,
'to_year': 2009,
'in_month': 9,
'on_day_of_week': 0,
'on_day_of_month': 27,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': 'S',
},
# Rule Zion 2010 only - Sep 12 2:00 0 S
{
'from_year': 2010,
'to_year': 2010,
'in_month': 9,
'on_day_of_week': 0,
'on_day_of_month': 12,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': 'S',
},
# Rule Zion 2011 only - Oct 2 2:00 0 S
{
'from_year': 2011,
'to_year': 2011,
'in_month': 10,
'on_day_of_week': 0,
'on_day_of_month': 2,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': 'S',
},
# Rule Zion 2012 only - Sep 23 2:00 0 S
{
'from_year': 2012,
'to_year': 2012,
'in_month': 9,
'on_day_of_week': 0,
'on_day_of_month': 23,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': 'S',
},
# Rule Zion 2013 max - Mar Fri>=23 2:00 1:00 D
{
'from_year': 2013,
'to_year': 9999,
'in_month': 3,
'on_day_of_week': 5,
'on_day_of_month': 23,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 3600,
'letter': 'D',
},
# Rule Zion 2013 max - Oct lastSun 2:00 0 S
{
'from_year': 2013,
'to_year': 9999,
'in_month': 10,
'on_day_of_week': 7,
'on_day_of_month': 0,
'at_seconds': 7200,
'at_time_suffix': 'w',
'delta_seconds': 0,
'letter': 'S',
},
]
ZONE_POLICY_Zion = {
'name': 'Zion',
'rules': ZONE_RULES_Zion
}
#---------------------------------------------------------------------------
ZONE_POLICY_MAP = {
'AN': ZONE_POLICY_AN,
'AQ': ZONE_POLICY_AQ,
'AS': ZONE_POLICY_AS,
'AT': ZONE_POLICY_AT,
'AV': ZONE_POLICY_AV,
'AW': ZONE_POLICY_AW,
'Albania': ZONE_POLICY_Albania,
'Algeria': ZONE_POLICY_Algeria,
'Arg': ZONE_POLICY_Arg,
'Armenia': ZONE_POLICY_Armenia,
'Aus': ZONE_POLICY_Aus,
'Austria': ZONE_POLICY_Austria,
'Azer': ZONE_POLICY_Azer,
'Bahamas': ZONE_POLICY_Bahamas,
'Barb': ZONE_POLICY_Barb,
'Belgium': ZONE_POLICY_Belgium,
'Belize': ZONE_POLICY_Belize,
'Bermuda': ZONE_POLICY_Bermuda,
'Brazil': ZONE_POLICY_Brazil,
'Bulg': ZONE_POLICY_Bulg,
'CO': ZONE_POLICY_CO,
'CR': ZONE_POLICY_CR,
'C_Eur': ZONE_POLICY_C_Eur,
'Canada': ZONE_POLICY_Canada,
'Chatham': ZONE_POLICY_Chatham,
'Chile': ZONE_POLICY_Chile,
'Cook': ZONE_POLICY_Cook,
'Cuba': ZONE_POLICY_Cuba,
'Cyprus': ZONE_POLICY_Cyprus,
'Czech': ZONE_POLICY_Czech,
'DR': ZONE_POLICY_DR,
'Denmark': ZONE_POLICY_Denmark,
'Dhaka': ZONE_POLICY_Dhaka,
'EU': ZONE_POLICY_EU,
'EUAsia': ZONE_POLICY_EUAsia,
'E_Eur': ZONE_POLICY_E_Eur,
'E_EurAsia': ZONE_POLICY_E_EurAsia,
'Ecuador': ZONE_POLICY_Ecuador,
'Edm': ZONE_POLICY_Edm,
'Egypt': ZONE_POLICY_Egypt,
'Eire': ZONE_POLICY_Eire,
'Falk': ZONE_POLICY_Falk,
'Fiji': ZONE_POLICY_Fiji,
'Finland': ZONE_POLICY_Finland,
'France': ZONE_POLICY_France,
'GB_Eire': ZONE_POLICY_GB_Eire,
'Germany': ZONE_POLICY_Germany,
'Ghana': ZONE_POLICY_Ghana,
'Greece': ZONE_POLICY_Greece,
'Guam': ZONE_POLICY_Guam,
'Guat': ZONE_POLICY_Guat,
'HK': ZONE_POLICY_HK,
'Haiti': ZONE_POLICY_Haiti,
'Halifax': ZONE_POLICY_Halifax,
'Holiday': ZONE_POLICY_Holiday,
'Hond': ZONE_POLICY_Hond,
'Hungary': ZONE_POLICY_Hungary,
'Iran': ZONE_POLICY_Iran,
'Iraq': ZONE_POLICY_Iraq,
'Italy': ZONE_POLICY_Italy,
'Japan': ZONE_POLICY_Japan,
'Jordan': ZONE_POLICY_Jordan,
'Kyrgyz': ZONE_POLICY_Kyrgyz,
'LH': ZONE_POLICY_LH,
'Latvia': ZONE_POLICY_Latvia,
'Lebanon': ZONE_POLICY_Lebanon,
'Libya': ZONE_POLICY_Libya,
'Macau': ZONE_POLICY_Macau,
'Malta': ZONE_POLICY_Malta,
'Mauritius': ZONE_POLICY_Mauritius,
'Mexico': ZONE_POLICY_Mexico,
'Moldova': ZONE_POLICY_Moldova,
'Moncton': ZONE_POLICY_Moncton,
'Mongol': ZONE_POLICY_Mongol,
'Morocco': ZONE_POLICY_Morocco,
'NC': ZONE_POLICY_NC,
'NT_YK': ZONE_POLICY_NT_YK,
'NZ': ZONE_POLICY_NZ,
'Namibia': ZONE_POLICY_Namibia,
'Neth': ZONE_POLICY_Neth,
'Nic': ZONE_POLICY_Nic,
'Norway': ZONE_POLICY_Norway,
'PRC': ZONE_POLICY_PRC,
'Pakistan': ZONE_POLICY_Pakistan,
'Palestine': ZONE_POLICY_Palestine,
'Para': ZONE_POLICY_Para,
'Peru': ZONE_POLICY_Peru,
'Phil': ZONE_POLICY_Phil,
'Poland': ZONE_POLICY_Poland,
'Port': ZONE_POLICY_Port,
'ROK': ZONE_POLICY_ROK,
'Romania': ZONE_POLICY_Romania,
'Russia': ZONE_POLICY_Russia,
'RussiaAsia': ZONE_POLICY_RussiaAsia,
'SA': ZONE_POLICY_SA,
'Salv': ZONE_POLICY_Salv,
'SanLuis': ZONE_POLICY_SanLuis,
'Spain': ZONE_POLICY_Spain,
'SpainAfrica': ZONE_POLICY_SpainAfrica,
'StJohns': ZONE_POLICY_StJohns,
'Sudan': ZONE_POLICY_Sudan,
'Swiss': ZONE_POLICY_Swiss,
'Syria': ZONE_POLICY_Syria,
'Taiwan': ZONE_POLICY_Taiwan,
'Thule': ZONE_POLICY_Thule,
'Tonga': ZONE_POLICY_Tonga,
'Toronto': ZONE_POLICY_Toronto,
'Troll': ZONE_POLICY_Troll,
'Tunisia': ZONE_POLICY_Tunisia,
'Turkey': ZONE_POLICY_Turkey,
'US': ZONE_POLICY_US,
'Uruguay': ZONE_POLICY_Uruguay,
'Vanc': ZONE_POLICY_Vanc,
'Vanuatu': ZONE_POLICY_Vanuatu,
'WS': ZONE_POLICY_WS,
'W_Eur': ZONE_POLICY_W_Eur,
'Winn': ZONE_POLICY_Winn,
'Zion': ZONE_POLICY_Zion,
}
#---------------------------------------------------------------------------
# The following zone policies are not supported in the current version of
# AceTime.
#
# numPolicies: 22
#
# CA (['unused'])
# Chicago (['unused'])
# Denver (['unused'])
# Detroit (['unused'])
# EgyptAsia (['unused'])
# Iceland (['unused'])
# Indianapolis (['unused'])
# Louisville (['unused'])
# Lux (['unused'])
# Marengo (['unused'])
# Menominee (['unused'])
# NBorneo (['unused'])
# NYC (['unused'])
# Perry (['unused'])
# Pike (['unused'])
# Pulaski (['unused'])
# Regina (['unused'])
# Shang (['unused'])
# SovietZone (['unused'])
# Starke (['unused'])
# Swift (['unused'])
# Vincennes (['unused'])
#---------------------------------------------------------------------------
# The following zone policies may have inaccuracies due to the following
# reasons:
#
# numPolicies: 10
#
# Belize (["LETTER 'CDT' not single character", "LETTER 'CST' not single character"])
# DR (["LETTER '-0430' not single character", "LETTER 'EST' not single character"])
# GB_Eire (["LETTER 'BST' not single character", "LETTER 'GMT' not single character"])
# Ghana (["LETTER 'GMT' not single character"])
# Guam (["AT '2:01' not on 15-minute boundary"])
# Moncton (["AT '0:01' not on 15-minute boundary"])
# Namibia (["LETTER 'CAT' not single character", "LETTER 'WAT' not single character"])
# Palestine (["AT '0:01' not on 15-minute boundary"])
# StJohns (["AT '0:01' not on 15-minute boundary", "LETTER 'DD' not single character"])
# Troll (["LETTER '+00' not single character", "LETTER '+02' not single character"])
| 27.538196 | 89 | 0.42281 | 49,292 | 422,133 | 3.2994 | 0.009129 | 0.070219 | 0.098307 | 0.077241 | 0.892728 | 0.865373 | 0.857213 | 0.845678 | 0.810581 | 0.789085 | 0 | 0.128569 | 0.39646 | 422,133 | 15,328 | 90 | 27.539992 | 0.509777 | 0.246538 | 0 | 0.765049 | 1 | 0 | 0.361075 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
222a7d9df20e078eb7abfd6047ec953e4a54a259 | 70,184 | py | Python | kinow_client/apis/actors_api.py | kinow-io/kaemo-python-sdk | 610fce09e3a9e631babf09195b0492959d9e4d56 | [
"Apache-2.0"
] | 1 | 2017-05-03T12:48:22.000Z | 2017-05-03T12:48:22.000Z | kinow_client/apis/actors_api.py | kinow-io/kaemo-python-sdk | 610fce09e3a9e631babf09195b0492959d9e4d56 | [
"Apache-2.0"
] | null | null | null | kinow_client/apis/actors_api.py | kinow-io/kaemo-python-sdk | 610fce09e3a9e631babf09195b0492959d9e4d56 | [
"Apache-2.0"
] | null | null | null | # coding: utf-8
"""
Server API
Reference for Server API (REST/Json)
OpenAPI spec version: 2.0.9
Generated by: https://github.com/swagger-api/swagger-codegen.git
"""
from __future__ import absolute_import
import sys
import os
import re
# python 2 and python 3 compatibility library
from six import iteritems
from ..configuration import Configuration
from ..api_client import ApiClient
class ActorsApi(object):
"""
NOTE: This class is auto generated by the swagger code generator program.
Do not edit the class manually.
Ref: https://github.com/swagger-api/swagger-codegen
"""
def __init__(self, api_client=None):
config = Configuration()
if api_client:
self.api_client = api_client
else:
if not config.api_client:
config.api_client = ApiClient()
self.api_client = config.api_client
def attach_actor_to_category(self, category_id, actor_id, **kwargs):
"""
Attach actor to category
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.attach_actor_to_category(category_id, actor_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param int category_id: Category ID to fetch (required)
:param int actor_id: Actor ID to attach (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.attach_actor_to_category_with_http_info(category_id, actor_id, **kwargs)
else:
(data) = self.attach_actor_to_category_with_http_info(category_id, actor_id, **kwargs)
return data
def attach_actor_to_category_with_http_info(self, category_id, actor_id, **kwargs):
"""
Attach actor to category
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.attach_actor_to_category_with_http_info(category_id, actor_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param int category_id: Category ID to fetch (required)
:param int actor_id: Actor ID to attach (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['category_id', 'actor_id']
all_params.append('callback')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method attach_actor_to_category" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'category_id' is set
if ('category_id' not in params) or (params['category_id'] is None):
raise ValueError("Missing the required parameter `category_id` when calling `attach_actor_to_category`")
# verify the required parameter 'actor_id' is set
if ('actor_id' not in params) or (params['actor_id'] is None):
raise ValueError("Missing the required parameter `actor_id` when calling `attach_actor_to_category`")
collection_formats = {}
resource_path = '/categories/{category_id}/actors'.replace('{format}', 'json')
path_params = {}
if 'category_id' in params:
path_params['category_id'] = params['category_id']
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
if 'actor_id' in params:
form_params.append(('actor_id', params['actor_id']))
self.api_client.set_default_header('Content-Type', 'application/x-www-form-urlencoded')
body_params = None
# Authentication setting
auth_settings = ['ApiClientId', 'ApiClientSecret']
return self.api_client.call_api(resource_path, 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type=None,
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def attach_actor_to_product(self, product_id, actor_id, **kwargs):
"""
Attach actor to product
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.attach_actor_to_product(product_id, actor_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param int product_id: Product ID to fetch (required)
:param int actor_id: Actor ID to attach (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.attach_actor_to_product_with_http_info(product_id, actor_id, **kwargs)
else:
(data) = self.attach_actor_to_product_with_http_info(product_id, actor_id, **kwargs)
return data
def attach_actor_to_product_with_http_info(self, product_id, actor_id, **kwargs):
"""
Attach actor to product
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.attach_actor_to_product_with_http_info(product_id, actor_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param int product_id: Product ID to fetch (required)
:param int actor_id: Actor ID to attach (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['product_id', 'actor_id']
all_params.append('callback')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method attach_actor_to_product" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'product_id' is set
if ('product_id' not in params) or (params['product_id'] is None):
raise ValueError("Missing the required parameter `product_id` when calling `attach_actor_to_product`")
# verify the required parameter 'actor_id' is set
if ('actor_id' not in params) or (params['actor_id'] is None):
raise ValueError("Missing the required parameter `actor_id` when calling `attach_actor_to_product`")
collection_formats = {}
resource_path = '/products/{product_id}/actors'.replace('{format}', 'json')
path_params = {}
if 'product_id' in params:
path_params['product_id'] = params['product_id']
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
if 'actor_id' in params:
form_params.append(('actor_id', params['actor_id']))
self.api_client.set_default_header('Content-Type', 'application/x-www-form-urlencoded')
body_params = None
# Authentication setting
auth_settings = ['ApiClientId', 'ApiClientSecret']
return self.api_client.call_api(resource_path, 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type=None,
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def create_actor(self, body, **kwargs):
"""
Create new actor
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.create_actor(body, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param CreateActorRequest body: Create an actor (required)
:return: ActorResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.create_actor_with_http_info(body, **kwargs)
else:
(data) = self.create_actor_with_http_info(body, **kwargs)
return data
def create_actor_with_http_info(self, body, **kwargs):
"""
Create new actor
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.create_actor_with_http_info(body, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param CreateActorRequest body: Create an actor (required)
:return: ActorResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['body']
all_params.append('callback')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method create_actor" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'body' is set
if ('body' not in params) or (params['body'] is None):
raise ValueError("Missing the required parameter `body` when calling `create_actor`")
collection_formats = {}
resource_path = '/actors'.replace('{format}', 'json')
path_params = {}
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
self.api_client.set_default_header('Content-Type', 'application/json')
# Authentication setting
auth_settings = ['ApiClientId', 'ApiClientSecret']
return self.api_client.call_api(resource_path, 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='ActorResponse',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def delete_actor(self, actor_id, **kwargs):
"""
Delete actor
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.delete_actor(actor_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param int actor_id: (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.delete_actor_with_http_info(actor_id, **kwargs)
else:
(data) = self.delete_actor_with_http_info(actor_id, **kwargs)
return data
def delete_actor_with_http_info(self, actor_id, **kwargs):
"""
Delete actor
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.delete_actor_with_http_info(actor_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param int actor_id: (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['actor_id']
all_params.append('callback')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method delete_actor" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'actor_id' is set
if ('actor_id' not in params) or (params['actor_id'] is None):
raise ValueError("Missing the required parameter `actor_id` when calling `delete_actor`")
collection_formats = {}
resource_path = '/actors/{actor_id}'.replace('{format}', 'json')
path_params = {}
if 'actor_id' in params:
path_params['actor_id'] = params['actor_id']
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# Authentication setting
auth_settings = ['ApiClientId', 'ApiClientSecret']
return self.api_client.call_api(resource_path, 'DELETE',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type=None,
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def detach_actor_from_category(self, category_id, actor_id, **kwargs):
"""
Detach actor from category
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.detach_actor_from_category(category_id, actor_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param int category_id: Category ID to fetch (required)
:param int actor_id: Actor ID to detach (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.detach_actor_from_category_with_http_info(category_id, actor_id, **kwargs)
else:
(data) = self.detach_actor_from_category_with_http_info(category_id, actor_id, **kwargs)
return data
def detach_actor_from_category_with_http_info(self, category_id, actor_id, **kwargs):
"""
Detach actor from category
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.detach_actor_from_category_with_http_info(category_id, actor_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param int category_id: Category ID to fetch (required)
:param int actor_id: Actor ID to detach (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['category_id', 'actor_id']
all_params.append('callback')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method detach_actor_from_category" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'category_id' is set
if ('category_id' not in params) or (params['category_id'] is None):
raise ValueError("Missing the required parameter `category_id` when calling `detach_actor_from_category`")
# verify the required parameter 'actor_id' is set
if ('actor_id' not in params) or (params['actor_id'] is None):
raise ValueError("Missing the required parameter `actor_id` when calling `detach_actor_from_category`")
collection_formats = {}
resource_path = '/categories/{category_id}/actors/{actor_id}'.replace('{format}', 'json')
path_params = {}
if 'category_id' in params:
path_params['category_id'] = params['category_id']
if 'actor_id' in params:
path_params['actor_id'] = params['actor_id']
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# Authentication setting
auth_settings = ['ApiClientId', 'ApiClientSecret']
return self.api_client.call_api(resource_path, 'DELETE',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type=None,
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_actor(self, actor_id, **kwargs):
"""
Get actor
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.get_actor(actor_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param int actor_id: Actor ID to fetch (required)
:return: ActorResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.get_actor_with_http_info(actor_id, **kwargs)
else:
(data) = self.get_actor_with_http_info(actor_id, **kwargs)
return data
def get_actor_with_http_info(self, actor_id, **kwargs):
"""
Get actor
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.get_actor_with_http_info(actor_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param int actor_id: Actor ID to fetch (required)
:return: ActorResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['actor_id']
all_params.append('callback')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_actor" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'actor_id' is set
if ('actor_id' not in params) or (params['actor_id'] is None):
raise ValueError("Missing the required parameter `actor_id` when calling `get_actor`")
collection_formats = {}
resource_path = '/actors/{actor_id}'.replace('{format}', 'json')
path_params = {}
if 'actor_id' in params:
path_params['actor_id'] = params['actor_id']
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# Authentication setting
auth_settings = ['ApiClientId', 'ApiClientSecret']
return self.api_client.call_api(resource_path, 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='ActorResponse',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_actor_cover_image(self, actor_id, **kwargs):
"""
Get cover image of an actor
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.get_actor_cover_image(actor_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param int actor_id: Actor ID to fetch (required)
:return: ImageResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.get_actor_cover_image_with_http_info(actor_id, **kwargs)
else:
(data) = self.get_actor_cover_image_with_http_info(actor_id, **kwargs)
return data
def get_actor_cover_image_with_http_info(self, actor_id, **kwargs):
"""
Get cover image of an actor
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.get_actor_cover_image_with_http_info(actor_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param int actor_id: Actor ID to fetch (required)
:return: ImageResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['actor_id']
all_params.append('callback')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_actor_cover_image" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'actor_id' is set
if ('actor_id' not in params) or (params['actor_id'] is None):
raise ValueError("Missing the required parameter `actor_id` when calling `get_actor_cover_image`")
collection_formats = {}
resource_path = '/actors/{actor_id}/cover'.replace('{format}', 'json')
path_params = {}
if 'actor_id' in params:
path_params['actor_id'] = params['actor_id']
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# Authentication setting
auth_settings = ['ApiClientId', 'ApiClientSecret']
return self.api_client.call_api(resource_path, 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='ImageResponse',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_actor_products(self, actor_id, **kwargs):
"""
Get actor products
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.get_actor_products(actor_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param int actor_id: Actor ID to fetch (required)
:param int page:
:param int per_page:
:param str sort_by: Sort by this attribute (id by default)
:param str sort_direction: Sorting direction (asc by default)
:param str ip: Filter by user IP
:param str features: ``` features[*][value]=string&features[*][operator]=strict&features[1][value]=string&features[1][operator]=strict _______________ { \"*\": { \"value\": \"string\", \"operator\": \"strict\" }, \"1\": { \"value\": \"string\", \"operator\": \"contains\" } } ``` Operator can be: strict, contains, between, in, gt (greater than), lt (lower than). To search on all features, you can pass * as featureId.
:param str filters: ``` name[value]=string&name][operator]=contains&date_add[value]=string&date_add[operator]=lt _______________ { \"name\": { \"value\": \"string\", \"operator\": \"contains\" }, \"date_add\": { \"value\": \"string\", \"operator\": \"lt\" } } ``` Operator can be: strict, contains, between, in, gt (greater than), lt (lower than).
:return: ActorProductListResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.get_actor_products_with_http_info(actor_id, **kwargs)
else:
(data) = self.get_actor_products_with_http_info(actor_id, **kwargs)
return data
def get_actor_products_with_http_info(self, actor_id, **kwargs):
"""
Get actor products
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.get_actor_products_with_http_info(actor_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param int actor_id: Actor ID to fetch (required)
:param int page:
:param int per_page:
:param str sort_by: Sort by this attribute (id by default)
:param str sort_direction: Sorting direction (asc by default)
:param str ip: Filter by user IP
:param str features: ``` features[*][value]=string&features[*][operator]=strict&features[1][value]=string&features[1][operator]=strict _______________ { \"*\": { \"value\": \"string\", \"operator\": \"strict\" }, \"1\": { \"value\": \"string\", \"operator\": \"contains\" } } ``` Operator can be: strict, contains, between, in, gt (greater than), lt (lower than). To search on all features, you can pass * as featureId.
:param str filters: ``` name[value]=string&name][operator]=contains&date_add[value]=string&date_add[operator]=lt _______________ { \"name\": { \"value\": \"string\", \"operator\": \"contains\" }, \"date_add\": { \"value\": \"string\", \"operator\": \"lt\" } } ``` Operator can be: strict, contains, between, in, gt (greater than), lt (lower than).
:return: ActorProductListResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['actor_id', 'page', 'per_page', 'sort_by', 'sort_direction', 'ip', 'features', 'filters']
all_params.append('callback')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_actor_products" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'actor_id' is set
if ('actor_id' not in params) or (params['actor_id'] is None):
raise ValueError("Missing the required parameter `actor_id` when calling `get_actor_products`")
collection_formats = {}
resource_path = '/actors/{actor_id}/products'.replace('{format}', 'json')
path_params = {}
if 'actor_id' in params:
path_params['actor_id'] = params['actor_id']
query_params = {}
if 'page' in params:
query_params['page'] = params['page']
if 'per_page' in params:
query_params['per_page'] = params['per_page']
if 'sort_by' in params:
query_params['sort_by'] = params['sort_by']
if 'sort_direction' in params:
query_params['sort_direction'] = params['sort_direction']
if 'ip' in params:
query_params['ip'] = params['ip']
if 'features' in params:
query_params['features'] = params['features']
if 'filters' in params:
query_params['filters'] = params['filters']
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# Authentication setting
auth_settings = ['ApiClientId', 'ApiClientSecret']
return self.api_client.call_api(resource_path, 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='ActorProductListResponse',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_actor_products_role(self, actor_id, **kwargs):
"""
Get Products linked to Actor with their role
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.get_actor_products_role(actor_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param int actor_id: Actor ID to fetch (required)
:param int page:
:param int per_page:
:return: ActorProductRoleListResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.get_actor_products_role_with_http_info(actor_id, **kwargs)
else:
(data) = self.get_actor_products_role_with_http_info(actor_id, **kwargs)
return data
def get_actor_products_role_with_http_info(self, actor_id, **kwargs):
"""
Get Products linked to Actor with their role
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.get_actor_products_role_with_http_info(actor_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param int actor_id: Actor ID to fetch (required)
:param int page:
:param int per_page:
:return: ActorProductRoleListResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['actor_id', 'page', 'per_page']
all_params.append('callback')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_actor_products_role" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'actor_id' is set
if ('actor_id' not in params) or (params['actor_id'] is None):
raise ValueError("Missing the required parameter `actor_id` when calling `get_actor_products_role`")
collection_formats = {}
resource_path = '/actors/{actor_id}/products-role'.replace('{format}', 'json')
path_params = {}
if 'actor_id' in params:
path_params['actor_id'] = params['actor_id']
query_params = {}
if 'page' in params:
query_params['page'] = params['page']
if 'per_page' in params:
query_params['per_page'] = params['per_page']
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# Authentication setting
auth_settings = ['ApiClientId', 'ApiClientSecret']
return self.api_client.call_api(resource_path, 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='ActorProductRoleListResponse',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_actors(self, **kwargs):
"""
Get actors list
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.get_actors(callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param int page:
:param int per_page:
:return: ActorListResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.get_actors_with_http_info(**kwargs)
else:
(data) = self.get_actors_with_http_info(**kwargs)
return data
def get_actors_with_http_info(self, **kwargs):
"""
Get actors list
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.get_actors_with_http_info(callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param int page:
:param int per_page:
:return: ActorListResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['page', 'per_page']
all_params.append('callback')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_actors" % key
)
params[key] = val
del params['kwargs']
collection_formats = {}
resource_path = '/actors'.replace('{format}', 'json')
path_params = {}
query_params = {}
if 'page' in params:
query_params['page'] = params['page']
if 'per_page' in params:
query_params['per_page'] = params['per_page']
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# Authentication setting
auth_settings = ['ApiClientId', 'ApiClientSecret']
return self.api_client.call_api(resource_path, 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='ActorListResponse',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_category_actors(self, category_id, **kwargs):
"""
Get actors attached to category
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.get_category_actors(category_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param int category_id: Category ID to fetch (required)
:param int page:
:param int per_page:
:return: CategoryActorsListResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.get_category_actors_with_http_info(category_id, **kwargs)
else:
(data) = self.get_category_actors_with_http_info(category_id, **kwargs)
return data
def get_category_actors_with_http_info(self, category_id, **kwargs):
"""
Get actors attached to category
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.get_category_actors_with_http_info(category_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param int category_id: Category ID to fetch (required)
:param int page:
:param int per_page:
:return: CategoryActorsListResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['category_id', 'page', 'per_page']
all_params.append('callback')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_category_actors" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'category_id' is set
if ('category_id' not in params) or (params['category_id'] is None):
raise ValueError("Missing the required parameter `category_id` when calling `get_category_actors`")
collection_formats = {}
resource_path = '/categories/{category_id}/actors'.replace('{format}', 'json')
path_params = {}
if 'category_id' in params:
path_params['category_id'] = params['category_id']
query_params = {}
if 'page' in params:
query_params['page'] = params['page']
if 'per_page' in params:
query_params['per_page'] = params['per_page']
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# Authentication setting
auth_settings = ['ApiClientId', 'ApiClientSecret']
return self.api_client.call_api(resource_path, 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='CategoryActorsListResponse',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_product_actors(self, product_id, **kwargs):
"""
Get actors attached to product
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.get_product_actors(product_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param int product_id: Product ID to fetch (required)
:param int page:
:param int per_page:
:param str image_type:
:return: ActorListResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.get_product_actors_with_http_info(product_id, **kwargs)
else:
(data) = self.get_product_actors_with_http_info(product_id, **kwargs)
return data
def get_product_actors_with_http_info(self, product_id, **kwargs):
"""
Get actors attached to product
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.get_product_actors_with_http_info(product_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param int product_id: Product ID to fetch (required)
:param int page:
:param int per_page:
:param str image_type:
:return: ActorListResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['product_id', 'page', 'per_page', 'image_type']
all_params.append('callback')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_product_actors" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'product_id' is set
if ('product_id' not in params) or (params['product_id'] is None):
raise ValueError("Missing the required parameter `product_id` when calling `get_product_actors`")
collection_formats = {}
resource_path = '/products/{product_id}/actors'.replace('{format}', 'json')
path_params = {}
if 'product_id' in params:
path_params['product_id'] = params['product_id']
query_params = {}
if 'page' in params:
query_params['page'] = params['page']
if 'per_page' in params:
query_params['per_page'] = params['per_page']
if 'image_type' in params:
query_params['image_type'] = params['image_type']
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# Authentication setting
auth_settings = ['ApiClientId', 'ApiClientSecret']
return self.api_client.call_api(resource_path, 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='ActorListResponse',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_product_actors_role(self, product_id, **kwargs):
"""
Get Actors attached to Product with their role
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.get_product_actors_role(product_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param int product_id: Product ID to fetch (required)
:param int page:
:param int per_page:
:return: ActorRoleListResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.get_product_actors_role_with_http_info(product_id, **kwargs)
else:
(data) = self.get_product_actors_role_with_http_info(product_id, **kwargs)
return data
def get_product_actors_role_with_http_info(self, product_id, **kwargs):
"""
Get Actors attached to Product with their role
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.get_product_actors_role_with_http_info(product_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param int product_id: Product ID to fetch (required)
:param int page:
:param int per_page:
:return: ActorRoleListResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['product_id', 'page', 'per_page']
all_params.append('callback')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_product_actors_role" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'product_id' is set
if ('product_id' not in params) or (params['product_id'] is None):
raise ValueError("Missing the required parameter `product_id` when calling `get_product_actors_role`")
collection_formats = {}
resource_path = '/products/{product_id}/actors-role'.replace('{format}', 'json')
path_params = {}
if 'product_id' in params:
path_params['product_id'] = params['product_id']
query_params = {}
if 'page' in params:
query_params['page'] = params['page']
if 'per_page' in params:
query_params['per_page'] = params['per_page']
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# Authentication setting
auth_settings = ['ApiClientId', 'ApiClientSecret']
return self.api_client.call_api(resource_path, 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='ActorRoleListResponse',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def update_actor(self, actor_id, body, **kwargs):
"""
Update actor
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.update_actor(actor_id, body, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param int actor_id: (required)
:param UpdateActorRequest body: Actor settings (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.update_actor_with_http_info(actor_id, body, **kwargs)
else:
(data) = self.update_actor_with_http_info(actor_id, body, **kwargs)
return data
def update_actor_with_http_info(self, actor_id, body, **kwargs):
"""
Update actor
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.update_actor_with_http_info(actor_id, body, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param int actor_id: (required)
:param UpdateActorRequest body: Actor settings (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['actor_id', 'body']
all_params.append('callback')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method update_actor" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'actor_id' is set
if ('actor_id' not in params) or (params['actor_id'] is None):
raise ValueError("Missing the required parameter `actor_id` when calling `update_actor`")
# verify the required parameter 'body' is set
if ('body' not in params) or (params['body'] is None):
raise ValueError("Missing the required parameter `body` when calling `update_actor`")
collection_formats = {}
resource_path = '/actors/{actor_id}'.replace('{format}', 'json')
path_params = {}
if 'actor_id' in params:
path_params['actor_id'] = params['actor_id']
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
self.api_client.set_default_header('Content-Type', 'application/json')
# Authentication setting
auth_settings = ['ApiClientId', 'ApiClientSecret']
return self.api_client.call_api(resource_path, 'PUT',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type=None,
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def upload_actor_cover(self, actor_id, **kwargs):
"""
Upload actor cover
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.upload_actor_cover(actor_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param float actor_id: Actor ID to fetch (required)
:param file file:
:param str hash:
:param str hash_algorithm: Hash algorithm to check the hash file (default value is: sha256)
:return: ImageResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.upload_actor_cover_with_http_info(actor_id, **kwargs)
else:
(data) = self.upload_actor_cover_with_http_info(actor_id, **kwargs)
return data
def upload_actor_cover_with_http_info(self, actor_id, **kwargs):
"""
Upload actor cover
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.upload_actor_cover_with_http_info(actor_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param float actor_id: Actor ID to fetch (required)
:param file file:
:param str hash:
:param str hash_algorithm: Hash algorithm to check the hash file (default value is: sha256)
:return: ImageResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['actor_id', 'file', 'hash', 'hash_algorithm']
all_params.append('callback')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method upload_actor_cover" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'actor_id' is set
if ('actor_id' not in params) or (params['actor_id'] is None):
raise ValueError("Missing the required parameter `actor_id` when calling `upload_actor_cover`")
collection_formats = {}
resource_path = '/actors/{actor_id}/cover'.replace('{format}', 'json')
path_params = {}
if 'actor_id' in params:
path_params['actor_id'] = params['actor_id']
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
if 'file' in params:
local_var_files['file'] = params['file']
self.api_client.set_default_header('Content-Type', 'application/x-www-form-urlencoded')
if 'hash' in params:
form_params.append(('hash', params['hash']))
self.api_client.set_default_header('Content-Type', 'application/x-www-form-urlencoded')
if 'hash_algorithm' in params:
form_params.append(('hash_algorithm', params['hash_algorithm']))
self.api_client.set_default_header('Content-Type', 'application/x-www-form-urlencoded')
body_params = None
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['multipart/form-data'])
# Authentication setting
auth_settings = ['ApiClientId', 'ApiClientSecret']
return self.api_client.call_api(resource_path, 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='ImageResponse',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
| 43.323457 | 429 | 0.567195 | 7,225 | 70,184 | 5.249965 | 0.031419 | 0.033956 | 0.022145 | 0.028473 | 0.966386 | 0.958135 | 0.95447 | 0.946403 | 0.937887 | 0.934328 | 0 | 0.000393 | 0.348071 | 70,184 | 1,619 | 430 | 43.350216 | 0.828609 | 0.319161 | 0 | 0.800983 | 1 | 0 | 0.180683 | 0.044952 | 0 | 0 | 0 | 0 | 0 | 1 | 0.038084 | false | 0 | 0.0086 | 0 | 0.103194 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
225608f899bdfc8f9599d54c5f8900bfe8f1522f | 36,334 | py | Python | core/domain/activity_jobs_one_off_test.py | kevjumba/oppia | 2618d2d85a5320c8ff4dcd700ccbc67b4d36fdf4 | [
"Apache-2.0"
] | 1 | 2021-06-26T00:31:08.000Z | 2021-06-26T00:31:08.000Z | core/domain/activity_jobs_one_off_test.py | kevjumba/oppia | 2618d2d85a5320c8ff4dcd700ccbc67b4d36fdf4 | [
"Apache-2.0"
] | null | null | null | core/domain/activity_jobs_one_off_test.py | kevjumba/oppia | 2618d2d85a5320c8ff4dcd700ccbc67b4d36fdf4 | [
"Apache-2.0"
] | null | null | null | # coding: utf-8
#
# Copyright 2017 The Oppia Authors. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS-IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""Unit tests for core.domain.activity_jobs_one_off."""
from __future__ import absolute_import # pylint: disable=import-only-modules
from __future__ import unicode_literals # pylint: disable=import-only-modules
import ast
import datetime
from constants import constants
from core.domain import activity_jobs_one_off
from core.domain import collection_domain
from core.domain import collection_services
from core.domain import exp_domain
from core.domain import exp_fetchers
from core.domain import exp_services
from core.domain import rights_manager
from core.domain import search_services
from core.domain import user_services
from core.platform import models
from core.platform.taskqueue import gae_taskqueue_services as taskqueue_services
from core.tests import test_utils
import feconf
import python_utils
from google.appengine.ext import ndb
gae_search_services = models.Registry.import_search_services()
(collection_models, exp_models) = models.Registry.import_models(
[models.NAMES.collection, models.NAMES.exploration])
class ActivityContributorsSummaryOneOffJobTests(test_utils.GenericTestBase):
ONE_OFF_JOB_MANAGERS_FOR_TESTS = [
activity_jobs_one_off.ActivityContributorsSummaryOneOffJob]
EXP_ID = 'exp_id'
COL_ID = 'col_id'
USERNAME_A = 'usernamea'
USERNAME_B = 'usernameb'
EMAIL_A = 'emaila@example.com'
EMAIL_B = 'emailb@example.com'
def setUp(self):
super(ActivityContributorsSummaryOneOffJobTests, self).setUp()
self.signup(self.EMAIL_A, self.USERNAME_A)
self.signup(self.EMAIL_B, self.USERNAME_B)
self.user_a_id = self.get_user_id_from_email(self.EMAIL_A)
self.user_b_id = self.get_user_id_from_email(self.EMAIL_B)
def _run_one_off_job(self):
"""Runs the one-off MapReduce job."""
job_id = (
activity_jobs_one_off.ActivityContributorsSummaryOneOffJob
.create_new())
activity_jobs_one_off.ActivityContributorsSummaryOneOffJob.enqueue(
job_id)
self.assertEqual(
self.count_jobs_in_taskqueue(
taskqueue_services.QUEUE_NAME_ONE_OFF_JOBS), 1)
self.process_and_flush_pending_tasks()
stringified_output = (
activity_jobs_one_off.ActivityContributorsSummaryOneOffJob
.get_output(job_id))
eval_output = [ast.literal_eval(stringified_item) for
stringified_item in stringified_output]
return eval_output
def test_contributors_for_valid_nonrevert_contribution(self):
# Let USER A make three commits.
exploration = self.save_new_valid_exploration(
self.EXP_ID, self.user_a_id)
collection = self.save_new_valid_collection(self.COL_ID, self.user_a_id)
exp_services.update_exploration(
self.user_a_id, self.EXP_ID, [exp_domain.ExplorationChange({
'cmd': 'edit_exploration_property',
'property_name': 'title',
'new_value': 'New Exploration Title'
})], 'Changed title.')
exp_services.update_exploration(
self.user_a_id, self.EXP_ID, [exp_domain.ExplorationChange({
'cmd': 'edit_exploration_property',
'property_name': 'objective',
'new_value': 'New Objective'
})], 'Changed Objective.')
collection_services.update_collection(
self.user_a_id, self.COL_ID, [{
'cmd': 'edit_collection_property',
'property_name': 'title',
'new_value': 'New Exploration Title'
}], 'Changed title.')
collection_services.update_collection(
self.user_a_id, self.COL_ID, [{
'cmd': 'edit_collection_property',
'property_name': 'objective',
'new_value': 'New Objective'
}], 'Changed Objective.')
output = self._run_one_off_job()
self.assertEqual([['SUCCESS', 3]], output)
exploration_summary = exp_fetchers.get_exploration_summary_by_id(
exploration.id)
self.assertEqual([self.user_a_id], exploration_summary.contributor_ids)
self.assertEqual(
{self.user_a_id: 3}, exploration_summary.contributors_summary)
collection_summary = collection_services.get_collection_summary_by_id(
collection.id)
self.assertEqual([self.user_a_id], collection_summary.contributor_ids)
self.assertEqual(
{self.user_a_id: 3}, collection_summary.contributors_summary)
def test_contributors_with_only_reverts_not_included(self):
# Let USER A make three commits.
exploration = self.save_new_valid_exploration(
self.EXP_ID, self.user_a_id, title='Exploration Title 1')
exp_services.update_exploration(
self.user_a_id, self.EXP_ID, [exp_domain.ExplorationChange({
'cmd': 'edit_exploration_property',
'property_name': 'title',
'new_value': 'New Exploration Title'
})], 'Changed title.')
exp_services.update_exploration(
self.user_a_id, self.EXP_ID, [exp_domain.ExplorationChange({
'cmd': 'edit_exploration_property',
'property_name': 'objective',
'new_value': 'New Objective'
})], 'Changed Objective.')
# Let the second user revert version 3 to version 2.
exp_services.revert_exploration(self.user_b_id, self.EXP_ID, 3, 2)
output = self._run_one_off_job()
self.assertEqual([['SUCCESS', 1]], output)
exploration_summary = exp_fetchers.get_exploration_summary_by_id(
exploration.id)
self.assertEqual([self.user_a_id], exploration_summary.contributor_ids)
self.assertEqual(
{self.user_a_id: 2}, exploration_summary.contributors_summary)
def test_reverts_not_counted(self):
# Let USER A make 3 non-revert commits.
exploration = self.save_new_valid_exploration(
self.EXP_ID, self.user_a_id, title='Exploration Title')
exp_services.update_exploration(
self.user_a_id, self.EXP_ID, [exp_domain.ExplorationChange({
'cmd': 'edit_exploration_property',
'property_name': 'title',
'new_value': 'New Exploration Title'
})], 'Changed title.')
exp_services.update_exploration(
self.user_a_id, self.EXP_ID, [exp_domain.ExplorationChange({
'cmd': 'edit_exploration_property',
'property_name': 'objective',
'new_value': 'New Objective'
})], 'Changed Objective.')
# Let USER A revert version 3 to version 2.
exp_services.revert_exploration(self.user_a_id, self.EXP_ID, 3, 2)
output = self._run_one_off_job()
self.assertEqual([['SUCCESS', 1]], output)
# Check that USER A's number of contributions is equal to 2.
exploration_summary = exp_fetchers.get_exploration_summary_by_id(
exploration.id)
self.assertEqual([self.user_a_id], exploration_summary.contributor_ids)
self.assertEqual(
{self.user_a_id: 2}, exploration_summary.contributors_summary)
def test_nonhuman_committers_not_counted(self):
# Create a commit with the system user id.
exploration = self.save_new_valid_exploration(
self.EXP_ID, feconf.SYSTEM_COMMITTER_ID, title='Original Title')
collection = self.save_new_valid_collection(self.COL_ID, self.user_a_id)
# Create commits with all the system user ids.
for system_id in constants.SYSTEM_USER_IDS:
exp_services.update_exploration(
system_id, self.EXP_ID, [exp_domain.ExplorationChange({
'cmd': 'edit_exploration_property',
'property_name': 'title',
'new_value': 'Title changed by %s' % system_id
})], 'Changed title.')
collection_services.update_collection(
system_id, self.COL_ID, [{
'cmd': 'edit_collection_property',
'property_name': 'title',
'new_value': 'New Exploration Title'
}], 'Changed title.')
output = self._run_one_off_job()
self.assertEqual([['SUCCESS', 3]], output)
# Check that no system id was added to the exploration's
# contributor's summary.
exploration_summary = exp_fetchers.get_exploration_summary_by_id(
exploration.id)
collection_summary = collection_services.get_collection_summary_by_id(
collection.id)
for system_id in constants.SYSTEM_USER_IDS:
self.assertNotIn(
system_id,
exploration_summary.contributors_summary)
self.assertNotIn(
system_id,
exploration_summary.contributor_ids)
self.assertNotIn(
system_id,
collection_summary.contributors_summary)
self.assertNotIn(
system_id,
collection_summary.contributor_ids)
def test_deleted_exploration(self):
self.save_new_valid_exploration(
self.EXP_ID, self.user_a_id)
exp_services.delete_exploration(feconf.SYSTEM_COMMITTER_ID, self.EXP_ID)
self.process_and_flush_pending_tasks()
output = self._run_one_off_job()
self.assertEqual([], output)
class AuditContributorsOneOffJobTests(test_utils.GenericTestBase):
USER_1_ID = 'user_1_id'
USER_2_ID = 'user_2_id'
USER_3_ID = 'user_3_id'
USER_4_ID = 'user_4_id'
def _run_one_off_job(self):
"""Runs the one-off MapReduce job."""
job_id = activity_jobs_one_off.AuditContributorsOneOffJob.create_new()
activity_jobs_one_off.AuditContributorsOneOffJob.enqueue(job_id)
self.assertEqual(
self.count_jobs_in_taskqueue(
taskqueue_services.QUEUE_NAME_ONE_OFF_JOBS), 1)
self.process_and_flush_pending_tasks()
stringified_output = (
activity_jobs_one_off.AuditContributorsOneOffJob.get_output(job_id))
eval_output = [ast.literal_eval(stringified_item) for
stringified_item in stringified_output]
for item in eval_output:
if isinstance(item[1], list):
item[1] = [ast.literal_eval(triple) for triple in item[1]]
return eval_output
def test_correct_models(self):
exp_models.ExpSummaryModel(
id='id_1',
title='title',
category='category',
objective='objective',
language_code='language_code',
community_owned=False,
contributor_ids=[self.USER_1_ID],
contributors_summary={self.USER_1_ID: 4},
).put()
collection_models.CollectionSummaryModel(
id='id_1',
title='title',
category='category',
objective='objective',
language_code='language_code',
community_owned=False,
contributor_ids=[self.USER_2_ID],
contributors_summary={self.USER_2_ID: 4},
).put()
output = self._run_one_off_job()
self.assertEqual(len(output), 1)
self.assertEqual([['SUCCESS', 2]], output)
def test_duplicate_ids_models(self):
exp_models.ExpSummaryModel(
id='id_1',
title='title',
category='category',
objective='objective',
language_code='language_code',
community_owned=False,
contributor_ids=[self.USER_1_ID, self.USER_1_ID],
contributors_summary={self.USER_1_ID: 4},
).put()
collection_models.CollectionSummaryModel(
id='id_2',
title='title',
category='category',
objective='objective',
language_code='language_code',
community_owned=False,
contributor_ids=[self.USER_2_ID, self.USER_2_ID],
contributors_summary={self.USER_2_ID: 4},
).put()
output = self._run_one_off_job()
self.assertEqual(len(output), 2)
self.assertIn(['SUCCESS', 2], output)
self.assertIn([
'DUPLICATE_IDS', [
('id_1', [self.USER_1_ID, self.USER_1_ID], {self.USER_1_ID: 4}),
('id_2', [self.USER_2_ID, self.USER_2_ID], {self.USER_2_ID: 4})
]], output)
def test_missing_in_summary_models(self):
exp_models.ExpSummaryModel(
id='id_1',
title='title',
category='category',
objective='objective',
language_code='language_code',
community_owned=False,
contributor_ids=[self.USER_1_ID, self.USER_2_ID],
contributors_summary={self.USER_1_ID: 4},
).put()
collection_models.CollectionSummaryModel(
id='id_2',
title='title',
category='category',
objective='objective',
language_code='language_code',
community_owned=False,
contributor_ids=[self.USER_1_ID, self.USER_2_ID],
contributors_summary={self.USER_2_ID: 4},
).put()
output = self._run_one_off_job()
self.assertEqual(len(output), 2)
self.assertIn(['SUCCESS', 2], output)
self.assertIn([
'MISSING_IN_SUMMARY', [
('id_1', [self.USER_1_ID, self.USER_2_ID], {self.USER_1_ID: 4}),
('id_2', [self.USER_1_ID, self.USER_2_ID], {self.USER_2_ID: 4})
]], output)
def test_missing_in_ids_models(self):
exp_models.ExpSummaryModel(
id='id_1',
title='title',
category='category',
objective='objective',
language_code='language_code',
community_owned=False,
contributor_ids=[self.USER_1_ID],
contributors_summary={self.USER_1_ID: 2, self.USER_2_ID: 4},
).put()
collection_models.CollectionSummaryModel(
id='id_2',
title='title',
category='category',
objective='objective',
language_code='language_code',
community_owned=False,
contributor_ids=[self.USER_2_ID],
contributors_summary={self.USER_1_ID: 1, self.USER_2_ID: 3},
).put()
output = self._run_one_off_job()
self.assertEqual(len(output), 2)
self.assertIn(['SUCCESS', 2], output)
self.assertIn([
'MISSING_IN_IDS', [
(
'id_1',
[self.USER_1_ID],
{self.USER_1_ID: 2, self.USER_2_ID: 4}
),
(
'id_2',
[self.USER_2_ID],
{self.USER_1_ID: 1, self.USER_2_ID: 3}
)
]], output)
def test_combined_models(self):
exp_models.ExpSummaryModel(
id='id_1',
title='title',
category='category',
objective='objective',
language_code='language_code',
community_owned=False,
contributor_ids=[self.USER_1_ID, self.USER_1_ID, self.USER_2_ID],
contributors_summary={self.USER_2_ID: 4},
).put()
collection_models.CollectionSummaryModel(
id='id_2',
title='title',
category='category',
objective='objective',
language_code='language_code',
community_owned=False,
contributor_ids=[self.USER_2_ID, self.USER_3_ID],
contributors_summary={self.USER_1_ID: 4, self.USER_2_ID: 4},
).put()
output = self._run_one_off_job()
self.assertEqual(len(output), 4)
self.assertIn(['SUCCESS', 2], output)
self.assertIn([
'DUPLICATE_IDS', [(
'id_1',
[self.USER_1_ID, self.USER_1_ID, self.USER_2_ID],
{self.USER_2_ID: 4}
)]], output)
self.assertIn([
'MISSING_IN_SUMMARY', [
(
'id_1',
[self.USER_1_ID, self.USER_1_ID, self.USER_2_ID],
{self.USER_2_ID: 4}
),
(
'id_2',
[self.USER_2_ID, self.USER_3_ID],
{self.USER_1_ID: 4, self.USER_2_ID: 4}
)
]], output)
self.assertIn([
'MISSING_IN_IDS', [(
'id_2',
[self.USER_2_ID, self.USER_3_ID],
{self.USER_1_ID: 4, self.USER_2_ID: 4}
)]], output)
class OneOffReindexActivitiesJobTests(test_utils.GenericTestBase):
def setUp(self):
super(OneOffReindexActivitiesJobTests, self).setUp()
self.signup(self.OWNER_EMAIL, self.OWNER_USERNAME)
self.owner_id = self.get_user_id_from_email(self.OWNER_EMAIL)
self.owner = user_services.UserActionsInfo(self.owner_id)
explorations = [exp_domain.Exploration.create_default_exploration(
'%s' % i,
title='title %d' % i,
category='category%d' % i
) for i in python_utils.RANGE(3)]
for exp in explorations:
exp_services.save_new_exploration(self.owner_id, exp)
rights_manager.publish_exploration(self.owner, exp.id)
collections = [collection_domain.Collection.create_default_collection(
'%s' % i,
title='title %d' % i,
category='category%d' % i
) for i in python_utils.RANGE(3, 6)]
for collection in collections:
collection_services.save_new_collection(self.owner_id, collection)
rights_manager.publish_collection(self.owner, collection.id)
self.process_and_flush_pending_tasks()
def test_standard_operation(self):
job_id = (
activity_jobs_one_off.IndexAllActivitiesJobManager.create_new())
activity_jobs_one_off.IndexAllActivitiesJobManager.enqueue(job_id)
self.assertEqual(
self.count_jobs_in_taskqueue(
taskqueue_services.QUEUE_NAME_ONE_OFF_JOBS), 1)
indexed_docs = []
def mock_add_documents_to_index(docs, index):
indexed_docs.extend(docs)
self.assertIn(index, (
search_services.SEARCH_INDEX_EXPLORATIONS,
search_services.SEARCH_INDEX_COLLECTIONS))
add_docs_swap = self.swap(
gae_search_services, 'add_documents_to_index',
mock_add_documents_to_index)
with add_docs_swap:
self.process_and_flush_pending_tasks()
ids = [doc['id'] for doc in indexed_docs]
titles = [doc['title'] for doc in indexed_docs]
categories = [doc['category'] for doc in indexed_docs]
for index in python_utils.RANGE(5):
self.assertIn('%s' % index, ids)
self.assertIn('title %d' % index, titles)
self.assertIn('category%d' % index, categories)
self.assertIsNone(
activity_jobs_one_off.IndexAllActivitiesJobManager.reduce(
'key', 'value'))
class MockCollectionCommitLogEntryModel(
collection_models.CollectionCommitLogEntryModel):
"""Mock CollectionCommitLogEntryModel so that it allows to set username."""
username = ndb.StringProperty(indexed=True, required=False)
class RemoveCommitUsernamesOneOffJobTests(test_utils.GenericTestBase):
USER_1_ID = 'user_1_id'
def _run_one_off_job(self):
"""Runs the one-off MapReduce job."""
job_id = (
activity_jobs_one_off.RemoveCommitUsernamesOneOffJob.create_new())
activity_jobs_one_off.RemoveCommitUsernamesOneOffJob.enqueue(job_id)
self.assertEqual(
self.count_jobs_in_taskqueue(
taskqueue_services.QUEUE_NAME_ONE_OFF_JOBS), 1)
self.process_and_flush_pending_tasks()
stringified_output = (
activity_jobs_one_off.RemoveCommitUsernamesOneOffJob
.get_output(job_id))
eval_output = [ast.literal_eval(stringified_item) for
stringified_item in stringified_output]
return eval_output
def test_one_commit_model_with_username(self):
with self.swap(
collection_models, 'CollectionCommitLogEntryModel',
MockCollectionCommitLogEntryModel
):
original_commit_model = (
collection_models.CollectionCommitLogEntryModel(
id='id',
user_id='committer_id',
username='username',
collection_id='col_id',
commit_type='create',
commit_message='Message',
commit_cmds=[],
version=1,
post_commit_status='public',
post_commit_community_owned=False,
post_commit_is_private=False
)
)
original_commit_model.put()
self.assertIsNotNone(original_commit_model.username)
self.assertIn('username', original_commit_model._values) # pylint: disable=protected-access
self.assertIn('username', original_commit_model._properties) # pylint: disable=protected-access
output = self._run_one_off_job()
self.assertItemsEqual(
[['SUCCESS_REMOVED - MockCollectionCommitLogEntryModel', 1]],
output)
migrated_commit_model = (
collection_models.CollectionCommitLogEntryModel.get_by_id('id'))
self.assertIsNone(migrated_commit_model.username)
self.assertNotIn('username', migrated_commit_model._values) # pylint: disable=protected-access
self.assertNotIn('username', migrated_commit_model._properties) # pylint: disable=protected-access
self.assertEqual(
original_commit_model.last_updated,
migrated_commit_model.last_updated)
def test_one_commit_model_without_username(self):
original_commit_model = (
collection_models.CollectionCommitLogEntryModel(
id='id',
user_id='committer_id',
collection_id='col_id',
commit_type='create',
commit_message='Message',
commit_cmds=[],
version=1,
post_commit_status='public',
post_commit_community_owned=False,
post_commit_is_private=False
)
)
original_commit_model.put()
self.assertNotIn('username', original_commit_model._values) # pylint: disable=protected-access
self.assertNotIn('username', original_commit_model._properties) # pylint: disable=protected-access
output = self._run_one_off_job()
self.assertItemsEqual(
[['SUCCESS_ALREADY_REMOVED - CollectionCommitLogEntryModel', 1]],
output)
migrated_commit_model = (
collection_models.CollectionCommitLogEntryModel.get_by_id('id'))
self.assertNotIn('username', migrated_commit_model._values) # pylint: disable=protected-access
self.assertNotIn('username', migrated_commit_model._properties) # pylint: disable=protected-access
self.assertEqual(
original_commit_model.last_updated,
migrated_commit_model.last_updated)
class FixCommitLastUpdatedOneOffJobTests(test_utils.GenericTestBase):
USER_1_ID = 'user_1_id'
def _run_one_off_job(self):
"""Runs the one-off MapReduce job."""
job_id = (
activity_jobs_one_off.FixCommitLastUpdatedOneOffJob.create_new())
activity_jobs_one_off.FixCommitLastUpdatedOneOffJob.enqueue(job_id)
self.assertEqual(
self.count_jobs_in_taskqueue(
taskqueue_services.QUEUE_NAME_ONE_OFF_JOBS), 1)
self.process_and_flush_pending_tasks()
stringified_output = (
activity_jobs_one_off.FixCommitLastUpdatedOneOffJob
.get_output(job_id))
eval_output = [ast.literal_eval(stringified_item) for
stringified_item in stringified_output]
return eval_output
def test_fix_one_commit_when_last_updated_is_before_migration_time(self):
original_commit_model = (
collection_models.CollectionCommitLogEntryModel(
id='id',
user_id='committer_id',
collection_id='col_id',
commit_type='create',
commit_message='Message',
commit_cmds=[],
version=1,
post_commit_status='public',
post_commit_community_owned=False,
post_commit_is_private=False,
created_on=datetime.datetime.strptime(
'2020-06-18T22:00:00Z', '%Y-%m-%dT%H:%M:%SZ'),
last_updated=datetime.datetime.strptime(
'2020-06-18T22:01:00Z', '%Y-%m-%dT%H:%M:%SZ')
)
)
original_commit_model.put(update_last_updated_time=False)
output = self._run_one_off_job()
self.assertItemsEqual(
[['SUCCESS_NEWLY_CREATED - CollectionCommitLogEntryModel', 1]],
output)
migrated_commit_model = (
collection_models.CollectionCommitLogEntryModel.get_by_id('id'))
self.assertEqual(
original_commit_model.created_on,
migrated_commit_model.created_on)
self.assertEqual(
original_commit_model.last_updated,
migrated_commit_model.last_updated)
def test_fix_one_commit_when_last_updated_is_during_migration_time(self):
original_commit_model = (
collection_models.CollectionCommitLogEntryModel(
id='id',
user_id='committer_id',
collection_id='col_id',
commit_type='create',
commit_message='Message',
commit_cmds=[],
version=1,
post_commit_status='public',
post_commit_community_owned=False,
post_commit_is_private=False,
created_on=datetime.datetime.strptime(
'2019-06-29T01:00:00Z', '%Y-%m-%dT%H:%M:%SZ'),
last_updated=datetime.datetime.strptime(
'2020-06-29T11:00:00Z', '%Y-%m-%dT%H:%M:%SZ')
)
)
original_commit_model.put(update_last_updated_time=False)
output = self._run_one_off_job()
self.assertItemsEqual(
[['SUCCESS_FIXED - CollectionCommitLogEntryModel', 1]], output)
migrated_commit_model = (
collection_models.CollectionCommitLogEntryModel.get_by_id('id'))
self.assertEqual(
original_commit_model.created_on,
migrated_commit_model.created_on)
self.assertNotEqual(
original_commit_model.last_updated,
migrated_commit_model.last_updated)
self.assertEqual(
original_commit_model.created_on,
migrated_commit_model.last_updated)
def test_fix_one_commit_when_last_updated_is_during_test_migration_time(
self):
original_commit_model = (
collection_models.CollectionCommitLogEntryModel(
id='id',
user_id='committer_id',
collection_id='col_id',
commit_type='create',
commit_message='Message',
commit_cmds=[],
version=1,
post_commit_status='public',
post_commit_community_owned=False,
post_commit_is_private=False,
created_on=datetime.datetime.strptime(
'2019-06-09T01:00:00Z', '%Y-%m-%dT%H:%M:%SZ'),
last_updated=datetime.datetime.strptime(
'2020-06-13T11:00:00Z', '%Y-%m-%dT%H:%M:%SZ')
)
)
original_commit_model.put(update_last_updated_time=False)
output = self._run_one_off_job()
self.assertItemsEqual(
[['SUCCESS_TEST_SERVER_FIXED - CollectionCommitLogEntryModel', 1]],
output)
migrated_commit_model = (
collection_models.CollectionCommitLogEntryModel.get_by_id('id'))
self.assertEqual(
original_commit_model.created_on,
migrated_commit_model.created_on)
self.assertNotEqual(
original_commit_model.last_updated,
migrated_commit_model.last_updated)
self.assertEqual(
original_commit_model.created_on,
migrated_commit_model.last_updated)
def test_fix_one_commit_when_last_updated_is_after_migration_time(self):
original_commit_model = (
collection_models.CollectionCommitLogEntryModel(
id='id',
user_id='committer_id',
collection_id='col_id',
commit_type='create',
commit_message='Message',
commit_cmds=[],
version=1,
post_commit_status='public',
post_commit_community_owned=False,
post_commit_is_private=False,
created_on=datetime.datetime.strptime(
'2020-07-01T08:59:59Z', '%Y-%m-%dT%H:%M:%SZ'),
last_updated=datetime.datetime.strptime(
'2020-07-01T09:00:00Z', '%Y-%m-%dT%H:%M:%SZ')
)
)
original_commit_model.put(update_last_updated_time=False)
output = self._run_one_off_job()
self.assertItemsEqual(
[['SUCCESS_NEWLY_CREATED - CollectionCommitLogEntryModel', 1]],
output)
migrated_commit_model = (
collection_models.CollectionCommitLogEntryModel.get_by_id('id'))
self.assertEqual(
original_commit_model.created_on,
migrated_commit_model.created_on)
self.assertEqual(
original_commit_model.last_updated,
migrated_commit_model.last_updated)
def test_fix_multiple_commits_when_commits_are_created_by_admins(self):
original_commit_model_1 = (
collection_models.CollectionCommitLogEntryModel(
id='id1',
user_id=feconf.SYSTEM_COMMITTER_ID,
collection_id='col_id',
commit_type='create',
commit_message='Message',
commit_cmds=[],
version=1,
post_commit_status='public',
post_commit_community_owned=False,
post_commit_is_private=False,
created_on=datetime.datetime.strptime(
'2020-07-01T09:59:59Z', '%Y-%m-%dT%H:%M:%SZ'),
last_updated=datetime.datetime.strptime(
'2020-07-01T11:00:00Z', '%Y-%m-%dT%H:%M:%SZ')
)
)
original_commit_model_1.put(update_last_updated_time=False)
original_commit_model_2 = (
collection_models.CollectionCommitLogEntryModel(
id='id2',
user_id=feconf.MIGRATION_BOT_USER_ID,
collection_id='col_id',
commit_type='create',
commit_message='Message',
commit_cmds=[],
version=1,
post_commit_status='public',
post_commit_community_owned=False,
post_commit_is_private=False,
created_on=datetime.datetime.strptime(
'2020-07-01T09:59:59Z', '%Y-%m-%dT%H:%M:%SZ'),
last_updated=datetime.datetime.strptime(
'2020-07-01T11:00:00Z', '%Y-%m-%dT%H:%M:%SZ')
)
)
original_commit_model_2.put(update_last_updated_time=False)
output = self._run_one_off_job()
self.assertItemsEqual(
[['SUCCESS_ADMIN - CollectionCommitLogEntryModel', 2]], output)
migrated_commit_model_1 = (
collection_models.CollectionCommitLogEntryModel.get_by_id('id1'))
self.assertEqual(
original_commit_model_1.created_on,
migrated_commit_model_1.created_on)
self.assertEqual(
original_commit_model_1.last_updated,
migrated_commit_model_1.last_updated)
migrated_commit_model_2 = (
collection_models.CollectionCommitLogEntryModel.get_by_id('id2'))
self.assertEqual(
original_commit_model_2.created_on,
migrated_commit_model_2.created_on)
self.assertEqual(
original_commit_model_2.last_updated,
migrated_commit_model_2.last_updated)
def test_fix_multiple_commits_when_last_updated_is_wrong(self):
original_commit_model_1 = (
collection_models.CollectionCommitLogEntryModel(
id='id1',
user_id='committer_id',
collection_id='col_id',
commit_type='create',
commit_message='Message',
commit_cmds=[],
version=1,
post_commit_status='public',
post_commit_community_owned=False,
post_commit_is_private=False,
created_on=datetime.datetime.strptime(
'2020-07-01T09:59:59Z', '%Y-%m-%dT%H:%M:%SZ'),
last_updated=datetime.datetime.strptime(
'2020-07-01T09:00:00Z', '%Y-%m-%dT%H:%M:%SZ')
)
)
original_commit_model_1.put(update_last_updated_time=False)
original_commit_model_2 = (
collection_models.CollectionCommitLogEntryModel(
id='id2',
user_id='committer_id',
collection_id='col_id',
commit_type='create',
commit_message='Message',
commit_cmds=[],
version=1,
post_commit_status='public',
post_commit_community_owned=False,
post_commit_is_private=False,
created_on=datetime.datetime.strptime(
'2020-07-01T09:59:59Z', '%Y-%m-%dT%H:%M:%SZ'),
last_updated=datetime.datetime.strptime(
'2020-07-20T09:00:00Z', '%Y-%m-%dT%H:%M:%SZ')
)
)
original_commit_model_2.put(update_last_updated_time=False)
output = self._run_one_off_job()
self.assertItemsEqual(
[['FAILURE_INCORRECT - CollectionCommitLogEntryModel',
['id1', 'id2']]],
output)
migrated_commit_model_1 = (
collection_models.CollectionCommitLogEntryModel.get_by_id('id1'))
self.assertEqual(
original_commit_model_1.created_on,
migrated_commit_model_1.created_on)
self.assertEqual(
original_commit_model_1.last_updated,
migrated_commit_model_1.last_updated)
migrated_commit_model_2 = (
collection_models.CollectionCommitLogEntryModel.get_by_id('id2'))
self.assertEqual(
original_commit_model_2.created_on,
migrated_commit_model_2.created_on)
self.assertEqual(
original_commit_model_2.last_updated,
migrated_commit_model_2.last_updated)
| 39.407809 | 111 | 0.611163 | 3,904 | 36,334 | 5.314549 | 0.084529 | 0.034702 | 0.041209 | 0.016965 | 0.818922 | 0.760073 | 0.741083 | 0.731251 | 0.721323 | 0.712647 | 0 | 0.017687 | 0.296664 | 36,334 | 921 | 112 | 39.450597 | 0.794209 | 0.043816 | 0 | 0.724806 | 0 | 0 | 0.094299 | 0.017995 | 0 | 0 | 0 | 0 | 0.104651 | 1 | 0.033592 | false | 0 | 0.028424 | 0 | 0.093023 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
225ad779f0d0fdffb7eefe6a0cf91682b9bb8f36 | 119 | py | Python | x-pack/metricbeat/tests/system/test_xpack_base.py | ContinuumLLC/beats | 4b9bc97d7e95c187a0326ba52d5fb052dd5d5a30 | [
"ECL-2.0",
"Apache-2.0"
] | 2 | 2018-07-15T13:33:54.000Z | 2020-04-25T14:54:54.000Z | x-pack/metricbeat/tests/system/test_xpack_base.py | ContinuumLLC/beats | 4b9bc97d7e95c187a0326ba52d5fb052dd5d5a30 | [
"ECL-2.0",
"Apache-2.0"
] | 3 | 2021-09-08T03:19:38.000Z | 2022-03-12T00:57:42.000Z | x-pack/metricbeat/tests/system/test_xpack_base.py | ContinuumLLC/beats | 4b9bc97d7e95c187a0326ba52d5fb052dd5d5a30 | [
"ECL-2.0",
"Apache-2.0"
] | 2 | 2020-10-26T15:34:06.000Z | 2021-12-10T08:51:58.000Z | import os
import xpack_metricbeat
import test_base
class Test(xpack_metricbeat.XPackTest, test_base.Test):
pass
| 13.222222 | 55 | 0.806723 | 17 | 119 | 5.411765 | 0.529412 | 0.326087 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.142857 | 119 | 8 | 56 | 14.875 | 0.901961 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.2 | 0.6 | 0 | 0.8 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 7 |
2276bb16aa245a7b95a8dd7f4b4179725508bc68 | 17,851 | py | Python | pyspedas/rbsp/__init__.py | pulupa/pyspedas | 7228199cf16eca2a27d130f1e4985ef1e69462ea | [
"MIT"
] | 75 | 2019-02-22T12:59:33.000Z | 2022-02-26T15:33:20.000Z | pyspedas/rbsp/__init__.py | pulupa/pyspedas | 7228199cf16eca2a27d130f1e4985ef1e69462ea | [
"MIT"
] | 40 | 2019-07-02T07:46:34.000Z | 2022-02-23T21:48:50.000Z | pyspedas/rbsp/__init__.py | pulupa/pyspedas | 7228199cf16eca2a27d130f1e4985ef1e69462ea | [
"MIT"
] | 43 | 2019-02-22T13:03:41.000Z | 2022-01-24T19:26:59.000Z |
from .load import load
def emfisis(trange=['2018-11-5', '2018-11-6'],
probe='a',
datatype='magnetometer',
level='l3',
cadence='4sec', # for EMFISIS mag data
coord='sm', # for EMFISIS mag
wavetype='waveform', # for EMFISIS waveform data
suffix='',
get_support_data=False,
varformat=None,
varnames=[],
downloadonly=False,
notplot=False,
no_update=False,
time_clip=False):
"""
This function loads data from the Electric and Magnetic Field Instrument Suite and Integrated Science (EMFISIS) instrument
For information on the EMFISIS data products, see:
https://emfisis.physics.uiowa.edu/data/level_descriptions
Parameters
----------
trange : list of str
time range of interest [starttime, endtime] with the format
'YYYY-MM-DD','YYYY-MM-DD'] or to specify more or less than a day
['YYYY-MM-DD/hh:mm:ss','YYYY-MM-DD/hh:mm:ss']
probe: str or list of str
Spacecraft probe name ('a' or 'b'); default: a
cadence: str
Data cadence (default: 4sec); other options: '1sec', 'hires'
coord: str
Data coordinate system (default: sm)
level: str
Data level; options: 'l1', 'l2', 'l3', l4'
datatype: str
Data type; valid options:
Level 1:
'magnetometer'
'hfr'
'housekeeping'
'sc-hk'
'spaceweather'
'wfr'
'wna'
Level 2:
'magnetometer'
'wfr'
'hfr'
'housekeeping'
Level 3:
'magnetometer'
Level 4:
'density'
'wna-survey'
wavetype: str
Type of level 2 waveform data; valid options:
For WFR data:
'waveform' (default)
'waveform-continuous-burst'
'spectral-matrix'
'spectral-matrix-diagonal'
'spectral-matrix-diagonal-merged'
For HFR data:
'waveform'
'spectra'
'spectra-burst'
'spectra-merged'
For descriptions of these data, see:
https://emfisis.physics.uiowa.edu/data/L2_products
suffix: str
The tplot variable names will be given this suffix. By default,
no suffix is added.
get_support_data: bool
Data with an attribute "VAR_TYPE" with a value of "support_data"
will be loaded into tplot. By default, only loads in data with a
"VAR_TYPE" attribute of "data".
varformat: str
The file variable formats to load into tplot. Wildcard character
"*" is accepted. By default, all variables are loaded in.
varnames: list of str
List of variable names to load (if not specified,
all data variables are loaded)
downloadonly: bool
Set this flag to download the CDF files, but not load them into
tplot variables
notplot: bool
Return the data in hash tables instead of creating tplot variables
no_update: bool
If set, only load data from your local cache
time_clip: bool
Time clip the variables to exactly the range specified in the trange keyword
Returns
----------
List of tplot variables created.
"""
return load(instrument='emfisis', wavetype=wavetype, trange=trange, probe=probe, datatype=datatype, level=level, cadence=cadence, coord=coord, suffix=suffix, get_support_data=get_support_data, varformat=varformat, varnames=varnames, downloadonly=downloadonly, notplot=notplot, time_clip=time_clip, no_update=no_update)
def rbspice(trange=['2018-11-5', '2018-11-6'],
probe='a',
datatype='tofxeh',
level='l3',
suffix='',
get_support_data=False,
varformat=None,
varnames=[],
downloadonly=False,
notplot=False,
no_update=False,
time_clip=False):
"""
This function loads data from the Radiation Belt Storm Probes Ion Composition Experiment (RBSPICE) instrument
Parameters
----------
trange : list of str
time range of interest [starttime, endtime] with the format
'YYYY-MM-DD','YYYY-MM-DD'] or to specify more or less than a day
['YYYY-MM-DD/hh:mm:ss','YYYY-MM-DD/hh:mm:ss']
probe: str or list of str
Spacecraft probe name ('a' or 'b'); default: a
datatype: str
Data type (default: tofxeh); Valid options:
suffix: str
The tplot variable names will be given this suffix. By default,
no suffix is added.
get_support_data: bool
Data with an attribute "VAR_TYPE" with a value of "support_data"
will be loaded into tplot. By default, only loads in data with a
"VAR_TYPE" attribute of "data".
varformat: str
The file variable formats to load into tplot. Wildcard character
"*" is accepted. By default, all variables are loaded in.
varnames: list of str
List of variable names to load (if not specified,
all data variables are loaded)
downloadonly: bool
Set this flag to download the CDF files, but not load them into
tplot variables
notplot: bool
Return the data in hash tables instead of creating tplot variables
no_update: bool
If set, only load data from your local cache
time_clip: bool
Time clip the variables to exactly the range specified in the trange keyword
Returns
----------
List of tplot variables created.
"""
return load(instrument='rbspice', trange=trange, probe=probe, datatype=datatype, level=level, suffix=suffix, get_support_data=get_support_data, varformat=varformat, varnames=varnames, downloadonly=downloadonly, notplot=notplot, time_clip=time_clip, no_update=no_update)
def efw(trange=['2015-11-5', '2015-11-6'],
probe='a',
datatype='spec',
level='l3',
suffix='',
get_support_data=False,
varformat=None,
varnames=[],
downloadonly=False,
notplot=False,
no_update=False,
time_clip=False):
"""
This function loads data from the Electric Field and Waves Suite (EFW)
Parameters
----------
trange : list of str
time range of interest [starttime, endtime] with the format
'YYYY-MM-DD','YYYY-MM-DD'] or to specify more or less than a day
['YYYY-MM-DD/hh:mm:ss','YYYY-MM-DD/hh:mm:ss']
probe: str or list of str
Spacecraft probe name ('a' or 'b'); default: a
datatype: str
Data type; Valid options:
suffix: str
The tplot variable names will be given this suffix. By default,
no suffix is added.
get_support_data: bool
Data with an attribute "VAR_TYPE" with a value of "support_data"
will be loaded into tplot. By default, only loads in data with a
"VAR_TYPE" attribute of "data".
varformat: str
The file variable formats to load into tplot. Wildcard character
"*" is accepted. By default, all variables are loaded in.
varnames: list of str
List of variable names to load (if not specified,
all data variables are loaded)
downloadonly: bool
Set this flag to download the CDF files, but not load them into
tplot variables
notplot: bool
Return the data in hash tables instead of creating tplot variables
no_update: bool
If set, only load data from your local cache
time_clip: bool
Time clip the variables to exactly the range specified in the trange keyword
Returns
----------
List of tplot variables created.
"""
return load(instrument='efw', trange=trange, probe=probe, datatype=datatype, level=level, suffix=suffix, get_support_data=get_support_data, varformat=varformat, varnames=varnames, downloadonly=downloadonly, notplot=notplot, time_clip=time_clip, no_update=no_update)
def mageis(trange=['2015-11-5', '2015-11-6'],
probe='a',
datatype='',
level='l3',
rel='rel04',
suffix='',
get_support_data=False,
varformat=None,
varnames=[],
downloadonly=False,
notplot=False,
no_update=False,
time_clip=False):
"""
This function loads data from the Energetic Particle, Composition, and Thermal Plasma Suite (ECT)
Parameters
----------
trange : list of str
time range of interest [starttime, endtime] with the format
'YYYY-MM-DD','YYYY-MM-DD'] or to specify more or less than a day
['YYYY-MM-DD/hh:mm:ss','YYYY-MM-DD/hh:mm:ss']
probe: str or list of str
Spacecraft probe name ('a' or 'b'); default: a
datatype: str
Data type; Valid options:
suffix: str
The tplot variable names will be given this suffix. By default,
no suffix is added.
get_support_data: bool
Data with an attribute "VAR_TYPE" with a value of "support_data"
will be loaded into tplot. By default, only loads in data with a
"VAR_TYPE" attribute of "data".
varformat: str
The file variable formats to load into tplot. Wildcard character
"*" is accepted. By default, all variables are loaded in.
varnames: list of str
List of variable names to load (if not specified,
all data variables are loaded)
downloadonly: bool
Set this flag to download the CDF files, but not load them into
tplot variables
notplot: bool
Return the data in hash tables instead of creating tplot variables
no_update: bool
If set, only load data from your local cache
time_clip: bool
Time clip the variables to exactly the range specified in the trange keyword
Returns
----------
List of tplot variables created.
"""
return load(instrument='mageis', rel=rel, trange=trange, probe=probe, datatype=datatype, level=level, suffix=suffix, get_support_data=get_support_data, varformat=varformat, varnames=varnames, downloadonly=downloadonly, notplot=notplot, time_clip=time_clip, no_update=no_update)
def hope(trange=['2015-11-5', '2015-11-6'],
probe='a',
datatype='moments',
level='l3',
rel='rel04',
suffix='',
get_support_data=False,
varformat=None,
varnames=[],
downloadonly=False,
notplot=False,
no_update=False,
time_clip=False):
"""
This function loads data from the Energetic Particle, Composition, and Thermal Plasma Suite (ECT)
Parameters
----------
trange : list of str
time range of interest [starttime, endtime] with the format
'YYYY-MM-DD','YYYY-MM-DD'] or to specify more or less than a day
['YYYY-MM-DD/hh:mm:ss','YYYY-MM-DD/hh:mm:ss']
probe: str or list of str
Spacecraft probe name ('a' or 'b'); default: a
datatype: str
Data type; Valid options:
suffix: str
The tplot variable names will be given this suffix. By default,
no suffix is added.
get_support_data: bool
Data with an attribute "VAR_TYPE" with a value of "support_data"
will be loaded into tplot. By default, only loads in data with a
"VAR_TYPE" attribute of "data".
varformat: str
The file variable formats to load into tplot. Wildcard character
"*" is accepted. By default, all variables are loaded in.
varnames: list of str
List of variable names to load (if not specified,
all data variables are loaded)
downloadonly: bool
Set this flag to download the CDF files, but not load them into
tplot variables
notplot: bool
Return the data in hash tables instead of creating tplot variables
no_update: bool
If set, only load data from your local cache
time_clip: bool
Time clip the variables to exactly the range specified in the trange keyword
Returns
----------
List of tplot variables created.
"""
return load(instrument='hope', rel=rel, trange=trange, probe=probe, datatype=datatype, level=level, suffix=suffix, get_support_data=get_support_data, varformat=varformat, varnames=varnames, downloadonly=downloadonly, notplot=notplot, time_clip=time_clip, no_update=no_update)
def rept(trange=['2015-11-5', '2015-11-6'],
probe='a',
datatype='',
level='l3',
rel='rel03',
suffix='',
get_support_data=False,
varformat=None,
varnames=[],
downloadonly=False,
notplot=False,
no_update=False,
time_clip=False):
"""
This function loads data from the Energetic Particle, Composition, and Thermal Plasma Suite (ECT)
Parameters
----------
trange : list of str
time range of interest [starttime, endtime] with the format
'YYYY-MM-DD','YYYY-MM-DD'] or to specify more or less than a day
['YYYY-MM-DD/hh:mm:ss','YYYY-MM-DD/hh:mm:ss']
probe: str or list of str
Spacecraft probe name ('a' or 'b'); default: a
datatype: str
Data type; Valid options:
suffix: str
The tplot variable names will be given this suffix. By default,
no suffix is added.
get_support_data: bool
Data with an attribute "VAR_TYPE" with a value of "support_data"
will be loaded into tplot. By default, only loads in data with a
"VAR_TYPE" attribute of "data".
varformat: str
The file variable formats to load into tplot. Wildcard character
"*" is accepted. By default, all variables are loaded in.
varnames: list of str
List of variable names to load (if not specified,
all data variables are loaded)
downloadonly: bool
Set this flag to download the CDF files, but not load them into
tplot variables
notplot: bool
Return the data in hash tables instead of creating tplot variables
no_update: bool
If set, only load data from your local cache
time_clip: bool
Time clip the variables to exactly the range specified in the trange keyword
Returns
----------
List of tplot variables created.
"""
return load(instrument='rept', rel=rel, trange=trange, probe=probe, datatype=datatype, level=level, suffix=suffix, get_support_data=get_support_data, varformat=varformat, varnames=varnames, downloadonly=downloadonly, notplot=notplot, time_clip=time_clip, no_update=no_update)
def rps(trange=['2015-11-5', '2015-11-6'],
probe='a',
datatype='rps-1min',
level='l2',
suffix='',
get_support_data=True,
varformat=None,
varnames=[],
downloadonly=False,
notplot=False,
no_update=False,
time_clip=False):
"""
This function loads data from the Relativistic Proton Spectrometer (RPS)
Parameters
----------
trange : list of str
time range of interest [starttime, endtime] with the format
'YYYY-MM-DD','YYYY-MM-DD'] or to specify more or less than a day
['YYYY-MM-DD/hh:mm:ss','YYYY-MM-DD/hh:mm:ss']
probe: str or list of str
Spacecraft probe name ('a' or 'b'); default: a
datatype: str
Data type; Valid options:
suffix: str
The tplot variable names will be given this suffix. By default,
no suffix is added.
get_support_data: bool
Data with an attribute "VAR_TYPE" with a value of "support_data"
will be loaded into tplot. By default, only loads in data with a
"VAR_TYPE" attribute of "data".
varformat: str
The file variable formats to load into tplot. Wildcard character
"*" is accepted. By default, all variables are loaded in.
varnames: list of str
List of variable names to load (if not specified,
all data variables are loaded)
downloadonly: bool
Set this flag to download the CDF files, but not load them into
tplot variables
notplot: bool
Return the data in hash tables instead of creating tplot variables
no_update: bool
If set, only load data from your local cache
time_clip: bool
Time clip the variables to exactly the range specified in the trange keyword
Returns
----------
List of tplot variables created.
"""
return load(instrument='rps', trange=trange, probe=probe, datatype=datatype, level=level, suffix=suffix, get_support_data=get_support_data, varformat=varformat, varnames=varnames, downloadonly=downloadonly, notplot=notplot, time_clip=time_clip, no_update=no_update)
| 34.662136 | 322 | 0.596605 | 2,240 | 17,851 | 4.694196 | 0.0875 | 0.036614 | 0.03728 | 0.026629 | 0.900428 | 0.900428 | 0.89748 | 0.891013 | 0.886448 | 0.886448 | 0 | 0.010338 | 0.322671 | 17,851 | 514 | 323 | 34.729572 | 0.859317 | 0.682819 | 0 | 0.734694 | 0 | 0 | 0.05647 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.071429 | false | 0 | 0.010204 | 0 | 0.153061 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
97e04c2349f961815f995fbfc7c5dcdcf00cb062 | 22,744 | py | Python | tests/test_scheduler/test_tag_tagged.py | zhongtianxie/fm-orchestrator | 5ab39bf1981cf4abdf7ca4c2a7d4a6120f1bea2f | [
"MIT"
] | null | null | null | tests/test_scheduler/test_tag_tagged.py | zhongtianxie/fm-orchestrator | 5ab39bf1981cf4abdf7ca4c2a7d4a6120f1bea2f | [
"MIT"
] | null | null | null | tests/test_scheduler/test_tag_tagged.py | zhongtianxie/fm-orchestrator | 5ab39bf1981cf4abdf7ca4c2a7d4a6120f1bea2f | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
# SPDX-License-Identifier: MIT
from __future__ import absolute_import
import koji
import mock
from mock import patch
import pytest
import module_build_service.common.models
from module_build_service.scheduler.db_session import db_session
import module_build_service.scheduler.handlers.repos
import module_build_service.scheduler.handlers.tags
@pytest.mark.usefixtures("reuse_component_init_data")
class TestTagTagged:
@mock.patch("module_build_service.common.models.ModuleBuild.get_by_tag")
def test_no_matching_module(self, get_by_tag):
""" Test that when a tag msg hits us and we have no match,
that we do nothing gracefully.
"""
get_by_tag.return_value = None
module_build_service.scheduler.handlers.tags.tagged(
msg_id="no matches for this...",
tag_name="2016-some-nonexistent-build",
build_nvr="artifact-1.2-1")
def test_no_matching_artifact(self):
""" Test that when a tag msg hits us and we have no match,
that we do nothing gracefully.
"""
module_build_service.scheduler.handlers.tags.tagged(
msg_id="id",
tag_name="module-testmodule-master-20170219191323-c40c156c-build",
build_nvr="artifact-1.2-1",
)
@patch(
"module_build_service.builder.GenericBuilder.default_buildroot_groups",
return_value={"build": [], "srpm-build": []},
)
@patch("module_build_service.builder.KojiModuleBuilder.get_session")
@patch("module_build_service.builder.GenericBuilder.create_from_module")
def test_newrepo(self, create_builder, koji_get_session, dbg):
"""
Test that newRepo is called in the expected times.
"""
koji_session = mock.MagicMock()
koji_session.getTag = lambda tag_name: {"name": tag_name}
koji_session.getTaskInfo.return_value = {"state": koji.TASK_STATES["CLOSED"]}
koji_session.newRepo.return_value = 123456
koji_get_session.return_value = koji_session
builder = mock.MagicMock()
builder.koji_session = koji_session
builder.buildroot_ready.return_value = False
builder.module_build_tag = {
"name": "module-testmodule-master-20170219191323-c40c156c-build"
}
create_builder.return_value = builder
module_build = module_build_service.common.models.ModuleBuild.get_by_id(db_session, 3)
# Set previous components as COMPLETE and tagged.
module_build.batch = 1
for c in module_build.up_to_current_batch():
c.state = koji.BUILD_STATES["COMPLETE"]
c.tagged = True
c.tagged_in_final = True
module_build.batch = 2
for c in module_build.current_batch():
if c.package == "perl-Tangerine":
c.nvr = "perl-Tangerine-0.23-1.module+0+d027b723"
elif c.package == "perl-List-Compare":
c.nvr = "perl-List-Compare-0.53-5.module+0+d027b723"
c.state = koji.BUILD_STATES["COMPLETE"]
db_session.commit()
# Tag the first component to the buildroot.
module_build_service.scheduler.handlers.tags.tagged(
msg_id="id",
tag_name="module-testmodule-master-20170219191323-c40c156c-build",
build_nvr="perl-Tangerine-0.23-1.module+0+d027b723",
)
# Tag the first component to the final tag.
module_build_service.scheduler.handlers.tags.tagged(
msg_id="id",
tag_name="module-testmodule-master-20170219191323-c40c156c",
build_nvr="perl-Tangerine-0.23-1.module+0+d027b723",
)
# newRepo should not be called, because there are still components
# to tag.
assert not koji_session.newRepo.called
# Tag the second component to the buildroot.
module_build_service.scheduler.handlers.tags.tagged(
msg_id="id",
tag_name="module-testmodule-master-20170219191323-c40c156c-build",
build_nvr="perl-List-Compare-0.53-5.module+0+d027b723",
)
# newRepo should not be called, because the component has not been
# tagged to final tag so far.
assert not koji_session.newRepo.called
# Tag the first component to the final tag.
module_build_service.scheduler.handlers.tags.tagged(
msg_id="id",
tag_name="module-testmodule-master-20170219191323-c40c156c",
build_nvr="perl-List-Compare-0.53-5.module+0+d027b723",
)
# newRepo should be called now - all components have been tagged.
koji_session.newRepo.assert_called_once_with(
"module-testmodule-master-20170219191323-c40c156c-build")
# Refresh our module_build object.
db_session.refresh(module_build)
# newRepo task_id should be stored in database, so we can check its
# status later in poller.
assert module_build.new_repo_task_id == 123456
@patch(
"module_build_service.builder.GenericBuilder.default_buildroot_groups",
return_value={"build": [], "srpm-build": []},
)
@patch("module_build_service.builder.KojiModuleBuilder.get_session")
@patch("module_build_service.builder.GenericBuilder.create_from_module")
def test_newrepo_still_building_components(
self, create_builder, koji_get_session, dbg
):
"""
Test that newRepo is called in the expected times.
"""
koji_session = mock.MagicMock()
koji_session.getTag = lambda tag_name: {"name": tag_name}
koji_session.getTaskInfo.return_value = {"state": koji.TASK_STATES["CLOSED"]}
koji_session.newRepo.return_value = 123456
koji_get_session.return_value = koji_session
builder = mock.MagicMock()
builder.koji_session = koji_session
builder.buildroot_ready.return_value = False
builder.module_build_tag = {
"name": "module-testmodule-master-20170219191323-c40c156c-build"
}
create_builder.return_value = builder
module_build = module_build_service.common.models.ModuleBuild.get_by_id(db_session, 3)
module_build.batch = 2
component = db_session.query(module_build_service.common.models.ComponentBuild).filter_by(
package="perl-Tangerine", module_id=module_build.id).one()
component.state = koji.BUILD_STATES["BUILDING"]
component.nvr = "perl-Tangerine-0.23-1.module+0+d027b723"
db_session.commit()
# Tag the perl-List-Compare component to the buildroot.
module_build_service.scheduler.handlers.tags.tagged(
msg_id="id",
tag_name="module-testmodule-master-20170219191323-c40c156c-build",
build_nvr="perl-Tangerine-0.23-1.module+0+d027b723",
)
# Tag the perl-List-Compare component to final tag.
module_build_service.scheduler.handlers.tags.tagged(
msg_id="id",
tag_name="module-testmodule-master-20170219191323-c40c156c",
build_nvr="perl-Tangerine-0.23-1.module+0+d027b723",
)
# newRepo should not be called, because perl-List-Compare has not been
# built yet.
assert not koji_session.newRepo.called
@patch(
"module_build_service.builder.GenericBuilder.default_buildroot_groups",
return_value={"build": [], "srpm-build": []},
)
@patch("module_build_service.builder.KojiModuleBuilder.get_session")
@patch("module_build_service.builder.GenericBuilder.create_from_module")
def test_newrepo_failed_components(self, create_builder, koji_get_session, dbg):
"""
Test that newRepo is called in the expected times.
"""
koji_session = mock.MagicMock()
koji_session.getTag = lambda tag_name: {"name": tag_name}
koji_session.getTaskInfo.return_value = {"state": koji.TASK_STATES["CLOSED"]}
koji_session.newRepo.return_value = 123456
koji_get_session.return_value = koji_session
builder = mock.MagicMock()
builder.koji_session = koji_session
builder.buildroot_ready.return_value = False
builder.module_build_tag = {
"name": "module-testmodule-master-20170219191323-c40c156c-build"
}
create_builder.return_value = builder
module_build = module_build_service.common.models.ModuleBuild.get_by_id(db_session, 3)
# Set previous components as COMPLETE and tagged.
module_build.batch = 1
for c in module_build.up_to_current_batch():
c.state = koji.BUILD_STATES["COMPLETE"]
c.tagged = True
c.tagged_in_final = True
module_build.batch = 2
component = db_session.query(module_build_service.common.models.ComponentBuild).filter_by(
package="perl-Tangerine", module_id=module_build.id).one()
component.state = koji.BUILD_STATES["FAILED"]
component.nvr = "perl-Tangerine-0.23-1.module+0+d027b723"
component = db_session.query(module_build_service.common.models.ComponentBuild).filter_by(
package="perl-List-Compare", module_id=module_build.id).one()
component.state = koji.BUILD_STATES["COMPLETE"]
component.nvr = "perl-List-Compare-0.53-5.module+0+d027b723"
db_session.commit()
# Tag the perl-List-Compare component to the buildroot.
module_build_service.scheduler.handlers.tags.tagged(
msg_id="id",
tag_name="module-testmodule-master-20170219191323-c40c156c-build",
build_nvr="perl-List-Compare-0.53-5.module+0+d027b723",
)
# Tag the perl-List-Compare component to final tag.
module_build_service.scheduler.handlers.tags.tagged(
msg_id="id",
tag_name="module-testmodule-master-20170219191323-c40c156c",
build_nvr="perl-List-Compare-0.53-5.module+0+d027b723",
)
# newRepo should be called now - all successfully built
# components have been tagged.
koji_session.newRepo.assert_called_once_with(
"module-testmodule-master-20170219191323-c40c156c-build")
# Refresh our module_build object.
db_session.refresh(module_build)
# newRepo task_id should be stored in database, so we can check its
# status later in poller.
assert module_build.new_repo_task_id == 123456
@patch(
"module_build_service.builder.GenericBuilder.default_buildroot_groups",
return_value={"build": [], "srpm-build": []},
)
@patch("module_build_service.builder.KojiModuleBuilder.get_session")
@patch("module_build_service.builder.GenericBuilder.create_from_module")
def test_newrepo_multiple_batches_tagged(
self, create_builder, koji_get_session, dbg
):
"""
Test that newRepo is called just once and only when all components
are tagged even if we tag components from the multiple batches in the
same time.
"""
koji_session = mock.MagicMock()
koji_session.getTag = lambda tag_name: {"name": tag_name}
koji_session.getTaskInfo.return_value = {"state": koji.TASK_STATES["CLOSED"]}
koji_session.newRepo.return_value = 123456
koji_get_session.return_value = koji_session
builder = mock.MagicMock()
builder.koji_session = koji_session
builder.buildroot_ready.return_value = False
builder.module_build_tag = {
"name": "module-testmodule-master-20170219191323-c40c156c-build"
}
create_builder.return_value = builder
module_build = module_build_service.common.models.ModuleBuild.get_by_id(db_session, 3)
module_build.batch = 2
mbm = module_build_service.common.models.ComponentBuild.from_component_name(
db_session, "module-build-macros", 3)
mbm.tagged = False
for c in module_build.current_batch():
if c.package == "perl-Tangerine":
c.nvr = "perl-Tangerine-0.23-1.module+0+d027b723"
elif c.package == "perl-List-Compare":
c.nvr = "perl-List-Compare-0.53-5.module+0+d027b723"
c.state = koji.BUILD_STATES["COMPLETE"]
db_session.commit()
# Tag the first component to the buildroot.
module_build_service.scheduler.handlers.tags.tagged(
msg_id="id",
tag_name="module-testmodule-master-20170219191323-c40c156c-build",
build_nvr="perl-Tangerine-0.23-1.module+0+d027b723",
)
# Tag the first component to the final tag.
module_build_service.scheduler.handlers.tags.tagged(
msg_id="id",
tag_name="module-testmodule-master-20170219191323-c40c156c",
build_nvr="perl-Tangerine-0.23-1.module+0+d027b723",
)
# newRepo should not be called, because there are still components
# to tag.
assert not koji_session.newRepo.called
# Tag the second component to the buildroot.
module_build_service.scheduler.handlers.tags.tagged(
msg_id="id",
tag_name="module-testmodule-master-20170219191323-c40c156c-build",
build_nvr="perl-List-Compare-0.53-5.module+0+d027b723",
)
# Tag the second component to final tag.
module_build_service.scheduler.handlers.tags.tagged(
msg_id="id",
tag_name="module-testmodule-master-20170219191323-c40c156c",
build_nvr="perl-List-Compare-0.53-5.module+0+d027b723",
)
# newRepo should not be called, because there are still components
# to tag.
assert not koji_session.newRepo.called
# Tag the component from first batch to final tag.
module_build_service.scheduler.handlers.tags.tagged(
msg_id="id",
tag_name="module-testmodule-master-20170219191323-c40c156c",
build_nvr="module-build-macros-0.1-1.module+0+b0a1d1f7",
)
# Tag the component from first batch to the buildroot.
module_build_service.scheduler.handlers.tags.tagged(
msg_id="id",
tag_name="module-testmodule-master-20170219191323-c40c156c-build",
build_nvr="module-build-macros-0.1-1.module+0+b0a1d1f7",
)
# newRepo should be called now - all components have been tagged.
koji_session.newRepo.assert_called_once_with(
"module-testmodule-master-20170219191323-c40c156c-build")
# Refresh our module_build object.
db_session.refresh(module_build)
# newRepo task_id should be stored in database, so we can check its
# status later in poller.
assert module_build.new_repo_task_id == 123456
@patch(
"module_build_service.builder.GenericBuilder.default_buildroot_groups",
return_value={"build": [], "srpm-build": []},
)
@patch("module_build_service.builder.KojiModuleBuilder.get_session")
@patch("module_build_service.builder.GenericBuilder.create_from_module")
def test_newrepo_build_time_only(self, create_builder, koji_get_session, dbg):
"""
Test the component.build_time_only is respected in tag handler.
"""
koji_session = mock.MagicMock()
koji_session.getTag = lambda tag_name: {"name": tag_name}
koji_session.getTaskInfo.return_value = {"state": koji.TASK_STATES["CLOSED"]}
koji_session.newRepo.return_value = 123456
koji_get_session.return_value = koji_session
builder = mock.MagicMock()
builder.koji_session = koji_session
builder.buildroot_ready.return_value = False
builder.module_build_tag = {
"name": "module-testmodule-master-20170219191323-c40c156c-build"
}
create_builder.return_value = builder
module_build = module_build_service.common.models.ModuleBuild.get_by_id(db_session, 3)
# Set previous components as COMPLETE and tagged.
module_build.batch = 1
for c in module_build.up_to_current_batch():
if c.package == "module-build-macros":
c.nvr = "module-build-macros-0.1-1.module+0+b0a1d1f7"
c.state = koji.BUILD_STATES["COMPLETE"]
c.tagged = True
c.tagged_in_final = True
module_build.batch = 2
component = db_session.query(module_build_service.common.models.ComponentBuild).filter_by(
package="perl-Tangerine", module_id=module_build.id).one()
component.state = koji.BUILD_STATES["COMPLETE"]
component.build_time_only = True
component.tagged = False
component.tagged_in_final = False
component.nvr = "perl-Tangerine-0.23-1.module+0+d027b723"
component = db_session.query(module_build_service.common.models.ComponentBuild).filter_by(
package="perl-List-Compare", module_id=module_build.id).one()
component.state = koji.BUILD_STATES["COMPLETE"]
component.nvr = "perl-List-Compare-0.53-5.module+0+d027b723"
db_session.commit()
# Tag the perl-Tangerine component to the buildroot.
module_build_service.scheduler.handlers.tags.tagged(
msg_id="id",
tag_name="module-testmodule-master-20170219191323-c40c156c-build",
build_nvr="perl-Tangerine-0.23-1.module+0+d027b723",
)
assert not koji_session.newRepo.called
# Tag the perl-List-Compare component to the buildroot.
module_build_service.scheduler.handlers.tags.tagged(
msg_id="id",
tag_name="module-testmodule-master-20170219191323-c40c156c-build",
build_nvr="perl-List-Compare-0.53-5.module+0+d027b723",
)
# Tag the perl-List-Compare component to final tag.
module_build_service.scheduler.handlers.tags.tagged(
msg_id="id",
tag_name="module-testmodule-master-20170219191323-c40c156c",
build_nvr="perl-List-Compare-0.53-5.module+0+d027b723",
)
# newRepo should be called now - all successfully built
# components have been tagged.
koji_session.newRepo.assert_called_once_with(
"module-testmodule-master-20170219191323-c40c156c-build")
# Refresh our module_build object.
db_session.refresh(module_build)
# newRepo task_id should be stored in database, so we can check its
# status later in poller.
assert module_build.new_repo_task_id == 123456
@pytest.mark.parametrize('task_state, expect_new_repo', (
(None, True), # Indicates a newRepo task has not been triggered yet.
(koji.TASK_STATES["CLOSED"], True),
(koji.TASK_STATES["CANCELED"], True),
(koji.TASK_STATES["FAILED"], True),
(koji.TASK_STATES["FREE"], False),
(koji.TASK_STATES["OPEN"], False),
(koji.TASK_STATES["ASSIGNED"], False),
))
@patch(
"module_build_service.builder.GenericBuilder.default_buildroot_groups",
return_value={"build": [], "srpm-build": []},
)
@patch("module_build_service.builder.KojiModuleBuilder.get_session")
@patch("module_build_service.builder.GenericBuilder.create_from_module")
def test_newrepo_not_duplicated(
self, create_builder, koji_get_session, dbg, task_state, expect_new_repo
):
"""
Test that newRepo is not called if a task is already in progress.
"""
koji_session = mock.MagicMock()
koji_session.getTag = lambda tag_name: {"name": tag_name}
koji_session.getTaskInfo.return_value = {"state": task_state}
koji_session.newRepo.return_value = 123456
koji_get_session.return_value = koji_session
builder = mock.MagicMock()
builder.koji_session = koji_session
builder.buildroot_ready.return_value = False
builder.module_build_tag = {
"name": "module-testmodule-master-20170219191323-c40c156c-build"
}
create_builder.return_value = builder
module_build = module_build_service.common.models.ModuleBuild.get_by_id(db_session, 3)
assert module_build
# Set previous components as COMPLETE and tagged.
module_build.batch = 1
for c in module_build.up_to_current_batch():
c.state = koji.BUILD_STATES["COMPLETE"]
c.tagged = True
c.tagged_in_final = True
module_build.batch = 2
for c in module_build.current_batch():
if c.package == "perl-Tangerine":
c.nvr = "perl-Tangerine-0.23-1.module+0+d027b723"
elif c.package == "perl-List-Compare":
c.nvr = "perl-List-Compare-0.53-5.module+0+d027b723"
c.state = koji.BUILD_STATES["COMPLETE"]
if task_state is not None:
module_build.new_repo_task_id = 123456
db_session.commit()
# Tag the first component to the buildroot.
module_build_service.scheduler.handlers.tags.tagged(
msg_id="id",
tag_name="module-testmodule-master-20170219191323-c40c156c-build",
build_nvr="perl-Tangerine-0.23-1.module+0+d027b723",
)
# Tag the first component to the final tag.
module_build_service.scheduler.handlers.tags.tagged(
msg_id="id",
tag_name="module-testmodule-master-20170219191323-c40c156c",
build_nvr="perl-Tangerine-0.23-1.module+0+d027b723",
)
# Tag the second component to the buildroot.
module_build_service.scheduler.handlers.tags.tagged(
msg_id="id",
tag_name="module-testmodule-master-20170219191323-c40c156c-build",
build_nvr="perl-List-Compare-0.53-5.module+0+d027b723",
)
# Tag the second component to the final tag.
module_build_service.scheduler.handlers.tags.tagged(
msg_id="id",
tag_name="module-testmodule-master-20170219191323-c40c156c",
build_nvr="perl-List-Compare-0.53-5.module+0+d027b723",
)
# All components are tagged, newRepo should be called if there are no active tasks.
if expect_new_repo:
koji_session.newRepo.assert_called_once_with(
"module-testmodule-master-20170219191323-c40c156c-build")
else:
assert not koji_session.newRepo.called
# Refresh our module_build object.
db_session.refresh(module_build)
# newRepo task_id should be stored in database, so we can check its
# status later in poller.
assert module_build.new_repo_task_id == 123456
koji_session.newRepo.reset_mock()
| 42.51215 | 98 | 0.6629 | 2,790 | 22,744 | 5.18638 | 0.070251 | 0.086662 | 0.072149 | 0.082101 | 0.912578 | 0.906773 | 0.893227 | 0.882792 | 0.872426 | 0.86897 | 0 | 0.062421 | 0.238568 | 22,744 | 534 | 99 | 42.59176 | 0.773126 | 0.142983 | 0 | 0.770667 | 0 | 0 | 0.257852 | 0.223742 | 0 | 0 | 0 | 0 | 0.048 | 1 | 0.021333 | false | 0 | 0.024 | 0 | 0.048 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
3f0fea94e85790c36df6645088fff75900947567 | 200 | py | Python | src/ml4logs/models/baselines/__init__.py | LogAnalysisTeam/ml4logs | 7efa6107af8d5b76954c66ab10e0eb5bfc582345 | [
"MIT"
] | 1 | 2020-11-19T11:43:14.000Z | 2020-11-19T11:43:14.000Z | src/ml4logs/models/baselines/__init__.py | LogAnalysisTeam/ml4logs | 7efa6107af8d5b76954c66ab10e0eb5bfc582345 | [
"MIT"
] | 12 | 2020-10-05T07:50:02.000Z | 2021-05-04T13:30:42.000Z | src/ml4logs/models/baselines/__init__.py | LogAnalysisTeam/ml4logs | 7efa6107af8d5b76954c66ab10e0eb5bfc582345 | [
"MIT"
] | 4 | 2020-11-19T12:20:39.000Z | 2022-02-15T17:42:02.000Z | from ml4logs.models.baselines.core import SequenceDataset, SeqModel
from ml4logs.models.baselines.seq2label import train_test_seq2label
from ml4logs.models.baselines.seq2seq import train_test_seq2seq
| 50 | 67 | 0.885 | 26 | 200 | 6.653846 | 0.461538 | 0.190751 | 0.294798 | 0.450867 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.037433 | 0.065 | 200 | 3 | 68 | 66.666667 | 0.887701 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
3f5f1a3ce44cf7bb277ee2d9cf267b4344581f5a | 7,156 | py | Python | nn_models.py | mdong151/improving-inference-for-neural-image-compression | 8b876ff84e1d075d8058cb23314e71166fc25074 | [
"MIT"
] | 25 | 2020-10-09T13:40:09.000Z | 2022-02-22T21:12:57.000Z | nn_models.py | mdong151/improving-inference-for-neural-image-compression | 8b876ff84e1d075d8058cb23314e71166fc25074 | [
"MIT"
] | 2 | 2020-10-12T01:49:31.000Z | 2021-03-09T13:55:17.000Z | nn_models.py | mdong151/improving-inference-for-neural-image-compression | 8b876ff84e1d075d8058cb23314e71166fc25074 | [
"MIT"
] | 5 | 2020-10-21T22:18:45.000Z | 2021-08-08T22:27:47.000Z | import tensorflow.compat.v1 as tf
import tensorflow_compression as tfc
class AnalysisTransform(tf.keras.layers.Layer):
"""The analysis transform."""
def __init__(self, num_filters, *args, **kwargs):
self.num_filters = num_filters
super(AnalysisTransform, self).__init__(*args, **kwargs)
def build(self, input_shape):
self._layers = [
tfc.SignalConv2D(
self.num_filters, (5, 5), name="layer_0", corr=True, strides_down=2,
padding="same_zeros", use_bias=True,
activation=tfc.GDN(name="gdn_0")),
tfc.SignalConv2D(
self.num_filters, (5, 5), name="layer_1", corr=True, strides_down=2,
padding="same_zeros", use_bias=True,
activation=tfc.GDN(name="gdn_1")),
tfc.SignalConv2D(
self.num_filters, (5, 5), name="layer_2", corr=True, strides_down=2,
padding="same_zeros", use_bias=True,
activation=tfc.GDN(name="gdn_2")),
tfc.SignalConv2D(
self.num_filters, (5, 5), name="layer_3", corr=True, strides_down=2,
padding="same_zeros", use_bias=True,
activation=None),
]
super(AnalysisTransform, self).build(input_shape)
def call(self, tensor):
for layer in self._layers:
tensor = layer(tensor)
return tensor
class SynthesisTransform(tf.keras.layers.Layer):
"""The synthesis transform."""
def __init__(self, num_filters, *args, **kwargs):
self.num_filters = num_filters
super(SynthesisTransform, self).__init__(*args, **kwargs)
def build(self, input_shape):
self._layers = [
tfc.SignalConv2D(
self.num_filters, (5, 5), name="layer_0", corr=False, strides_up=2,
padding="same_zeros", use_bias=True,
activation=tfc.GDN(name="igdn_0", inverse=True)),
tfc.SignalConv2D(
self.num_filters, (5, 5), name="layer_1", corr=False, strides_up=2,
padding="same_zeros", use_bias=True,
activation=tfc.GDN(name="igdn_1", inverse=True)),
tfc.SignalConv2D(
self.num_filters, (5, 5), name="layer_2", corr=False, strides_up=2,
padding="same_zeros", use_bias=True,
activation=tfc.GDN(name="igdn_2", inverse=True)),
tfc.SignalConv2D(
3, (5, 5), name="layer_3", corr=False, strides_up=2,
padding="same_zeros", use_bias=True,
activation=None),
]
super(SynthesisTransform, self).build(input_shape)
def call(self, tensor):
for layer in self._layers:
tensor = layer(tensor)
return tensor
class HyperAnalysisTransform(tf.keras.layers.Layer):
"""The analysis transform for the entropy model parameters."""
def __init__(self, num_filters, num_output_filters=None, *args, **kwargs):
self.num_filters = num_filters
if num_output_filters is None: # default to the same
num_output_filters = num_filters
self.num_output_filters = num_output_filters
super(HyperAnalysisTransform, self).__init__(*args, **kwargs)
def build(self, input_shape):
self._layers = [
tfc.SignalConv2D(
self.num_filters, (3, 3), name="layer_0", corr=True, strides_down=1,
padding="same_zeros", use_bias=True,
activation=tf.nn.relu),
tfc.SignalConv2D(
self.num_filters, (5, 5), name="layer_1", corr=True, strides_down=2,
padding="same_zeros", use_bias=True,
activation=tf.nn.relu),
tfc.SignalConv2D(
self.num_output_filters, (5, 5), name="layer_2", corr=True, strides_down=2,
padding="same_zeros", use_bias=False,
activation=None)
]
super(HyperAnalysisTransform, self).build(input_shape)
def call(self, tensor):
for layer in self._layers:
tensor = layer(tensor)
return tensor
class HyperSynthesisTransform(tf.keras.layers.Layer):
"""The synthesis transform for the entropy model parameters."""
def __init__(self, num_filters, num_output_filters=None, *args, **kwargs):
self.num_filters = num_filters
if num_output_filters is None: # default to the same
num_output_filters = num_filters
self.num_output_filters = num_output_filters
super(HyperSynthesisTransform, self).__init__(*args, **kwargs)
def build(self, input_shape):
self._layers = [
tfc.SignalConv2D(
self.num_filters, (5, 5), name="layer_0", corr=False, strides_up=2,
padding="same_zeros", use_bias=True, kernel_parameterizer=None,
activation=tf.nn.relu),
tfc.SignalConv2D(
self.num_filters, (5, 5), name="layer_1", corr=False, strides_up=2,
padding="same_zeros", use_bias=True, kernel_parameterizer=None,
activation=tf.nn.relu),
tfc.SignalConv2D(
self.num_output_filters, (3, 3), name="layer_2", corr=False, strides_up=1,
padding="same_zeros", use_bias=True, kernel_parameterizer=None,
activation=None),
]
super(HyperSynthesisTransform, self).build(input_shape)
def call(self, tensor):
for layer in self._layers:
tensor = layer(tensor)
return tensor
# Architecture (mean-scale, no context model) based on Table 1 of https://papers.nips.cc/paper/8275-joint-autoregressive-and-hierarchical-priors-for-learned-image-compression.pdf
class MBT2018HyperSynthesisTransform(tf.keras.layers.Layer):
"""The synthesis transform for the entropy model parameters."""
def __init__(self, num_filters, num_output_filters=None, *args, **kwargs):
self.num_filters = num_filters
if num_output_filters is None: # default to the same
num_output_filters = num_filters
self.num_output_filters = num_output_filters
super().__init__(*args, **kwargs)
def build(self, input_shape):
self._layers = [
tfc.SignalConv2D(
self.num_filters, (5, 5), name="layer_0", corr=False, strides_up=2,
padding="same_zeros", use_bias=True, kernel_parameterizer=None,
activation=tf.nn.relu),
tfc.SignalConv2D(
int(self.num_filters * 1.5), (5, 5), name="layer_1", corr=False, strides_up=2,
padding="same_zeros", use_bias=True, kernel_parameterizer=None,
activation=tf.nn.relu),
tfc.SignalConv2D(
self.num_output_filters, (3, 3), name="layer_2", corr=False, strides_up=1,
padding="same_zeros", use_bias=True, kernel_parameterizer=None,
activation=None),
]
super().build(input_shape)
def call(self, tensor):
for layer in self._layers:
tensor = layer(tensor)
return tensor
| 41.847953 | 178 | 0.605925 | 852 | 7,156 | 4.84507 | 0.112676 | 0.075097 | 0.078004 | 0.078246 | 0.864099 | 0.863614 | 0.860465 | 0.83188 | 0.83188 | 0.82219 | 0 | 0.020167 | 0.279346 | 7,156 | 170 | 179 | 42.094118 | 0.780299 | 0.064142 | 0 | 0.726619 | 0 | 0 | 0.04829 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.107914 | false | 0 | 0.014388 | 0 | 0.194245 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
58b56616bb8c0ceff4ae695456402f0d4fb93f70 | 13,865 | py | Python | tests/test_threads.py | AndreyAD1/forum | bae8bee6c45ca53b717c661a4dc624fec05aca35 | [
"MIT"
] | null | null | null | tests/test_threads.py | AndreyAD1/forum | bae8bee6c45ca53b717c661a4dc624fec05aca35 | [
"MIT"
] | null | null | null | tests/test_threads.py | AndreyAD1/forum | bae8bee6c45ca53b717c661a4dc624fec05aca35 | [
"MIT"
] | null | null | null | import logging
import random
from faker import Faker
import requests
logger = logging.getLogger(__file__)
def test_create_thread():
fake = Faker()
user_info = {
'username': fake.first_name() + str(random.randint(1, 1000)),
'common_name': fake.name(),
'email': fake.email(),
'password': 'pass'
}
logger.info(f'Create the user: {user_info}')
response = requests.post(
'http://127.0.0.1:5000/api/v1/users/create',
json=user_info
)
logger.info(f'Receive response: {response.text}')
response = requests.post(
f'http://127.0.0.1:5000/api/v1/tokens',
auth=(user_info['username'], user_info['password'])
)
logger.info(f'Receive response: {response.text}')
token = response.json()['token']
forum_info = {
'name': fake.company() + str(random.randint(1, 1000)),
'short_name': fake.company_suffix() + str(random.randint(1, 1000))
}
headers = {'Authorization': f'Bearer {token}'}
response = requests.post(
f'http://127.0.0.1:5000/api/v1/forums/create',
headers=headers,
json=forum_info
)
logger.info(f'Receive response: {response.text}')
forum_id = response.json()['forum_id']
thread_info = {
'name': fake.company() + str(random.randint(1, 1000)),
'short_name': fake.company_suffix() + str(random.randint(1, 1000)),
'text': fake.text(),
'forum_id': forum_id
}
response = requests.post(
'http://127.0.0.1:5000/api/v1/threads/create',
json=thread_info,
headers=headers
)
logger.info(f'Receive response: {response.text}')
assert response.status_code == 201
response_json = response.json()
assert len(response_json) == 1
thread_id = response_json.get('thread_id')
assert thread_id
assert isinstance(thread_id, int)
def test_get_thread():
fake = Faker()
user_info = {
'username': fake.first_name() + str(random.randint(1, 1000)),
'common_name': fake.name(),
'email': fake.email(),
'password': 'pass'
}
logger.info(f'Create the user: {user_info}')
response = requests.post(
'http://127.0.0.1:5000/api/v1/users/create',
json=user_info
)
logger.info(f'Receive response: {response.text}')
user_id = response.json()['user_id']
response = requests.post(
f'http://127.0.0.1:5000/api/v1/tokens',
auth=(user_info['username'], user_info['password'])
)
logger.info(f'Receive response: {response.text}')
token = response.json()['token']
forum_info = {
'name': fake.company() + str(random.randint(1, 1000)),
'short_name': fake.company_suffix() + str(random.randint(1, 1000))
}
headers = {'Authorization': f'Bearer {token}'}
response = requests.post(
f'http://127.0.0.1:5000/api/v1/forums/create',
headers=headers,
json=forum_info
)
logger.info(f'Receive response: {response.text}')
forum_id = response.json()['forum_id']
thread_info = {
'name': fake.company() + str(random.randint(1, 1000)),
'short_name': fake.company_suffix() + str(random.randint(1, 1000)),
'text': fake.text(),
'forum_id': forum_id
}
response = requests.post(
'http://127.0.0.1:5000/api/v1/threads/create',
json=thread_info,
headers=headers
)
logger.info(f'Receive response: {response.text}')
thread_id = response.json()['thread_id']
response = requests.get(
f'http://127.0.0.1:5000/api/v1/threads/{thread_id}',
headers=headers
)
logger.info(f'Receive response: {response.text}')
assert response.status_code == 200
expected_thread = {'id': thread_id, 'creator_id': user_id, **thread_info}
logger.info(f'Receive response: {response.text}')
returned_thread = response.json()
returned_thread.pop('creation_timestamp')
assert returned_thread == expected_thread
def test_delete_thread():
fake = Faker()
user_info = {
'username': fake.first_name() + str(random.randint(1, 1000)),
'common_name': fake.name(),
'email': fake.email(),
'password': 'pass'
}
logger.info(f'Create the user: {user_info}')
response = requests.post(
'http://127.0.0.1:5000/api/v1/users/create',
json=user_info
)
logger.info(f'Receive response: {response.text}')
response = requests.post(
f'http://127.0.0.1:5000/api/v1/tokens',
auth=(user_info['username'], user_info['password'])
)
logger.info(f'Receive response: {response.text}')
token = response.json()['token']
forum_info = {
'name': fake.company() + str(random.randint(1, 1000)),
'short_name': fake.company_suffix() + str(random.randint(1, 1000))
}
headers = {'Authorization': f'Bearer {token}'}
response = requests.post(
f'http://127.0.0.1:5000/api/v1/forums/create',
headers=headers,
json=forum_info
)
logger.info(f'Receive response: {response.text}')
forum_id = response.json()['forum_id']
thread_info = {
'name': fake.company() + str(random.randint(1, 1000)),
'short_name': fake.company_suffix() + str(random.randint(1, 1000)),
'text': fake.text(),
'forum_id': forum_id
}
response = requests.post(
'http://127.0.0.1:5000/api/v1/threads/create',
json=thread_info,
headers=headers
)
logger.info(f'Receive response: {response.text}')
thread_id = response.json()['thread_id']
response = requests.delete(
f'http://127.0.0.1:5000/api/v1/threads/{thread_id}',
headers=headers
)
logger.info(f'Receive response: {response.text}')
assert response.status_code == 200
response = requests.get(
f'http://127.0.0.1:5000/api/v1/threads/{thread_id}',
headers=headers
)
logger.info(f'Receive response: {response.text}')
assert response.status_code == 404
def test_restore_thread():
fake = Faker()
user_info = {
'username': fake.first_name() + str(random.randint(1, 1000)),
'common_name': fake.name(),
'email': fake.email(),
'password': 'pass'
}
logger.info(f'Create the user: {user_info}')
response = requests.post(
'http://127.0.0.1:5000/api/v1/users/create',
json=user_info
)
logger.info(f'Receive response: {response.text}')
user_id = response.json()['user_id']
response = requests.post(
f'http://127.0.0.1:5000/api/v1/tokens',
auth=(user_info['username'], user_info['password'])
)
logger.info(f'Receive response: {response.text}')
token = response.json()['token']
forum_info = {
'name': fake.company() + str(random.randint(1, 1000)),
'short_name': fake.company_suffix() + str(random.randint(1, 1000))
}
headers = {'Authorization': f'Bearer {token}'}
response = requests.post(
f'http://127.0.0.1:5000/api/v1/forums/create',
headers=headers,
json=forum_info
)
logger.info(f'Receive response: {response.text}')
forum_id = response.json()['forum_id']
thread_info = {
'name': fake.company() + str(random.randint(1, 1000)),
'short_name': fake.company_suffix() + str(random.randint(1, 1000)),
'text': fake.text(),
'forum_id': forum_id
}
response = requests.post(
'http://127.0.0.1:5000/api/v1/threads/create',
json=thread_info,
headers=headers
)
logger.info(f'Receive response: {response.text}')
thread_id = response.json()['thread_id']
response = requests.delete(
f'http://127.0.0.1:5000/api/v1/threads/{thread_id}',
headers=headers
)
logger.info(f'Receive response: {response.text}')
assert response.status_code == 200
response = requests.post(
'http://127.0.0.1:5000/api/v1/threads/restore',
json={'thread_id': thread_id},
headers=headers
)
logger.info(f'Receive response: {response.text}')
assert response.status_code == 201
response = requests.get(
f'http://127.0.0.1:5000/api/v1/threads/{thread_id}',
headers=headers
)
logger.info(f'Receive response: {response.text}')
assert response.status_code == 200
expected_thread = {'id': thread_id, 'creator_id': user_id, **thread_info}
logger.info(f'Receive response: {response.text}')
returned_thread = response.json()
returned_thread.pop('creation_timestamp')
assert returned_thread == expected_thread
def test_delete_thread_with_posts():
fake = Faker()
user_info = {
'username': fake.first_name() + str(random.randint(1, 1000)),
'common_name': fake.name(),
'email': fake.email(),
'password': 'pass'
}
logger.info(f'Create the user: {user_info}')
response = requests.post(
'http://127.0.0.1:5000/api/v1/users/create',
json=user_info
)
logger.info(f'Receive response: {response.text}')
response = requests.post(
f'http://127.0.0.1:5000/api/v1/tokens',
auth=(user_info['username'], user_info['password'])
)
logger.info(f'Receive response: {response.text}')
token = response.json()['token']
forum_info = {
'name': fake.company() + str(random.randint(1, 1000)),
'short_name': fake.company_suffix() + str(random.randint(1, 1000))
}
headers = {'Authorization': f'Bearer {token}'}
response = requests.post(
f'http://127.0.0.1:5000/api/v1/forums/create',
headers=headers,
json=forum_info
)
logger.info(f'Receive response: {response.text}')
forum_id = response.json()['forum_id']
thread_info = {
'name': fake.company() + str(random.randint(1, 1000)),
'short_name': fake.company_suffix() + str(random.randint(1, 1000)),
'text': fake.text(),
'forum_id': forum_id
}
response = requests.post(
'http://127.0.0.1:5000/api/v1/threads/create',
json=thread_info,
headers=headers
)
logger.info(f'Receive response: {response.text}')
thread_id = response.json()['thread_id']
post_text = fake.text()
headers = {'Authorization': f'Bearer {token}'}
response = requests.post(
'http://127.0.0.1:5000/api/v1/posts/create',
json={'text': post_text, 'thread_id': thread_id},
headers=headers
)
logger.info(f'Receive response: {response.text}')
post_id = response.json().get('post_id')
response = requests.delete(
f'http://127.0.0.1:5000/api/v1/threads/{thread_id}',
headers=headers
)
logger.info(f'Receive response: {response.text}')
assert response.status_code == 200
response = requests.get(
f'http://127.0.0.1:5000/api/v1/posts/{post_id}',
headers=headers
)
logger.info(f'Receive response: {response.text}')
assert response.status_code == 404
def test_delete_post_and_restore_thread():
fake = Faker()
user_info = {
'username': fake.first_name() + str(random.randint(1, 1000)),
'common_name': fake.name(),
'email': fake.email(),
'password': 'pass'
}
logger.info(f'Create the user: {user_info}')
response = requests.post(
'http://127.0.0.1:5000/api/v1/users/create',
json=user_info
)
logger.info(f'Receive response: {response.text}')
response = requests.post(
f'http://127.0.0.1:5000/api/v1/tokens',
auth=(user_info['username'], user_info['password'])
)
logger.info(f'Receive response: {response.text}')
token = response.json()['token']
forum_info = {
'name': fake.company() + str(random.randint(1, 1000)),
'short_name': fake.company_suffix() + str(random.randint(1, 1000))
}
headers = {'Authorization': f'Bearer {token}'}
response = requests.post(
f'http://127.0.0.1:5000/api/v1/forums/create',
headers=headers,
json=forum_info
)
logger.info(f'Receive response: {response.text}')
forum_id = response.json()['forum_id']
thread_info = {
'name': fake.company() + str(random.randint(1, 1000)),
'short_name': fake.company_suffix() + str(random.randint(1, 1000)),
'text': fake.text(),
'forum_id': forum_id
}
response = requests.post(
'http://127.0.0.1:5000/api/v1/threads/create',
json=thread_info,
headers=headers
)
logger.info(f'Receive response: {response.text}')
thread_id = response.json()['thread_id']
post_text = fake.text()
headers = {'Authorization': f'Bearer {token}'}
response = requests.post(
'http://127.0.0.1:5000/api/v1/posts/create',
json={'text': post_text, 'thread_id': thread_id},
headers=headers
)
logger.info(f'Receive response: {response.text}')
post_id = response.json().get('post_id')
response = requests.delete(
f'http://127.0.0.1:5000/api/v1/posts/{post_id}',
headers=headers
)
logger.info(f'Receive response: {response.text}')
assert response.status_code == 200
response = requests.delete(
f'http://127.0.0.1:5000/api/v1/threads/{thread_id}',
headers=headers
)
logger.info(f'Receive response: {response.text}')
assert response.status_code == 200
response = requests.post(
'http://127.0.0.1:5000/api/v1/threads/restore',
json={'thread_id': thread_id},
headers=headers
)
logger.info(f'Receive response: {response.text}')
assert response.status_code == 201
response = requests.get(
f'http://127.0.0.1:5000/api/v1/posts/{post_id}',
headers=headers
)
logger.info(f'Receive response: {response.text}')
assert response.status_code == 404
| 32.094907 | 77 | 0.610963 | 1,782 | 13,865 | 4.632997 | 0.042088 | 0.055717 | 0.061289 | 0.087209 | 0.97093 | 0.97093 | 0.97093 | 0.97093 | 0.97093 | 0.97093 | 0 | 0.056375 | 0.222142 | 13,865 | 431 | 78 | 32.169374 | 0.709133 | 0 | 0 | 0.8125 | 0 | 0 | 0.29744 | 0 | 0 | 0 | 0 | 0 | 0.046875 | 1 | 0.015625 | false | 0.03125 | 0.010417 | 0 | 0.026042 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
58bd94d3c97a03a28719e607e8cbeff739941b6b | 163 | py | Python | noiseplanet/db/__init__.py | jks-liu/noiseplanet | 907b2bb9a93df48a266ad01c7cad0d8ef2367e78 | [
"Apache-2.0"
] | 27 | 2020-06-13T21:00:53.000Z | 2022-03-29T03:22:06.000Z | noiseplanet/db/__init__.py | voodooed/noiseplanet | fca79ef3b81826a5286d566ebf3bf9340df13201 | [
"Apache-2.0"
] | 12 | 2020-12-24T08:14:00.000Z | 2022-02-06T18:01:51.000Z | noiseplanet/db/__init__.py | voodooed/noiseplanet | fca79ef3b81826a5286d566ebf3bf9340df13201 | [
"Apache-2.0"
] | 9 | 2020-08-28T16:06:54.000Z | 2022-01-26T01:05:02.000Z | # -*- coding: utf-8 -*-
"""
Created on Wed Dec 18 22:16:19 2019
@author: Utilisateur
"""
from noiseplanet.db.connect import *
from noiseplanet.db.commit import * | 18.111111 | 36 | 0.693252 | 24 | 163 | 4.708333 | 0.833333 | 0.265487 | 0.300885 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.094203 | 0.153374 | 163 | 9 | 37 | 18.111111 | 0.724638 | 0.490798 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
58d160a60e677cb16deed8ff3b32cd382f1c449d | 1,721 | py | Python | p2_continuous-control/unity_env_wrapper.py | weicheng113/deep-reinforcement-learning | 9b1653b7aedeb4dc0e4aab9351cc4a7f4ccb4f32 | [
"MIT"
] | null | null | null | p2_continuous-control/unity_env_wrapper.py | weicheng113/deep-reinforcement-learning | 9b1653b7aedeb4dc0e4aab9351cc4a7f4ccb4f32 | [
"MIT"
] | null | null | null | p2_continuous-control/unity_env_wrapper.py | weicheng113/deep-reinforcement-learning | 9b1653b7aedeb4dc0e4aab9351cc4a7f4ccb4f32 | [
"MIT"
] | 2 | 2018-11-19T08:43:02.000Z | 2019-02-06T00:05:15.000Z | class EnvSingleWrapper:
def __init__(self, env, train_mode=False):
self.env = env
self.brain_name = env.brain_names[0]
self.train_mode = train_mode
brain = env.brains[self.brain_name]
self.action_size = brain.vector_action_space_size
state = self.reset()
self.state_size = state.shape[1]
self.num_agents = 1
def reset(self):
env_info = self.env.reset(train_mode=self.train_mode)[self.brain_name]
states = env_info.vector_observations
return states
def step(self, actions):
env_info = self.env.step(actions)[self.brain_name]
next_states = env_info.vector_observations
rewards = env_info.rewards
dones = env_info.local_done
return next_states, rewards, dones
class EnvMultipleWrapper:
def __init__(self, env, train_mode=False):
self.env = env
self.brain_name = env.brain_names[0]
self.train_mode = train_mode
brain = env.brains[self.brain_name]
self.action_size = brain.vector_action_space_size
env_info = self.env.reset(train_mode=self.train_mode)[self.brain_name]
states = env_info.vector_observations
self.state_size = states.shape[1]
self.num_agents = len(env_info.agents)
def reset(self):
env_info = self.env.reset(train_mode=self.train_mode)[self.brain_name]
states = env_info.vector_observations
return states
def step(self, actions):
env_info = self.env.step(actions)[self.brain_name]
next_states = env_info.vector_observations
rewards = env_info.rewards
dones = env_info.local_done
return next_states, rewards, dones
| 33.745098 | 78 | 0.669378 | 232 | 1,721 | 4.668103 | 0.159483 | 0.096953 | 0.108033 | 0.064635 | 0.896584 | 0.861496 | 0.861496 | 0.861496 | 0.861496 | 0.861496 | 0 | 0.003837 | 0.242882 | 1,721 | 50 | 79 | 34.42 | 0.827322 | 0 | 0 | 0.829268 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.146341 | false | 0 | 0 | 0 | 0.292683 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
58ef4ecf28092263a8ba4b662ca3bc989993d1c3 | 27,295 | py | Python | package/tests/test_cp/test_openstack/test_domain/test_services/test_nova/test_nova_instance_service.py | QualiSystems/OpenStack-Shell | 2e218ee249867550332a9b887a7c50b76ad52e20 | [
"ISC"
] | 1 | 2016-07-06T19:59:33.000Z | 2016-07-06T19:59:33.000Z | package/tests/test_cp/test_openstack/test_domain/test_services/test_nova/test_nova_instance_service.py | QualiSystems/OpenStack-Shell | 2e218ee249867550332a9b887a7c50b76ad52e20 | [
"ISC"
] | 256 | 2016-07-06T17:02:55.000Z | 2020-10-01T09:35:03.000Z | package/tests/test_cp/test_openstack/test_domain/test_services/test_nova/test_nova_instance_service.py | QualiSystems/OpenStack-Shell | 2e218ee249867550332a9b887a7c50b76ad52e20 | [
"ISC"
] | 1 | 2017-05-16T20:24:57.000Z | 2017-05-16T20:24:57.000Z | from unittest import TestCase
from mock import Mock
from cloudshell.cp.openstack.domain.services.nova.nova_instance_service import NovaInstanceService
import cloudshell.cp.openstack.domain.services.nova.nova_instance_service as test_nova_instance_service
from cloudshell.cp.openstack.common.driver_helper import CloudshellDriverHelper
from cloudshell.cp.openstack.models.exceptions import CommandCancellationException, InstanceErrorStateException
class TestNovaInstanceService(TestCase):
def setUp(self):
instance_waiter = Mock()
instance_waiter.wait = Mock()
instance_waiter.ACTIVE = 'ACTIVE'
self.instance_service = NovaInstanceService(instance_waiter=instance_waiter)
self.mock_logger = Mock()
self.openstack_session = Mock()
def test_instance_create_empty_openstack_session(self):
test_name = 'test'
result = self.instance_service.create_instance(openstack_session=None,
name=test_name,
reservation=Mock(),
cp_resource_model=Mock(),
deploy_req_model=Mock(),
cancellation_context=Mock(),
logger=self.mock_logger)
self.assertEqual(result, None)
def test_instance_create_success(self):
test_name = 'test'
CloudshellDriverHelper.get_uuid = Mock(return_value='1234')
test_uniq_name = 'test-1234'
mock_client2 = Mock()
test_nova_instance_service.novaclient.Client = Mock(return_value=mock_client2)
# mock_client.Client = Mock(return_vaule=mock_client2)
mock_image = Mock()
mock_flavor = Mock()
mock_client2.images.find = Mock(return_value=mock_image)
mock_client2.flavors.find = Mock(return_value=mock_flavor)
mock_deploy_req_model = Mock()
mock_deploy_req_model.affinity_group_id = ''
mock_deploy_req_model.availability_zone = 'test-avail-zone'
test_nova_instance_service.udev_rules_sh_str = 'test_userdata'
mock_cp_resource_model = Mock()
mock_cp_resource_model.qs_mgmt_os_net_uuid = '1234'
mock_cancellation_context = Mock()
mock_client2.servers = Mock()
mocked_inst = Mock()
mock_client2.servers.create = Mock(return_value=mocked_inst)
mock_qnet_dict = {'net-id': mock_cp_resource_model.qs_mgmt_os_net_uuid}
result = self.instance_service.create_instance(openstack_session=self.openstack_session,
name=test_name,
reservation=Mock(),
cp_resource_model=mock_cp_resource_model,
deploy_req_model=mock_deploy_req_model,
cancellation_context=mock_cancellation_context,
logger=self.mock_logger)
mock_client2.servers.create.assert_called_with(name=test_uniq_name,
image=mock_image,
flavor=mock_flavor,
availability_zone='test-avail-zone',
userdata='test_userdata',
nics=[mock_qnet_dict])
self.assertEquals(result, mocked_inst)
self.instance_service.instance_waiter.wait.assert_called_with(mocked_inst,
state=self.instance_service.instance_waiter.ACTIVE,
cancellation_context=mock_cancellation_context,
logger=self.mock_logger)
def test_instance_create_cancellation_called(self):
test_name = 'test'
CloudshellDriverHelper.get_uuid = Mock(return_value='1234')
test_uniq_name = 'test-1234'
mock_client2 = Mock()
test_nova_instance_service.novaclient.Client = Mock(return_value=mock_client2)
# mock_client.Client = Mock(return_vaule=mock_client2)
mock_image = Mock()
mock_flavor = Mock()
mock_client2.images.find = Mock(return_value=mock_image)
mock_client2.flavors.find = Mock(return_value=mock_flavor)
mock_cp_resource_model = Mock()
mock_cp_resource_model.qs_mgmt_os_net_uuid = '1234'
mock_cancellation_context = Mock()
mock_client2.servers = Mock()
mocked_inst = Mock()
mock_client2.servers.create = Mock(return_value=mocked_inst)
mock_qnet_dict = {'net-id': mock_cp_resource_model.qs_mgmt_os_net_uuid}
self.instance_service.instance_waiter = Mock()
self.instance_service.instance_waiter.wait = Mock(side_effect=CommandCancellationException)
with self.assertRaises(CommandCancellationException):
result = self.instance_service.create_instance(openstack_session=self.openstack_session,
name=test_name,
reservation=Mock(),
cp_resource_model=mock_cp_resource_model,
deploy_req_model=Mock(),
cancellation_context=mock_cancellation_context,
logger=self.mock_logger)
mock_client2.servers.delete.assert_called_once_with(mocked_inst)
def test_instance_create_success_affinity_group(self):
test_name = 'test'
CloudshellDriverHelper.get_uuid = Mock(return_value='1234')
test_uniq_name = 'test-1234'
mock_client2 = Mock()
test_nova_instance_service.novaclient.Client = Mock(return_value=mock_client2)
# mock_client.Client = Mock(return_vaule=mock_client2)
mock_image = Mock()
mock_flavor = Mock()
mock_client2.images.find = Mock(return_value=mock_image)
mock_client2.flavors.find = Mock(return_value=mock_flavor)
mock_deploy_req_model = Mock()
mock_deploy_req_model.affinity_group_id = 'test_affinity_group_id'
mock_deploy_req_model.availability_zone = ''
mock_deploy_req_model.auto_udev = False
mock_cp_resource_model = Mock()
mock_cp_resource_model.qs_mgmt_os_net_uuid = '1234'
mock_cancellation_context = Mock()
mock_client2.servers = Mock()
mocked_inst = Mock()
mock_client2.servers.create = Mock(return_value=mocked_inst)
mock_qnet_dict = {'net-id': mock_cp_resource_model.qs_mgmt_os_net_uuid}
result = self.instance_service.create_instance(openstack_session=self.openstack_session,
name=test_name,
reservation=Mock(),
cp_resource_model=mock_cp_resource_model,
deploy_req_model=mock_deploy_req_model,
cancellation_context=mock_cancellation_context,
logger=self.mock_logger)
mock_client2.servers.create.assert_called_with(name=test_uniq_name,
image=mock_image,
flavor=mock_flavor,
nics=[mock_qnet_dict],
scheduler_hints={'group': 'test_affinity_group_id'})
self.assertEquals(result, mocked_inst)
self.instance_service.instance_waiter.wait.assert_called_with(mocked_inst,
state=self.instance_service.instance_waiter.ACTIVE,
cancellation_context=mock_cancellation_context,
logger=self.mock_logger)
def test_instance_terminate_openstack_session_none(self):
with self.assertRaises(ValueError) as context:
self.instance_service.terminate_instance(openstack_session=None,
instance_id='1234',
logger=self.mock_logger)
self.assertTrue(context)
def test_instance_terminate_success(self):
mock_client2 = Mock()
test_nova_instance_service.novaclient.Client = Mock(return_value=mock_client2)
mock_instance = Mock()
test_instance_id = '1234-56'
test_floating_ip = '1.2.3.4'
self.instance_service.get_instance_from_instance_id = Mock(return_value=mock_instance)
self.instance_service.detach_and_delete_floating_ip = Mock()
self.instance_service.terminate_instance(openstack_session=self.openstack_session,
instance_id=test_instance_id,
logger=self.mock_logger)
mock_client2.servers.delete.assert_called_with(mock_instance)
def test_instance_power_off_success(self):
mock_client2 = Mock()
test_nova_instance_service.novaclient.Client = Mock(return_value=mock_client2)
mock_instance = Mock()
test_instance_id = '1234-56'
self.instance_service.get_instance_from_instance_id = Mock(return_value=mock_instance)
self.instance_service.instance_power_off(openstack_session=self.openstack_session,
instance_id=test_instance_id,
logger=self.mock_logger)
self.instance_service.get_instance_from_instance_id.assert_called_with(openstack_session=self.openstack_session,
instance_id=test_instance_id,
logger=self.mock_logger,
client=mock_client2)
self.instance_service.instance_waiter.wait.assert_called_with(instance=mock_instance,
state=self.instance_service.instance_waiter.SHUTOFF,
cancellation_context=None,
logger=self.mock_logger)
self.assertEqual(True, mock_instance.stop.called)
def test_instance_power_on_success(self):
mock_client2 = Mock()
test_nova_instance_service.novaclient.Client = Mock(return_value=mock_client2)
mock_instance = Mock()
test_instance_id = '1234-56'
self.instance_service.get_instance_from_instance_id = Mock(return_value=mock_instance)
self.instance_service.instance_power_on(openstack_session=self.openstack_session,
instance_id=test_instance_id,
logger=self.mock_logger)
self.instance_service.get_instance_from_instance_id.assert_called_with(openstack_session=self.openstack_session,
instance_id=test_instance_id,
logger=self.mock_logger,
client=mock_client2)
self.instance_service.instance_waiter.wait.assert_called_with(instance=mock_instance,
state=self.instance_service.instance_waiter.ACTIVE,
cancellation_context=None,
logger=self.mock_logger)
self.assertEqual(True, mock_instance.start.called)
def test_instance_power_on_no_instance(self):
"""
:return:
"""
mock_client2 = Mock()
test_nova_instance_service.novaclient.Client = Mock(return_value=mock_client2)
test_instance_id = 'test-id'
self.instance_service.get_instance_from_instance_id = Mock(return_value=None)
with self.assertRaises(ValueError) as context:
self.instance_service.instance_power_on(openstack_session=self.openstack_session,
instance_id=test_instance_id,
logger=self.mock_logger)
self.instance_service.get_instance_from_instance_id.assert_called_with(
openstack_session=self.openstack_session,
instance_id=test_instance_id,
logger=self.mock_logger,
client=mock_client2)
self.assertTrue(context)
def test_instance_power_off_no_instance(self):
"""
:return:
"""
mock_client2 = Mock()
test_nova_instance_service.novaclient.Client = Mock(return_value=mock_client2)
test_instance_id = 'test-id'
self.instance_service.get_instance_from_instance_id = Mock(return_value=None)
with self.assertRaises(ValueError) as context:
self.instance_service.instance_power_off(openstack_session=self.openstack_session,
instance_id=test_instance_id,
logger=self.mock_logger)
self.instance_service.get_instance_from_instance_id.assert_called_with(
openstack_session=self.openstack_session,
instance_id=test_instance_id,
logger=self.mock_logger,
client=mock_client2)
self.assertTrue(context)
def test_get_instance_from_instance_id(self):
mock_client2 = Mock()
test_nova_instance_service.novaclient.Client = Mock(return_value=mock_client2)
mock_instance = Mock()
mock_client2.servers.find = Mock(return_value=mock_instance)
test_instance_id = '1234'
result = self.instance_service.get_instance_from_instance_id(openstack_session=self.openstack_session,
instance_id=test_instance_id,
logger=self.mock_logger,
client=mock_client2)
self.assertEqual(result, mock_instance)
def test_get_instance_from_instance_id_not_found_on_nova(self):
"""Check that function will return None if instance with given id will not be found on the Nova server"""
mock_client = Mock()
test_nova_instance_service.novaclient.Client = Mock(return_value=mock_client)
mock_client.servers.find = Mock(side_effect=test_nova_instance_service.novaclient.exceptions.NotFound(""))
test_instance_id = '1234'
result = self.instance_service.get_instance_from_instance_id(openstack_session=self.openstack_session,
instance_id=test_instance_id,
logger=self.mock_logger,
client=mock_client)
self.assertEqual(result, None)
def test_get_instance_from_instance_id_reraise_exception(self):
"""Check that function will re-raise exception if such occurs during retrieving instance from Nova server"""
mock_client = Mock()
test_nova_instance_service.novaclient.Client = Mock(return_value=mock_client)
mock_client.servers.find = Mock(side_effect=Exception())
test_instance_id = '1234'
with self.assertRaises(Exception):
self.instance_service.get_instance_from_instance_id(openstack_session=self.openstack_session,
instance_id=test_instance_id,
logger=self.mock_logger,
client=mock_client)
def test_attach_nic_to_net_success(self):
"""
:return:
"""
import jsonpickle
mock_client = Mock()
test_nova_instance_service.novaclient.Client = Mock(return_value=mock_client)
mock_instance = Mock()
mock_iface_attach_result = Mock()
mock_instance.interface_attach = Mock(return_value=mock_iface_attach_result)
expected_test_mac = 'test_mac_address'
expected_port_id = 'test_port_id'
expected_ip_address = 'test_ip_address'
mock_result_dict = {'mac_addr': expected_test_mac,
'port_id': expected_port_id,
'fixed_ips': [{'ip_address': expected_ip_address}]}
mock_iface_attach_result.to_dict = Mock(return_value=mock_result_dict)
self.instance_service.get_instance_from_instance_id = Mock(return_value=mock_instance)
result = self.instance_service.attach_nic_to_net(openstack_session=self.openstack_session,
net_id='test_net_id',
instance_id='test_instance_id',
logger=self.mock_logger)
expected_result_dict = {'ip_address': expected_ip_address,
'port_id': expected_port_id,
'mac_address': expected_test_mac}
self.assertEqual(jsonpickle.loads(result), expected_result_dict)
def test_attach_nic_to_net_failure_no_instance(self):
mock_client = Mock()
test_nova_instance_service.novaclient.Client = Mock(return_value=mock_client)
self.instance_service.get_instance_from_instance_id = Mock(return_value=None)
result = self.instance_service.attach_nic_to_net(openstack_session=self.openstack_session,
net_id='test_net_id',
instance_id='test_instance_id',
logger=self.mock_logger)
self.assertEqual(result, None)
def test_attach_nic_to_net_failure_exception(self):
mock_client = Mock()
test_nova_instance_service.novaclient.Client = Mock(return_value=mock_client)
mock_instance = Mock()
mock_instance.interface_attach = Mock(side_effect=Exception)
with self.assertRaises(Exception) as context:
result = self.instance_service.attach_nic_to_net(openstack_session=self.openstack_session,
net_id='test_net_id',
instance_id='test_instance_id',
logger=self.mock_logger)
self.assertTrue(context)
def test_detach_nic_from_net_success(self):
mock_client = Mock()
test_nova_instance_service.novaclient.Client = Mock(return_value=mock_client)
mock_instance = Mock()
self.instance_service.get_instance_from_instance_id = Mock(return_value=mock_instance)
mock_iface_detach_result = Mock()
mock_instance.interface_detach = Mock(return_value=mock_iface_detach_result)
result = self.instance_service.detach_nic_from_instance(openstack_session=self.openstack_session,
instance_id='test_instance_id',
port_id='test_port_id',
logger=self.mock_logger)
mock_instance.interface_detach.assert_called_with('test_port_id')
self.assertEqual(result, True)
def test_detach_nic_from_net_failure(self):
mock_client = Mock()
test_nova_instance_service.novaclient.Client = Mock(return_value=mock_client)
mock_instance = Mock()
self.instance_service.get_instance_from_instance_id = Mock(return_value=mock_instance)
mock_instance.interface_detach = Mock(side_effect=Exception)
result = self.instance_service.detach_nic_from_instance(openstack_session=self.openstack_session,
instance_id='test_instance_id',
port_id='test_port_id',
logger=self.mock_logger)
self.assertEqual(result, False)
def test_attach_floating_ip(self):
mock_client = Mock()
test_nova_instance_service.novaclient.Client = Mock(return_value=mock_client)
test_external_nw_id = 'ext-net-id'
test_floating_ip = '4.3.2.1'
test_net_label = 'test-net'
mock_net_obj = Mock()
mock_net_obj.to_dict = Mock(return_value={'id': test_external_nw_id, 'label': test_net_label})
mock_client.networks.list = Mock(return_value=[mock_net_obj])
mock_floating_ip_obj = Mock()
mock_floating_ip_obj.ip = test_floating_ip
mock_client.floating_ips.create = Mock(return_value=mock_floating_ip_obj)
mock_instance = Mock()
mock_instance.add_floating_ip = Mock()
result = self.instance_service.attach_floating_ip(openstack_session=self.openstack_session,
instance=mock_instance,
floating_ip=test_floating_ip,
logger=self.mock_logger)
mock_instance.add_floating_ip.assert_called_with(test_floating_ip)
self.assertEqual(result, True)
def test_detach_floating_ip(self):
mock_client = Mock()
test_nova_instance_service.novaclient.Client = Mock(return_value=mock_client)
mock_floating_ip = '1.2.3.4'
mock_instance = Mock()
self.instance_service.get_instance_from_instance_id = Mock(return_value=mock_instance)
mock_instance.remove_floating_ip = Mock()
self.instance_service.detach_floating_ip(openstack_session=self.openstack_session,
instance=mock_instance,
floating_ip=mock_floating_ip,
logger=self.mock_logger)
mock_instance.remove_floating_ip.assert_called_with(mock_floating_ip)
def test_get_instance_mgmt_net_name_success(self):
mock_client = Mock()
test_nova_instance_service.novaclient.Client = Mock(return_value=mock_client)
test_net_id = 'test_net_id'
test_cp_resource_model = Mock()
test_cp_resource_model.qs_mgmt_os_net_uuid = test_net_id
mock_net_obj = Mock()
mock_net_obj.to_dict = Mock(return_value={'id': test_net_id, 'label': 'test_returned_net'})
mock_client.networks = Mock()
mock_client.networks.list = Mock(return_value=[mock_net_obj])
result = self.instance_service.get_instance_mgmt_network_name(instance=Mock(),
openstack_session=self.openstack_session,
cp_resource_model=test_cp_resource_model)
self.assertEqual(result, 'test_returned_net')
def test_get_instance_mgmt_net_name_fail(self):
mock_client = Mock()
test_nova_instance_service.novaclient.Client = Mock(return_value=mock_client)
test_net_id = 'test_net_id'
test_cp_resource_model = Mock()
test_cp_resource_model.qs_mgmt_os_net_uuid = test_net_id
test_net_id_1 = 'test_net_id_1'
mock_net_obj = Mock()
mock_net_obj.to_dict = Mock(return_value={'id': test_net_id_1, 'label': 'test_returned_net'})
mock_client.networks = Mock()
mock_client.networks.list = Mock(return_value=[mock_net_obj])
result = self.instance_service.get_instance_mgmt_network_name(instance=Mock(),
openstack_session=self.openstack_session,
cp_resource_model=test_cp_resource_model)
self.assertEqual(result, None)
def test_instance_create_error_state(self):
test_name = 'test'
CloudshellDriverHelper.get_uuid = Mock(return_value='1234')
test_uniq_name = 'test-1234'
mock_client2 = Mock()
test_nova_instance_service.novaclient.Client = Mock(return_value=mock_client2)
# mock_client.Client = Mock(return_vaule=mock_client2)
mock_image = Mock()
mock_flavor = Mock()
mock_client2.images.find = Mock(return_value=mock_image)
mock_client2.flavors.find = Mock(return_value=mock_flavor)
mock_cp_resource_model = Mock()
mock_cp_resource_model.qs_mgmt_os_net_uuid = '1234'
mock_cancellation_context = Mock()
mock_client2.servers = Mock()
mocked_inst = Mock()
mock_client2.servers.create = Mock(return_value=mocked_inst)
mock_qnet_dict = {'net-id': mock_cp_resource_model.qs_mgmt_os_net_uuid}
self.instance_service.instance_waiter = Mock()
self.instance_service.instance_waiter.wait = Mock(side_effect=InstanceErrorStateException)
with self.assertRaises(InstanceErrorStateException):
result = self.instance_service.create_instance(openstack_session=self.openstack_session,
name=test_name,
reservation=Mock(),
cp_resource_model=mock_cp_resource_model,
deploy_req_model=Mock(),
cancellation_context=mock_cancellation_context,
logger=self.mock_logger)
mock_client2.servers.delete.assert_called_once_with(mocked_inst)
| 51.891635 | 122 | 0.576443 | 2,709 | 27,295 | 5.365449 | 0.061277 | 0.079463 | 0.059856 | 0.057516 | 0.861094 | 0.817337 | 0.79656 | 0.770829 | 0.757138 | 0.730925 | 0 | 0.008475 | 0.360176 | 27,295 | 525 | 123 | 51.990476 | 0.823809 | 0.016193 | 0 | 0.7275 | 0 | 0 | 0.02518 | 0.001644 | 0 | 0 | 0 | 0 | 0.1025 | 1 | 0.06 | false | 0 | 0.0175 | 0 | 0.08 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
58f4a278a8220aa5adfb4734695cc58b4f94b9d0 | 251,038 | py | Python | pyboto3/kms.py | gehad-shaat/pyboto3 | 4a0c2851a8bc04fb1c71c36086f7bb257e48181d | [
"MIT"
] | 91 | 2016-12-31T11:38:37.000Z | 2021-09-16T19:33:23.000Z | pyboto3/kms.py | gehad-shaat/pyboto3 | 4a0c2851a8bc04fb1c71c36086f7bb257e48181d | [
"MIT"
] | 7 | 2017-01-02T18:54:23.000Z | 2020-08-11T13:54:02.000Z | pyboto3/kms.py | gehad-shaat/pyboto3 | 4a0c2851a8bc04fb1c71c36086f7bb257e48181d | [
"MIT"
] | 26 | 2016-12-31T13:11:00.000Z | 2022-03-03T21:01:12.000Z | '''
The MIT License (MIT)
Copyright (c) 2016 WavyCloud
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.
'''
def can_paginate(operation_name=None):
"""
Check if an operation can be paginated.
:type operation_name: string
:param operation_name: The operation name. This is the same name\nas the method name on the client. For example, if the\nmethod name is create_foo, and you\'d normally invoke the\noperation as client.create_foo(**kwargs), if the\ncreate_foo operation can be paginated, you can use the\ncall client.get_paginator('create_foo').
"""
pass
def cancel_key_deletion(KeyId=None):
"""
Cancels the deletion of a customer master key (CMK). When this operation succeeds, the key state of the CMK is Disabled . To enable the CMK, use EnableKey . You cannot perform this operation on a CMK in a different AWS account.
For more information about scheduling and canceling deletion of a CMK, see Deleting Customer Master Keys in the AWS Key Management Service Developer Guide .
The CMK that you use for this operation must be in a compatible key state. For details, see How Key State Affects Use of a Customer Master Key in the AWS Key Management Service Developer Guide .
See also: AWS API Documentation
Exceptions
Examples
The following example cancels deletion of the specified CMK.
Expected Output:
:example: response = client.cancel_key_deletion(
KeyId='string'
)
:type KeyId: string
:param KeyId: [REQUIRED]\nThe unique identifier for the customer master key (CMK) for which to cancel deletion.\nSpecify the key ID or the Amazon Resource Name (ARN) of the CMK.\nFor example:\n\nKey ID: 1234abcd-12ab-34cd-56ef-1234567890ab\nKey ARN: arn:aws:kms:us-east-2:111122223333:key/1234abcd-12ab-34cd-56ef-1234567890ab\n\nTo get the key ID and key ARN for a CMK, use ListKeys or DescribeKey .\n
:rtype: dict
ReturnsResponse Syntax{
'KeyId': 'string'
}
Response Structure
(dict) --
KeyId (string) --The unique identifier of the master key for which deletion is canceled.
Exceptions
KMS.Client.exceptions.NotFoundException
KMS.Client.exceptions.InvalidArnException
KMS.Client.exceptions.DependencyTimeoutException
KMS.Client.exceptions.KMSInternalException
KMS.Client.exceptions.KMSInvalidStateException
Examples
The following example cancels deletion of the specified CMK.
response = client.cancel_key_deletion(
# The identifier of the CMK whose deletion you are canceling. You can use the key ID or the Amazon Resource Name (ARN) of the CMK.
KeyId='1234abcd-12ab-34cd-56ef-1234567890ab',
)
print(response)
Expected Output:
{
# The ARN of the CMK whose deletion you canceled.
'KeyId': 'arn:aws:kms:us-east-2:111122223333:key/1234abcd-12ab-34cd-56ef-1234567890ab',
'ResponseMetadata': {
'...': '...',
},
}
:return: {
'KeyId': 'string'
}
:returns:
KMS.Client.exceptions.NotFoundException
KMS.Client.exceptions.InvalidArnException
KMS.Client.exceptions.DependencyTimeoutException
KMS.Client.exceptions.KMSInternalException
KMS.Client.exceptions.KMSInvalidStateException
"""
pass
def connect_custom_key_store(CustomKeyStoreId=None):
"""
Connects or reconnects a custom key store to its associated AWS CloudHSM cluster.
The custom key store must be connected before you can create customer master keys (CMKs) in the key store or use the CMKs it contains. You can disconnect and reconnect a custom key store at any time.
To connect a custom key store, its associated AWS CloudHSM cluster must have at least one active HSM. To get the number of active HSMs in a cluster, use the DescribeClusters operation. To add HSMs to the cluster, use the CreateHsm operation. Also, the ` kmsuser crypto user <https://docs.aws.amazon.com/kms/latest/developerguide/key-store-concepts.html#concept-kmsuser>`__ (CU) must not be logged into the cluster. This prevents AWS KMS from using this account to log in.
The connection process can take an extended amount of time to complete; up to 20 minutes. This operation starts the connection process, but it does not wait for it to complete. When it succeeds, this operation quickly returns an HTTP 200 response and a JSON object with no properties. However, this response does not indicate that the custom key store is connected. To get the connection state of the custom key store, use the DescribeCustomKeyStores operation.
During the connection process, AWS KMS finds the AWS CloudHSM cluster that is associated with the custom key store, creates the connection infrastructure, connects to the cluster, logs into the AWS CloudHSM client as the kmsuser CU, and rotates its password.
The ConnectCustomKeyStore operation might fail for various reasons. To find the reason, use the DescribeCustomKeyStores operation and see the ConnectionErrorCode in the response. For help interpreting the ConnectionErrorCode , see CustomKeyStoresListEntry .
To fix the failure, use the DisconnectCustomKeyStore operation to disconnect the custom key store, correct the error, use the UpdateCustomKeyStore operation if necessary, and then use ConnectCustomKeyStore again.
If you are having trouble connecting or disconnecting a custom key store, see Troubleshooting a Custom Key Store in the AWS Key Management Service Developer Guide .
See also: AWS API Documentation
Exceptions
:example: response = client.connect_custom_key_store(
CustomKeyStoreId='string'
)
:type CustomKeyStoreId: string
:param CustomKeyStoreId: [REQUIRED]\nEnter the key store ID of the custom key store that you want to connect. To find the ID of a custom key store, use the DescribeCustomKeyStores operation.\n
:rtype: dict
ReturnsResponse Syntax{}
Response Structure
(dict) --
Exceptions
KMS.Client.exceptions.CloudHsmClusterNotActiveException
KMS.Client.exceptions.CustomKeyStoreInvalidStateException
KMS.Client.exceptions.CustomKeyStoreNotFoundException
KMS.Client.exceptions.KMSInternalException
KMS.Client.exceptions.CloudHsmClusterInvalidConfigurationException
:return: {}
:returns:
KMS.Client.exceptions.CloudHsmClusterNotActiveException
KMS.Client.exceptions.CustomKeyStoreInvalidStateException
KMS.Client.exceptions.CustomKeyStoreNotFoundException
KMS.Client.exceptions.KMSInternalException
KMS.Client.exceptions.CloudHsmClusterInvalidConfigurationException
"""
pass
def create_alias(AliasName=None, TargetKeyId=None):
"""
Creates a display name for a customer managed customer master key (CMK). You can use an alias to identify a CMK in cryptographic operations, such as Encrypt and GenerateDataKey . You can change the CMK associated with the alias at any time.
Aliases are easier to remember than key IDs. They can also help to simplify your applications. For example, if you use an alias in your code, you can change the CMK your code uses by associating a given alias with a different CMK.
To run the same code in multiple AWS regions, use an alias in your code, such as alias/ApplicationKey . Then, in each AWS Region, create an alias/ApplicationKey alias that is associated with a CMK in that Region. When you run your code, it uses the alias/ApplicationKey CMK for that AWS Region without any Region-specific code.
This operation does not return a response. To get the alias that you created, use the ListAliases operation.
To use aliases successfully, be aware of the following information.
Because an alias is not a property of a CMK, you can delete and change the aliases of a CMK without affecting the CMK. Also, aliases do not appear in the response from the DescribeKey operation. To get the aliases and alias ARNs of CMKs in each AWS account and Region, use the ListAliases operation.
The CMK that you use for this operation must be in a compatible key state. For details, see How Key State Affects Use of a Customer Master Key in the AWS Key Management Service Developer Guide .
See also: AWS API Documentation
Exceptions
Examples
The following example creates an alias for the specified customer master key (CMK).
Expected Output:
:example: response = client.create_alias(
AliasName='string',
TargetKeyId='string'
)
:type AliasName: string
:param AliasName: [REQUIRED]\nSpecifies the alias name. This value must begin with alias/ followed by a name, such as alias/ExampleAlias . The alias name cannot begin with alias/aws/ . The alias/aws/ prefix is reserved for AWS managed CMKs.\n
:type TargetKeyId: string
:param TargetKeyId: [REQUIRED]\nIdentifies the CMK to which the alias refers. Specify the key ID or the Amazon Resource Name (ARN) of the CMK. You cannot specify another alias. For help finding the key ID and ARN, see Finding the Key ID and ARN in the AWS Key Management Service Developer Guide .\n
:return: response = client.create_alias(
# The alias to create. Aliases must begin with 'alias/'. Do not use aliases that begin with 'alias/aws' because they are reserved for use by AWS.
AliasName='alias/ExampleAlias',
# The identifier of the CMK whose alias you are creating. You can use the key ID or the Amazon Resource Name (ARN) of the CMK.
TargetKeyId='1234abcd-12ab-34cd-56ef-1234567890ab',
)
print(response)
:returns:
AliasName (string) -- [REQUIRED]
Specifies the alias name. This value must begin with alias/ followed by a name, such as alias/ExampleAlias . The alias name cannot begin with alias/aws/ . The alias/aws/ prefix is reserved for AWS managed CMKs.
TargetKeyId (string) -- [REQUIRED]
Identifies the CMK to which the alias refers. Specify the key ID or the Amazon Resource Name (ARN) of the CMK. You cannot specify another alias. For help finding the key ID and ARN, see Finding the Key ID and ARN in the AWS Key Management Service Developer Guide .
"""
pass
def create_custom_key_store(CustomKeyStoreName=None, CloudHsmClusterId=None, TrustAnchorCertificate=None, KeyStorePassword=None):
"""
Creates a custom key store that is associated with an AWS CloudHSM cluster that you own and manage.
This operation is part of the Custom Key Store feature feature in AWS KMS, which combines the convenience and extensive integration of AWS KMS with the isolation and control of a single-tenant key store.
Before you create the custom key store, you must assemble the required elements, including an AWS CloudHSM cluster that fulfills the requirements for a custom key store. For details about the required elements, see Assemble the Prerequisites in the AWS Key Management Service Developer Guide .
When the operation completes successfully, it returns the ID of the new custom key store. Before you can use your new custom key store, you need to use the ConnectCustomKeyStore operation to connect the new key store to its AWS CloudHSM cluster. Even if you are not going to use your custom key store immediately, you might want to connect it to verify that all settings are correct and then disconnect it until you are ready to use it.
For help with failures, see Troubleshooting a Custom Key Store in the AWS Key Management Service Developer Guide .
See also: AWS API Documentation
Exceptions
:example: response = client.create_custom_key_store(
CustomKeyStoreName='string',
CloudHsmClusterId='string',
TrustAnchorCertificate='string',
KeyStorePassword='string'
)
:type CustomKeyStoreName: string
:param CustomKeyStoreName: [REQUIRED]\nSpecifies a friendly name for the custom key store. The name must be unique in your AWS account.\n
:type CloudHsmClusterId: string
:param CloudHsmClusterId: [REQUIRED]\nIdentifies the AWS CloudHSM cluster for the custom key store. Enter the cluster ID of any active AWS CloudHSM cluster that is not already associated with a custom key store. To find the cluster ID, use the DescribeClusters operation.\n
:type TrustAnchorCertificate: string
:param TrustAnchorCertificate: [REQUIRED]\nEnter the content of the trust anchor certificate for the cluster. This is the content of the customerCA.crt file that you created when you initialized the cluster .\n
:type KeyStorePassword: string
:param KeyStorePassword: [REQUIRED]\nEnter the password of the ` kmsuser crypto user (CU) account <https://docs.aws.amazon.com/kms/latest/developerguide/key-store-concepts.html#concept-kmsuser>`__ in the specified AWS CloudHSM cluster. AWS KMS logs into the cluster as this user to manage key material on your behalf.\nThe password must be a string of 7 to 32 characters. Its value is case sensitive.\nThis parameter tells AWS KMS the kmsuser account password; it does not change the password in the AWS CloudHSM cluster.\n
:rtype: dict
ReturnsResponse Syntax
{
'CustomKeyStoreId': 'string'
}
Response Structure
(dict) --
CustomKeyStoreId (string) --
A unique identifier for the new custom key store.
Exceptions
KMS.Client.exceptions.CloudHsmClusterInUseException
KMS.Client.exceptions.CustomKeyStoreNameInUseException
KMS.Client.exceptions.CloudHsmClusterNotFoundException
KMS.Client.exceptions.KMSInternalException
KMS.Client.exceptions.CloudHsmClusterNotActiveException
KMS.Client.exceptions.IncorrectTrustAnchorException
KMS.Client.exceptions.CloudHsmClusterInvalidConfigurationException
:return: {
'CustomKeyStoreId': 'string'
}
:returns:
KMS.Client.exceptions.CloudHsmClusterInUseException
KMS.Client.exceptions.CustomKeyStoreNameInUseException
KMS.Client.exceptions.CloudHsmClusterNotFoundException
KMS.Client.exceptions.KMSInternalException
KMS.Client.exceptions.CloudHsmClusterNotActiveException
KMS.Client.exceptions.IncorrectTrustAnchorException
KMS.Client.exceptions.CloudHsmClusterInvalidConfigurationException
"""
pass
def create_grant(KeyId=None, GranteePrincipal=None, RetiringPrincipal=None, Operations=None, Constraints=None, GrantTokens=None, Name=None):
"""
Adds a grant to a customer master key (CMK). The grant allows the grantee principal to use the CMK when the conditions specified in the grant are met. When setting permissions, grants are an alternative to key policies.
To create a grant that allows a cryptographic operation only when the request includes a particular encryption context , use the Constraints parameter. For details, see GrantConstraints .
You can create grants on symmetric and asymmetric CMKs. However, if the grant allows an operation that the CMK does not support, CreateGrant fails with a ValidationException .
For information about symmetric and asymmetric CMKs, see Using Symmetric and Asymmetric CMKs in the AWS Key Management Service Developer Guide .
To perform this operation on a CMK in a different AWS account, specify the key ARN in the value of the KeyId parameter. For more information about grants, see Grants in the * AWS Key Management Service Developer Guide * .
The CMK that you use for this operation must be in a compatible key state. For details, see How Key State Affects Use of a Customer Master Key in the AWS Key Management Service Developer Guide .
See also: AWS API Documentation
Exceptions
Examples
The following example creates a grant that allows the specified IAM role to encrypt data with the specified customer master key (CMK).
Expected Output:
:example: response = client.create_grant(
KeyId='string',
GranteePrincipal='string',
RetiringPrincipal='string',
Operations=[
'Decrypt'|'Encrypt'|'GenerateDataKey'|'GenerateDataKeyWithoutPlaintext'|'ReEncryptFrom'|'ReEncryptTo'|'Sign'|'Verify'|'GetPublicKey'|'CreateGrant'|'RetireGrant'|'DescribeKey'|'GenerateDataKeyPair'|'GenerateDataKeyPairWithoutPlaintext',
],
Constraints={
'EncryptionContextSubset': {
'string': 'string'
},
'EncryptionContextEquals': {
'string': 'string'
}
},
GrantTokens=[
'string',
],
Name='string'
)
:type KeyId: string
:param KeyId: [REQUIRED]\nThe unique identifier for the customer master key (CMK) that the grant applies to.\nSpecify the key ID or the Amazon Resource Name (ARN) of the CMK. To specify a CMK in a different AWS account, you must use the key ARN.\nFor example:\n\nKey ID: 1234abcd-12ab-34cd-56ef-1234567890ab\nKey ARN: arn:aws:kms:us-east-2:111122223333:key/1234abcd-12ab-34cd-56ef-1234567890ab\n\nTo get the key ID and key ARN for a CMK, use ListKeys or DescribeKey .\n
:type GranteePrincipal: string
:param GranteePrincipal: [REQUIRED]\nThe principal that is given permission to perform the operations that the grant permits.\nTo specify the principal, use the Amazon Resource Name (ARN) of an AWS principal. Valid AWS principals include AWS accounts (root), IAM users, IAM roles, federated users, and assumed role users. For examples of the ARN syntax to use for specifying a principal, see AWS Identity and Access Management (IAM) in the Example ARNs section of the AWS General Reference .\n
:type RetiringPrincipal: string
:param RetiringPrincipal: The principal that is given permission to retire the grant by using RetireGrant operation.\nTo specify the principal, use the Amazon Resource Name (ARN) of an AWS principal. Valid AWS principals include AWS accounts (root), IAM users, federated users, and assumed role users. For examples of the ARN syntax to use for specifying a principal, see AWS Identity and Access Management (IAM) in the Example ARNs section of the AWS General Reference .\n
:type Operations: list
:param Operations: [REQUIRED]\nA list of operations that the grant permits.\n\n(string) --\n\n
:type Constraints: dict
:param Constraints: Allows a cryptographic operation only when the encryption context matches or includes the encryption context specified in this structure. For more information about encryption context, see Encryption Context in the * AWS Key Management Service Developer Guide * .\n\nEncryptionContextSubset (dict) --A list of key-value pairs that must be included in the encryption context of the cryptographic operation request. The grant allows the cryptographic operation only when the encryption context in the request includes the key-value pairs specified in this constraint, although it can include additional key-value pairs.\n\n(string) --\n(string) --\n\n\n\n\nEncryptionContextEquals (dict) --A list of key-value pairs that must match the encryption context in the cryptographic operation request. The grant allows the operation only when the encryption context in the request is the same as the encryption context specified in this constraint.\n\n(string) --\n(string) --\n\n\n\n\n\n
:type GrantTokens: list
:param GrantTokens: A list of grant tokens.\nFor more information, see Grant Tokens in the AWS Key Management Service Developer Guide .\n\n(string) --\n\n
:type Name: string
:param Name: A friendly name for identifying the grant. Use this value to prevent the unintended creation of duplicate grants when retrying this request.\nWhen this value is absent, all CreateGrant requests result in a new grant with a unique GrantId even if all the supplied parameters are identical. This can result in unintended duplicates when you retry the CreateGrant request.\nWhen this value is present, you can retry a CreateGrant request with identical parameters; if the grant already exists, the original GrantId is returned without creating a new grant. Note that the returned grant token is unique with every CreateGrant request, even when a duplicate GrantId is returned. All grant tokens obtained in this way can be used interchangeably.\n
:rtype: dict
ReturnsResponse Syntax
{
'GrantToken': 'string',
'GrantId': 'string'
}
Response Structure
(dict) --
GrantToken (string) --
The grant token.
For more information, see Grant Tokens in the AWS Key Management Service Developer Guide .
GrantId (string) --
The unique identifier for the grant.
You can use the GrantId in a subsequent RetireGrant or RevokeGrant operation.
Exceptions
KMS.Client.exceptions.NotFoundException
KMS.Client.exceptions.DisabledException
KMS.Client.exceptions.DependencyTimeoutException
KMS.Client.exceptions.InvalidArnException
KMS.Client.exceptions.KMSInternalException
KMS.Client.exceptions.InvalidGrantTokenException
KMS.Client.exceptions.LimitExceededException
KMS.Client.exceptions.KMSInvalidStateException
Examples
The following example creates a grant that allows the specified IAM role to encrypt data with the specified customer master key (CMK).
response = client.create_grant(
# The identity that is given permission to perform the operations specified in the grant.
GranteePrincipal='arn:aws:iam::111122223333:role/ExampleRole',
# The identifier of the CMK to which the grant applies. You can use the key ID or the Amazon Resource Name (ARN) of the CMK.
KeyId='arn:aws:kms:us-east-2:444455556666:key/1234abcd-12ab-34cd-56ef-1234567890ab',
# A list of operations that the grant allows.
Operations=[
'Encrypt',
'Decrypt',
],
)
print(response)
Expected Output:
{
# The unique identifier of the grant.
'GrantId': '0c237476b39f8bc44e45212e08498fbe3151305030726c0590dd8d3e9f3d6a60',
# The grant token.
'GrantToken': 'AQpAM2RhZTk1MGMyNTk2ZmZmMzEyYWVhOWViN2I1MWM4Mzc0MWFiYjc0ZDE1ODkyNGFlNTIzODZhMzgyZjBlNGY3NiKIAgEBAgB4Pa6VDCWW__MSrqnre1HIN0Grt00ViSSuUjhqOC8OT3YAAADfMIHcBgkqhkiG9w0BBwaggc4wgcsCAQAwgcUGCSqGSIb3DQEHATAeBglghkgBZQMEAS4wEQQMmqLyBTAegIn9XlK5AgEQgIGXZQjkBcl1dykDdqZBUQ6L1OfUivQy7JVYO2-ZJP7m6f1g8GzV47HX5phdtONAP7K_HQIflcgpkoCqd_fUnE114mSmiagWkbQ5sqAVV3ov-VeqgrvMe5ZFEWLMSluvBAqdjHEdMIkHMlhlj4ENZbzBfo9Wxk8b8SnwP4kc4gGivedzFXo-dwN8fxjjq_ZZ9JFOj2ijIbj5FyogDCN0drOfi8RORSEuCEmPvjFRMFAwcmwFkN2NPp89amA',
'ResponseMetadata': {
'...': '...',
},
}
:return: {
'GrantToken': 'string',
'GrantId': 'string'
}
:returns:
KeyId (string) -- [REQUIRED]
The unique identifier for the customer master key (CMK) that the grant applies to.
Specify the key ID or the Amazon Resource Name (ARN) of the CMK. To specify a CMK in a different AWS account, you must use the key ARN.
For example:
Key ID: 1234abcd-12ab-34cd-56ef-1234567890ab
Key ARN: arn:aws:kms:us-east-2:111122223333:key/1234abcd-12ab-34cd-56ef-1234567890ab
To get the key ID and key ARN for a CMK, use ListKeys or DescribeKey .
GranteePrincipal (string) -- [REQUIRED]
The principal that is given permission to perform the operations that the grant permits.
To specify the principal, use the Amazon Resource Name (ARN) of an AWS principal. Valid AWS principals include AWS accounts (root), IAM users, IAM roles, federated users, and assumed role users. For examples of the ARN syntax to use for specifying a principal, see AWS Identity and Access Management (IAM) in the Example ARNs section of the AWS General Reference .
RetiringPrincipal (string) -- The principal that is given permission to retire the grant by using RetireGrant operation.
To specify the principal, use the Amazon Resource Name (ARN) of an AWS principal. Valid AWS principals include AWS accounts (root), IAM users, federated users, and assumed role users. For examples of the ARN syntax to use for specifying a principal, see AWS Identity and Access Management (IAM) in the Example ARNs section of the AWS General Reference .
Operations (list) -- [REQUIRED]
A list of operations that the grant permits.
(string) --
Constraints (dict) -- Allows a cryptographic operation only when the encryption context matches or includes the encryption context specified in this structure. For more information about encryption context, see Encryption Context in the * AWS Key Management Service Developer Guide * .
EncryptionContextSubset (dict) --A list of key-value pairs that must be included in the encryption context of the cryptographic operation request. The grant allows the cryptographic operation only when the encryption context in the request includes the key-value pairs specified in this constraint, although it can include additional key-value pairs.
(string) --
(string) --
EncryptionContextEquals (dict) --A list of key-value pairs that must match the encryption context in the cryptographic operation request. The grant allows the operation only when the encryption context in the request is the same as the encryption context specified in this constraint.
(string) --
(string) --
GrantTokens (list) -- A list of grant tokens.
For more information, see Grant Tokens in the AWS Key Management Service Developer Guide .
(string) --
Name (string) -- A friendly name for identifying the grant. Use this value to prevent the unintended creation of duplicate grants when retrying this request.
When this value is absent, all CreateGrant requests result in a new grant with a unique GrantId even if all the supplied parameters are identical. This can result in unintended duplicates when you retry the CreateGrant request.
When this value is present, you can retry a CreateGrant request with identical parameters; if the grant already exists, the original GrantId is returned without creating a new grant. Note that the returned grant token is unique with every CreateGrant request, even when a duplicate GrantId is returned. All grant tokens obtained in this way can be used interchangeably.
"""
pass
def create_key(Policy=None, Description=None, KeyUsage=None, CustomerMasterKeySpec=None, Origin=None, CustomKeyStoreId=None, BypassPolicyLockoutSafetyCheck=None, Tags=None):
"""
Creates a unique customer managed customer master key (CMK) in your AWS account and Region. You cannot use this operation to create a CMK in a different AWS account.
You can use the CreateKey operation to create symmetric or asymmetric CMKs.
For information about symmetric and asymmetric CMKs, see Using Symmetric and Asymmetric CMKs in the AWS Key Management Service Developer Guide .
To create different types of CMKs, use the following guidance:
To create an asymmetric CMK, use the CustomerMasterKeySpec parameter to specify the type of key material in the CMK. Then, use the Key parameter to determine whether the CMK will be used to encrypt and decrypt or sign and verify. You can\'t change these properties after the CMK is created.
When creating a symmetric CMK, you don\'t need to specify the CustomerMasterKeySpec or Key parameters. The default value for CustomerMasterKeySpec , SYMMETRIC_DEFAULT , and the default value for KeyUsage , ENCRYPT_DECRYPT , are the only valid values for symmetric CMKs.
To import your own key material, begin by creating a symmetric CMK with no key material. To do this, use the Origin parameter of CreateKey with a value of EXTERNAL . Next, use GetParametersForImport operation to get a public key and import token, and use the public key to encrypt your key material. Then, use ImportKeyMaterial with your import token to import the key material. For step-by-step instructions, see Importing Key Material in the * AWS Key Management Service Developer Guide * . You cannot import the key material into an asymmetric CMK.
To create a symmetric CMK in a custom key store , use the CustomKeyStoreId parameter to specify the custom key store. You must also use the Origin parameter with a value of AWS_CLOUDHSM . The AWS CloudHSM cluster that is associated with the custom key store must have at least two active HSMs in different Availability Zones in the AWS Region.
You cannot create an asymmetric CMK in a custom key store. For information about custom key stores in AWS KMS see Using Custom Key Stores in the * AWS Key Management Service Developer Guide * .
See also: AWS API Documentation
Exceptions
Examples
The following example creates a CMK.
Expected Output:
:example: response = client.create_key(
Policy='string',
Description='string',
KeyUsage='SIGN_VERIFY'|'ENCRYPT_DECRYPT',
CustomerMasterKeySpec='RSA_2048'|'RSA_3072'|'RSA_4096'|'ECC_NIST_P256'|'ECC_NIST_P384'|'ECC_NIST_P521'|'ECC_SECG_P256K1'|'SYMMETRIC_DEFAULT',
Origin='AWS_KMS'|'EXTERNAL'|'AWS_CLOUDHSM',
CustomKeyStoreId='string',
BypassPolicyLockoutSafetyCheck=True|False,
Tags=[
{
'TagKey': 'string',
'TagValue': 'string'
},
]
)
:type Policy: string
:param Policy: The key policy to attach to the CMK.\nIf you provide a key policy, it must meet the following criteria:\n\nIf you don\'t set BypassPolicyLockoutSafetyCheck to true, the key policy must allow the principal that is making the CreateKey request to make a subsequent PutKeyPolicy request on the CMK. This reduces the risk that the CMK becomes unmanageable. For more information, refer to the scenario in the Default Key Policy section of the * AWS Key Management Service Developer Guide * .\nEach statement in the key policy must contain one or more principals. The principals in the key policy must exist and be visible to AWS KMS. When you create a new AWS principal (for example, an IAM user or role), you might need to enforce a delay before including the new principal in a key policy because the new principal might not be immediately visible to AWS KMS. For more information, see Changes that I make are not always immediately visible in the AWS Identity and Access Management User Guide .\n\nIf you do not provide a key policy, AWS KMS attaches a default key policy to the CMK. For more information, see Default Key Policy in the AWS Key Management Service Developer Guide .\nThe key policy size quota is 32 kilobytes (32768 bytes).\n
:type Description: string
:param Description: A description of the CMK.\nUse a description that helps you decide whether the CMK is appropriate for a task.\n
:type KeyUsage: string
:param KeyUsage: Determines the cryptographic operations for which you can use the CMK. The default value is ENCRYPT_DECRYPT . This parameter is required only for asymmetric CMKs. You can\'t change the KeyUsage value after the CMK is created.\nSelect only one valid value.\n\nFor symmetric CMKs, omit the parameter or specify ENCRYPT_DECRYPT .\nFor asymmetric CMKs with RSA key material, specify ENCRYPT_DECRYPT or SIGN_VERIFY .\nFor asymmetric CMKs with ECC key material, specify SIGN_VERIFY .\n\n
:type CustomerMasterKeySpec: string
:param CustomerMasterKeySpec: Specifies the type of CMK to create. The default value, SYMMETRIC_DEFAULT , creates a CMK with a 256-bit symmetric key for encryption and decryption. For help choosing a key spec for your CMK, see How to Choose Your CMK Configuration in the AWS Key Management Service Developer Guide .\nThe CustomerMasterKeySpec determines whether the CMK contains a symmetric key or an asymmetric key pair. It also determines the encryption algorithms or signing algorithms that the CMK supports. You can\'t change the CustomerMasterKeySpec after the CMK is created. To further restrict the algorithms that can be used with the CMK, use a condition key in its key policy or IAM policy. For more information, see kms:EncryptionAlgorithm or kms:Signing Algorithm in the AWS Key Management Service Developer Guide .\n\nWarning\nAWS services that are integrated with AWS KMS use symmetric CMKs to protect your data. These services do not support asymmetric CMKs. For help determining whether a CMK is symmetric or asymmetric, see Identifying Symmetric and Asymmetric CMKs in the AWS Key Management Service Developer Guide .\n\nAWS KMS supports the following key specs for CMKs:\n\nSymmetric key (default)\nSYMMETRIC_DEFAULT (AES-256-GCM)\n\n\nAsymmetric RSA key pairs\nRSA_2048\nRSA_3072\nRSA_4096\n\n\nAsymmetric NIST-recommended elliptic curve key pairs\nECC_NIST_P256 (secp256r1)\nECC_NIST_P384 (secp384r1)\nECC_NIST_P521 (secp521r1)\n\n\nOther asymmetric elliptic curve key pairs\nECC_SECG_P256K1 (secp256k1), commonly used for cryptocurrencies.\n\n\n\n
:type Origin: string
:param Origin: The source of the key material for the CMK. You cannot change the origin after you create the CMK. The default is AWS_KMS , which means AWS KMS creates the key material.\nWhen the parameter value is EXTERNAL , AWS KMS creates a CMK without key material so that you can import key material from your existing key management infrastructure. For more information about importing key material into AWS KMS, see Importing Key Material in the AWS Key Management Service Developer Guide . This value is valid only for symmetric CMKs.\nWhen the parameter value is AWS_CLOUDHSM , AWS KMS creates the CMK in an AWS KMS custom key store and creates its key material in the associated AWS CloudHSM cluster. You must also use the CustomKeyStoreId parameter to identify the custom key store. This value is valid only for symmetric CMKs.\n
:type CustomKeyStoreId: string
:param CustomKeyStoreId: Creates the CMK in the specified custom key store and the key material in its associated AWS CloudHSM cluster. To create a CMK in a custom key store, you must also specify the Origin parameter with a value of AWS_CLOUDHSM . The AWS CloudHSM cluster that is associated with the custom key store must have at least two active HSMs, each in a different Availability Zone in the Region.\nThis parameter is valid only for symmetric CMKs. You cannot create an asymmetric CMK in a custom key store.\nTo find the ID of a custom key store, use the DescribeCustomKeyStores operation.\nThe response includes the custom key store ID and the ID of the AWS CloudHSM cluster.\nThis operation is part of the Custom Key Store feature feature in AWS KMS, which combines the convenience and extensive integration of AWS KMS with the isolation and control of a single-tenant key store.\n
:type BypassPolicyLockoutSafetyCheck: boolean
:param BypassPolicyLockoutSafetyCheck: A flag to indicate whether to bypass the key policy lockout safety check.\n\nWarning\nSetting this value to true increases the risk that the CMK becomes unmanageable. Do not set this value to true indiscriminately.\nFor more information, refer to the scenario in the Default Key Policy section in the * AWS Key Management Service Developer Guide * .\n\nUse this parameter only when you include a policy in the request and you intend to prevent the principal that is making the request from making a subsequent PutKeyPolicy request on the CMK.\nThe default value is false.\n
:type Tags: list
:param Tags: One or more tags. Each tag consists of a tag key and a tag value. Both the tag key and the tag value are required, but the tag value can be an empty (null) string.\nWhen you add tags to an AWS resource, AWS generates a cost allocation report with usage and costs aggregated by tags. For information about adding, changing, deleting and listing tags for CMKs, see Tagging Keys .\nUse this parameter to tag the CMK when it is created. To add tags to an existing CMK, use the TagResource operation.\n\n(dict) --A key-value pair. A tag consists of a tag key and a tag value. Tag keys and tag values are both required, but tag values can be empty (null) strings.\nFor information about the rules that apply to tag keys and tag values, see User-Defined Tag Restrictions in the AWS Billing and Cost Management User Guide .\n\nTagKey (string) -- [REQUIRED]The key of the tag.\n\nTagValue (string) -- [REQUIRED]The value of the tag.\n\n\n\n\n
:rtype: dict
ReturnsResponse Syntax
{
'KeyMetadata': {
'AWSAccountId': 'string',
'KeyId': 'string',
'Arn': 'string',
'CreationDate': datetime(2015, 1, 1),
'Enabled': True|False,
'Description': 'string',
'KeyUsage': 'SIGN_VERIFY'|'ENCRYPT_DECRYPT',
'KeyState': 'Enabled'|'Disabled'|'PendingDeletion'|'PendingImport'|'Unavailable',
'DeletionDate': datetime(2015, 1, 1),
'ValidTo': datetime(2015, 1, 1),
'Origin': 'AWS_KMS'|'EXTERNAL'|'AWS_CLOUDHSM',
'CustomKeyStoreId': 'string',
'CloudHsmClusterId': 'string',
'ExpirationModel': 'KEY_MATERIAL_EXPIRES'|'KEY_MATERIAL_DOES_NOT_EXPIRE',
'KeyManager': 'AWS'|'CUSTOMER',
'CustomerMasterKeySpec': 'RSA_2048'|'RSA_3072'|'RSA_4096'|'ECC_NIST_P256'|'ECC_NIST_P384'|'ECC_NIST_P521'|'ECC_SECG_P256K1'|'SYMMETRIC_DEFAULT',
'EncryptionAlgorithms': [
'SYMMETRIC_DEFAULT'|'RSAES_OAEP_SHA_1'|'RSAES_OAEP_SHA_256',
],
'SigningAlgorithms': [
'RSASSA_PSS_SHA_256'|'RSASSA_PSS_SHA_384'|'RSASSA_PSS_SHA_512'|'RSASSA_PKCS1_V1_5_SHA_256'|'RSASSA_PKCS1_V1_5_SHA_384'|'RSASSA_PKCS1_V1_5_SHA_512'|'ECDSA_SHA_256'|'ECDSA_SHA_384'|'ECDSA_SHA_512',
]
}
}
Response Structure
(dict) --
KeyMetadata (dict) --
Metadata associated with the CMK.
AWSAccountId (string) --
The twelve-digit account ID of the AWS account that owns the CMK.
KeyId (string) --
The globally unique identifier for the CMK.
Arn (string) --
The Amazon Resource Name (ARN) of the CMK. For examples, see AWS Key Management Service (AWS KMS) in the Example ARNs section of the AWS General Reference .
CreationDate (datetime) --
The date and time when the CMK was created.
Enabled (boolean) --
Specifies whether the CMK is enabled. When KeyState is Enabled this value is true, otherwise it is false.
Description (string) --
The description of the CMK.
KeyUsage (string) --
The cryptographic operations for which you can use the CMK.
KeyState (string) --
The state of the CMK.
For more information about how key state affects the use of a CMK, see How Key State Affects the Use of a Customer Master Key in the AWS Key Management Service Developer Guide .
DeletionDate (datetime) --
The date and time after which AWS KMS deletes the CMK. This value is present only when KeyState is PendingDeletion .
ValidTo (datetime) --
The time at which the imported key material expires. When the key material expires, AWS KMS deletes the key material and the CMK becomes unusable. This value is present only for CMKs whose Origin is EXTERNAL and whose ExpirationModel is KEY_MATERIAL_EXPIRES , otherwise this value is omitted.
Origin (string) --
The source of the CMK\'s key material. When this value is AWS_KMS , AWS KMS created the key material. When this value is EXTERNAL , the key material was imported from your existing key management infrastructure or the CMK lacks key material. When this value is AWS_CLOUDHSM , the key material was created in the AWS CloudHSM cluster associated with a custom key store.
CustomKeyStoreId (string) --
A unique identifier for the custom key store that contains the CMK. This value is present only when the CMK is created in a custom key store.
CloudHsmClusterId (string) --
The cluster ID of the AWS CloudHSM cluster that contains the key material for the CMK. When you create a CMK in a custom key store , AWS KMS creates the key material for the CMK in the associated AWS CloudHSM cluster. This value is present only when the CMK is created in a custom key store.
ExpirationModel (string) --
Specifies whether the CMK\'s key material expires. This value is present only when Origin is EXTERNAL , otherwise this value is omitted.
KeyManager (string) --
The manager of the CMK. CMKs in your AWS account are either customer managed or AWS managed. For more information about the difference, see Customer Master Keys in the AWS Key Management Service Developer Guide .
CustomerMasterKeySpec (string) --
Describes the type of key material in the CMK.
EncryptionAlgorithms (list) --
A list of encryption algorithms that the CMK supports. You cannot use the CMK with other encryption algorithms within AWS KMS.
This field appears only when the KeyUsage of the CMK is ENCRYPT_DECRYPT .
(string) --
SigningAlgorithms (list) --
A list of signing algorithms that the CMK supports. You cannot use the CMK with other signing algorithms within AWS KMS.
This field appears only when the KeyUsage of the CMK is SIGN_VERIFY .
(string) --
Exceptions
KMS.Client.exceptions.MalformedPolicyDocumentException
KMS.Client.exceptions.DependencyTimeoutException
KMS.Client.exceptions.InvalidArnException
KMS.Client.exceptions.UnsupportedOperationException
KMS.Client.exceptions.KMSInternalException
KMS.Client.exceptions.LimitExceededException
KMS.Client.exceptions.TagException
KMS.Client.exceptions.CustomKeyStoreNotFoundException
KMS.Client.exceptions.CustomKeyStoreInvalidStateException
KMS.Client.exceptions.CloudHsmClusterInvalidConfigurationException
Examples
The following example creates a CMK.
response = client.create_key(
# One or more tags. Each tag consists of a tag key and a tag value.
Tags=[
{
'TagKey': 'CreatedBy',
'TagValue': 'ExampleUser',
},
],
)
print(response)
Expected Output:
{
# An object that contains information about the CMK created by this operation.
'KeyMetadata': {
'AWSAccountId': '111122223333',
'Arn': 'arn:aws:kms:us-east-2:111122223333:key/1234abcd-12ab-34cd-56ef-1234567890ab',
'CreationDate': datetime(2017, 7, 5, 14, 4, 55, 2, 186, 0),
'Description': '',
'Enabled': True,
'KeyId': '1234abcd-12ab-34cd-56ef-1234567890ab',
'KeyManager': 'CUSTOMER',
'KeyState': 'Enabled',
'KeyUsage': 'ENCRYPT_DECRYPT',
'Origin': 'AWS_KMS',
},
'ResponseMetadata': {
'...': '...',
},
}
:return: {
'KeyMetadata': {
'AWSAccountId': 'string',
'KeyId': 'string',
'Arn': 'string',
'CreationDate': datetime(2015, 1, 1),
'Enabled': True|False,
'Description': 'string',
'KeyUsage': 'SIGN_VERIFY'|'ENCRYPT_DECRYPT',
'KeyState': 'Enabled'|'Disabled'|'PendingDeletion'|'PendingImport'|'Unavailable',
'DeletionDate': datetime(2015, 1, 1),
'ValidTo': datetime(2015, 1, 1),
'Origin': 'AWS_KMS'|'EXTERNAL'|'AWS_CLOUDHSM',
'CustomKeyStoreId': 'string',
'CloudHsmClusterId': 'string',
'ExpirationModel': 'KEY_MATERIAL_EXPIRES'|'KEY_MATERIAL_DOES_NOT_EXPIRE',
'KeyManager': 'AWS'|'CUSTOMER',
'CustomerMasterKeySpec': 'RSA_2048'|'RSA_3072'|'RSA_4096'|'ECC_NIST_P256'|'ECC_NIST_P384'|'ECC_NIST_P521'|'ECC_SECG_P256K1'|'SYMMETRIC_DEFAULT',
'EncryptionAlgorithms': [
'SYMMETRIC_DEFAULT'|'RSAES_OAEP_SHA_1'|'RSAES_OAEP_SHA_256',
],
'SigningAlgorithms': [
'RSASSA_PSS_SHA_256'|'RSASSA_PSS_SHA_384'|'RSASSA_PSS_SHA_512'|'RSASSA_PKCS1_V1_5_SHA_256'|'RSASSA_PKCS1_V1_5_SHA_384'|'RSASSA_PKCS1_V1_5_SHA_512'|'ECDSA_SHA_256'|'ECDSA_SHA_384'|'ECDSA_SHA_512',
]
}
}
:returns:
Policy (string) -- The key policy to attach to the CMK.
If you provide a key policy, it must meet the following criteria:
If you don\'t set BypassPolicyLockoutSafetyCheck to true, the key policy must allow the principal that is making the CreateKey request to make a subsequent PutKeyPolicy request on the CMK. This reduces the risk that the CMK becomes unmanageable. For more information, refer to the scenario in the Default Key Policy section of the * AWS Key Management Service Developer Guide * .
Each statement in the key policy must contain one or more principals. The principals in the key policy must exist and be visible to AWS KMS. When you create a new AWS principal (for example, an IAM user or role), you might need to enforce a delay before including the new principal in a key policy because the new principal might not be immediately visible to AWS KMS. For more information, see Changes that I make are not always immediately visible in the AWS Identity and Access Management User Guide .
If you do not provide a key policy, AWS KMS attaches a default key policy to the CMK. For more information, see Default Key Policy in the AWS Key Management Service Developer Guide .
The key policy size quota is 32 kilobytes (32768 bytes).
Description (string) -- A description of the CMK.
Use a description that helps you decide whether the CMK is appropriate for a task.
KeyUsage (string) -- Determines the cryptographic operations for which you can use the CMK. The default value is ENCRYPT_DECRYPT . This parameter is required only for asymmetric CMKs. You can\'t change the KeyUsage value after the CMK is created.
Select only one valid value.
For symmetric CMKs, omit the parameter or specify ENCRYPT_DECRYPT .
For asymmetric CMKs with RSA key material, specify ENCRYPT_DECRYPT or SIGN_VERIFY .
For asymmetric CMKs with ECC key material, specify SIGN_VERIFY .
CustomerMasterKeySpec (string) -- Specifies the type of CMK to create. The default value, SYMMETRIC_DEFAULT , creates a CMK with a 256-bit symmetric key for encryption and decryption. For help choosing a key spec for your CMK, see How to Choose Your CMK Configuration in the AWS Key Management Service Developer Guide .
The CustomerMasterKeySpec determines whether the CMK contains a symmetric key or an asymmetric key pair. It also determines the encryption algorithms or signing algorithms that the CMK supports. You can\'t change the CustomerMasterKeySpec after the CMK is created. To further restrict the algorithms that can be used with the CMK, use a condition key in its key policy or IAM policy. For more information, see kms:EncryptionAlgorithm or kms:Signing Algorithm in the AWS Key Management Service Developer Guide .
Warning
AWS services that are integrated with AWS KMS use symmetric CMKs to protect your data. These services do not support asymmetric CMKs. For help determining whether a CMK is symmetric or asymmetric, see Identifying Symmetric and Asymmetric CMKs in the AWS Key Management Service Developer Guide .
AWS KMS supports the following key specs for CMKs:
Symmetric key (default)
SYMMETRIC_DEFAULT (AES-256-GCM)
Asymmetric RSA key pairs
RSA_2048
RSA_3072
RSA_4096
Asymmetric NIST-recommended elliptic curve key pairs
ECC_NIST_P256 (secp256r1)
ECC_NIST_P384 (secp384r1)
ECC_NIST_P521 (secp521r1)
Other asymmetric elliptic curve key pairs
ECC_SECG_P256K1 (secp256k1), commonly used for cryptocurrencies.
Origin (string) -- The source of the key material for the CMK. You cannot change the origin after you create the CMK. The default is AWS_KMS , which means AWS KMS creates the key material.
When the parameter value is EXTERNAL , AWS KMS creates a CMK without key material so that you can import key material from your existing key management infrastructure. For more information about importing key material into AWS KMS, see Importing Key Material in the AWS Key Management Service Developer Guide . This value is valid only for symmetric CMKs.
When the parameter value is AWS_CLOUDHSM , AWS KMS creates the CMK in an AWS KMS custom key store and creates its key material in the associated AWS CloudHSM cluster. You must also use the CustomKeyStoreId parameter to identify the custom key store. This value is valid only for symmetric CMKs.
CustomKeyStoreId (string) -- Creates the CMK in the specified custom key store and the key material in its associated AWS CloudHSM cluster. To create a CMK in a custom key store, you must also specify the Origin parameter with a value of AWS_CLOUDHSM . The AWS CloudHSM cluster that is associated with the custom key store must have at least two active HSMs, each in a different Availability Zone in the Region.
This parameter is valid only for symmetric CMKs. You cannot create an asymmetric CMK in a custom key store.
To find the ID of a custom key store, use the DescribeCustomKeyStores operation.
The response includes the custom key store ID and the ID of the AWS CloudHSM cluster.
This operation is part of the Custom Key Store feature feature in AWS KMS, which combines the convenience and extensive integration of AWS KMS with the isolation and control of a single-tenant key store.
BypassPolicyLockoutSafetyCheck (boolean) -- A flag to indicate whether to bypass the key policy lockout safety check.
Warning
Setting this value to true increases the risk that the CMK becomes unmanageable. Do not set this value to true indiscriminately.
For more information, refer to the scenario in the Default Key Policy section in the * AWS Key Management Service Developer Guide * .
Use this parameter only when you include a policy in the request and you intend to prevent the principal that is making the request from making a subsequent PutKeyPolicy request on the CMK.
The default value is false.
Tags (list) -- One or more tags. Each tag consists of a tag key and a tag value. Both the tag key and the tag value are required, but the tag value can be an empty (null) string.
When you add tags to an AWS resource, AWS generates a cost allocation report with usage and costs aggregated by tags. For information about adding, changing, deleting and listing tags for CMKs, see Tagging Keys .
Use this parameter to tag the CMK when it is created. To add tags to an existing CMK, use the TagResource operation.
(dict) --A key-value pair. A tag consists of a tag key and a tag value. Tag keys and tag values are both required, but tag values can be empty (null) strings.
For information about the rules that apply to tag keys and tag values, see User-Defined Tag Restrictions in the AWS Billing and Cost Management User Guide .
TagKey (string) -- [REQUIRED]The key of the tag.
TagValue (string) -- [REQUIRED]The value of the tag.
"""
pass
def decrypt(CiphertextBlob=None, EncryptionContext=None, GrantTokens=None, KeyId=None, EncryptionAlgorithm=None):
"""
Decrypts ciphertext that was encrypted by a AWS KMS customer master key (CMK) using any of the following operations:
You can use this operation to decrypt ciphertext that was encrypted under a symmetric or asymmetric CMK. When the CMK is asymmetric, you must specify the CMK and the encryption algorithm that was used to encrypt the ciphertext. For information about symmetric and asymmetric CMKs, see Using Symmetric and Asymmetric CMKs in the AWS Key Management Service Developer Guide .
The Decrypt operation also decrypts ciphertext that was encrypted outside of AWS KMS by the public key in an AWS KMS asymmetric CMK. However, it cannot decrypt ciphertext produced by other libraries, such as the AWS Encryption SDK or Amazon S3 client-side encryption . These libraries return a ciphertext format that is incompatible with AWS KMS.
If the ciphertext was encrypted under a symmetric CMK, you do not need to specify the CMK or the encryption algorithm. AWS KMS can get this information from metadata that it adds to the symmetric ciphertext blob. However, if you prefer, you can specify the KeyId to ensure that a particular CMK is used to decrypt the ciphertext. If you specify a different CMK than the one used to encrypt the ciphertext, the Decrypt operation fails.
Whenever possible, use key policies to give users permission to call the Decrypt operation on a particular CMK, instead of using IAM policies. Otherwise, you might create an IAM user policy that gives the user Decrypt permission on all CMKs. This user could decrypt ciphertext that was encrypted by CMKs in other accounts if the key policy for the cross-account CMK permits it. If you must use an IAM policy for Decrypt permissions, limit the user to particular CMKs or particular trusted accounts.
The CMK that you use for this operation must be in a compatible key state. For details, see How Key State Affects Use of a Customer Master Key in the AWS Key Management Service Developer Guide .
See also: AWS API Documentation
Exceptions
Examples
The following example decrypts data that was encrypted with a customer master key (CMK) in AWS KMS.
Expected Output:
:example: response = client.decrypt(
CiphertextBlob=b'bytes',
EncryptionContext={
'string': 'string'
},
GrantTokens=[
'string',
],
KeyId='string',
EncryptionAlgorithm='SYMMETRIC_DEFAULT'|'RSAES_OAEP_SHA_1'|'RSAES_OAEP_SHA_256'
)
:type CiphertextBlob: bytes
:param CiphertextBlob: [REQUIRED]\nCiphertext to be decrypted. The blob includes metadata.\n
:type EncryptionContext: dict
:param EncryptionContext: Specifies the encryption context to use when decrypting the data. An encryption context is valid only for cryptographic operations with a symmetric CMK. The standard asymmetric encryption algorithms that AWS KMS uses do not support an encryption context.\nAn encryption context is a collection of non-secret key-value pairs that represents additional authenticated data. When you use an encryption context to encrypt data, you must specify the same (an exact case-sensitive match) encryption context to decrypt the data. An encryption context is optional when encrypting with a symmetric CMK, but it is highly recommended.\nFor more information, see Encryption Context in the AWS Key Management Service Developer Guide .\n\n(string) --\n(string) --\n\n\n\n
:type GrantTokens: list
:param GrantTokens: A list of grant tokens.\nFor more information, see Grant Tokens in the AWS Key Management Service Developer Guide .\n\n(string) --\n\n
:type KeyId: string
:param KeyId: Specifies the customer master key (CMK) that AWS KMS will use to decrypt the ciphertext. Enter a key ID of the CMK that was used to encrypt the ciphertext.\nIf you specify a KeyId value, the Decrypt operation succeeds only if the specified CMK was used to encrypt the ciphertext.\nThis parameter is required only when the ciphertext was encrypted under an asymmetric CMK. Otherwise, AWS KMS uses the metadata that it adds to the ciphertext blob to determine which CMK was used to encrypt the ciphertext. However, you can use this parameter to ensure that a particular CMK (of any kind) is used to decrypt the ciphertext.\nTo specify a CMK, use its key ID, Amazon Resource Name (ARN), alias name, or alias ARN. When using an alias name, prefix it with 'alias/' .\nFor example:\n\nKey ID: 1234abcd-12ab-34cd-56ef-1234567890ab\nKey ARN: arn:aws:kms:us-east-2:111122223333:key/1234abcd-12ab-34cd-56ef-1234567890ab\nAlias name: alias/ExampleAlias\nAlias ARN: arn:aws:kms:us-east-2:111122223333:alias/ExampleAlias\n\nTo get the key ID and key ARN for a CMK, use ListKeys or DescribeKey . To get the alias name and alias ARN, use ListAliases .\n
:type EncryptionAlgorithm: string
:param EncryptionAlgorithm: Specifies the encryption algorithm that will be used to decrypt the ciphertext. Specify the same algorithm that was used to encrypt the data. If you specify a different algorithm, the Decrypt operation fails.\nThis parameter is required only when the ciphertext was encrypted under an asymmetric CMK. The default value, SYMMETRIC_DEFAULT , represents the only supported algorithm that is valid for symmetric CMKs.\n
:rtype: dict
ReturnsResponse Syntax
{
'KeyId': 'string',
'Plaintext': b'bytes',
'EncryptionAlgorithm': 'SYMMETRIC_DEFAULT'|'RSAES_OAEP_SHA_1'|'RSAES_OAEP_SHA_256'
}
Response Structure
(dict) --
KeyId (string) --
The ARN of the customer master key that was used to perform the decryption.
Plaintext (bytes) --
Decrypted plaintext data. When you use the HTTP API or the AWS CLI, the value is Base64-encoded. Otherwise, it is not Base64-encoded.
EncryptionAlgorithm (string) --
The encryption algorithm that was used to decrypt the ciphertext.
Exceptions
KMS.Client.exceptions.NotFoundException
KMS.Client.exceptions.DisabledException
KMS.Client.exceptions.InvalidCiphertextException
KMS.Client.exceptions.KeyUnavailableException
KMS.Client.exceptions.IncorrectKeyException
KMS.Client.exceptions.InvalidKeyUsageException
KMS.Client.exceptions.DependencyTimeoutException
KMS.Client.exceptions.InvalidGrantTokenException
KMS.Client.exceptions.KMSInternalException
KMS.Client.exceptions.KMSInvalidStateException
Examples
The following example decrypts data that was encrypted with a customer master key (CMK) in AWS KMS.
response = client.decrypt(
# The encrypted data (ciphertext).
CiphertextBlob='<binary data>',
)
print(response)
Expected Output:
{
# The Amazon Resource Name (ARN) of the CMK that was used to decrypt the data.
'KeyId': 'arn:aws:kms:us-west-2:111122223333:key/1234abcd-12ab-34cd-56ef-1234567890ab',
# The decrypted (plaintext) data.
'Plaintext': '<binary data>',
'ResponseMetadata': {
'...': '...',
},
}
:return: {
'KeyId': 'string',
'Plaintext': b'bytes',
'EncryptionAlgorithm': 'SYMMETRIC_DEFAULT'|'RSAES_OAEP_SHA_1'|'RSAES_OAEP_SHA_256'
}
:returns:
CiphertextBlob (bytes) -- [REQUIRED]
Ciphertext to be decrypted. The blob includes metadata.
EncryptionContext (dict) -- Specifies the encryption context to use when decrypting the data. An encryption context is valid only for cryptographic operations with a symmetric CMK. The standard asymmetric encryption algorithms that AWS KMS uses do not support an encryption context.
An encryption context is a collection of non-secret key-value pairs that represents additional authenticated data. When you use an encryption context to encrypt data, you must specify the same (an exact case-sensitive match) encryption context to decrypt the data. An encryption context is optional when encrypting with a symmetric CMK, but it is highly recommended.
For more information, see Encryption Context in the AWS Key Management Service Developer Guide .
(string) --
(string) --
GrantTokens (list) -- A list of grant tokens.
For more information, see Grant Tokens in the AWS Key Management Service Developer Guide .
(string) --
KeyId (string) -- Specifies the customer master key (CMK) that AWS KMS will use to decrypt the ciphertext. Enter a key ID of the CMK that was used to encrypt the ciphertext.
If you specify a KeyId value, the Decrypt operation succeeds only if the specified CMK was used to encrypt the ciphertext.
This parameter is required only when the ciphertext was encrypted under an asymmetric CMK. Otherwise, AWS KMS uses the metadata that it adds to the ciphertext blob to determine which CMK was used to encrypt the ciphertext. However, you can use this parameter to ensure that a particular CMK (of any kind) is used to decrypt the ciphertext.
To specify a CMK, use its key ID, Amazon Resource Name (ARN), alias name, or alias ARN. When using an alias name, prefix it with "alias/" .
For example:
Key ID: 1234abcd-12ab-34cd-56ef-1234567890ab
Key ARN: arn:aws:kms:us-east-2:111122223333:key/1234abcd-12ab-34cd-56ef-1234567890ab
Alias name: alias/ExampleAlias
Alias ARN: arn:aws:kms:us-east-2:111122223333:alias/ExampleAlias
To get the key ID and key ARN for a CMK, use ListKeys or DescribeKey . To get the alias name and alias ARN, use ListAliases .
EncryptionAlgorithm (string) -- Specifies the encryption algorithm that will be used to decrypt the ciphertext. Specify the same algorithm that was used to encrypt the data. If you specify a different algorithm, the Decrypt operation fails.
This parameter is required only when the ciphertext was encrypted under an asymmetric CMK. The default value, SYMMETRIC_DEFAULT , represents the only supported algorithm that is valid for symmetric CMKs.
"""
pass
def delete_alias(AliasName=None):
"""
Deletes the specified alias. You cannot perform this operation on an alias in a different AWS account.
Because an alias is not a property of a CMK, you can delete and change the aliases of a CMK without affecting the CMK. Also, aliases do not appear in the response from the DescribeKey operation. To get the aliases of all CMKs, use the ListAliases operation.
Each CMK can have multiple aliases. To change the alias of a CMK, use DeleteAlias to delete the current alias and CreateAlias to create a new alias. To associate an existing alias with a different customer master key (CMK), call UpdateAlias .
See also: AWS API Documentation
Exceptions
Examples
The following example deletes the specified alias.
Expected Output:
:example: response = client.delete_alias(
AliasName='string'
)
:type AliasName: string
:param AliasName: [REQUIRED]\nThe alias to be deleted. The alias name must begin with alias/ followed by the alias name, such as alias/ExampleAlias .\n
:return: response = client.delete_alias(
# The alias to delete.
AliasName='alias/ExampleAlias',
)
print(response)
"""
pass
def delete_custom_key_store(CustomKeyStoreId=None):
"""
Deletes a custom key store . This operation does not delete the AWS CloudHSM cluster that is associated with the custom key store, or affect any users or keys in the cluster.
The custom key store that you delete cannot contain any AWS KMS customer master keys (CMKs) . Before deleting the key store, verify that you will never need to use any of the CMKs in the key store for any cryptographic operations. Then, use ScheduleKeyDeletion to delete the AWS KMS customer master keys (CMKs) from the key store. When the scheduled waiting period expires, the ScheduleKeyDeletion operation deletes the CMKs. Then it makes a best effort to delete the key material from the associated cluster. However, you might need to manually delete the orphaned key material from the cluster and its backups.
After all CMKs are deleted from AWS KMS, use DisconnectCustomKeyStore to disconnect the key store from AWS KMS. Then, you can delete the custom key store.
Instead of deleting the custom key store, consider using DisconnectCustomKeyStore to disconnect it from AWS KMS. While the key store is disconnected, you cannot create or use the CMKs in the key store. But, you do not need to delete CMKs and you can reconnect a disconnected custom key store at any time.
If the operation succeeds, it returns a JSON object with no properties.
This operation is part of the Custom Key Store feature feature in AWS KMS, which combines the convenience and extensive integration of AWS KMS with the isolation and control of a single-tenant key store.
See also: AWS API Documentation
Exceptions
:example: response = client.delete_custom_key_store(
CustomKeyStoreId='string'
)
:type CustomKeyStoreId: string
:param CustomKeyStoreId: [REQUIRED]\nEnter the ID of the custom key store you want to delete. To find the ID of a custom key store, use the DescribeCustomKeyStores operation.\n
:rtype: dict
ReturnsResponse Syntax{}
Response Structure
(dict) --
Exceptions
KMS.Client.exceptions.CustomKeyStoreHasCMKsException
KMS.Client.exceptions.CustomKeyStoreInvalidStateException
KMS.Client.exceptions.CustomKeyStoreNotFoundException
KMS.Client.exceptions.KMSInternalException
:return: {}
:returns:
KMS.Client.exceptions.CustomKeyStoreHasCMKsException
KMS.Client.exceptions.CustomKeyStoreInvalidStateException
KMS.Client.exceptions.CustomKeyStoreNotFoundException
KMS.Client.exceptions.KMSInternalException
"""
pass
def delete_imported_key_material(KeyId=None):
"""
Deletes key material that you previously imported. This operation makes the specified customer master key (CMK) unusable. For more information about importing key material into AWS KMS, see Importing Key Material in the AWS Key Management Service Developer Guide . You cannot perform this operation on a CMK in a different AWS account.
When the specified CMK is in the PendingDeletion state, this operation does not change the CMK\'s state. Otherwise, it changes the CMK\'s state to PendingImport .
After you delete key material, you can use ImportKeyMaterial to reimport the same key material into the CMK.
The CMK that you use for this operation must be in a compatible key state. For details, see How Key State Affects Use of a Customer Master Key in the AWS Key Management Service Developer Guide .
See also: AWS API Documentation
Exceptions
Examples
The following example deletes the imported key material from the specified customer master key (CMK).
Expected Output:
:example: response = client.delete_imported_key_material(
KeyId='string'
)
:type KeyId: string
:param KeyId: [REQUIRED]\nIdentifies the CMK from which you are deleting imported key material. The Origin of the CMK must be EXTERNAL .\nSpecify the key ID or the Amazon Resource Name (ARN) of the CMK.\nFor example:\n\nKey ID: 1234abcd-12ab-34cd-56ef-1234567890ab\nKey ARN: arn:aws:kms:us-east-2:111122223333:key/1234abcd-12ab-34cd-56ef-1234567890ab\n\nTo get the key ID and key ARN for a CMK, use ListKeys or DescribeKey .\n
:return: response = client.delete_imported_key_material(
# The identifier of the CMK whose imported key material you are deleting. You can use the key ID or the Amazon Resource Name (ARN) of the CMK.
KeyId='1234abcd-12ab-34cd-56ef-1234567890ab',
)
print(response)
:returns:
KMS.Client.exceptions.InvalidArnException
KMS.Client.exceptions.UnsupportedOperationException
KMS.Client.exceptions.DependencyTimeoutException
KMS.Client.exceptions.NotFoundException
KMS.Client.exceptions.KMSInternalException
KMS.Client.exceptions.KMSInvalidStateException
"""
pass
def describe_custom_key_stores(CustomKeyStoreId=None, CustomKeyStoreName=None, Limit=None, Marker=None):
"""
Gets information about custom key stores in the account and region.
This operation is part of the Custom Key Store feature feature in AWS KMS, which combines the convenience and extensive integration of AWS KMS with the isolation and control of a single-tenant key store.
By default, this operation returns information about all custom key stores in the account and region. To get only information about a particular custom key store, use either the CustomKeyStoreName or CustomKeyStoreId parameter (but not both).
To determine whether the custom key store is connected to its AWS CloudHSM cluster, use the ConnectionState element in the response. If an attempt to connect the custom key store failed, the ConnectionState value is FAILED and the ConnectionErrorCode element in the response indicates the cause of the failure. For help interpreting the ConnectionErrorCode , see CustomKeyStoresListEntry .
Custom key stores have a DISCONNECTED connection state if the key store has never been connected or you use the DisconnectCustomKeyStore operation to disconnect it. If your custom key store state is CONNECTED but you are having trouble using it, make sure that its associated AWS CloudHSM cluster is active and contains the minimum number of HSMs required for the operation, if any.
For help repairing your custom key store, see the Troubleshooting Custom Key Stores topic in the AWS Key Management Service Developer Guide .
See also: AWS API Documentation
Exceptions
:example: response = client.describe_custom_key_stores(
CustomKeyStoreId='string',
CustomKeyStoreName='string',
Limit=123,
Marker='string'
)
:type CustomKeyStoreId: string
:param CustomKeyStoreId: Gets only information about the specified custom key store. Enter the key store ID.\nBy default, this operation gets information about all custom key stores in the account and region. To limit the output to a particular custom key store, you can use either the CustomKeyStoreId or CustomKeyStoreName parameter, but not both.\n
:type CustomKeyStoreName: string
:param CustomKeyStoreName: Gets only information about the specified custom key store. Enter the friendly name of the custom key store.\nBy default, this operation gets information about all custom key stores in the account and region. To limit the output to a particular custom key store, you can use either the CustomKeyStoreId or CustomKeyStoreName parameter, but not both.\n
:type Limit: integer
:param Limit: Use this parameter to specify the maximum number of items to return. When this value is present, AWS KMS does not return more than the specified number of items, but it might return fewer.
:type Marker: string
:param Marker: Use this parameter in a subsequent request after you receive a response with truncated results. Set it to the value of NextMarker from the truncated response you just received.
:rtype: dict
ReturnsResponse Syntax
{
'CustomKeyStores': [
{
'CustomKeyStoreId': 'string',
'CustomKeyStoreName': 'string',
'CloudHsmClusterId': 'string',
'TrustAnchorCertificate': 'string',
'ConnectionState': 'CONNECTED'|'CONNECTING'|'FAILED'|'DISCONNECTED'|'DISCONNECTING',
'ConnectionErrorCode': 'INVALID_CREDENTIALS'|'CLUSTER_NOT_FOUND'|'NETWORK_ERRORS'|'INTERNAL_ERROR'|'INSUFFICIENT_CLOUDHSM_HSMS'|'USER_LOCKED_OUT'|'USER_NOT_FOUND'|'USER_LOGGED_IN'|'SUBNET_NOT_FOUND',
'CreationDate': datetime(2015, 1, 1)
},
],
'NextMarker': 'string',
'Truncated': True|False
}
Response Structure
(dict) --
CustomKeyStores (list) --
Contains metadata about each custom key store.
(dict) --
Contains information about each custom key store in the custom key store list.
CustomKeyStoreId (string) --
A unique identifier for the custom key store.
CustomKeyStoreName (string) --
The user-specified friendly name for the custom key store.
CloudHsmClusterId (string) --
A unique identifier for the AWS CloudHSM cluster that is associated with the custom key store.
TrustAnchorCertificate (string) --
The trust anchor certificate of the associated AWS CloudHSM cluster. When you initialize the cluster , you create this certificate and save it in the customerCA.crt file.
ConnectionState (string) --
Indicates whether the custom key store is connected to its AWS CloudHSM cluster.
You can create and use CMKs in your custom key stores only when its connection state is CONNECTED .
The value is DISCONNECTED if the key store has never been connected or you use the DisconnectCustomKeyStore operation to disconnect it. If the value is CONNECTED but you are having trouble using the custom key store, make sure that its associated AWS CloudHSM cluster is active and contains at least one active HSM.
A value of FAILED indicates that an attempt to connect was unsuccessful. The ConnectionErrorCode field in the response indicates the cause of the failure. For help resolving a connection failure, see Troubleshooting a Custom Key Store in the AWS Key Management Service Developer Guide .
ConnectionErrorCode (string) --
Describes the connection error. This field appears in the response only when the ConnectionState is FAILED . For help resolving these errors, see How to Fix a Connection Failure in AWS Key Management Service Developer Guide .
Valid values are:
CLUSTER_NOT_FOUND - AWS KMS cannot find the AWS CloudHSM cluster with the specified cluster ID.
INSUFFICIENT_CLOUDHSM_HSMS - The associated AWS CloudHSM cluster does not contain any active HSMs. To connect a custom key store to its AWS CloudHSM cluster, the cluster must contain at least one active HSM.
INTERNAL_ERROR - AWS KMS could not complete the request due to an internal error. Retry the request. For ConnectCustomKeyStore requests, disconnect the custom key store before trying to connect again.
INVALID_CREDENTIALS - AWS KMS does not have the correct password for the kmsuser crypto user in the AWS CloudHSM cluster. Before you can connect your custom key store to its AWS CloudHSM cluster, you must change the kmsuser account password and update the key store password value for the custom key store.
NETWORK_ERRORS - Network errors are preventing AWS KMS from connecting to the custom key store.
SUBNET_NOT_FOUND - A subnet in the AWS CloudHSM cluster configuration was deleted. If AWS KMS cannot find all of the subnets that were configured for the cluster when the custom key store was created, attempts to connect fail. To fix this error, create a cluster from a backup and associate it with your custom key store. This process includes selecting a VPC and subnets. For details, see How to Fix a Connection Failure in the AWS Key Management Service Developer Guide .
USER_LOCKED_OUT - The kmsuser CU account is locked out of the associated AWS CloudHSM cluster due to too many failed password attempts. Before you can connect your custom key store to its AWS CloudHSM cluster, you must change the kmsuser account password and update the key store password value for the custom key store.
USER_LOGGED_IN - The kmsuser CU account is logged into the the associated AWS CloudHSM cluster. This prevents AWS KMS from rotating the kmsuser account password and logging into the cluster. Before you can connect your custom key store to its AWS CloudHSM cluster, you must log the kmsuser CU out of the cluster. If you changed the kmsuser password to log into the cluster, you must also and update the key store password value for the custom key store. For help, see How to Log Out and Reconnect in the AWS Key Management Service Developer Guide .
USER_NOT_FOUND - AWS KMS cannot find a kmsuser CU account in the associated AWS CloudHSM cluster. Before you can connect your custom key store to its AWS CloudHSM cluster, you must create a kmsuser CU account in the cluster, and then update the key store password value for the custom key store.
CreationDate (datetime) --
The date and time when the custom key store was created.
NextMarker (string) --
When Truncated is true, this element is present and contains the value to use for the Marker parameter in a subsequent request.
Truncated (boolean) --
A flag that indicates whether there are more items in the list. When this value is true, the list in this response is truncated. To get more items, pass the value of the NextMarker element in thisresponse to the Marker parameter in a subsequent request.
Exceptions
KMS.Client.exceptions.CustomKeyStoreNotFoundException
KMS.Client.exceptions.KMSInternalException
:return: {
'CustomKeyStores': [
{
'CustomKeyStoreId': 'string',
'CustomKeyStoreName': 'string',
'CloudHsmClusterId': 'string',
'TrustAnchorCertificate': 'string',
'ConnectionState': 'CONNECTED'|'CONNECTING'|'FAILED'|'DISCONNECTED'|'DISCONNECTING',
'ConnectionErrorCode': 'INVALID_CREDENTIALS'|'CLUSTER_NOT_FOUND'|'NETWORK_ERRORS'|'INTERNAL_ERROR'|'INSUFFICIENT_CLOUDHSM_HSMS'|'USER_LOCKED_OUT'|'USER_NOT_FOUND'|'USER_LOGGED_IN'|'SUBNET_NOT_FOUND',
'CreationDate': datetime(2015, 1, 1)
},
],
'NextMarker': 'string',
'Truncated': True|False
}
:returns:
CLUSTER_NOT_FOUND - AWS KMS cannot find the AWS CloudHSM cluster with the specified cluster ID.
INSUFFICIENT_CLOUDHSM_HSMS - The associated AWS CloudHSM cluster does not contain any active HSMs. To connect a custom key store to its AWS CloudHSM cluster, the cluster must contain at least one active HSM.
INTERNAL_ERROR - AWS KMS could not complete the request due to an internal error. Retry the request. For ConnectCustomKeyStore requests, disconnect the custom key store before trying to connect again.
INVALID_CREDENTIALS - AWS KMS does not have the correct password for the kmsuser crypto user in the AWS CloudHSM cluster. Before you can connect your custom key store to its AWS CloudHSM cluster, you must change the kmsuser account password and update the key store password value for the custom key store.
NETWORK_ERRORS - Network errors are preventing AWS KMS from connecting to the custom key store.
SUBNET_NOT_FOUND - A subnet in the AWS CloudHSM cluster configuration was deleted. If AWS KMS cannot find all of the subnets that were configured for the cluster when the custom key store was created, attempts to connect fail. To fix this error, create a cluster from a backup and associate it with your custom key store. This process includes selecting a VPC and subnets. For details, see How to Fix a Connection Failure in the AWS Key Management Service Developer Guide .
USER_LOCKED_OUT - The kmsuser CU account is locked out of the associated AWS CloudHSM cluster due to too many failed password attempts. Before you can connect your custom key store to its AWS CloudHSM cluster, you must change the kmsuser account password and update the key store password value for the custom key store.
USER_LOGGED_IN - The kmsuser CU account is logged into the the associated AWS CloudHSM cluster. This prevents AWS KMS from rotating the kmsuser account password and logging into the cluster. Before you can connect your custom key store to its AWS CloudHSM cluster, you must log the kmsuser CU out of the cluster. If you changed the kmsuser password to log into the cluster, you must also and update the key store password value for the custom key store. For help, see How to Log Out and Reconnect in the AWS Key Management Service Developer Guide .
USER_NOT_FOUND - AWS KMS cannot find a kmsuser CU account in the associated AWS CloudHSM cluster. Before you can connect your custom key store to its AWS CloudHSM cluster, you must create a kmsuser CU account in the cluster, and then update the key store password value for the custom key store.
"""
pass
def describe_key(KeyId=None, GrantTokens=None):
"""
Provides detailed information about a customer master key (CMK). You can run DescribeKey on a customer managed CMK or an AWS managed CMK .
This detailed information includes the key ARN, creation date (and deletion date, if applicable), the key state, and the origin and expiration date (if any) of the key material. For CMKs in custom key stores, it includes information about the custom key store, such as the key store ID and the AWS CloudHSM cluster ID. It includes fields, like KeySpec , that help you distinguish symmetric from asymmetric CMKs. It also provides information that is particularly important to asymmetric CMKs, such as the key usage (encryption or signing) and the encryption algorithms or signing algorithms that the CMK supports.
If you call the DescribeKey operation on a predefined AWS alias , that is, an AWS alias with no key ID, AWS KMS creates an AWS managed CMK . Then, it associates the alias with the new CMK, and returns the KeyId and Arn of the new CMK in the response.
To perform this operation on a CMK in a different AWS account, specify the key ARN or alias ARN in the value of the KeyId parameter.
See also: AWS API Documentation
Exceptions
Examples
The following example returns information (metadata) about the specified CMK.
Expected Output:
:example: response = client.describe_key(
KeyId='string',
GrantTokens=[
'string',
]
)
:type KeyId: string
:param KeyId: [REQUIRED]\nDescribes the specified customer master key (CMK).\nIf you specify a predefined AWS alias (an AWS alias with no key ID), KMS associates the alias with an AWS managed CMK and returns its KeyId and Arn in the response.\nTo specify a CMK, use its key ID, Amazon Resource Name (ARN), alias name, or alias ARN. When using an alias name, prefix it with 'alias/' . To specify a CMK in a different AWS account, you must use the key ARN or alias ARN.\nFor example:\n\nKey ID: 1234abcd-12ab-34cd-56ef-1234567890ab\nKey ARN: arn:aws:kms:us-east-2:111122223333:key/1234abcd-12ab-34cd-56ef-1234567890ab\nAlias name: alias/ExampleAlias\nAlias ARN: arn:aws:kms:us-east-2:111122223333:alias/ExampleAlias\n\nTo get the key ID and key ARN for a CMK, use ListKeys or DescribeKey . To get the alias name and alias ARN, use ListAliases .\n
:type GrantTokens: list
:param GrantTokens: A list of grant tokens.\nFor more information, see Grant Tokens in the AWS Key Management Service Developer Guide .\n\n(string) --\n\n
:rtype: dict
ReturnsResponse Syntax
{
'KeyMetadata': {
'AWSAccountId': 'string',
'KeyId': 'string',
'Arn': 'string',
'CreationDate': datetime(2015, 1, 1),
'Enabled': True|False,
'Description': 'string',
'KeyUsage': 'SIGN_VERIFY'|'ENCRYPT_DECRYPT',
'KeyState': 'Enabled'|'Disabled'|'PendingDeletion'|'PendingImport'|'Unavailable',
'DeletionDate': datetime(2015, 1, 1),
'ValidTo': datetime(2015, 1, 1),
'Origin': 'AWS_KMS'|'EXTERNAL'|'AWS_CLOUDHSM',
'CustomKeyStoreId': 'string',
'CloudHsmClusterId': 'string',
'ExpirationModel': 'KEY_MATERIAL_EXPIRES'|'KEY_MATERIAL_DOES_NOT_EXPIRE',
'KeyManager': 'AWS'|'CUSTOMER',
'CustomerMasterKeySpec': 'RSA_2048'|'RSA_3072'|'RSA_4096'|'ECC_NIST_P256'|'ECC_NIST_P384'|'ECC_NIST_P521'|'ECC_SECG_P256K1'|'SYMMETRIC_DEFAULT',
'EncryptionAlgorithms': [
'SYMMETRIC_DEFAULT'|'RSAES_OAEP_SHA_1'|'RSAES_OAEP_SHA_256',
],
'SigningAlgorithms': [
'RSASSA_PSS_SHA_256'|'RSASSA_PSS_SHA_384'|'RSASSA_PSS_SHA_512'|'RSASSA_PKCS1_V1_5_SHA_256'|'RSASSA_PKCS1_V1_5_SHA_384'|'RSASSA_PKCS1_V1_5_SHA_512'|'ECDSA_SHA_256'|'ECDSA_SHA_384'|'ECDSA_SHA_512',
]
}
}
Response Structure
(dict) --
KeyMetadata (dict) --
Metadata associated with the key.
AWSAccountId (string) --
The twelve-digit account ID of the AWS account that owns the CMK.
KeyId (string) --
The globally unique identifier for the CMK.
Arn (string) --
The Amazon Resource Name (ARN) of the CMK. For examples, see AWS Key Management Service (AWS KMS) in the Example ARNs section of the AWS General Reference .
CreationDate (datetime) --
The date and time when the CMK was created.
Enabled (boolean) --
Specifies whether the CMK is enabled. When KeyState is Enabled this value is true, otherwise it is false.
Description (string) --
The description of the CMK.
KeyUsage (string) --
The cryptographic operations for which you can use the CMK.
KeyState (string) --
The state of the CMK.
For more information about how key state affects the use of a CMK, see How Key State Affects the Use of a Customer Master Key in the AWS Key Management Service Developer Guide .
DeletionDate (datetime) --
The date and time after which AWS KMS deletes the CMK. This value is present only when KeyState is PendingDeletion .
ValidTo (datetime) --
The time at which the imported key material expires. When the key material expires, AWS KMS deletes the key material and the CMK becomes unusable. This value is present only for CMKs whose Origin is EXTERNAL and whose ExpirationModel is KEY_MATERIAL_EXPIRES , otherwise this value is omitted.
Origin (string) --
The source of the CMK\'s key material. When this value is AWS_KMS , AWS KMS created the key material. When this value is EXTERNAL , the key material was imported from your existing key management infrastructure or the CMK lacks key material. When this value is AWS_CLOUDHSM , the key material was created in the AWS CloudHSM cluster associated with a custom key store.
CustomKeyStoreId (string) --
A unique identifier for the custom key store that contains the CMK. This value is present only when the CMK is created in a custom key store.
CloudHsmClusterId (string) --
The cluster ID of the AWS CloudHSM cluster that contains the key material for the CMK. When you create a CMK in a custom key store , AWS KMS creates the key material for the CMK in the associated AWS CloudHSM cluster. This value is present only when the CMK is created in a custom key store.
ExpirationModel (string) --
Specifies whether the CMK\'s key material expires. This value is present only when Origin is EXTERNAL , otherwise this value is omitted.
KeyManager (string) --
The manager of the CMK. CMKs in your AWS account are either customer managed or AWS managed. For more information about the difference, see Customer Master Keys in the AWS Key Management Service Developer Guide .
CustomerMasterKeySpec (string) --
Describes the type of key material in the CMK.
EncryptionAlgorithms (list) --
A list of encryption algorithms that the CMK supports. You cannot use the CMK with other encryption algorithms within AWS KMS.
This field appears only when the KeyUsage of the CMK is ENCRYPT_DECRYPT .
(string) --
SigningAlgorithms (list) --
A list of signing algorithms that the CMK supports. You cannot use the CMK with other signing algorithms within AWS KMS.
This field appears only when the KeyUsage of the CMK is SIGN_VERIFY .
(string) --
Exceptions
KMS.Client.exceptions.NotFoundException
KMS.Client.exceptions.InvalidArnException
KMS.Client.exceptions.DependencyTimeoutException
KMS.Client.exceptions.KMSInternalException
Examples
The following example returns information (metadata) about the specified CMK.
response = client.describe_key(
# The identifier of the CMK that you want information about. You can use the key ID or the Amazon Resource Name (ARN) of the CMK.
KeyId='1234abcd-12ab-34cd-56ef-1234567890ab',
)
print(response)
Expected Output:
{
# An object that contains information about the specified CMK.
'KeyMetadata': {
'AWSAccountId': '111122223333',
'Arn': 'arn:aws:kms:us-east-2:111122223333:key/1234abcd-12ab-34cd-56ef-1234567890ab',
'CreationDate': datetime(2017, 7, 5, 14, 4, 55, 2, 186, 0),
'Description': '',
'Enabled': True,
'KeyId': '1234abcd-12ab-34cd-56ef-1234567890ab',
'KeyManager': 'CUSTOMER',
'KeyState': 'Enabled',
'KeyUsage': 'ENCRYPT_DECRYPT',
'Origin': 'AWS_KMS',
},
'ResponseMetadata': {
'...': '...',
},
}
:return: {
'KeyMetadata': {
'AWSAccountId': 'string',
'KeyId': 'string',
'Arn': 'string',
'CreationDate': datetime(2015, 1, 1),
'Enabled': True|False,
'Description': 'string',
'KeyUsage': 'SIGN_VERIFY'|'ENCRYPT_DECRYPT',
'KeyState': 'Enabled'|'Disabled'|'PendingDeletion'|'PendingImport'|'Unavailable',
'DeletionDate': datetime(2015, 1, 1),
'ValidTo': datetime(2015, 1, 1),
'Origin': 'AWS_KMS'|'EXTERNAL'|'AWS_CLOUDHSM',
'CustomKeyStoreId': 'string',
'CloudHsmClusterId': 'string',
'ExpirationModel': 'KEY_MATERIAL_EXPIRES'|'KEY_MATERIAL_DOES_NOT_EXPIRE',
'KeyManager': 'AWS'|'CUSTOMER',
'CustomerMasterKeySpec': 'RSA_2048'|'RSA_3072'|'RSA_4096'|'ECC_NIST_P256'|'ECC_NIST_P384'|'ECC_NIST_P521'|'ECC_SECG_P256K1'|'SYMMETRIC_DEFAULT',
'EncryptionAlgorithms': [
'SYMMETRIC_DEFAULT'|'RSAES_OAEP_SHA_1'|'RSAES_OAEP_SHA_256',
],
'SigningAlgorithms': [
'RSASSA_PSS_SHA_256'|'RSASSA_PSS_SHA_384'|'RSASSA_PSS_SHA_512'|'RSASSA_PKCS1_V1_5_SHA_256'|'RSASSA_PKCS1_V1_5_SHA_384'|'RSASSA_PKCS1_V1_5_SHA_512'|'ECDSA_SHA_256'|'ECDSA_SHA_384'|'ECDSA_SHA_512',
]
}
}
:returns:
KeyId (string) -- [REQUIRED]
Describes the specified customer master key (CMK).
If you specify a predefined AWS alias (an AWS alias with no key ID), KMS associates the alias with an AWS managed CMK and returns its KeyId and Arn in the response.
To specify a CMK, use its key ID, Amazon Resource Name (ARN), alias name, or alias ARN. When using an alias name, prefix it with "alias/" . To specify a CMK in a different AWS account, you must use the key ARN or alias ARN.
For example:
Key ID: 1234abcd-12ab-34cd-56ef-1234567890ab
Key ARN: arn:aws:kms:us-east-2:111122223333:key/1234abcd-12ab-34cd-56ef-1234567890ab
Alias name: alias/ExampleAlias
Alias ARN: arn:aws:kms:us-east-2:111122223333:alias/ExampleAlias
To get the key ID and key ARN for a CMK, use ListKeys or DescribeKey . To get the alias name and alias ARN, use ListAliases .
GrantTokens (list) -- A list of grant tokens.
For more information, see Grant Tokens in the AWS Key Management Service Developer Guide .
(string) --
"""
pass
def disable_key(KeyId=None):
"""
Sets the state of a customer master key (CMK) to disabled, thereby preventing its use for cryptographic operations. You cannot perform this operation on a CMK in a different AWS account.
For more information about how key state affects the use of a CMK, see How Key State Affects the Use of a Customer Master Key in the * AWS Key Management Service Developer Guide * .
The CMK that you use for this operation must be in a compatible key state. For details, see How Key State Affects Use of a Customer Master Key in the AWS Key Management Service Developer Guide .
See also: AWS API Documentation
Exceptions
Examples
The following example disables the specified CMK.
Expected Output:
:example: response = client.disable_key(
KeyId='string'
)
:type KeyId: string
:param KeyId: [REQUIRED]\nA unique identifier for the customer master key (CMK).\nSpecify the key ID or the Amazon Resource Name (ARN) of the CMK.\nFor example:\n\nKey ID: 1234abcd-12ab-34cd-56ef-1234567890ab\nKey ARN: arn:aws:kms:us-east-2:111122223333:key/1234abcd-12ab-34cd-56ef-1234567890ab\n\nTo get the key ID and key ARN for a CMK, use ListKeys or DescribeKey .\n
:return: response = client.disable_key(
# The identifier of the CMK to disable. You can use the key ID or the Amazon Resource Name (ARN) of the CMK.
KeyId='1234abcd-12ab-34cd-56ef-1234567890ab',
)
print(response)
:returns:
KMS.Client.exceptions.NotFoundException
KMS.Client.exceptions.InvalidArnException
KMS.Client.exceptions.DependencyTimeoutException
KMS.Client.exceptions.KMSInternalException
KMS.Client.exceptions.KMSInvalidStateException
"""
pass
def disable_key_rotation(KeyId=None):
"""
Disables automatic rotation of the key material for the specified symmetric customer master key (CMK).
You cannot enable automatic rotation of asymmetric CMKs, CMKs with imported key material, or CMKs in a custom key store . You cannot perform this operation on a CMK in a different AWS account.
The CMK that you use for this operation must be in a compatible key state. For details, see How Key State Affects Use of a Customer Master Key in the AWS Key Management Service Developer Guide .
See also: AWS API Documentation
Exceptions
Examples
The following example disables automatic annual rotation of the key material for the specified CMK.
Expected Output:
:example: response = client.disable_key_rotation(
KeyId='string'
)
:type KeyId: string
:param KeyId: [REQUIRED]\nIdentifies a symmetric customer master key (CMK). You cannot enable automatic rotation of asymmetric CMKs , CMKs with imported key material , or CMKs in a custom key store .\nSpecify the key ID or the Amazon Resource Name (ARN) of the CMK.\nFor example:\n\nKey ID: 1234abcd-12ab-34cd-56ef-1234567890ab\nKey ARN: arn:aws:kms:us-east-2:111122223333:key/1234abcd-12ab-34cd-56ef-1234567890ab\n\nTo get the key ID and key ARN for a CMK, use ListKeys or DescribeKey .\n
:return: response = client.disable_key_rotation(
# The identifier of the CMK whose key material will no longer be rotated. You can use the key ID or the Amazon Resource Name (ARN) of the CMK.
KeyId='1234abcd-12ab-34cd-56ef-1234567890ab',
)
print(response)
:returns:
KMS.Client.exceptions.NotFoundException
KMS.Client.exceptions.DisabledException
KMS.Client.exceptions.InvalidArnException
KMS.Client.exceptions.DependencyTimeoutException
KMS.Client.exceptions.KMSInternalException
KMS.Client.exceptions.KMSInvalidStateException
KMS.Client.exceptions.UnsupportedOperationException
"""
pass
def disconnect_custom_key_store(CustomKeyStoreId=None):
"""
Disconnects the custom key store from its associated AWS CloudHSM cluster. While a custom key store is disconnected, you can manage the custom key store and its customer master keys (CMKs), but you cannot create or use CMKs in the custom key store. You can reconnect the custom key store at any time.
To find the connection state of a custom key store, use the DescribeCustomKeyStores operation. To reconnect a custom key store, use the ConnectCustomKeyStore operation.
If the operation succeeds, it returns a JSON object with no properties.
This operation is part of the Custom Key Store feature feature in AWS KMS, which combines the convenience and extensive integration of AWS KMS with the isolation and control of a single-tenant key store.
See also: AWS API Documentation
Exceptions
:example: response = client.disconnect_custom_key_store(
CustomKeyStoreId='string'
)
:type CustomKeyStoreId: string
:param CustomKeyStoreId: [REQUIRED]\nEnter the ID of the custom key store you want to disconnect. To find the ID of a custom key store, use the DescribeCustomKeyStores operation.\n
:rtype: dict
ReturnsResponse Syntax{}
Response Structure
(dict) --
Exceptions
KMS.Client.exceptions.CustomKeyStoreInvalidStateException
KMS.Client.exceptions.CustomKeyStoreNotFoundException
KMS.Client.exceptions.KMSInternalException
:return: {}
:returns:
KMS.Client.exceptions.CustomKeyStoreInvalidStateException
KMS.Client.exceptions.CustomKeyStoreNotFoundException
KMS.Client.exceptions.KMSInternalException
"""
pass
def enable_key(KeyId=None):
"""
Sets the key state of a customer master key (CMK) to enabled. This allows you to use the CMK for cryptographic operations. You cannot perform this operation on a CMK in a different AWS account.
The CMK that you use for this operation must be in a compatible key state. For details, see How Key State Affects Use of a Customer Master Key in the AWS Key Management Service Developer Guide .
See also: AWS API Documentation
Exceptions
Examples
The following example enables the specified CMK.
Expected Output:
:example: response = client.enable_key(
KeyId='string'
)
:type KeyId: string
:param KeyId: [REQUIRED]\nA unique identifier for the customer master key (CMK).\nSpecify the key ID or the Amazon Resource Name (ARN) of the CMK.\nFor example:\n\nKey ID: 1234abcd-12ab-34cd-56ef-1234567890ab\nKey ARN: arn:aws:kms:us-east-2:111122223333:key/1234abcd-12ab-34cd-56ef-1234567890ab\n\nTo get the key ID and key ARN for a CMK, use ListKeys or DescribeKey .\n
:return: response = client.enable_key(
# The identifier of the CMK to enable. You can use the key ID or the Amazon Resource Name (ARN) of the CMK.
KeyId='1234abcd-12ab-34cd-56ef-1234567890ab',
)
print(response)
:returns:
KMS.Client.exceptions.NotFoundException
KMS.Client.exceptions.InvalidArnException
KMS.Client.exceptions.DependencyTimeoutException
KMS.Client.exceptions.KMSInternalException
KMS.Client.exceptions.LimitExceededException
KMS.Client.exceptions.KMSInvalidStateException
"""
pass
def enable_key_rotation(KeyId=None):
"""
Enables automatic rotation of the key material for the specified symmetric customer master key (CMK). You cannot perform this operation on a CMK in a different AWS account.
You cannot enable automatic rotation of asymmetric CMKs, CMKs with imported key material, or CMKs in a custom key store .
The CMK that you use for this operation must be in a compatible key state. For details, see How Key State Affects Use of a Customer Master Key in the AWS Key Management Service Developer Guide .
See also: AWS API Documentation
Exceptions
Examples
The following example enables automatic annual rotation of the key material for the specified CMK.
Expected Output:
:example: response = client.enable_key_rotation(
KeyId='string'
)
:type KeyId: string
:param KeyId: [REQUIRED]\nIdentifies a symmetric customer master key (CMK). You cannot enable automatic rotation of asymmetric CMKs, CMKs with imported key material, or CMKs in a custom key store .\nSpecify the key ID or the Amazon Resource Name (ARN) of the CMK.\nFor example:\n\nKey ID: 1234abcd-12ab-34cd-56ef-1234567890ab\nKey ARN: arn:aws:kms:us-east-2:111122223333:key/1234abcd-12ab-34cd-56ef-1234567890ab\n\nTo get the key ID and key ARN for a CMK, use ListKeys or DescribeKey .\n
:return: response = client.enable_key_rotation(
# The identifier of the CMK whose key material will be rotated annually. You can use the key ID or the Amazon Resource Name (ARN) of the CMK.
KeyId='1234abcd-12ab-34cd-56ef-1234567890ab',
)
print(response)
:returns:
KMS.Client.exceptions.NotFoundException
KMS.Client.exceptions.DisabledException
KMS.Client.exceptions.InvalidArnException
KMS.Client.exceptions.DependencyTimeoutException
KMS.Client.exceptions.KMSInternalException
KMS.Client.exceptions.KMSInvalidStateException
KMS.Client.exceptions.UnsupportedOperationException
"""
pass
def encrypt(KeyId=None, Plaintext=None, EncryptionContext=None, GrantTokens=None, EncryptionAlgorithm=None):
"""
Encrypts plaintext into ciphertext by using a customer master key (CMK). The Encrypt operation has two primary use cases:
You don\'t need to use the Encrypt operation to encrypt a data key. The GenerateDataKey and GenerateDataKeyPair operations return a plaintext data key and an encrypted copy of that data key.
When you encrypt data, you must specify a symmetric or asymmetric CMK to use in the encryption operation. The CMK must have a Key value of ENCRYPT_DECRYPT. To find the Key of a CMK, use the DescribeKey operation.
If you use a symmetric CMK, you can use an encryption context to add additional security to your encryption operation. If you specify an EncryptionContext when encrypting data, you must specify the same encryption context (a case-sensitive exact match) when decrypting the data. Otherwise, the request to decrypt fails with an InvalidCiphertextException . For more information, see Encryption Context in the AWS Key Management Service Developer Guide .
If you specify an asymmetric CMK, you must also specify the encryption algorithm. The algorithm must be compatible with the CMK type.
The maximum size of the data that you can encrypt varies with the type of CMK and the encryption algorithm that you choose.
The CMK that you use for this operation must be in a compatible key state. For details, see How Key State Affects Use of a Customer Master Key in the AWS Key Management Service Developer Guide .
To perform this operation on a CMK in a different AWS account, specify the key ARN or alias ARN in the value of the KeyId parameter.
See also: AWS API Documentation
Exceptions
Examples
The following example encrypts data with the specified customer master key (CMK).
Expected Output:
:example: response = client.encrypt(
KeyId='string',
Plaintext=b'bytes',
EncryptionContext={
'string': 'string'
},
GrantTokens=[
'string',
],
EncryptionAlgorithm='SYMMETRIC_DEFAULT'|'RSAES_OAEP_SHA_1'|'RSAES_OAEP_SHA_256'
)
:type KeyId: string
:param KeyId: [REQUIRED]\nA unique identifier for the customer master key (CMK).\nTo specify a CMK, use its key ID, Amazon Resource Name (ARN), alias name, or alias ARN. When using an alias name, prefix it with 'alias/' . To specify a CMK in a different AWS account, you must use the key ARN or alias ARN.\nFor example:\n\nKey ID: 1234abcd-12ab-34cd-56ef-1234567890ab\nKey ARN: arn:aws:kms:us-east-2:111122223333:key/1234abcd-12ab-34cd-56ef-1234567890ab\nAlias name: alias/ExampleAlias\nAlias ARN: arn:aws:kms:us-east-2:111122223333:alias/ExampleAlias\n\nTo get the key ID and key ARN for a CMK, use ListKeys or DescribeKey . To get the alias name and alias ARN, use ListAliases .\n
:type Plaintext: bytes
:param Plaintext: [REQUIRED]\nData to be encrypted.\n
:type EncryptionContext: dict
:param EncryptionContext: Specifies the encryption context that will be used to encrypt the data. An encryption context is valid only for cryptographic operations with a symmetric CMK. The standard asymmetric encryption algorithms that AWS KMS uses do not support an encryption context.\nAn encryption context is a collection of non-secret key-value pairs that represents additional authenticated data. When you use an encryption context to encrypt data, you must specify the same (an exact case-sensitive match) encryption context to decrypt the data. An encryption context is optional when encrypting with a symmetric CMK, but it is highly recommended.\nFor more information, see Encryption Context in the AWS Key Management Service Developer Guide .\n\n(string) --\n(string) --\n\n\n\n
:type GrantTokens: list
:param GrantTokens: A list of grant tokens.\nFor more information, see Grant Tokens in the AWS Key Management Service Developer Guide .\n\n(string) --\n\n
:type EncryptionAlgorithm: string
:param EncryptionAlgorithm: Specifies the encryption algorithm that AWS KMS will use to encrypt the plaintext message. The algorithm must be compatible with the CMK that you specify.\nThis parameter is required only for asymmetric CMKs. The default value, SYMMETRIC_DEFAULT , is the algorithm used for symmetric CMKs. If you are using an asymmetric CMK, we recommend RSAES_OAEP_SHA_256.\n
:rtype: dict
ReturnsResponse Syntax
{
'CiphertextBlob': b'bytes',
'KeyId': 'string',
'EncryptionAlgorithm': 'SYMMETRIC_DEFAULT'|'RSAES_OAEP_SHA_1'|'RSAES_OAEP_SHA_256'
}
Response Structure
(dict) --
CiphertextBlob (bytes) --
The encrypted plaintext. When you use the HTTP API or the AWS CLI, the value is Base64-encoded. Otherwise, it is not Base64-encoded.
KeyId (string) --
The ID of the key used during encryption.
EncryptionAlgorithm (string) --
The encryption algorithm that was used to encrypt the plaintext.
Exceptions
KMS.Client.exceptions.NotFoundException
KMS.Client.exceptions.DisabledException
KMS.Client.exceptions.KeyUnavailableException
KMS.Client.exceptions.DependencyTimeoutException
KMS.Client.exceptions.InvalidKeyUsageException
KMS.Client.exceptions.InvalidGrantTokenException
KMS.Client.exceptions.KMSInternalException
KMS.Client.exceptions.KMSInvalidStateException
Examples
The following example encrypts data with the specified customer master key (CMK).
response = client.encrypt(
# The identifier of the CMK to use for encryption. You can use the key ID or Amazon Resource Name (ARN) of the CMK, or the name or ARN of an alias that refers to the CMK.
KeyId='1234abcd-12ab-34cd-56ef-1234567890ab',
# The data to encrypt.
Plaintext='<binary data>',
)
print(response)
Expected Output:
{
# The encrypted data (ciphertext).
'CiphertextBlob': '<binary data>',
# The ARN of the CMK that was used to encrypt the data.
'KeyId': 'arn:aws:kms:us-west-2:111122223333:key/1234abcd-12ab-34cd-56ef-1234567890ab',
'ResponseMetadata': {
'...': '...',
},
}
:return: {
'CiphertextBlob': b'bytes',
'KeyId': 'string',
'EncryptionAlgorithm': 'SYMMETRIC_DEFAULT'|'RSAES_OAEP_SHA_1'|'RSAES_OAEP_SHA_256'
}
:returns:
Symmetric CMKs
SYMMETRIC_DEFAULT : 4096 bytes
RSA_2048
RSAES_OAEP_SHA_1 : 214 bytes
RSAES_OAEP_SHA_256 : 190 bytes
RSA_3072
RSAES_OAEP_SHA_1 : 342 bytes
RSAES_OAEP_SHA_256 : 318 bytes
RSA_4096
RSAES_OAEP_SHA_1 : 470 bytes
RSAES_OAEP_SHA_256 : 446 bytes
"""
pass
def generate_data_key(KeyId=None, EncryptionContext=None, NumberOfBytes=None, KeySpec=None, GrantTokens=None):
"""
Generates a unique symmetric data key. This operation returns a plaintext copy of the data key and a copy that is encrypted under a customer master key (CMK) that you specify. You can use the plaintext key to encrypt your data outside of AWS KMS and store the encrypted data key with the encrypted data.
To generate a data key, specify the symmetric CMK that will be used to encrypt the data key. You cannot use an asymmetric CMK to generate data keys. To get the type of your CMK, use the DescribeKey operation.
You must also specify the length of the data key. Use either the KeySpec or NumberOfBytes parameters (but not both). For 128-bit and 256-bit data keys, use the KeySpec parameter.
If the operation succeeds, the plaintext copy of the data key is in the Plaintext field of the response, and the encrypted copy of the data key in the CiphertextBlob field.
To get only an encrypted copy of the data key, use GenerateDataKeyWithoutPlaintext . To generate an asymmetric data key pair, use the GenerateDataKeyPair or GenerateDataKeyPairWithoutPlaintext operation. To get a cryptographically secure random byte string, use GenerateRandom .
You can use the optional encryption context to add additional security to the encryption operation. If you specify an EncryptionContext , you must specify the same encryption context (a case-sensitive exact match) when decrypting the encrypted data key. Otherwise, the request to decrypt fails with an InvalidCiphertextException. For more information, see Encryption Context in the AWS Key Management Service Developer Guide .
The CMK that you use for this operation must be in a compatible key state. For details, see How Key State Affects Use of a Customer Master Key in the AWS Key Management Service Developer Guide .
We recommend that you use the following pattern to encrypt data locally in your application:
To decrypt data locally:
See also: AWS API Documentation
Exceptions
Examples
The following example generates a 256-bit symmetric data encryption key (data key) in two formats. One is the unencrypted (plainext) data key, and the other is the data key encrypted with the specified customer master key (CMK).
Expected Output:
:example: response = client.generate_data_key(
KeyId='string',
EncryptionContext={
'string': 'string'
},
NumberOfBytes=123,
KeySpec='AES_256'|'AES_128',
GrantTokens=[
'string',
]
)
:type KeyId: string
:param KeyId: [REQUIRED]\nIdentifies the symmetric CMK that encrypts the data key.\nTo specify a CMK, use its key ID, Amazon Resource Name (ARN), alias name, or alias ARN. When using an alias name, prefix it with 'alias/' . To specify a CMK in a different AWS account, you must use the key ARN or alias ARN.\nFor example:\n\nKey ID: 1234abcd-12ab-34cd-56ef-1234567890ab\nKey ARN: arn:aws:kms:us-east-2:111122223333:key/1234abcd-12ab-34cd-56ef-1234567890ab\nAlias name: alias/ExampleAlias\nAlias ARN: arn:aws:kms:us-east-2:111122223333:alias/ExampleAlias\n\nTo get the key ID and key ARN for a CMK, use ListKeys or DescribeKey . To get the alias name and alias ARN, use ListAliases .\n
:type EncryptionContext: dict
:param EncryptionContext: Specifies the encryption context that will be used when encrypting the data key.\nAn encryption context is a collection of non-secret key-value pairs that represents additional authenticated data. When you use an encryption context to encrypt data, you must specify the same (an exact case-sensitive match) encryption context to decrypt the data. An encryption context is optional when encrypting with a symmetric CMK, but it is highly recommended.\nFor more information, see Encryption Context in the AWS Key Management Service Developer Guide .\n\n(string) --\n(string) --\n\n\n\n
:type NumberOfBytes: integer
:param NumberOfBytes: Specifies the length of the data key in bytes. For example, use the value 64 to generate a 512-bit data key (64 bytes is 512 bits). For 128-bit (16-byte) and 256-bit (32-byte) data keys, use the KeySpec parameter.\nYou must specify either the KeySpec or the NumberOfBytes parameter (but not both) in every GenerateDataKey request.\n
:type KeySpec: string
:param KeySpec: Specifies the length of the data key. Use AES_128 to generate a 128-bit symmetric key, or AES_256 to generate a 256-bit symmetric key.\nYou must specify either the KeySpec or the NumberOfBytes parameter (but not both) in every GenerateDataKey request.\n
:type GrantTokens: list
:param GrantTokens: A list of grant tokens.\nFor more information, see Grant Tokens in the AWS Key Management Service Developer Guide .\n\n(string) --\n\n
:rtype: dict
ReturnsResponse Syntax
{
'CiphertextBlob': b'bytes',
'Plaintext': b'bytes',
'KeyId': 'string'
}
Response Structure
(dict) --
CiphertextBlob (bytes) --
The encrypted copy of the data key. When you use the HTTP API or the AWS CLI, the value is Base64-encoded. Otherwise, it is not Base64-encoded.
Plaintext (bytes) --
The plaintext data key. When you use the HTTP API or the AWS CLI, the value is Base64-encoded. Otherwise, it is not Base64-encoded. Use this data key to encrypt your data outside of KMS. Then, remove it from memory as soon as possible.
KeyId (string) --
The identifier of the CMK that encrypted the data key.
Exceptions
KMS.Client.exceptions.NotFoundException
KMS.Client.exceptions.DisabledException
KMS.Client.exceptions.KeyUnavailableException
KMS.Client.exceptions.DependencyTimeoutException
KMS.Client.exceptions.InvalidKeyUsageException
KMS.Client.exceptions.InvalidGrantTokenException
KMS.Client.exceptions.KMSInternalException
KMS.Client.exceptions.KMSInvalidStateException
Examples
The following example generates a 256-bit symmetric data encryption key (data key) in two formats. One is the unencrypted (plainext) data key, and the other is the data key encrypted with the specified customer master key (CMK).
response = client.generate_data_key(
# The identifier of the CMK to use to encrypt the data key. You can use the key ID or Amazon Resource Name (ARN) of the CMK, or the name or ARN of an alias that refers to the CMK.
KeyId='alias/ExampleAlias',
# Specifies the type of data key to return.
KeySpec='AES_256',
)
print(response)
Expected Output:
{
# The encrypted data key.
'CiphertextBlob': '<binary data>',
# The ARN of the CMK that was used to encrypt the data key.
'KeyId': 'arn:aws:kms:us-east-2:111122223333:key/1234abcd-12ab-34cd-56ef-1234567890ab',
# The unencrypted (plaintext) data key.
'Plaintext': '<binary data>',
'ResponseMetadata': {
'...': '...',
},
}
:return: {
'CiphertextBlob': b'bytes',
'Plaintext': b'bytes',
'KeyId': 'string'
}
:returns:
Use the Decrypt operation to decrypt the encrypted data key. The operation returns a plaintext copy of the data key.
Use the plaintext data key to decrypt data locally, then erase the plaintext data key from memory.
"""
pass
def generate_data_key_pair(EncryptionContext=None, KeyId=None, KeyPairSpec=None, GrantTokens=None):
"""
Generates a unique asymmetric data key pair. The GenerateDataKeyPair operation returns a plaintext public key, a plaintext private key, and a copy of the private key that is encrypted under the symmetric CMK you specify. You can use the data key pair to perform asymmetric cryptography outside of AWS KMS.
You can use the public key that GenerateDataKeyPair returns to encrypt data or verify a signature outside of AWS KMS. Then, store the encrypted private key with the data. When you are ready to decrypt data or sign a message, you can use the Decrypt operation to decrypt the encrypted private key.
To generate a data key pair, you must specify a symmetric customer master key (CMK) to encrypt the private key in a data key pair. You cannot use an asymmetric CMK. To get the type of your CMK, use the DescribeKey operation.
If you are using the data key pair to encrypt data, or for any operation where you don\'t immediately need a private key, consider using the GenerateDataKeyPairWithoutPlaintext operation. GenerateDataKeyPairWithoutPlaintext returns a plaintext public key and an encrypted private key, but omits the plaintext private key that you need only to decrypt ciphertext or sign a message. Later, when you need to decrypt the data or sign a message, use the Decrypt operation to decrypt the encrypted private key in the data key pair.
You can use the optional encryption context to add additional security to the encryption operation. If you specify an EncryptionContext , you must specify the same encryption context (a case-sensitive exact match) when decrypting the encrypted data key. Otherwise, the request to decrypt fails with an InvalidCiphertextException. For more information, see Encryption Context in the AWS Key Management Service Developer Guide .
The CMK that you use for this operation must be in a compatible key state. For details, see How Key State Affects Use of a Customer Master Key in the AWS Key Management Service Developer Guide .
See also: AWS API Documentation
Exceptions
:example: response = client.generate_data_key_pair(
EncryptionContext={
'string': 'string'
},
KeyId='string',
KeyPairSpec='RSA_2048'|'RSA_3072'|'RSA_4096'|'ECC_NIST_P256'|'ECC_NIST_P384'|'ECC_NIST_P521'|'ECC_SECG_P256K1',
GrantTokens=[
'string',
]
)
:type EncryptionContext: dict
:param EncryptionContext: Specifies the encryption context that will be used when encrypting the private key in the data key pair.\nAn encryption context is a collection of non-secret key-value pairs that represents additional authenticated data. When you use an encryption context to encrypt data, you must specify the same (an exact case-sensitive match) encryption context to decrypt the data. An encryption context is optional when encrypting with a symmetric CMK, but it is highly recommended.\nFor more information, see Encryption Context in the AWS Key Management Service Developer Guide .\n\n(string) --\n(string) --\n\n\n\n
:type KeyId: string
:param KeyId: [REQUIRED]\nSpecifies the symmetric CMK that encrypts the private key in the data key pair. You cannot specify an asymmetric CMKs.\nTo specify a CMK, use its key ID, Amazon Resource Name (ARN), alias name, or alias ARN. When using an alias name, prefix it with 'alias/' . To specify a CMK in a different AWS account, you must use the key ARN or alias ARN.\nFor example:\n\nKey ID: 1234abcd-12ab-34cd-56ef-1234567890ab\nKey ARN: arn:aws:kms:us-east-2:111122223333:key/1234abcd-12ab-34cd-56ef-1234567890ab\nAlias name: alias/ExampleAlias\nAlias ARN: arn:aws:kms:us-east-2:111122223333:alias/ExampleAlias\n\nTo get the key ID and key ARN for a CMK, use ListKeys or DescribeKey . To get the alias name and alias ARN, use ListAliases .\n
:type KeyPairSpec: string
:param KeyPairSpec: [REQUIRED]\nDetermines the type of data key pair that is generated.\nThe AWS KMS rule that restricts the use of asymmetric RSA CMKs to encrypt and decrypt or to sign and verify (but not both), and the rule that permits you to use ECC CMKs only to sign and verify, are not effective outside of AWS KMS.\n
:type GrantTokens: list
:param GrantTokens: A list of grant tokens.\nFor more information, see Grant Tokens in the AWS Key Management Service Developer Guide .\n\n(string) --\n\n
:rtype: dict
ReturnsResponse Syntax
{
'PrivateKeyCiphertextBlob': b'bytes',
'PrivateKeyPlaintext': b'bytes',
'PublicKey': b'bytes',
'KeyId': 'string',
'KeyPairSpec': 'RSA_2048'|'RSA_3072'|'RSA_4096'|'ECC_NIST_P256'|'ECC_NIST_P384'|'ECC_NIST_P521'|'ECC_SECG_P256K1'
}
Response Structure
(dict) --
PrivateKeyCiphertextBlob (bytes) --
The encrypted copy of the private key. When you use the HTTP API or the AWS CLI, the value is Base64-encoded. Otherwise, it is not Base64-encoded.
PrivateKeyPlaintext (bytes) --
The plaintext copy of the private key. When you use the HTTP API or the AWS CLI, the value is Base64-encoded. Otherwise, it is not Base64-encoded.
PublicKey (bytes) --
The public key (in plaintext).
KeyId (string) --
The identifier of the CMK that encrypted the private key.
KeyPairSpec (string) --
The type of data key pair that was generated.
Exceptions
KMS.Client.exceptions.NotFoundException
KMS.Client.exceptions.DisabledException
KMS.Client.exceptions.KeyUnavailableException
KMS.Client.exceptions.DependencyTimeoutException
KMS.Client.exceptions.InvalidKeyUsageException
KMS.Client.exceptions.InvalidGrantTokenException
KMS.Client.exceptions.KMSInternalException
KMS.Client.exceptions.KMSInvalidStateException
:return: {
'PrivateKeyCiphertextBlob': b'bytes',
'PrivateKeyPlaintext': b'bytes',
'PublicKey': b'bytes',
'KeyId': 'string',
'KeyPairSpec': 'RSA_2048'|'RSA_3072'|'RSA_4096'|'ECC_NIST_P256'|'ECC_NIST_P384'|'ECC_NIST_P521'|'ECC_SECG_P256K1'
}
:returns:
KMS.Client.exceptions.NotFoundException
KMS.Client.exceptions.DisabledException
KMS.Client.exceptions.KeyUnavailableException
KMS.Client.exceptions.DependencyTimeoutException
KMS.Client.exceptions.InvalidKeyUsageException
KMS.Client.exceptions.InvalidGrantTokenException
KMS.Client.exceptions.KMSInternalException
KMS.Client.exceptions.KMSInvalidStateException
"""
pass
def generate_data_key_pair_without_plaintext(EncryptionContext=None, KeyId=None, KeyPairSpec=None, GrantTokens=None):
"""
Generates a unique asymmetric data key pair. The GenerateDataKeyPairWithoutPlaintext operation returns a plaintext public key and a copy of the private key that is encrypted under the symmetric CMK you specify. Unlike GenerateDataKeyPair , this operation does not return a plaintext private key.
To generate a data key pair, you must specify a symmetric customer master key (CMK) to encrypt the private key in the data key pair. You cannot use an asymmetric CMK. To get the type of your CMK, use the KeySpec field in the DescribeKey response.
You can use the public key that GenerateDataKeyPairWithoutPlaintext returns to encrypt data or verify a signature outside of AWS KMS. Then, store the encrypted private key with the data. When you are ready to decrypt data or sign a message, you can use the Decrypt operation to decrypt the encrypted private key.
You can use the optional encryption context to add additional security to the encryption operation. If you specify an EncryptionContext , you must specify the same encryption context (a case-sensitive exact match) when decrypting the encrypted data key. Otherwise, the request to decrypt fails with an InvalidCiphertextException. For more information, see Encryption Context in the AWS Key Management Service Developer Guide .
The CMK that you use for this operation must be in a compatible key state. For details, see How Key State Affects Use of a Customer Master Key in the AWS Key Management Service Developer Guide .
See also: AWS API Documentation
Exceptions
:example: response = client.generate_data_key_pair_without_plaintext(
EncryptionContext={
'string': 'string'
},
KeyId='string',
KeyPairSpec='RSA_2048'|'RSA_3072'|'RSA_4096'|'ECC_NIST_P256'|'ECC_NIST_P384'|'ECC_NIST_P521'|'ECC_SECG_P256K1',
GrantTokens=[
'string',
]
)
:type EncryptionContext: dict
:param EncryptionContext: Specifies the encryption context that will be used when encrypting the private key in the data key pair.\nAn encryption context is a collection of non-secret key-value pairs that represents additional authenticated data. When you use an encryption context to encrypt data, you must specify the same (an exact case-sensitive match) encryption context to decrypt the data. An encryption context is optional when encrypting with a symmetric CMK, but it is highly recommended.\nFor more information, see Encryption Context in the AWS Key Management Service Developer Guide .\n\n(string) --\n(string) --\n\n\n\n
:type KeyId: string
:param KeyId: [REQUIRED]\nSpecifies the CMK that encrypts the private key in the data key pair. You must specify a symmetric CMK. You cannot use an asymmetric CMK. To get the type of your CMK, use the DescribeKey operation.\nTo specify a CMK, use its key ID, Amazon Resource Name (ARN), alias name, or alias ARN. When using an alias name, prefix it with 'alias/' .\nFor example:\n\nKey ID: 1234abcd-12ab-34cd-56ef-1234567890ab\nKey ARN: arn:aws:kms:us-east-2:111122223333:key/1234abcd-12ab-34cd-56ef-1234567890ab\nAlias name: alias/ExampleAlias\nAlias ARN: arn:aws:kms:us-east-2:111122223333:alias/ExampleAlias\n\nTo get the key ID and key ARN for a CMK, use ListKeys or DescribeKey . To get the alias name and alias ARN, use ListAliases .\n
:type KeyPairSpec: string
:param KeyPairSpec: [REQUIRED]\nDetermines the type of data key pair that is generated.\nThe AWS KMS rule that restricts the use of asymmetric RSA CMKs to encrypt and decrypt or to sign and verify (but not both), and the rule that permits you to use ECC CMKs only to sign and verify, are not effective outside of AWS KMS.\n
:type GrantTokens: list
:param GrantTokens: A list of grant tokens.\nFor more information, see Grant Tokens in the AWS Key Management Service Developer Guide .\n\n(string) --\n\n
:rtype: dict
ReturnsResponse Syntax
{
'PrivateKeyCiphertextBlob': b'bytes',
'PublicKey': b'bytes',
'KeyId': 'string',
'KeyPairSpec': 'RSA_2048'|'RSA_3072'|'RSA_4096'|'ECC_NIST_P256'|'ECC_NIST_P384'|'ECC_NIST_P521'|'ECC_SECG_P256K1'
}
Response Structure
(dict) --
PrivateKeyCiphertextBlob (bytes) --
The encrypted copy of the private key. When you use the HTTP API or the AWS CLI, the value is Base64-encoded. Otherwise, it is not Base64-encoded.
PublicKey (bytes) --
The public key (in plaintext).
KeyId (string) --
Specifies the CMK that encrypted the private key in the data key pair. You must specify a symmetric CMK. You cannot use an asymmetric CMK. To get the type of your CMK, use the DescribeKey operation.
To specify a CMK, use its key ID, Amazon Resource Name (ARN), alias name, or alias ARN. When using an alias name, prefix it with "alias/" .
For example:
Key ID: 1234abcd-12ab-34cd-56ef-1234567890ab
Key ARN: arn:aws:kms:us-east-2:111122223333:key/1234abcd-12ab-34cd-56ef-1234567890ab
Alias name: alias/ExampleAlias
Alias ARN: arn:aws:kms:us-east-2:111122223333:alias/ExampleAlias
To get the key ID and key ARN for a CMK, use ListKeys or DescribeKey . To get the alias name and alias ARN, use ListAliases .
KeyPairSpec (string) --
The type of data key pair that was generated.
Exceptions
KMS.Client.exceptions.NotFoundException
KMS.Client.exceptions.DisabledException
KMS.Client.exceptions.KeyUnavailableException
KMS.Client.exceptions.DependencyTimeoutException
KMS.Client.exceptions.InvalidKeyUsageException
KMS.Client.exceptions.InvalidGrantTokenException
KMS.Client.exceptions.KMSInternalException
KMS.Client.exceptions.KMSInvalidStateException
:return: {
'PrivateKeyCiphertextBlob': b'bytes',
'PublicKey': b'bytes',
'KeyId': 'string',
'KeyPairSpec': 'RSA_2048'|'RSA_3072'|'RSA_4096'|'ECC_NIST_P256'|'ECC_NIST_P384'|'ECC_NIST_P521'|'ECC_SECG_P256K1'
}
:returns:
Key ID: 1234abcd-12ab-34cd-56ef-1234567890ab
Key ARN: arn:aws:kms:us-east-2:111122223333:key/1234abcd-12ab-34cd-56ef-1234567890ab
Alias name: alias/ExampleAlias
Alias ARN: arn:aws:kms:us-east-2:111122223333:alias/ExampleAlias
"""
pass
def generate_data_key_without_plaintext(KeyId=None, EncryptionContext=None, KeySpec=None, NumberOfBytes=None, GrantTokens=None):
"""
Generates a unique symmetric data key. This operation returns a data key that is encrypted under a customer master key (CMK) that you specify. To request an asymmetric data key pair, use the GenerateDataKeyPair or GenerateDataKeyPairWithoutPlaintext operations.
It\'s also useful in distributed systems with different levels of trust. For example, you might store encrypted data in containers. One component of your system creates new containers and stores an encrypted data key with each container. Then, a different component puts the data into the containers. That component first decrypts the data key, uses the plaintext data key to encrypt data, puts the encrypted data into the container, and then destroys the plaintext data key. In this system, the component that creates the containers never sees the plaintext data key.
To generate a data key, you must specify the symmetric customer master key (CMK) that is used to encrypt the data key. You cannot use an asymmetric CMK to generate a data key. To get the type of your CMK, use the DescribeKey operation.
If the operation succeeds, you will find the encrypted copy of the data key in the CiphertextBlob field.
You can use the optional encryption context to add additional security to the encryption operation. If you specify an EncryptionContext , you must specify the same encryption context (a case-sensitive exact match) when decrypting the encrypted data key. Otherwise, the request to decrypt fails with an InvalidCiphertextException. For more information, see Encryption Context in the AWS Key Management Service Developer Guide .
The CMK that you use for this operation must be in a compatible key state. For details, see How Key State Affects Use of a Customer Master Key in the AWS Key Management Service Developer Guide .
See also: AWS API Documentation
Exceptions
Examples
The following example generates an encrypted copy of a 256-bit symmetric data encryption key (data key). The data key is encrypted with the specified customer master key (CMK).
Expected Output:
:example: response = client.generate_data_key_without_plaintext(
KeyId='string',
EncryptionContext={
'string': 'string'
},
KeySpec='AES_256'|'AES_128',
NumberOfBytes=123,
GrantTokens=[
'string',
]
)
:type KeyId: string
:param KeyId: [REQUIRED]\nThe identifier of the symmetric customer master key (CMK) that encrypts the data key.\nTo specify a CMK, use its key ID, Amazon Resource Name (ARN), alias name, or alias ARN. When using an alias name, prefix it with 'alias/' . To specify a CMK in a different AWS account, you must use the key ARN or alias ARN.\nFor example:\n\nKey ID: 1234abcd-12ab-34cd-56ef-1234567890ab\nKey ARN: arn:aws:kms:us-east-2:111122223333:key/1234abcd-12ab-34cd-56ef-1234567890ab\nAlias name: alias/ExampleAlias\nAlias ARN: arn:aws:kms:us-east-2:111122223333:alias/ExampleAlias\n\nTo get the key ID and key ARN for a CMK, use ListKeys or DescribeKey . To get the alias name and alias ARN, use ListAliases .\n
:type EncryptionContext: dict
:param EncryptionContext: Specifies the encryption context that will be used when encrypting the data key.\nAn encryption context is a collection of non-secret key-value pairs that represents additional authenticated data. When you use an encryption context to encrypt data, you must specify the same (an exact case-sensitive match) encryption context to decrypt the data. An encryption context is optional when encrypting with a symmetric CMK, but it is highly recommended.\nFor more information, see Encryption Context in the AWS Key Management Service Developer Guide .\n\n(string) --\n(string) --\n\n\n\n
:type KeySpec: string
:param KeySpec: The length of the data key. Use AES_128 to generate a 128-bit symmetric key, or AES_256 to generate a 256-bit symmetric key.
:type NumberOfBytes: integer
:param NumberOfBytes: The length of the data key in bytes. For example, use the value 64 to generate a 512-bit data key (64 bytes is 512 bits). For common key lengths (128-bit and 256-bit symmetric keys), we recommend that you use the KeySpec field instead of this one.
:type GrantTokens: list
:param GrantTokens: A list of grant tokens.\nFor more information, see Grant Tokens in the AWS Key Management Service Developer Guide .\n\n(string) --\n\n
:rtype: dict
ReturnsResponse Syntax
{
'CiphertextBlob': b'bytes',
'KeyId': 'string'
}
Response Structure
(dict) --
CiphertextBlob (bytes) --
The encrypted data key. When you use the HTTP API or the AWS CLI, the value is Base64-encoded. Otherwise, it is not Base64-encoded.
KeyId (string) --
The identifier of the CMK that encrypted the data key.
Exceptions
KMS.Client.exceptions.NotFoundException
KMS.Client.exceptions.DisabledException
KMS.Client.exceptions.KeyUnavailableException
KMS.Client.exceptions.DependencyTimeoutException
KMS.Client.exceptions.InvalidKeyUsageException
KMS.Client.exceptions.InvalidGrantTokenException
KMS.Client.exceptions.KMSInternalException
KMS.Client.exceptions.KMSInvalidStateException
Examples
The following example generates an encrypted copy of a 256-bit symmetric data encryption key (data key). The data key is encrypted with the specified customer master key (CMK).
response = client.generate_data_key_without_plaintext(
# The identifier of the CMK to use to encrypt the data key. You can use the key ID or Amazon Resource Name (ARN) of the CMK, or the name or ARN of an alias that refers to the CMK.
KeyId='alias/ExampleAlias',
# Specifies the type of data key to return.
KeySpec='AES_256',
)
print(response)
Expected Output:
{
# The encrypted data key.
'CiphertextBlob': '<binary data>',
# The ARN of the CMK that was used to encrypt the data key.
'KeyId': 'arn:aws:kms:us-east-2:111122223333:key/1234abcd-12ab-34cd-56ef-1234567890ab',
'ResponseMetadata': {
'...': '...',
},
}
:return: {
'CiphertextBlob': b'bytes',
'KeyId': 'string'
}
:returns:
KMS.Client.exceptions.NotFoundException
KMS.Client.exceptions.DisabledException
KMS.Client.exceptions.KeyUnavailableException
KMS.Client.exceptions.DependencyTimeoutException
KMS.Client.exceptions.InvalidKeyUsageException
KMS.Client.exceptions.InvalidGrantTokenException
KMS.Client.exceptions.KMSInternalException
KMS.Client.exceptions.KMSInvalidStateException
"""
pass
def generate_presigned_url(ClientMethod=None, Params=None, ExpiresIn=None, HttpMethod=None):
"""
Generate a presigned url given a client, its method, and arguments
:type ClientMethod: string
:param ClientMethod: The client method to presign for
:type Params: dict
:param Params: The parameters normally passed to\nClientMethod.
:type ExpiresIn: int
:param ExpiresIn: The number of seconds the presigned url is valid\nfor. By default it expires in an hour (3600 seconds)
:type HttpMethod: string
:param HttpMethod: The http method to use on the generated url. By\ndefault, the http method is whatever is used in the method\'s model.
"""
pass
def generate_random(NumberOfBytes=None, CustomKeyStoreId=None):
"""
Returns a random byte string that is cryptographically secure.
By default, the random byte string is generated in AWS KMS. To generate the byte string in the AWS CloudHSM cluster that is associated with a custom key store , specify the custom key store ID.
For more information about entropy and random number generation, see the AWS Key Management Service Cryptographic Details whitepaper.
See also: AWS API Documentation
Exceptions
Examples
The following example uses AWS KMS to generate 32 bytes of random data.
Expected Output:
:example: response = client.generate_random(
NumberOfBytes=123,
CustomKeyStoreId='string'
)
:type NumberOfBytes: integer
:param NumberOfBytes: The length of the byte string.
:type CustomKeyStoreId: string
:param CustomKeyStoreId: Generates the random byte string in the AWS CloudHSM cluster that is associated with the specified custom key store . To find the ID of a custom key store, use the DescribeCustomKeyStores operation.
:rtype: dict
ReturnsResponse Syntax
{
'Plaintext': b'bytes'
}
Response Structure
(dict) --
Plaintext (bytes) --
The random byte string. When you use the HTTP API or the AWS CLI, the value is Base64-encoded. Otherwise, it is not Base64-encoded.
Exceptions
KMS.Client.exceptions.DependencyTimeoutException
KMS.Client.exceptions.KMSInternalException
KMS.Client.exceptions.CustomKeyStoreNotFoundException
KMS.Client.exceptions.CustomKeyStoreInvalidStateException
Examples
The following example uses AWS KMS to generate 32 bytes of random data.
response = client.generate_random(
# The length of the random data, specified in number of bytes.
NumberOfBytes=32,
)
print(response)
Expected Output:
{
# The random data.
'Plaintext': '<binary data>',
'ResponseMetadata': {
'...': '...',
},
}
:return: {
'Plaintext': b'bytes'
}
:returns:
KMS.Client.exceptions.DependencyTimeoutException
KMS.Client.exceptions.KMSInternalException
KMS.Client.exceptions.CustomKeyStoreNotFoundException
KMS.Client.exceptions.CustomKeyStoreInvalidStateException
"""
pass
def get_key_policy(KeyId=None, PolicyName=None):
"""
Gets a key policy attached to the specified customer master key (CMK). You cannot perform this operation on a CMK in a different AWS account.
See also: AWS API Documentation
Exceptions
Examples
The following example retrieves the key policy for the specified customer master key (CMK).
Expected Output:
:example: response = client.get_key_policy(
KeyId='string',
PolicyName='string'
)
:type KeyId: string
:param KeyId: [REQUIRED]\nA unique identifier for the customer master key (CMK).\nSpecify the key ID or the Amazon Resource Name (ARN) of the CMK.\nFor example:\n\nKey ID: 1234abcd-12ab-34cd-56ef-1234567890ab\nKey ARN: arn:aws:kms:us-east-2:111122223333:key/1234abcd-12ab-34cd-56ef-1234567890ab\n\nTo get the key ID and key ARN for a CMK, use ListKeys or DescribeKey .\n
:type PolicyName: string
:param PolicyName: [REQUIRED]\nSpecifies the name of the key policy. The only valid name is default . To get the names of key policies, use ListKeyPolicies .\n
:rtype: dict
ReturnsResponse Syntax
{
'Policy': 'string'
}
Response Structure
(dict) --
Policy (string) --
A key policy document in JSON format.
Exceptions
KMS.Client.exceptions.NotFoundException
KMS.Client.exceptions.InvalidArnException
KMS.Client.exceptions.DependencyTimeoutException
KMS.Client.exceptions.KMSInternalException
KMS.Client.exceptions.KMSInvalidStateException
Examples
The following example retrieves the key policy for the specified customer master key (CMK).
response = client.get_key_policy(
# The identifier of the CMK whose key policy you want to retrieve. You can use the key ID or the Amazon Resource Name (ARN) of the CMK.
KeyId='1234abcd-12ab-34cd-56ef-1234567890ab',
# The name of the key policy to retrieve.
PolicyName='default',
)
print(response)
Expected Output:
{
# The key policy document.
'Policy': '{\
"Version" : "2012-10-17",\
"Id" : "key-default-1",\
"Statement" : [ {\
"Sid" : "Enable IAM User Permissions",\
"Effect" : "Allow",\
"Principal" : {\
"AWS" : "arn:aws:iam::111122223333:root"\
},\
"Action" : "kms:*",\
"Resource" : "*"\
} ]\
}',
'ResponseMetadata': {
'...': '...',
},
}
:return: {
'Policy': 'string'
}
:returns:
KMS.Client.exceptions.NotFoundException
KMS.Client.exceptions.InvalidArnException
KMS.Client.exceptions.DependencyTimeoutException
KMS.Client.exceptions.KMSInternalException
KMS.Client.exceptions.KMSInvalidStateException
"""
pass
def get_key_rotation_status(KeyId=None):
"""
Gets a Boolean value that indicates whether automatic rotation of the key material is enabled for the specified customer master key (CMK).
You cannot enable automatic rotation of asymmetric CMKs, CMKs with imported key material, or CMKs in a custom key store . The key rotation status for these CMKs is always false .
The CMK that you use for this operation must be in a compatible key state. For details, see How Key State Affects Use of a Customer Master Key in the AWS Key Management Service Developer Guide .
To perform this operation on a CMK in a different AWS account, specify the key ARN in the value of the KeyId parameter.
See also: AWS API Documentation
Exceptions
Examples
The following example retrieves the status of automatic annual rotation of the key material for the specified CMK.
Expected Output:
:example: response = client.get_key_rotation_status(
KeyId='string'
)
:type KeyId: string
:param KeyId: [REQUIRED]\nA unique identifier for the customer master key (CMK).\nSpecify the key ID or the Amazon Resource Name (ARN) of the CMK. To specify a CMK in a different AWS account, you must use the key ARN.\nFor example:\n\nKey ID: 1234abcd-12ab-34cd-56ef-1234567890ab\nKey ARN: arn:aws:kms:us-east-2:111122223333:key/1234abcd-12ab-34cd-56ef-1234567890ab\n\nTo get the key ID and key ARN for a CMK, use ListKeys or DescribeKey .\n
:rtype: dict
ReturnsResponse Syntax{
'KeyRotationEnabled': True|False
}
Response Structure
(dict) --
KeyRotationEnabled (boolean) --A Boolean value that specifies whether key rotation is enabled.
Exceptions
KMS.Client.exceptions.NotFoundException
KMS.Client.exceptions.InvalidArnException
KMS.Client.exceptions.DependencyTimeoutException
KMS.Client.exceptions.KMSInternalException
KMS.Client.exceptions.KMSInvalidStateException
KMS.Client.exceptions.UnsupportedOperationException
Examples
The following example retrieves the status of automatic annual rotation of the key material for the specified CMK.
response = client.get_key_rotation_status(
# The identifier of the CMK whose key material rotation status you want to retrieve. You can use the key ID or the Amazon Resource Name (ARN) of the CMK.
KeyId='1234abcd-12ab-34cd-56ef-1234567890ab',
)
print(response)
Expected Output:
{
# A boolean that indicates the key material rotation status. Returns true when automatic annual rotation of the key material is enabled, or false when it is not.
'KeyRotationEnabled': True,
'ResponseMetadata': {
'...': '...',
},
}
:return: {
'KeyRotationEnabled': True|False
}
:returns:
Key ID: 1234abcd-12ab-34cd-56ef-1234567890ab
Key ARN: arn:aws:kms:us-east-2:111122223333:key/1234abcd-12ab-34cd-56ef-1234567890ab
"""
pass
def get_paginator(operation_name=None):
"""
Create a paginator for an operation.
:type operation_name: string
:param operation_name: The operation name. This is the same name\nas the method name on the client. For example, if the\nmethod name is create_foo, and you\'d normally invoke the\noperation as client.create_foo(**kwargs), if the\ncreate_foo operation can be paginated, you can use the\ncall client.get_paginator('create_foo').
:rtype: L{botocore.paginate.Paginator}
ReturnsA paginator object.
"""
pass
def get_parameters_for_import(KeyId=None, WrappingAlgorithm=None, WrappingKeySpec=None):
"""
Returns the items you need to import key material into a symmetric, customer managed customer master key (CMK). For more information about importing key material into AWS KMS, see Importing Key Material in the AWS Key Management Service Developer Guide .
This operation returns a public key and an import token. Use the public key to encrypt the symmetric key material. Store the import token to send with a subsequent ImportKeyMaterial request.
You must specify the key ID of the symmetric CMK into which you will import key material. This CMK\'s Origin must be EXTERNAL . You must also specify the wrapping algorithm and type of wrapping key (public key) that you will use to encrypt the key material. You cannot perform this operation on an asymmetric CMK or on any CMK in a different AWS account.
To import key material, you must use the public key and import token from the same response. These items are valid for 24 hours. The expiration date and time appear in the GetParametersForImport response. You cannot use an expired token in an ImportKeyMaterial request. If your key and token expire, send another GetParametersForImport request.
The CMK that you use for this operation must be in a compatible key state. For details, see How Key State Affects Use of a Customer Master Key in the AWS Key Management Service Developer Guide .
See also: AWS API Documentation
Exceptions
Examples
The following example retrieves the public key and import token for the specified CMK.
Expected Output:
:example: response = client.get_parameters_for_import(
KeyId='string',
WrappingAlgorithm='RSAES_PKCS1_V1_5'|'RSAES_OAEP_SHA_1'|'RSAES_OAEP_SHA_256',
WrappingKeySpec='RSA_2048'
)
:type KeyId: string
:param KeyId: [REQUIRED]\nThe identifier of the symmetric CMK into which you will import key material. The Origin of the CMK must be EXTERNAL .\nSpecify the key ID or the Amazon Resource Name (ARN) of the CMK.\nFor example:\n\nKey ID: 1234abcd-12ab-34cd-56ef-1234567890ab\nKey ARN: arn:aws:kms:us-east-2:111122223333:key/1234abcd-12ab-34cd-56ef-1234567890ab\n\nTo get the key ID and key ARN for a CMK, use ListKeys or DescribeKey .\n
:type WrappingAlgorithm: string
:param WrappingAlgorithm: [REQUIRED]\nThe algorithm you will use to encrypt the key material before importing it with ImportKeyMaterial . For more information, see Encrypt the Key Material in the AWS Key Management Service Developer Guide .\n
:type WrappingKeySpec: string
:param WrappingKeySpec: [REQUIRED]\nThe type of wrapping key (public key) to return in the response. Only 2048-bit RSA public keys are supported.\n
:rtype: dict
ReturnsResponse Syntax
{
'KeyId': 'string',
'ImportToken': b'bytes',
'PublicKey': b'bytes',
'ParametersValidTo': datetime(2015, 1, 1)
}
Response Structure
(dict) --
KeyId (string) --
The identifier of the CMK to use in a subsequent ImportKeyMaterial request. This is the same CMK specified in the GetParametersForImport request.
ImportToken (bytes) --
The import token to send in a subsequent ImportKeyMaterial request.
PublicKey (bytes) --
The public key to use to encrypt the key material before importing it with ImportKeyMaterial .
ParametersValidTo (datetime) --
The time at which the import token and public key are no longer valid. After this time, you cannot use them to make an ImportKeyMaterial request and you must send another GetParametersForImport request to get new ones.
Exceptions
KMS.Client.exceptions.InvalidArnException
KMS.Client.exceptions.UnsupportedOperationException
KMS.Client.exceptions.DependencyTimeoutException
KMS.Client.exceptions.NotFoundException
KMS.Client.exceptions.KMSInternalException
KMS.Client.exceptions.KMSInvalidStateException
Examples
The following example retrieves the public key and import token for the specified CMK.
response = client.get_parameters_for_import(
# The identifier of the CMK for which to retrieve the public key and import token. You can use the key ID or the Amazon Resource Name (ARN) of the CMK.
KeyId='1234abcd-12ab-34cd-56ef-1234567890ab',
# The algorithm that you will use to encrypt the key material before importing it.
WrappingAlgorithm='RSAES_OAEP_SHA_1',
# The type of wrapping key (public key) to return in the response.
WrappingKeySpec='RSA_2048',
)
print(response)
Expected Output:
{
# The import token to send with a subsequent ImportKeyMaterial request.
'ImportToken': '<binary data>',
# The ARN of the CMK for which you are retrieving the public key and import token. This is the same CMK specified in the request.
'KeyId': 'arn:aws:kms:us-east-2:111122223333:key/1234abcd-12ab-34cd-56ef-1234567890ab',
# The time at which the import token and public key are no longer valid.
'ParametersValidTo': datetime(2016, 12, 1, 14, 52, 17, 3, 336, 0),
# The public key to use to encrypt the key material before importing it.
'PublicKey': '<binary data>',
'ResponseMetadata': {
'...': '...',
},
}
:return: {
'KeyId': 'string',
'ImportToken': b'bytes',
'PublicKey': b'bytes',
'ParametersValidTo': datetime(2015, 1, 1)
}
:returns:
KMS.Client.exceptions.InvalidArnException
KMS.Client.exceptions.UnsupportedOperationException
KMS.Client.exceptions.DependencyTimeoutException
KMS.Client.exceptions.NotFoundException
KMS.Client.exceptions.KMSInternalException
KMS.Client.exceptions.KMSInvalidStateException
"""
pass
def get_public_key(KeyId=None, GrantTokens=None):
"""
Returns the public key of an asymmetric CMK. Unlike the private key of a asymmetric CMK, which never leaves AWS KMS unencrypted, callers with kms:GetPublicKey permission can download the public key of an asymmetric CMK. You can share the public key to allow others to encrypt messages and verify signatures outside of AWS KMS. For information about symmetric and asymmetric CMKs, see Using Symmetric and Asymmetric CMKs in the AWS Key Management Service Developer Guide .
You do not need to download the public key. Instead, you can use the public key within AWS KMS by calling the Encrypt , ReEncrypt , or Verify operations with the identifier of an asymmetric CMK. When you use the public key within AWS KMS, you benefit from the authentication, authorization, and logging that are part of every AWS KMS operation. You also reduce of risk of encrypting data that cannot be decrypted. These features are not effective outside of AWS KMS. For details, see Special Considerations for Downloading Public Keys .
To help you use the public key safely outside of AWS KMS, GetPublicKey returns important information about the public key in the response, including:
Although AWS KMS cannot enforce these restrictions on external operations, it is crucial that you use this information to prevent the public key from being used improperly. For example, you can prevent a public signing key from being used encrypt data, or prevent a public key from being used with an encryption algorithm that is not supported by AWS KMS. You can also avoid errors, such as using the wrong signing algorithm in a verification operation.
The CMK that you use for this operation must be in a compatible key state. For details, see How Key State Affects Use of a Customer Master Key in the AWS Key Management Service Developer Guide .
See also: AWS API Documentation
Exceptions
:example: response = client.get_public_key(
KeyId='string',
GrantTokens=[
'string',
]
)
:type KeyId: string
:param KeyId: [REQUIRED]\nIdentifies the asymmetric CMK that includes the public key.\nTo specify a CMK, use its key ID, Amazon Resource Name (ARN), alias name, or alias ARN. When using an alias name, prefix it with 'alias/' . To specify a CMK in a different AWS account, you must use the key ARN or alias ARN.\nFor example:\n\nKey ID: 1234abcd-12ab-34cd-56ef-1234567890ab\nKey ARN: arn:aws:kms:us-east-2:111122223333:key/1234abcd-12ab-34cd-56ef-1234567890ab\nAlias name: alias/ExampleAlias\nAlias ARN: arn:aws:kms:us-east-2:111122223333:alias/ExampleAlias\n\nTo get the key ID and key ARN for a CMK, use ListKeys or DescribeKey . To get the alias name and alias ARN, use ListAliases .\n
:type GrantTokens: list
:param GrantTokens: A list of grant tokens.\nFor more information, see Grant Tokens in the AWS Key Management Service Developer Guide .\n\n(string) --\n\n
:rtype: dict
ReturnsResponse Syntax
{
'KeyId': 'string',
'PublicKey': b'bytes',
'CustomerMasterKeySpec': 'RSA_2048'|'RSA_3072'|'RSA_4096'|'ECC_NIST_P256'|'ECC_NIST_P384'|'ECC_NIST_P521'|'ECC_SECG_P256K1'|'SYMMETRIC_DEFAULT',
'KeyUsage': 'SIGN_VERIFY'|'ENCRYPT_DECRYPT',
'EncryptionAlgorithms': [
'SYMMETRIC_DEFAULT'|'RSAES_OAEP_SHA_1'|'RSAES_OAEP_SHA_256',
],
'SigningAlgorithms': [
'RSASSA_PSS_SHA_256'|'RSASSA_PSS_SHA_384'|'RSASSA_PSS_SHA_512'|'RSASSA_PKCS1_V1_5_SHA_256'|'RSASSA_PKCS1_V1_5_SHA_384'|'RSASSA_PKCS1_V1_5_SHA_512'|'ECDSA_SHA_256'|'ECDSA_SHA_384'|'ECDSA_SHA_512',
]
}
Response Structure
(dict) --
KeyId (string) --
The identifier of the asymmetric CMK from which the public key was downloaded.
PublicKey (bytes) --
The exported public key.
The value is a DER-encoded X.509 public key, also known as SubjectPublicKeyInfo (SPKI), as defined in RFC 5280 . When you use the HTTP API or the AWS CLI, the value is Base64-encoded. Otherwise, it is not Base64-encoded.
CustomerMasterKeySpec (string) --
The type of the of the public key that was downloaded.
KeyUsage (string) --
The permitted use of the public key. Valid values are ENCRYPT_DECRYPT or SIGN_VERIFY .
This information is critical. If a public key with SIGN_VERIFY key usage encrypts data outside of AWS KMS, the ciphertext cannot be decrypted.
EncryptionAlgorithms (list) --
The encryption algorithms that AWS KMS supports for this key.
This information is critical. If a public key encrypts data outside of AWS KMS by using an unsupported encryption algorithm, the ciphertext cannot be decrypted.
This field appears in the response only when the KeyUsage of the public key is ENCRYPT_DECRYPT .
(string) --
SigningAlgorithms (list) --
The signing algorithms that AWS KMS supports for this key.
This field appears in the response only when the KeyUsage of the public key is SIGN_VERIFY .
(string) --
Exceptions
KMS.Client.exceptions.NotFoundException
KMS.Client.exceptions.DisabledException
KMS.Client.exceptions.KeyUnavailableException
KMS.Client.exceptions.DependencyTimeoutException
KMS.Client.exceptions.UnsupportedOperationException
KMS.Client.exceptions.InvalidArnException
KMS.Client.exceptions.InvalidGrantTokenException
KMS.Client.exceptions.InvalidKeyUsageException
KMS.Client.exceptions.KMSInternalException
KMS.Client.exceptions.KMSInvalidStateException
:return: {
'KeyId': 'string',
'PublicKey': b'bytes',
'CustomerMasterKeySpec': 'RSA_2048'|'RSA_3072'|'RSA_4096'|'ECC_NIST_P256'|'ECC_NIST_P384'|'ECC_NIST_P521'|'ECC_SECG_P256K1'|'SYMMETRIC_DEFAULT',
'KeyUsage': 'SIGN_VERIFY'|'ENCRYPT_DECRYPT',
'EncryptionAlgorithms': [
'SYMMETRIC_DEFAULT'|'RSAES_OAEP_SHA_1'|'RSAES_OAEP_SHA_256',
],
'SigningAlgorithms': [
'RSASSA_PSS_SHA_256'|'RSASSA_PSS_SHA_384'|'RSASSA_PSS_SHA_512'|'RSASSA_PKCS1_V1_5_SHA_256'|'RSASSA_PKCS1_V1_5_SHA_384'|'RSASSA_PKCS1_V1_5_SHA_512'|'ECDSA_SHA_256'|'ECDSA_SHA_384'|'ECDSA_SHA_512',
]
}
:returns:
KeyId (string) -- [REQUIRED]
Identifies the asymmetric CMK that includes the public key.
To specify a CMK, use its key ID, Amazon Resource Name (ARN), alias name, or alias ARN. When using an alias name, prefix it with "alias/" . To specify a CMK in a different AWS account, you must use the key ARN or alias ARN.
For example:
Key ID: 1234abcd-12ab-34cd-56ef-1234567890ab
Key ARN: arn:aws:kms:us-east-2:111122223333:key/1234abcd-12ab-34cd-56ef-1234567890ab
Alias name: alias/ExampleAlias
Alias ARN: arn:aws:kms:us-east-2:111122223333:alias/ExampleAlias
To get the key ID and key ARN for a CMK, use ListKeys or DescribeKey . To get the alias name and alias ARN, use ListAliases .
GrantTokens (list) -- A list of grant tokens.
For more information, see Grant Tokens in the AWS Key Management Service Developer Guide .
(string) --
"""
pass
def get_waiter(waiter_name=None):
"""
Returns an object that can wait for some condition.
:type waiter_name: str
:param waiter_name: The name of the waiter to get. See the waiters\nsection of the service docs for a list of available waiters.
:rtype: botocore.waiter.Waiter
"""
pass
def import_key_material(KeyId=None, ImportToken=None, EncryptedKeyMaterial=None, ValidTo=None, ExpirationModel=None):
"""
Imports key material into an existing symmetric AWS KMS customer master key (CMK) that was created without key material. After you successfully import key material into a CMK, you can reimport the same key material into that CMK, but you cannot import different key material.
You cannot perform this operation on an asymmetric CMK or on any CMK in a different AWS account. For more information about creating CMKs with no key material and then importing key material, see Importing Key Material in the AWS Key Management Service Developer Guide .
Before using this operation, call GetParametersForImport . Its response includes a public key and an import token. Use the public key to encrypt the key material. Then, submit the import token from the same GetParametersForImport response.
When calling this operation, you must specify the following values:
When this operation is successful, the key state of the CMK changes from PendingImport to Enabled , and you can use the CMK.
If this operation fails, use the exception to help determine the problem. If the error is related to the key material, the import token, or wrapping key, use GetParametersForImport to get a new public key and import token for the CMK and repeat the import procedure. For help, see How To Import Key Material in the AWS Key Management Service Developer Guide .
The CMK that you use for this operation must be in a compatible key state. For details, see How Key State Affects Use of a Customer Master Key in the AWS Key Management Service Developer Guide .
See also: AWS API Documentation
Exceptions
Examples
The following example imports key material into the specified CMK.
Expected Output:
:example: response = client.import_key_material(
KeyId='string',
ImportToken=b'bytes',
EncryptedKeyMaterial=b'bytes',
ValidTo=datetime(2015, 1, 1),
ExpirationModel='KEY_MATERIAL_EXPIRES'|'KEY_MATERIAL_DOES_NOT_EXPIRE'
)
:type KeyId: string
:param KeyId: [REQUIRED]\nThe identifier of the symmetric CMK that receives the imported key material. The CMK\'s Origin must be EXTERNAL . This must be the same CMK specified in the KeyID parameter of the corresponding GetParametersForImport request.\nSpecify the key ID or the Amazon Resource Name (ARN) of the CMK.\nFor example:\n\nKey ID: 1234abcd-12ab-34cd-56ef-1234567890ab\nKey ARN: arn:aws:kms:us-east-2:111122223333:key/1234abcd-12ab-34cd-56ef-1234567890ab\n\nTo get the key ID and key ARN for a CMK, use ListKeys or DescribeKey .\n
:type ImportToken: bytes
:param ImportToken: [REQUIRED]\nThe import token that you received in the response to a previous GetParametersForImport request. It must be from the same response that contained the public key that you used to encrypt the key material.\n
:type EncryptedKeyMaterial: bytes
:param EncryptedKeyMaterial: [REQUIRED]\nThe encrypted key material to import. The key material must be encrypted with the public wrapping key that GetParametersForImport returned, using the wrapping algorithm that you specified in the same GetParametersForImport request.\n
:type ValidTo: datetime
:param ValidTo: The time at which the imported key material expires. When the key material expires, AWS KMS deletes the key material and the CMK becomes unusable. You must omit this parameter when the ExpirationModel parameter is set to KEY_MATERIAL_DOES_NOT_EXPIRE . Otherwise it is required.
:type ExpirationModel: string
:param ExpirationModel: Specifies whether the key material expires. The default is KEY_MATERIAL_EXPIRES , in which case you must include the ValidTo parameter. When this parameter is set to KEY_MATERIAL_DOES_NOT_EXPIRE , you must omit the ValidTo parameter.
:rtype: dict
ReturnsResponse Syntax
{}
Response Structure
(dict) --
Exceptions
KMS.Client.exceptions.InvalidArnException
KMS.Client.exceptions.UnsupportedOperationException
KMS.Client.exceptions.DependencyTimeoutException
KMS.Client.exceptions.NotFoundException
KMS.Client.exceptions.KMSInternalException
KMS.Client.exceptions.KMSInvalidStateException
KMS.Client.exceptions.InvalidCiphertextException
KMS.Client.exceptions.IncorrectKeyMaterialException
KMS.Client.exceptions.ExpiredImportTokenException
KMS.Client.exceptions.InvalidImportTokenException
Examples
The following example imports key material into the specified CMK.
response = client.import_key_material(
# The encrypted key material to import.
EncryptedKeyMaterial='<binary data>',
# A value that specifies whether the key material expires.
ExpirationModel='KEY_MATERIAL_DOES_NOT_EXPIRE',
# The import token that you received in the response to a previous GetParametersForImport request.
ImportToken='<binary data>',
# The identifier of the CMK to import the key material into. You can use the key ID or the Amazon Resource Name (ARN) of the CMK.
KeyId='1234abcd-12ab-34cd-56ef-1234567890ab',
)
print(response)
Expected Output:
{
'ResponseMetadata': {
'...': '...',
},
}
:return: {}
:returns:
KeyId (string) -- [REQUIRED]
The identifier of the symmetric CMK that receives the imported key material. The CMK\'s Origin must be EXTERNAL . This must be the same CMK specified in the KeyID parameter of the corresponding GetParametersForImport request.
Specify the key ID or the Amazon Resource Name (ARN) of the CMK.
For example:
Key ID: 1234abcd-12ab-34cd-56ef-1234567890ab
Key ARN: arn:aws:kms:us-east-2:111122223333:key/1234abcd-12ab-34cd-56ef-1234567890ab
To get the key ID and key ARN for a CMK, use ListKeys or DescribeKey .
ImportToken (bytes) -- [REQUIRED]
The import token that you received in the response to a previous GetParametersForImport request. It must be from the same response that contained the public key that you used to encrypt the key material.
EncryptedKeyMaterial (bytes) -- [REQUIRED]
The encrypted key material to import. The key material must be encrypted with the public wrapping key that GetParametersForImport returned, using the wrapping algorithm that you specified in the same GetParametersForImport request.
ValidTo (datetime) -- The time at which the imported key material expires. When the key material expires, AWS KMS deletes the key material and the CMK becomes unusable. You must omit this parameter when the ExpirationModel parameter is set to KEY_MATERIAL_DOES_NOT_EXPIRE . Otherwise it is required.
ExpirationModel (string) -- Specifies whether the key material expires. The default is KEY_MATERIAL_EXPIRES , in which case you must include the ValidTo parameter. When this parameter is set to KEY_MATERIAL_DOES_NOT_EXPIRE , you must omit the ValidTo parameter.
"""
pass
def list_aliases(KeyId=None, Limit=None, Marker=None):
"""
Gets a list of aliases in the caller\'s AWS account and region. You cannot list aliases in other accounts. For more information about aliases, see CreateAlias .
By default, the ListAliases command returns all aliases in the account and region. To get only the aliases that point to a particular customer master key (CMK), use the KeyId parameter.
The ListAliases response can include aliases that you created and associated with your customer managed CMKs, and aliases that AWS created and associated with AWS managed CMKs in your account. You can recognize AWS aliases because their names have the format aws/<service-name> , such as aws/dynamodb .
The response might also include aliases that have no TargetKeyId field. These are predefined aliases that AWS has created but has not yet associated with a CMK. Aliases that AWS creates in your account, including predefined aliases, do not count against your AWS KMS aliases quota .
See also: AWS API Documentation
Exceptions
Examples
The following example lists aliases.
Expected Output:
:example: response = client.list_aliases(
KeyId='string',
Limit=123,
Marker='string'
)
:type KeyId: string
:param KeyId: Lists only aliases that refer to the specified CMK. The value of this parameter can be the ID or Amazon Resource Name (ARN) of a CMK in the caller\'s account and region. You cannot use an alias name or alias ARN in this value.\nThis parameter is optional. If you omit it, ListAliases returns all aliases in the account and region.\n
:type Limit: integer
:param Limit: Use this parameter to specify the maximum number of items to return. When this value is present, AWS KMS does not return more than the specified number of items, but it might return fewer.\nThis value is optional. If you include a value, it must be between 1 and 100, inclusive. If you do not include a value, it defaults to 50.\n
:type Marker: string
:param Marker: Use this parameter in a subsequent request after you receive a response with truncated results. Set it to the value of NextMarker from the truncated response you just received.
:rtype: dict
ReturnsResponse Syntax
{
'Aliases': [
{
'AliasName': 'string',
'AliasArn': 'string',
'TargetKeyId': 'string'
},
],
'NextMarker': 'string',
'Truncated': True|False
}
Response Structure
(dict) --
Aliases (list) --
A list of aliases.
(dict) --
Contains information about an alias.
AliasName (string) --
String that contains the alias. This value begins with alias/ .
AliasArn (string) --
String that contains the key ARN.
TargetKeyId (string) --
String that contains the key identifier referred to by the alias.
NextMarker (string) --
When Truncated is true, this element is present and contains the value to use for the Marker parameter in a subsequent request.
Truncated (boolean) --
A flag that indicates whether there are more items in the list. When this value is true, the list in this response is truncated. To get more items, pass the value of the NextMarker element in thisresponse to the Marker parameter in a subsequent request.
Exceptions
KMS.Client.exceptions.DependencyTimeoutException
KMS.Client.exceptions.InvalidMarkerException
KMS.Client.exceptions.KMSInternalException
KMS.Client.exceptions.InvalidArnException
KMS.Client.exceptions.NotFoundException
Examples
The following example lists aliases.
response = client.list_aliases(
)
print(response)
Expected Output:
{
# A list of aliases, including the key ID of the customer master key (CMK) that each alias refers to.
'Aliases': [
{
'AliasArn': 'arn:aws:kms:us-east-2:111122223333:alias/aws/acm',
'AliasName': 'alias/aws/acm',
'TargetKeyId': 'da03f6f7-d279-427a-9cae-de48d07e5b66',
},
{
'AliasArn': 'arn:aws:kms:us-east-2:111122223333:alias/aws/ebs',
'AliasName': 'alias/aws/ebs',
'TargetKeyId': '25a217e7-7170-4b8c-8bf6-045ea5f70e5b',
},
{
'AliasArn': 'arn:aws:kms:us-east-2:111122223333:alias/aws/rds',
'AliasName': 'alias/aws/rds',
'TargetKeyId': '7ec3104e-c3f2-4b5c-bf42-bfc4772c6685',
},
{
'AliasArn': 'arn:aws:kms:us-east-2:111122223333:alias/aws/redshift',
'AliasName': 'alias/aws/redshift',
'TargetKeyId': '08f7a25a-69e2-4fb5-8f10-393db27326fa',
},
{
'AliasArn': 'arn:aws:kms:us-east-2:111122223333:alias/aws/s3',
'AliasName': 'alias/aws/s3',
'TargetKeyId': 'd2b0f1a3-580d-4f79-b836-bc983be8cfa5',
},
{
'AliasArn': 'arn:aws:kms:us-east-2:111122223333:alias/example1',
'AliasName': 'alias/example1',
'TargetKeyId': '4da1e216-62d0-46c5-a7c0-5f3a3d2f8046',
},
{
'AliasArn': 'arn:aws:kms:us-east-2:111122223333:alias/example2',
'AliasName': 'alias/example2',
'TargetKeyId': 'f32fef59-2cc2-445b-8573-2d73328acbee',
},
{
'AliasArn': 'arn:aws:kms:us-east-2:111122223333:alias/example3',
'AliasName': 'alias/example3',
'TargetKeyId': '1374ef38-d34e-4d5f-b2c9-4e0daee38855',
},
],
# A boolean that indicates whether there are more items in the list. Returns true when there are more items, or false when there are not.
'Truncated': False,
'ResponseMetadata': {
'...': '...',
},
}
:return: {
'Aliases': [
{
'AliasName': 'string',
'AliasArn': 'string',
'TargetKeyId': 'string'
},
],
'NextMarker': 'string',
'Truncated': True|False
}
:returns:
KMS.Client.exceptions.DependencyTimeoutException
KMS.Client.exceptions.InvalidMarkerException
KMS.Client.exceptions.KMSInternalException
KMS.Client.exceptions.InvalidArnException
KMS.Client.exceptions.NotFoundException
"""
pass
def list_grants(Limit=None, Marker=None, KeyId=None):
"""
Gets a list of all grants for the specified customer master key (CMK).
To perform this operation on a CMK in a different AWS account, specify the key ARN in the value of the KeyId parameter.
See also: AWS API Documentation
Exceptions
Examples
The following example lists grants for the specified CMK.
Expected Output:
:example: response = client.list_grants(
Limit=123,
Marker='string',
KeyId='string'
)
:type Limit: integer
:param Limit: Use this parameter to specify the maximum number of items to return. When this value is present, AWS KMS does not return more than the specified number of items, but it might return fewer.\nThis value is optional. If you include a value, it must be between 1 and 100, inclusive. If you do not include a value, it defaults to 50.\n
:type Marker: string
:param Marker: Use this parameter in a subsequent request after you receive a response with truncated results. Set it to the value of NextMarker from the truncated response you just received.
:type KeyId: string
:param KeyId: [REQUIRED]\nA unique identifier for the customer master key (CMK).\nSpecify the key ID or the Amazon Resource Name (ARN) of the CMK. To specify a CMK in a different AWS account, you must use the key ARN.\nFor example:\n\nKey ID: 1234abcd-12ab-34cd-56ef-1234567890ab\nKey ARN: arn:aws:kms:us-east-2:111122223333:key/1234abcd-12ab-34cd-56ef-1234567890ab\n\nTo get the key ID and key ARN for a CMK, use ListKeys or DescribeKey .\n
:rtype: dict
ReturnsResponse Syntax
{
'Grants': [
{
'KeyId': 'string',
'GrantId': 'string',
'Name': 'string',
'CreationDate': datetime(2015, 1, 1),
'GranteePrincipal': 'string',
'RetiringPrincipal': 'string',
'IssuingAccount': 'string',
'Operations': [
'Decrypt'|'Encrypt'|'GenerateDataKey'|'GenerateDataKeyWithoutPlaintext'|'ReEncryptFrom'|'ReEncryptTo'|'Sign'|'Verify'|'GetPublicKey'|'CreateGrant'|'RetireGrant'|'DescribeKey'|'GenerateDataKeyPair'|'GenerateDataKeyPairWithoutPlaintext',
],
'Constraints': {
'EncryptionContextSubset': {
'string': 'string'
},
'EncryptionContextEquals': {
'string': 'string'
}
}
},
],
'NextMarker': 'string',
'Truncated': True|False
}
Response Structure
(dict) --
Grants (list) --
A list of grants.
(dict) --
Contains information about an entry in a list of grants.
KeyId (string) --
The unique identifier for the customer master key (CMK) to which the grant applies.
GrantId (string) --
The unique identifier for the grant.
Name (string) --
The friendly name that identifies the grant. If a name was provided in the CreateGrant request, that name is returned. Otherwise this value is null.
CreationDate (datetime) --
The date and time when the grant was created.
GranteePrincipal (string) --
The principal that receives the grant\'s permissions.
RetiringPrincipal (string) --
The principal that can retire the grant.
IssuingAccount (string) --
The AWS account under which the grant was issued.
Operations (list) --
The list of operations permitted by the grant.
(string) --
Constraints (dict) --
A list of key-value pairs that must be present in the encryption context of certain subsequent operations that the grant allows.
EncryptionContextSubset (dict) --
A list of key-value pairs that must be included in the encryption context of the cryptographic operation request. The grant allows the cryptographic operation only when the encryption context in the request includes the key-value pairs specified in this constraint, although it can include additional key-value pairs.
(string) --
(string) --
EncryptionContextEquals (dict) --
A list of key-value pairs that must match the encryption context in the cryptographic operation request. The grant allows the operation only when the encryption context in the request is the same as the encryption context specified in this constraint.
(string) --
(string) --
NextMarker (string) --
When Truncated is true, this element is present and contains the value to use for the Marker parameter in a subsequent request.
Truncated (boolean) --
A flag that indicates whether there are more items in the list. When this value is true, the list in this response is truncated. To get more items, pass the value of the NextMarker element in thisresponse to the Marker parameter in a subsequent request.
Exceptions
KMS.Client.exceptions.NotFoundException
KMS.Client.exceptions.DependencyTimeoutException
KMS.Client.exceptions.InvalidMarkerException
KMS.Client.exceptions.InvalidArnException
KMS.Client.exceptions.KMSInternalException
KMS.Client.exceptions.KMSInvalidStateException
Examples
The following example lists grants for the specified CMK.
response = client.list_grants(
# The identifier of the CMK whose grants you want to list. You can use the key ID or the Amazon Resource Name (ARN) of the CMK.
KeyId='1234abcd-12ab-34cd-56ef-1234567890ab',
)
print(response)
Expected Output:
{
# A list of grants.
'Grants': [
{
'CreationDate': datetime(2016, 10, 25, 14, 37, 41, 1, 299, 0),
'GrantId': '91ad875e49b04a9d1f3bdeb84d821f9db6ea95e1098813f6d47f0c65fbe2a172',
'GranteePrincipal': 'acm.us-east-2.amazonaws.com',
'IssuingAccount': 'arn:aws:iam::111122223333:root',
'KeyId': 'arn:aws:kms:us-east-2:111122223333:key/1234abcd-12ab-34cd-56ef-1234567890ab',
'Operations': [
'Encrypt',
'ReEncryptFrom',
'ReEncryptTo',
],
'RetiringPrincipal': 'acm.us-east-2.amazonaws.com',
},
{
'CreationDate': datetime(2016, 10, 25, 14, 37, 41, 1, 299, 0),
'GrantId': 'a5d67d3e207a8fc1f4928749ee3e52eb0440493a8b9cf05bbfad91655b056200',
'GranteePrincipal': 'acm.us-east-2.amazonaws.com',
'IssuingAccount': 'arn:aws:iam::111122223333:root',
'KeyId': 'arn:aws:kms:us-east-2:111122223333:key/1234abcd-12ab-34cd-56ef-1234567890ab',
'Operations': [
'ReEncryptFrom',
'ReEncryptTo',
],
'RetiringPrincipal': 'acm.us-east-2.amazonaws.com',
},
{
'CreationDate': datetime(2016, 10, 25, 14, 37, 41, 1, 299, 0),
'GrantId': 'c541aaf05d90cb78846a73b346fc43e65be28b7163129488c738e0c9e0628f4f',
'GranteePrincipal': 'acm.us-east-2.amazonaws.com',
'IssuingAccount': 'arn:aws:iam::111122223333:root',
'KeyId': 'arn:aws:kms:us-east-2:111122223333:key/1234abcd-12ab-34cd-56ef-1234567890ab',
'Operations': [
'Encrypt',
'ReEncryptFrom',
'ReEncryptTo',
],
'RetiringPrincipal': 'acm.us-east-2.amazonaws.com',
},
{
'CreationDate': datetime(2016, 10, 25, 14, 37, 41, 1, 299, 0),
'GrantId': 'dd2052c67b4c76ee45caf1dc6a1e2d24e8dc744a51b36ae2f067dc540ce0105c',
'GranteePrincipal': 'acm.us-east-2.amazonaws.com',
'IssuingAccount': 'arn:aws:iam::111122223333:root',
'KeyId': 'arn:aws:kms:us-east-2:111122223333:key/1234abcd-12ab-34cd-56ef-1234567890ab',
'Operations': [
'Encrypt',
'ReEncryptFrom',
'ReEncryptTo',
],
'RetiringPrincipal': 'acm.us-east-2.amazonaws.com',
},
],
# A boolean that indicates whether there are more items in the list. Returns true when there are more items, or false when there are not.
'Truncated': True,
'ResponseMetadata': {
'...': '...',
},
}
:return: {
'Grants': [
{
'KeyId': 'string',
'GrantId': 'string',
'Name': 'string',
'CreationDate': datetime(2015, 1, 1),
'GranteePrincipal': 'string',
'RetiringPrincipal': 'string',
'IssuingAccount': 'string',
'Operations': [
'Decrypt'|'Encrypt'|'GenerateDataKey'|'GenerateDataKeyWithoutPlaintext'|'ReEncryptFrom'|'ReEncryptTo'|'Sign'|'Verify'|'GetPublicKey'|'CreateGrant'|'RetireGrant'|'DescribeKey'|'GenerateDataKeyPair'|'GenerateDataKeyPairWithoutPlaintext',
],
'Constraints': {
'EncryptionContextSubset': {
'string': 'string'
},
'EncryptionContextEquals': {
'string': 'string'
}
}
},
],
'NextMarker': 'string',
'Truncated': True|False
}
:returns:
(string) --
"""
pass
def list_key_policies(KeyId=None, Limit=None, Marker=None):
"""
Gets the names of the key policies that are attached to a customer master key (CMK). This operation is designed to get policy names that you can use in a GetKeyPolicy operation. However, the only valid policy name is default . You cannot perform this operation on a CMK in a different AWS account.
See also: AWS API Documentation
Exceptions
Examples
The following example lists key policies for the specified CMK.
Expected Output:
:example: response = client.list_key_policies(
KeyId='string',
Limit=123,
Marker='string'
)
:type KeyId: string
:param KeyId: [REQUIRED]\nA unique identifier for the customer master key (CMK).\nSpecify the key ID or the Amazon Resource Name (ARN) of the CMK.\nFor example:\n\nKey ID: 1234abcd-12ab-34cd-56ef-1234567890ab\nKey ARN: arn:aws:kms:us-east-2:111122223333:key/1234abcd-12ab-34cd-56ef-1234567890ab\n\nTo get the key ID and key ARN for a CMK, use ListKeys or DescribeKey .\n
:type Limit: integer
:param Limit: Use this parameter to specify the maximum number of items to return. When this value is present, AWS KMS does not return more than the specified number of items, but it might return fewer.\nThis value is optional. If you include a value, it must be between 1 and 1000, inclusive. If you do not include a value, it defaults to 100.\nOnly one policy can be attached to a key.\n
:type Marker: string
:param Marker: Use this parameter in a subsequent request after you receive a response with truncated results. Set it to the value of NextMarker from the truncated response you just received.
:rtype: dict
ReturnsResponse Syntax
{
'PolicyNames': [
'string',
],
'NextMarker': 'string',
'Truncated': True|False
}
Response Structure
(dict) --
PolicyNames (list) --
A list of key policy names. The only valid value is default .
(string) --
NextMarker (string) --
When Truncated is true, this element is present and contains the value to use for the Marker parameter in a subsequent request.
Truncated (boolean) --
A flag that indicates whether there are more items in the list. When this value is true, the list in this response is truncated. To get more items, pass the value of the NextMarker element in thisresponse to the Marker parameter in a subsequent request.
Exceptions
KMS.Client.exceptions.NotFoundException
KMS.Client.exceptions.InvalidArnException
KMS.Client.exceptions.DependencyTimeoutException
KMS.Client.exceptions.KMSInternalException
KMS.Client.exceptions.KMSInvalidStateException
Examples
The following example lists key policies for the specified CMK.
response = client.list_key_policies(
# The identifier of the CMK whose key policies you want to list. You can use the key ID or the Amazon Resource Name (ARN) of the CMK.
KeyId='1234abcd-12ab-34cd-56ef-1234567890ab',
)
print(response)
Expected Output:
{
# A list of key policy names.
'PolicyNames': [
'default',
],
# A boolean that indicates whether there are more items in the list. Returns true when there are more items, or false when there are not.
'Truncated': False,
'ResponseMetadata': {
'...': '...',
},
}
:return: {
'PolicyNames': [
'string',
],
'NextMarker': 'string',
'Truncated': True|False
}
:returns:
(string) --
"""
pass
def list_keys(Limit=None, Marker=None):
"""
Gets a list of all customer master keys (CMKs) in the caller\'s AWS account and Region.
See also: AWS API Documentation
Exceptions
Examples
The following example lists CMKs.
Expected Output:
:example: response = client.list_keys(
Limit=123,
Marker='string'
)
:type Limit: integer
:param Limit: Use this parameter to specify the maximum number of items to return. When this value is present, AWS KMS does not return more than the specified number of items, but it might return fewer.\nThis value is optional. If you include a value, it must be between 1 and 1000, inclusive. If you do not include a value, it defaults to 100.\n
:type Marker: string
:param Marker: Use this parameter in a subsequent request after you receive a response with truncated results. Set it to the value of NextMarker from the truncated response you just received.
:rtype: dict
ReturnsResponse Syntax
{
'Keys': [
{
'KeyId': 'string',
'KeyArn': 'string'
},
],
'NextMarker': 'string',
'Truncated': True|False
}
Response Structure
(dict) --
Keys (list) --
A list of customer master keys (CMKs).
(dict) --
Contains information about each entry in the key list.
KeyId (string) --
Unique identifier of the key.
KeyArn (string) --
ARN of the key.
NextMarker (string) --
When Truncated is true, this element is present and contains the value to use for the Marker parameter in a subsequent request.
Truncated (boolean) --
A flag that indicates whether there are more items in the list. When this value is true, the list in this response is truncated. To get more items, pass the value of the NextMarker element in thisresponse to the Marker parameter in a subsequent request.
Exceptions
KMS.Client.exceptions.DependencyTimeoutException
KMS.Client.exceptions.KMSInternalException
KMS.Client.exceptions.InvalidMarkerException
Examples
The following example lists CMKs.
response = client.list_keys(
)
print(response)
Expected Output:
{
# A list of CMKs, including the key ID and Amazon Resource Name (ARN) of each one.
'Keys': [
{
'KeyArn': 'arn:aws:kms:us-east-2:111122223333:key/0d990263-018e-4e65-a703-eff731de951e',
'KeyId': '0d990263-018e-4e65-a703-eff731de951e',
},
{
'KeyArn': 'arn:aws:kms:us-east-2:111122223333:key/144be297-0ae1-44ac-9c8f-93cd8c82f841',
'KeyId': '144be297-0ae1-44ac-9c8f-93cd8c82f841',
},
{
'KeyArn': 'arn:aws:kms:us-east-2:111122223333:key/21184251-b765-428e-b852-2c7353e72571',
'KeyId': '21184251-b765-428e-b852-2c7353e72571',
},
{
'KeyArn': 'arn:aws:kms:us-east-2:111122223333:key/214fe92f-5b03-4ae1-b350-db2a45dbe10c',
'KeyId': '214fe92f-5b03-4ae1-b350-db2a45dbe10c',
},
{
'KeyArn': 'arn:aws:kms:us-east-2:111122223333:key/339963f2-e523-49d3-af24-a0fe752aa458',
'KeyId': '339963f2-e523-49d3-af24-a0fe752aa458',
},
{
'KeyArn': 'arn:aws:kms:us-east-2:111122223333:key/b776a44b-df37-4438-9be4-a27494e4271a',
'KeyId': 'b776a44b-df37-4438-9be4-a27494e4271a',
},
{
'KeyArn': 'arn:aws:kms:us-east-2:111122223333:key/deaf6c9e-cf2c-46a6-bf6d-0b6d487cffbb',
'KeyId': 'deaf6c9e-cf2c-46a6-bf6d-0b6d487cffbb',
},
],
# A boolean that indicates whether there are more items in the list. Returns true when there are more items, or false when there are not.
'Truncated': False,
'ResponseMetadata': {
'...': '...',
},
}
:return: {
'Keys': [
{
'KeyId': 'string',
'KeyArn': 'string'
},
],
'NextMarker': 'string',
'Truncated': True|False
}
:returns:
KMS.Client.exceptions.DependencyTimeoutException
KMS.Client.exceptions.KMSInternalException
KMS.Client.exceptions.InvalidMarkerException
"""
pass
def list_resource_tags(KeyId=None, Limit=None, Marker=None):
"""
Returns a list of all tags for the specified customer master key (CMK).
You cannot perform this operation on a CMK in a different AWS account.
See also: AWS API Documentation
Exceptions
Examples
The following example lists tags for a CMK.
Expected Output:
:example: response = client.list_resource_tags(
KeyId='string',
Limit=123,
Marker='string'
)
:type KeyId: string
:param KeyId: [REQUIRED]\nA unique identifier for the customer master key (CMK).\nSpecify the key ID or the Amazon Resource Name (ARN) of the CMK.\nFor example:\n\nKey ID: 1234abcd-12ab-34cd-56ef-1234567890ab\nKey ARN: arn:aws:kms:us-east-2:111122223333:key/1234abcd-12ab-34cd-56ef-1234567890ab\n\nTo get the key ID and key ARN for a CMK, use ListKeys or DescribeKey .\n
:type Limit: integer
:param Limit: Use this parameter to specify the maximum number of items to return. When this value is present, AWS KMS does not return more than the specified number of items, but it might return fewer.\nThis value is optional. If you include a value, it must be between 1 and 50, inclusive. If you do not include a value, it defaults to 50.\n
:type Marker: string
:param Marker: Use this parameter in a subsequent request after you receive a response with truncated results. Set it to the value of NextMarker from the truncated response you just received.\nDo not attempt to construct this value. Use only the value of NextMarker from the truncated response you just received.\n
:rtype: dict
ReturnsResponse Syntax
{
'Tags': [
{
'TagKey': 'string',
'TagValue': 'string'
},
],
'NextMarker': 'string',
'Truncated': True|False
}
Response Structure
(dict) --
Tags (list) --
A list of tags. Each tag consists of a tag key and a tag value.
(dict) --
A key-value pair. A tag consists of a tag key and a tag value. Tag keys and tag values are both required, but tag values can be empty (null) strings.
For information about the rules that apply to tag keys and tag values, see User-Defined Tag Restrictions in the AWS Billing and Cost Management User Guide .
TagKey (string) --
The key of the tag.
TagValue (string) --
The value of the tag.
NextMarker (string) --
When Truncated is true, this element is present and contains the value to use for the Marker parameter in a subsequent request.
Do not assume or infer any information from this value.
Truncated (boolean) --
A flag that indicates whether there are more items in the list. When this value is true, the list in this response is truncated. To get more items, pass the value of the NextMarker element in thisresponse to the Marker parameter in a subsequent request.
Exceptions
KMS.Client.exceptions.KMSInternalException
KMS.Client.exceptions.NotFoundException
KMS.Client.exceptions.InvalidArnException
KMS.Client.exceptions.InvalidMarkerException
Examples
The following example lists tags for a CMK.
response = client.list_resource_tags(
# The identifier of the CMK whose tags you are listing. You can use the key ID or the Amazon Resource Name (ARN) of the CMK.
KeyId='1234abcd-12ab-34cd-56ef-1234567890ab',
)
print(response)
Expected Output:
{
# A list of tags.
'Tags': [
{
'TagKey': 'CostCenter',
'TagValue': '87654',
},
{
'TagKey': 'CreatedBy',
'TagValue': 'ExampleUser',
},
{
'TagKey': 'Purpose',
'TagValue': 'Test',
},
],
# A boolean that indicates whether there are more items in the list. Returns true when there are more items, or false when there are not.
'Truncated': False,
'ResponseMetadata': {
'...': '...',
},
}
:return: {
'Tags': [
{
'TagKey': 'string',
'TagValue': 'string'
},
],
'NextMarker': 'string',
'Truncated': True|False
}
:returns:
KMS.Client.exceptions.KMSInternalException
KMS.Client.exceptions.NotFoundException
KMS.Client.exceptions.InvalidArnException
KMS.Client.exceptions.InvalidMarkerException
"""
pass
def list_retirable_grants(Limit=None, Marker=None, RetiringPrincipal=None):
"""
Returns a list of all grants for which the grant\'s RetiringPrincipal matches the one specified.
A typical use is to list all grants that you are able to retire. To retire a grant, use RetireGrant .
See also: AWS API Documentation
Exceptions
Examples
The following example lists the grants that the specified principal (identity) can retire.
Expected Output:
:example: response = client.list_retirable_grants(
Limit=123,
Marker='string',
RetiringPrincipal='string'
)
:type Limit: integer
:param Limit: Use this parameter to specify the maximum number of items to return. When this value is present, AWS KMS does not return more than the specified number of items, but it might return fewer.\nThis value is optional. If you include a value, it must be between 1 and 100, inclusive. If you do not include a value, it defaults to 50.\n
:type Marker: string
:param Marker: Use this parameter in a subsequent request after you receive a response with truncated results. Set it to the value of NextMarker from the truncated response you just received.
:type RetiringPrincipal: string
:param RetiringPrincipal: [REQUIRED]\nThe retiring principal for which to list grants.\nTo specify the retiring principal, use the Amazon Resource Name (ARN) of an AWS principal. Valid AWS principals include AWS accounts (root), IAM users, federated users, and assumed role users. For examples of the ARN syntax for specifying a principal, see AWS Identity and Access Management (IAM) in the Example ARNs section of the Amazon Web Services General Reference .\n
:rtype: dict
ReturnsResponse Syntax
{
'Grants': [
{
'KeyId': 'string',
'GrantId': 'string',
'Name': 'string',
'CreationDate': datetime(2015, 1, 1),
'GranteePrincipal': 'string',
'RetiringPrincipal': 'string',
'IssuingAccount': 'string',
'Operations': [
'Decrypt'|'Encrypt'|'GenerateDataKey'|'GenerateDataKeyWithoutPlaintext'|'ReEncryptFrom'|'ReEncryptTo'|'Sign'|'Verify'|'GetPublicKey'|'CreateGrant'|'RetireGrant'|'DescribeKey'|'GenerateDataKeyPair'|'GenerateDataKeyPairWithoutPlaintext',
],
'Constraints': {
'EncryptionContextSubset': {
'string': 'string'
},
'EncryptionContextEquals': {
'string': 'string'
}
}
},
],
'NextMarker': 'string',
'Truncated': True|False
}
Response Structure
(dict) --
Grants (list) --
A list of grants.
(dict) --
Contains information about an entry in a list of grants.
KeyId (string) --
The unique identifier for the customer master key (CMK) to which the grant applies.
GrantId (string) --
The unique identifier for the grant.
Name (string) --
The friendly name that identifies the grant. If a name was provided in the CreateGrant request, that name is returned. Otherwise this value is null.
CreationDate (datetime) --
The date and time when the grant was created.
GranteePrincipal (string) --
The principal that receives the grant\'s permissions.
RetiringPrincipal (string) --
The principal that can retire the grant.
IssuingAccount (string) --
The AWS account under which the grant was issued.
Operations (list) --
The list of operations permitted by the grant.
(string) --
Constraints (dict) --
A list of key-value pairs that must be present in the encryption context of certain subsequent operations that the grant allows.
EncryptionContextSubset (dict) --
A list of key-value pairs that must be included in the encryption context of the cryptographic operation request. The grant allows the cryptographic operation only when the encryption context in the request includes the key-value pairs specified in this constraint, although it can include additional key-value pairs.
(string) --
(string) --
EncryptionContextEquals (dict) --
A list of key-value pairs that must match the encryption context in the cryptographic operation request. The grant allows the operation only when the encryption context in the request is the same as the encryption context specified in this constraint.
(string) --
(string) --
NextMarker (string) --
When Truncated is true, this element is present and contains the value to use for the Marker parameter in a subsequent request.
Truncated (boolean) --
A flag that indicates whether there are more items in the list. When this value is true, the list in this response is truncated. To get more items, pass the value of the NextMarker element in thisresponse to the Marker parameter in a subsequent request.
Exceptions
KMS.Client.exceptions.DependencyTimeoutException
KMS.Client.exceptions.InvalidMarkerException
KMS.Client.exceptions.InvalidArnException
KMS.Client.exceptions.NotFoundException
KMS.Client.exceptions.KMSInternalException
Examples
The following example lists the grants that the specified principal (identity) can retire.
response = client.list_retirable_grants(
# The retiring principal whose grants you want to list. Use the Amazon Resource Name (ARN) of an AWS principal such as an AWS account (root), IAM user, federated user, or assumed role user.
RetiringPrincipal='arn:aws:iam::111122223333:role/ExampleRole',
)
print(response)
Expected Output:
{
# A list of grants that the specified principal can retire.
'Grants': [
{
'CreationDate': datetime(2016, 12, 7, 11, 9, 35, 2, 342, 0),
'GrantId': '0c237476b39f8bc44e45212e08498fbe3151305030726c0590dd8d3e9f3d6a60',
'GranteePrincipal': 'arn:aws:iam::111122223333:role/ExampleRole',
'IssuingAccount': 'arn:aws:iam::444455556666:root',
'KeyId': 'arn:aws:kms:us-east-2:444455556666:key/1234abcd-12ab-34cd-56ef-1234567890ab',
'Operations': [
'Decrypt',
'Encrypt',
],
'RetiringPrincipal': 'arn:aws:iam::111122223333:role/ExampleRole',
},
],
# A boolean that indicates whether there are more items in the list. Returns true when there are more items, or false when there are not.
'Truncated': False,
'ResponseMetadata': {
'...': '...',
},
}
:return: {
'Grants': [
{
'KeyId': 'string',
'GrantId': 'string',
'Name': 'string',
'CreationDate': datetime(2015, 1, 1),
'GranteePrincipal': 'string',
'RetiringPrincipal': 'string',
'IssuingAccount': 'string',
'Operations': [
'Decrypt'|'Encrypt'|'GenerateDataKey'|'GenerateDataKeyWithoutPlaintext'|'ReEncryptFrom'|'ReEncryptTo'|'Sign'|'Verify'|'GetPublicKey'|'CreateGrant'|'RetireGrant'|'DescribeKey'|'GenerateDataKeyPair'|'GenerateDataKeyPairWithoutPlaintext',
],
'Constraints': {
'EncryptionContextSubset': {
'string': 'string'
},
'EncryptionContextEquals': {
'string': 'string'
}
}
},
],
'NextMarker': 'string',
'Truncated': True|False
}
:returns:
(string) --
"""
pass
def put_key_policy(KeyId=None, PolicyName=None, Policy=None, BypassPolicyLockoutSafetyCheck=None):
"""
Attaches a key policy to the specified customer master key (CMK). You cannot perform this operation on a CMK in a different AWS account.
For more information about key policies, see Key Policies in the AWS Key Management Service Developer Guide .
See also: AWS API Documentation
Exceptions
Examples
The following example attaches a key policy to the specified CMK.
Expected Output:
:example: response = client.put_key_policy(
KeyId='string',
PolicyName='string',
Policy='string',
BypassPolicyLockoutSafetyCheck=True|False
)
:type KeyId: string
:param KeyId: [REQUIRED]\nA unique identifier for the customer master key (CMK).\nSpecify the key ID or the Amazon Resource Name (ARN) of the CMK.\nFor example:\n\nKey ID: 1234abcd-12ab-34cd-56ef-1234567890ab\nKey ARN: arn:aws:kms:us-east-2:111122223333:key/1234abcd-12ab-34cd-56ef-1234567890ab\n\nTo get the key ID and key ARN for a CMK, use ListKeys or DescribeKey .\n
:type PolicyName: string
:param PolicyName: [REQUIRED]\nThe name of the key policy. The only valid value is default .\n
:type Policy: string
:param Policy: [REQUIRED]\nThe key policy to attach to the CMK.\nThe key policy must meet the following criteria:\n\nIf you don\'t set BypassPolicyLockoutSafetyCheck to true, the key policy must allow the principal that is making the PutKeyPolicy request to make a subsequent PutKeyPolicy request on the CMK. This reduces the risk that the CMK becomes unmanageable. For more information, refer to the scenario in the Default Key Policy section of the AWS Key Management Service Developer Guide .\nEach statement in the key policy must contain one or more principals. The principals in the key policy must exist and be visible to AWS KMS. When you create a new AWS principal (for example, an IAM user or role), you might need to enforce a delay before including the new principal in a key policy because the new principal might not be immediately visible to AWS KMS. For more information, see Changes that I make are not always immediately visible in the AWS Identity and Access Management User Guide .\n\nThe key policy cannot exceed 32 kilobytes (32768 bytes). For more information, see Resource Quotas in the AWS Key Management Service Developer Guide .\n
:type BypassPolicyLockoutSafetyCheck: boolean
:param BypassPolicyLockoutSafetyCheck: A flag to indicate whether to bypass the key policy lockout safety check.\n\nWarning\nSetting this value to true increases the risk that the CMK becomes unmanageable. Do not set this value to true indiscriminately.\nFor more information, refer to the scenario in the Default Key Policy section in the AWS Key Management Service Developer Guide .\n\nUse this parameter only when you intend to prevent the principal that is making the request from making a subsequent PutKeyPolicy request on the CMK.\nThe default value is false.\n
:return: response = client.put_key_policy(
# The identifier of the CMK to attach the key policy to. You can use the key ID or the Amazon Resource Name (ARN) of the CMK.
KeyId='1234abcd-12ab-34cd-56ef-1234567890ab',
# The key policy document.
Policy='{\
"Version": "2012-10-17",\
"Id": "custom-policy-2016-12-07",\
"Statement": [\
{\
"Sid": "Enable IAM User Permissions",\
"Effect": "Allow",\
"Principal": {\
"AWS": "arn:aws:iam::111122223333:root"\
},\
"Action": "kms:*",\
"Resource": "*"\
},\
{\
"Sid": "Allow access for Key Administrators",\
"Effect": "Allow",\
"Principal": {\
"AWS": [\
"arn:aws:iam::111122223333:user/ExampleAdminUser",\
"arn:aws:iam::111122223333:role/ExampleAdminRole"\
]\
},\
"Action": [\
"kms:Create*",\
"kms:Describe*",\
"kms:Enable*",\
"kms:List*",\
"kms:Put*",\
"kms:Update*",\
"kms:Revoke*",\
"kms:Disable*",\
"kms:Get*",\
"kms:Delete*",\
"kms:ScheduleKeyDeletion",\
"kms:CancelKeyDeletion"\
],\
"Resource": "*"\
},\
{\
"Sid": "Allow use of the key",\
"Effect": "Allow",\
"Principal": {\
"AWS": "arn:aws:iam::111122223333:role/ExamplePowerUserRole"\
},\
"Action": [\
"kms:Encrypt",\
"kms:Decrypt",\
"kms:ReEncrypt*",\
"kms:GenerateDataKey*",\
"kms:DescribeKey"\
],\
"Resource": "*"\
},\
{\
"Sid": "Allow attachment of persistent resources",\
"Effect": "Allow",\
"Principal": {\
"AWS": "arn:aws:iam::111122223333:role/ExamplePowerUserRole"\
},\
"Action": [\
"kms:CreateGrant",\
"kms:ListGrants",\
"kms:RevokeGrant"\
],\
"Resource": "*",\
"Condition": {\
"Bool": {\
"kms:GrantIsForAWSResource": "true"\
}\
}\
}\
]\
}\
',
# The name of the key policy.
PolicyName='default',
)
print(response)
:returns:
KMS.Client.exceptions.NotFoundException
KMS.Client.exceptions.InvalidArnException
KMS.Client.exceptions.MalformedPolicyDocumentException
KMS.Client.exceptions.DependencyTimeoutException
KMS.Client.exceptions.UnsupportedOperationException
KMS.Client.exceptions.KMSInternalException
KMS.Client.exceptions.LimitExceededException
KMS.Client.exceptions.KMSInvalidStateException
"""
pass
def re_encrypt(CiphertextBlob=None, SourceEncryptionContext=None, SourceKeyId=None, DestinationKeyId=None, DestinationEncryptionContext=None, SourceEncryptionAlgorithm=None, DestinationEncryptionAlgorithm=None, GrantTokens=None):
"""
Decrypts ciphertext and then reencrypts it entirely within AWS KMS. You can use this operation to change the customer master key (CMK) under which data is encrypted, such as when you manually rotate a CMK or change the CMK that protects a ciphertext. You can also use it to reencrypt ciphertext under the same CMK, such as to change the encryption context of a ciphertext.
The ReEncrypt operation can decrypt ciphertext that was encrypted by using an AWS KMS CMK in an AWS KMS operation, such as Encrypt or GenerateDataKey . It can also decrypt ciphertext that was encrypted by using the public key of an asymmetric CMK outside of AWS KMS. However, it cannot decrypt ciphertext produced by other libraries, such as the AWS Encryption SDK or Amazon S3 client-side encryption . These libraries return a ciphertext format that is incompatible with AWS KMS.
When you use the ReEncrypt operation, you need to provide information for the decrypt operation and the subsequent encrypt operation.
Unlike other AWS KMS API operations, ReEncrypt callers must have two permissions:
To permit reencryption from
or to a CMK, include the "kms:ReEncrypt*" permission in your key policy . This permission is automatically included in the key policy when you use the console to create a CMK. But you must include it manually when you create a CMK programmatically or when you use the PutKeyPolicy operation set a key policy.
The CMK that you use for this operation must be in a compatible key state. For details, see How Key State Affects Use of a Customer Master Key in the AWS Key Management Service Developer Guide .
See also: AWS API Documentation
Exceptions
Examples
The following example reencrypts data with the specified CMK.
Expected Output:
:example: response = client.re_encrypt(
CiphertextBlob=b'bytes',
SourceEncryptionContext={
'string': 'string'
},
SourceKeyId='string',
DestinationKeyId='string',
DestinationEncryptionContext={
'string': 'string'
},
SourceEncryptionAlgorithm='SYMMETRIC_DEFAULT'|'RSAES_OAEP_SHA_1'|'RSAES_OAEP_SHA_256',
DestinationEncryptionAlgorithm='SYMMETRIC_DEFAULT'|'RSAES_OAEP_SHA_1'|'RSAES_OAEP_SHA_256',
GrantTokens=[
'string',
]
)
:type CiphertextBlob: bytes
:param CiphertextBlob: [REQUIRED]\nCiphertext of the data to reencrypt.\n
:type SourceEncryptionContext: dict
:param SourceEncryptionContext: Specifies the encryption context to use to decrypt the ciphertext. Enter the same encryption context that was used to encrypt the ciphertext.\nAn encryption context is a collection of non-secret key-value pairs that represents additional authenticated data. When you use an encryption context to encrypt data, you must specify the same (an exact case-sensitive match) encryption context to decrypt the data. An encryption context is optional when encrypting with a symmetric CMK, but it is highly recommended.\nFor more information, see Encryption Context in the AWS Key Management Service Developer Guide .\n\n(string) --\n(string) --\n\n\n\n
:type SourceKeyId: string
:param SourceKeyId: A unique identifier for the CMK that is used to decrypt the ciphertext before it reencrypts it using the destination CMK.\nThis parameter is required only when the ciphertext was encrypted under an asymmetric CMK. Otherwise, AWS KMS uses the metadata that it adds to the ciphertext blob to determine which CMK was used to encrypt the ciphertext. However, you can use this parameter to ensure that a particular CMK (of any kind) is used to decrypt the ciphertext before it is reencrypted.\nIf you specify a KeyId value, the decrypt part of the ReEncrypt operation succeeds only if the specified CMK was used to encrypt the ciphertext.\nTo specify a CMK, use its key ID, Amazon Resource Name (ARN), alias name, or alias ARN. When using an alias name, prefix it with 'alias/' .\nFor example:\n\nKey ID: 1234abcd-12ab-34cd-56ef-1234567890ab\nKey ARN: arn:aws:kms:us-east-2:111122223333:key/1234abcd-12ab-34cd-56ef-1234567890ab\nAlias name: alias/ExampleAlias\nAlias ARN: arn:aws:kms:us-east-2:111122223333:alias/ExampleAlias\n\nTo get the key ID and key ARN for a CMK, use ListKeys or DescribeKey . To get the alias name and alias ARN, use ListAliases .\n
:type DestinationKeyId: string
:param DestinationKeyId: [REQUIRED]\nA unique identifier for the CMK that is used to reencrypt the data. Specify a symmetric or asymmetric CMK with a KeyUsage value of ENCRYPT_DECRYPT . To find the KeyUsage value of a CMK, use the DescribeKey operation.\nTo specify a CMK, use its key ID, Amazon Resource Name (ARN), alias name, or alias ARN. When using an alias name, prefix it with 'alias/' . To specify a CMK in a different AWS account, you must use the key ARN or alias ARN.\nFor example:\n\nKey ID: 1234abcd-12ab-34cd-56ef-1234567890ab\nKey ARN: arn:aws:kms:us-east-2:111122223333:key/1234abcd-12ab-34cd-56ef-1234567890ab\nAlias name: alias/ExampleAlias\nAlias ARN: arn:aws:kms:us-east-2:111122223333:alias/ExampleAlias\n\nTo get the key ID and key ARN for a CMK, use ListKeys or DescribeKey . To get the alias name and alias ARN, use ListAliases .\n
:type DestinationEncryptionContext: dict
:param DestinationEncryptionContext: Specifies that encryption context to use when the reencrypting the data.\nA destination encryption context is valid only when the destination CMK is a symmetric CMK. The standard ciphertext format for asymmetric CMKs does not include fields for metadata.\nAn encryption context is a collection of non-secret key-value pairs that represents additional authenticated data. When you use an encryption context to encrypt data, you must specify the same (an exact case-sensitive match) encryption context to decrypt the data. An encryption context is optional when encrypting with a symmetric CMK, but it is highly recommended.\nFor more information, see Encryption Context in the AWS Key Management Service Developer Guide .\n\n(string) --\n(string) --\n\n\n\n
:type SourceEncryptionAlgorithm: string
:param SourceEncryptionAlgorithm: Specifies the encryption algorithm that AWS KMS will use to decrypt the ciphertext before it is reencrypted. The default value, SYMMETRIC_DEFAULT , represents the algorithm used for symmetric CMKs.\nSpecify the same algorithm that was used to encrypt the ciphertext. If you specify a different algorithm, the decrypt attempt fails.\nThis parameter is required only when the ciphertext was encrypted under an asymmetric CMK.\n
:type DestinationEncryptionAlgorithm: string
:param DestinationEncryptionAlgorithm: Specifies the encryption algorithm that AWS KMS will use to reecrypt the data after it has decrypted it. The default value, SYMMETRIC_DEFAULT , represents the encryption algorithm used for symmetric CMKs.\nThis parameter is required only when the destination CMK is an asymmetric CMK.\n
:type GrantTokens: list
:param GrantTokens: A list of grant tokens.\nFor more information, see Grant Tokens in the AWS Key Management Service Developer Guide .\n\n(string) --\n\n
:rtype: dict
ReturnsResponse Syntax
{
'CiphertextBlob': b'bytes',
'SourceKeyId': 'string',
'KeyId': 'string',
'SourceEncryptionAlgorithm': 'SYMMETRIC_DEFAULT'|'RSAES_OAEP_SHA_1'|'RSAES_OAEP_SHA_256',
'DestinationEncryptionAlgorithm': 'SYMMETRIC_DEFAULT'|'RSAES_OAEP_SHA_1'|'RSAES_OAEP_SHA_256'
}
Response Structure
(dict) --
CiphertextBlob (bytes) --
The reencrypted data. When you use the HTTP API or the AWS CLI, the value is Base64-encoded. Otherwise, it is not Base64-encoded.
SourceKeyId (string) --
Unique identifier of the CMK used to originally encrypt the data.
KeyId (string) --
Unique identifier of the CMK used to reencrypt the data.
SourceEncryptionAlgorithm (string) --
The encryption algorithm that was used to decrypt the ciphertext before it was reencrypted.
DestinationEncryptionAlgorithm (string) --
The encryption algorithm that was used to reencrypt the data.
Exceptions
KMS.Client.exceptions.NotFoundException
KMS.Client.exceptions.DisabledException
KMS.Client.exceptions.InvalidCiphertextException
KMS.Client.exceptions.KeyUnavailableException
KMS.Client.exceptions.IncorrectKeyException
KMS.Client.exceptions.DependencyTimeoutException
KMS.Client.exceptions.InvalidKeyUsageException
KMS.Client.exceptions.InvalidGrantTokenException
KMS.Client.exceptions.KMSInternalException
KMS.Client.exceptions.KMSInvalidStateException
Examples
The following example reencrypts data with the specified CMK.
response = client.re_encrypt(
# The data to reencrypt.
CiphertextBlob='<binary data>',
# The identifier of the CMK to use to reencrypt the data. You can use the key ID or Amazon Resource Name (ARN) of the CMK, or the name or ARN of an alias that refers to the CMK.
DestinationKeyId='0987dcba-09fe-87dc-65ba-ab0987654321',
)
print(response)
Expected Output:
{
# The reencrypted data.
'CiphertextBlob': '<binary data>',
# The ARN of the CMK that was used to reencrypt the data.
'KeyId': 'arn:aws:kms:us-east-2:111122223333:key/0987dcba-09fe-87dc-65ba-ab0987654321',
# The ARN of the CMK that was used to originally encrypt the data.
'SourceKeyId': 'arn:aws:kms:us-east-2:111122223333:key/1234abcd-12ab-34cd-56ef-1234567890ab',
'ResponseMetadata': {
'...': '...',
},
}
:return: {
'CiphertextBlob': b'bytes',
'SourceKeyId': 'string',
'KeyId': 'string',
'SourceEncryptionAlgorithm': 'SYMMETRIC_DEFAULT'|'RSAES_OAEP_SHA_1'|'RSAES_OAEP_SHA_256',
'DestinationEncryptionAlgorithm': 'SYMMETRIC_DEFAULT'|'RSAES_OAEP_SHA_1'|'RSAES_OAEP_SHA_256'
}
:returns:
kms:EncryptFrom permission on the source CMK
kms:EncryptTo permission on the destination CMK
"""
pass
def retire_grant(GrantToken=None, KeyId=None, GrantId=None):
"""
Retires a grant. To clean up, you can retire a grant when you\'re done using it. You should revoke a grant when you intend to actively deny operations that depend on it. The following are permitted to call this API:
You must identify the grant to retire by its grant token or by a combination of the grant ID and the Amazon Resource Name (ARN) of the customer master key (CMK). A grant token is a unique variable-length base64-encoded string. A grant ID is a 64 character unique identifier of a grant. The CreateGrant operation returns both.
See also: AWS API Documentation
Exceptions
Examples
The following example retires a grant.
Expected Output:
:example: response = client.retire_grant(
GrantToken='string',
KeyId='string',
GrantId='string'
)
:type GrantToken: string
:param GrantToken: Token that identifies the grant to be retired.
:type KeyId: string
:param KeyId: The Amazon Resource Name (ARN) of the CMK associated with the grant.\nFor example: arn:aws:kms:us-east-2:444455556666:key/1234abcd-12ab-34cd-56ef-1234567890ab\n
:type GrantId: string
:param GrantId: Unique identifier of the grant to retire. The grant ID is returned in the response to a CreateGrant operation.\n\nGrant ID Example - 0123456789012345678901234567890123456789012345678901234567890123\n\n
:return: response = client.retire_grant(
# The identifier of the grant to retire.
GrantId='0c237476b39f8bc44e45212e08498fbe3151305030726c0590dd8d3e9f3d6a60',
# The Amazon Resource Name (ARN) of the customer master key (CMK) associated with the grant.
KeyId='arn:aws:kms:us-east-2:444455556666:key/1234abcd-12ab-34cd-56ef-1234567890ab',
)
print(response)
:returns:
GrantToken (string) -- Token that identifies the grant to be retired.
KeyId (string) -- The Amazon Resource Name (ARN) of the CMK associated with the grant.
For example: arn:aws:kms:us-east-2:444455556666:key/1234abcd-12ab-34cd-56ef-1234567890ab
GrantId (string) -- Unique identifier of the grant to retire. The grant ID is returned in the response to a CreateGrant operation.
Grant ID Example - 0123456789012345678901234567890123456789012345678901234567890123
"""
pass
def revoke_grant(KeyId=None, GrantId=None):
"""
Revokes the specified grant for the specified customer master key (CMK). You can revoke a grant to actively deny operations that depend on it.
To perform this operation on a CMK in a different AWS account, specify the key ARN in the value of the KeyId parameter.
See also: AWS API Documentation
Exceptions
Examples
The following example revokes a grant.
Expected Output:
:example: response = client.revoke_grant(
KeyId='string',
GrantId='string'
)
:type KeyId: string
:param KeyId: [REQUIRED]\nA unique identifier for the customer master key associated with the grant.\nSpecify the key ID or the Amazon Resource Name (ARN) of the CMK. To specify a CMK in a different AWS account, you must use the key ARN.\nFor example:\n\nKey ID: 1234abcd-12ab-34cd-56ef-1234567890ab\nKey ARN: arn:aws:kms:us-east-2:111122223333:key/1234abcd-12ab-34cd-56ef-1234567890ab\n\nTo get the key ID and key ARN for a CMK, use ListKeys or DescribeKey .\n
:type GrantId: string
:param GrantId: [REQUIRED]\nIdentifier of the grant to be revoked.\n
:return: response = client.revoke_grant(
# The identifier of the grant to revoke.
GrantId='0c237476b39f8bc44e45212e08498fbe3151305030726c0590dd8d3e9f3d6a60',
# The identifier of the customer master key (CMK) associated with the grant. You can use the key ID or the Amazon Resource Name (ARN) of the CMK.
KeyId='1234abcd-12ab-34cd-56ef-1234567890ab',
)
print(response)
:returns:
KMS.Client.exceptions.NotFoundException
KMS.Client.exceptions.DependencyTimeoutException
KMS.Client.exceptions.InvalidArnException
KMS.Client.exceptions.InvalidGrantIdException
KMS.Client.exceptions.KMSInternalException
KMS.Client.exceptions.KMSInvalidStateException
"""
pass
def schedule_key_deletion(KeyId=None, PendingWindowInDays=None):
"""
Schedules the deletion of a customer master key (CMK). You may provide a waiting period, specified in days, before deletion occurs. If you do not provide a waiting period, the default period of 30 days is used. When this operation is successful, the key state of the CMK changes to PendingDeletion . Before the waiting period ends, you can use CancelKeyDeletion to cancel the deletion of the CMK. After the waiting period ends, AWS KMS deletes the CMK and all AWS KMS data associated with it, including all aliases that refer to it.
If you schedule deletion of a CMK from a custom key store , when the waiting period expires, ScheduleKeyDeletion deletes the CMK from AWS KMS. Then AWS KMS makes a best effort to delete the key material from the associated AWS CloudHSM cluster. However, you might need to manually delete the orphaned key material from the cluster and its backups.
You cannot perform this operation on a CMK in a different AWS account.
For more information about scheduling a CMK for deletion, see Deleting Customer Master Keys in the AWS Key Management Service Developer Guide .
The CMK that you use for this operation must be in a compatible key state. For details, see How Key State Affects Use of a Customer Master Key in the AWS Key Management Service Developer Guide .
See also: AWS API Documentation
Exceptions
Examples
The following example schedules the specified CMK for deletion.
Expected Output:
:example: response = client.schedule_key_deletion(
KeyId='string',
PendingWindowInDays=123
)
:type KeyId: string
:param KeyId: [REQUIRED]\nThe unique identifier of the customer master key (CMK) to delete.\nSpecify the key ID or the Amazon Resource Name (ARN) of the CMK.\nFor example:\n\nKey ID: 1234abcd-12ab-34cd-56ef-1234567890ab\nKey ARN: arn:aws:kms:us-east-2:111122223333:key/1234abcd-12ab-34cd-56ef-1234567890ab\n\nTo get the key ID and key ARN for a CMK, use ListKeys or DescribeKey .\n
:type PendingWindowInDays: integer
:param PendingWindowInDays: The waiting period, specified in number of days. After the waiting period ends, AWS KMS deletes the customer master key (CMK).\nThis value is optional. If you include a value, it must be between 7 and 30, inclusive. If you do not include a value, it defaults to 30.\n
:rtype: dict
ReturnsResponse Syntax
{
'KeyId': 'string',
'DeletionDate': datetime(2015, 1, 1)
}
Response Structure
(dict) --
KeyId (string) --
The unique identifier of the customer master key (CMK) for which deletion is scheduled.
DeletionDate (datetime) --
The date and time after which AWS KMS deletes the customer master key (CMK).
Exceptions
KMS.Client.exceptions.NotFoundException
KMS.Client.exceptions.InvalidArnException
KMS.Client.exceptions.DependencyTimeoutException
KMS.Client.exceptions.KMSInternalException
KMS.Client.exceptions.KMSInvalidStateException
Examples
The following example schedules the specified CMK for deletion.
response = client.schedule_key_deletion(
# The identifier of the CMK to schedule for deletion. You can use the key ID or the Amazon Resource Name (ARN) of the CMK.
KeyId='1234abcd-12ab-34cd-56ef-1234567890ab',
# The waiting period, specified in number of days. After the waiting period ends, AWS KMS deletes the CMK.
PendingWindowInDays=7,
)
print(response)
Expected Output:
{
# The date and time after which AWS KMS deletes the CMK.
'DeletionDate': datetime(2016, 12, 17, 16, 0, 0, 5, 352, 0),
# The ARN of the CMK that is scheduled for deletion.
'KeyId': 'arn:aws:kms:us-east-2:111122223333:key/1234abcd-12ab-34cd-56ef-1234567890ab',
'ResponseMetadata': {
'...': '...',
},
}
:return: {
'KeyId': 'string',
'DeletionDate': datetime(2015, 1, 1)
}
:returns:
KMS.Client.exceptions.NotFoundException
KMS.Client.exceptions.InvalidArnException
KMS.Client.exceptions.DependencyTimeoutException
KMS.Client.exceptions.KMSInternalException
KMS.Client.exceptions.KMSInvalidStateException
"""
pass
def sign(KeyId=None, Message=None, MessageType=None, GrantTokens=None, SigningAlgorithm=None):
"""
Creates a digital signature for a message or message digest by using the private key in an asymmetric CMK. To verify the signature, use the Verify operation, or use the public key in the same asymmetric CMK outside of AWS KMS. For information about symmetric and asymmetric CMKs, see Using Symmetric and Asymmetric CMKs in the AWS Key Management Service Developer Guide .
Digital signatures are generated and verified by using asymmetric key pair, such as an RSA or ECC pair that is represented by an asymmetric customer master key (CMK). The key owner (or an authorized user) uses their private key to sign a message. Anyone with the public key can verify that the message was signed with that particular private key and that the message hasn\'t changed since it was signed.
To use the Sign operation, provide the following information:
To verify the signature that this operation generates, use the Verify operation. Or use the GetPublicKey operation to download the public key and then use the public key to verify the signature outside of AWS KMS.
The CMK that you use for this operation must be in a compatible key state. For details, see How Key State Affects Use of a Customer Master Key in the AWS Key Management Service Developer Guide .
See also: AWS API Documentation
Exceptions
:example: response = client.sign(
KeyId='string',
Message=b'bytes',
MessageType='RAW'|'DIGEST',
GrantTokens=[
'string',
],
SigningAlgorithm='RSASSA_PSS_SHA_256'|'RSASSA_PSS_SHA_384'|'RSASSA_PSS_SHA_512'|'RSASSA_PKCS1_V1_5_SHA_256'|'RSASSA_PKCS1_V1_5_SHA_384'|'RSASSA_PKCS1_V1_5_SHA_512'|'ECDSA_SHA_256'|'ECDSA_SHA_384'|'ECDSA_SHA_512'
)
:type KeyId: string
:param KeyId: [REQUIRED]\nIdentifies an asymmetric CMK. AWS KMS uses the private key in the asymmetric CMK to sign the message. The KeyUsage type of the CMK must be SIGN_VERIFY . To find the KeyUsage of a CMK, use the DescribeKey operation.\nTo specify a CMK, use its key ID, Amazon Resource Name (ARN), alias name, or alias ARN. When using an alias name, prefix it with 'alias/' . To specify a CMK in a different AWS account, you must use the key ARN or alias ARN.\nFor example:\n\nKey ID: 1234abcd-12ab-34cd-56ef-1234567890ab\nKey ARN: arn:aws:kms:us-east-2:111122223333:key/1234abcd-12ab-34cd-56ef-1234567890ab\nAlias name: alias/ExampleAlias\nAlias ARN: arn:aws:kms:us-east-2:111122223333:alias/ExampleAlias\n\nTo get the key ID and key ARN for a CMK, use ListKeys or DescribeKey . To get the alias name and alias ARN, use ListAliases .\n
:type Message: bytes
:param Message: [REQUIRED]\nSpecifies the message or message digest to sign. Messages can be 0-4096 bytes. To sign a larger message, provide the message digest.\nIf you provide a message, AWS KMS generates a hash digest of the message and then signs it.\n
:type MessageType: string
:param MessageType: Tells AWS KMS whether the value of the Message parameter is a message or message digest. The default value, RAW, indicates a message. To indicate a message digest, enter DIGEST .
:type GrantTokens: list
:param GrantTokens: A list of grant tokens.\nFor more information, see Grant Tokens in the AWS Key Management Service Developer Guide .\n\n(string) --\n\n
:type SigningAlgorithm: string
:param SigningAlgorithm: [REQUIRED]\nSpecifies the signing algorithm to use when signing the message.\nChoose an algorithm that is compatible with the type and size of the specified asymmetric CMK.\n
:rtype: dict
ReturnsResponse Syntax
{
'KeyId': 'string',
'Signature': b'bytes',
'SigningAlgorithm': 'RSASSA_PSS_SHA_256'|'RSASSA_PSS_SHA_384'|'RSASSA_PSS_SHA_512'|'RSASSA_PKCS1_V1_5_SHA_256'|'RSASSA_PKCS1_V1_5_SHA_384'|'RSASSA_PKCS1_V1_5_SHA_512'|'ECDSA_SHA_256'|'ECDSA_SHA_384'|'ECDSA_SHA_512'
}
Response Structure
(dict) --
KeyId (string) --
The Amazon Resource Name (ARN) of the asymmetric CMK that was used to sign the message.
Signature (bytes) --
The cryptographic signature that was generated for the message.
When used with the supported RSA signing algorithms, the encoding of this value is defined by PKCS #1 in RFC 8017 .
When used with the ECDSA_SHA_256 , ECDSA_SHA_384 , or ECDSA_SHA_512 signing algorithms, this value is a DER-encoded object as defined by ANS X9.62\xe2\x80\x932005 and RFC 3279 Section 2.2.3 . This is the most commonly used signature format and is appropriate for most uses.
When you use the HTTP API or the AWS CLI, the value is Base64-encoded. Otherwise, it is not Base64-encoded.
SigningAlgorithm (string) --
The signing algorithm that was used to sign the message.
Exceptions
KMS.Client.exceptions.NotFoundException
KMS.Client.exceptions.DisabledException
KMS.Client.exceptions.KeyUnavailableException
KMS.Client.exceptions.DependencyTimeoutException
KMS.Client.exceptions.InvalidKeyUsageException
KMS.Client.exceptions.InvalidGrantTokenException
KMS.Client.exceptions.KMSInternalException
KMS.Client.exceptions.KMSInvalidStateException
:return: {
'KeyId': 'string',
'Signature': b'bytes',
'SigningAlgorithm': 'RSASSA_PSS_SHA_256'|'RSASSA_PSS_SHA_384'|'RSASSA_PSS_SHA_512'|'RSASSA_PKCS1_V1_5_SHA_256'|'RSASSA_PKCS1_V1_5_SHA_384'|'RSASSA_PKCS1_V1_5_SHA_512'|'ECDSA_SHA_256'|'ECDSA_SHA_384'|'ECDSA_SHA_512'
}
:returns:
KeyId (string) -- [REQUIRED]
Identifies an asymmetric CMK. AWS KMS uses the private key in the asymmetric CMK to sign the message. The KeyUsage type of the CMK must be SIGN_VERIFY . To find the KeyUsage of a CMK, use the DescribeKey operation.
To specify a CMK, use its key ID, Amazon Resource Name (ARN), alias name, or alias ARN. When using an alias name, prefix it with "alias/" . To specify a CMK in a different AWS account, you must use the key ARN or alias ARN.
For example:
Key ID: 1234abcd-12ab-34cd-56ef-1234567890ab
Key ARN: arn:aws:kms:us-east-2:111122223333:key/1234abcd-12ab-34cd-56ef-1234567890ab
Alias name: alias/ExampleAlias
Alias ARN: arn:aws:kms:us-east-2:111122223333:alias/ExampleAlias
To get the key ID and key ARN for a CMK, use ListKeys or DescribeKey . To get the alias name and alias ARN, use ListAliases .
Message (bytes) -- [REQUIRED]
Specifies the message or message digest to sign. Messages can be 0-4096 bytes. To sign a larger message, provide the message digest.
If you provide a message, AWS KMS generates a hash digest of the message and then signs it.
MessageType (string) -- Tells AWS KMS whether the value of the Message parameter is a message or message digest. The default value, RAW, indicates a message. To indicate a message digest, enter DIGEST .
GrantTokens (list) -- A list of grant tokens.
For more information, see Grant Tokens in the AWS Key Management Service Developer Guide .
(string) --
SigningAlgorithm (string) -- [REQUIRED]
Specifies the signing algorithm to use when signing the message.
Choose an algorithm that is compatible with the type and size of the specified asymmetric CMK.
"""
pass
def tag_resource(KeyId=None, Tags=None):
"""
Adds or edits tags for a customer master key (CMK). You cannot perform this operation on a CMK in a different AWS account.
Each tag consists of a tag key and a tag value. Tag keys and tag values are both required, but tag values can be empty (null) strings.
You can only use a tag key once for each CMK. If you use the tag key again, AWS KMS replaces the current tag value with the specified value.
For information about the rules that apply to tag keys and tag values, see User-Defined Tag Restrictions in the AWS Billing and Cost Management User Guide .
The CMK that you use for this operation must be in a compatible key state. For details, see How Key State Affects Use of a Customer Master Key in the AWS Key Management Service Developer Guide .
See also: AWS API Documentation
Exceptions
Examples
The following example tags a CMK.
Expected Output:
:example: response = client.tag_resource(
KeyId='string',
Tags=[
{
'TagKey': 'string',
'TagValue': 'string'
},
]
)
:type KeyId: string
:param KeyId: [REQUIRED]\nA unique identifier for the CMK you are tagging.\nSpecify the key ID or the Amazon Resource Name (ARN) of the CMK.\nFor example:\n\nKey ID: 1234abcd-12ab-34cd-56ef-1234567890ab\nKey ARN: arn:aws:kms:us-east-2:111122223333:key/1234abcd-12ab-34cd-56ef-1234567890ab\n\nTo get the key ID and key ARN for a CMK, use ListKeys or DescribeKey .\n
:type Tags: list
:param Tags: [REQUIRED]\nOne or more tags. Each tag consists of a tag key and a tag value.\n\n(dict) --A key-value pair. A tag consists of a tag key and a tag value. Tag keys and tag values are both required, but tag values can be empty (null) strings.\nFor information about the rules that apply to tag keys and tag values, see User-Defined Tag Restrictions in the AWS Billing and Cost Management User Guide .\n\nTagKey (string) -- [REQUIRED]The key of the tag.\n\nTagValue (string) -- [REQUIRED]The value of the tag.\n\n\n\n\n
:return: response = client.tag_resource(
# The identifier of the CMK you are tagging. You can use the key ID or the Amazon Resource Name (ARN) of the CMK.
KeyId='1234abcd-12ab-34cd-56ef-1234567890ab',
# A list of tags.
Tags=[
{
'TagKey': 'Purpose',
'TagValue': 'Test',
},
],
)
print(response)
:returns:
KMS.Client.exceptions.KMSInternalException
KMS.Client.exceptions.NotFoundException
KMS.Client.exceptions.InvalidArnException
KMS.Client.exceptions.KMSInvalidStateException
KMS.Client.exceptions.LimitExceededException
KMS.Client.exceptions.TagException
"""
pass
def untag_resource(KeyId=None, TagKeys=None):
"""
Removes the specified tags from the specified customer master key (CMK). You cannot perform this operation on a CMK in a different AWS account.
To remove a tag, specify the tag key. To change the tag value of an existing tag key, use TagResource .
The CMK that you use for this operation must be in a compatible key state. For details, see How Key State Affects Use of a Customer Master Key in the AWS Key Management Service Developer Guide .
See also: AWS API Documentation
Exceptions
Examples
The following example removes tags from a CMK.
Expected Output:
:example: response = client.untag_resource(
KeyId='string',
TagKeys=[
'string',
]
)
:type KeyId: string
:param KeyId: [REQUIRED]\nA unique identifier for the CMK from which you are removing tags.\nSpecify the key ID or the Amazon Resource Name (ARN) of the CMK.\nFor example:\n\nKey ID: 1234abcd-12ab-34cd-56ef-1234567890ab\nKey ARN: arn:aws:kms:us-east-2:111122223333:key/1234abcd-12ab-34cd-56ef-1234567890ab\n\nTo get the key ID and key ARN for a CMK, use ListKeys or DescribeKey .\n
:type TagKeys: list
:param TagKeys: [REQUIRED]\nOne or more tag keys. Specify only the tag keys, not the tag values.\n\n(string) --\n\n
:return: response = client.untag_resource(
# The identifier of the CMK whose tags you are removing.
KeyId='1234abcd-12ab-34cd-56ef-1234567890ab',
# A list of tag keys. Provide only the tag keys, not the tag values.
TagKeys=[
'Purpose',
'CostCenter',
],
)
print(response)
:returns:
KMS.Client.exceptions.KMSInternalException
KMS.Client.exceptions.NotFoundException
KMS.Client.exceptions.InvalidArnException
KMS.Client.exceptions.KMSInvalidStateException
KMS.Client.exceptions.TagException
"""
pass
def update_alias(AliasName=None, TargetKeyId=None):
"""
Associates an existing AWS KMS alias with a different customer master key (CMK). Each alias is associated with only one CMK at a time, although a CMK can have multiple aliases. The alias and the CMK must be in the same AWS account and region. You cannot perform this operation on an alias in a different AWS account.
The current and new CMK must be the same type (both symmetric or both asymmetric), and they must have the same key usage (ENCRYPT_DECRYPT or SIGN_VERIFY ). This restriction prevents errors in code that uses aliases. If you must assign an alias to a different type of CMK, use DeleteAlias to delete the old alias and CreateAlias to create a new alias.
You cannot use UpdateAlias to change an alias name. To change an alias name, use DeleteAlias to delete the old alias and CreateAlias to create a new alias.
Because an alias is not a property of a CMK, you can create, update, and delete the aliases of a CMK without affecting the CMK. Also, aliases do not appear in the response from the DescribeKey operation. To get the aliases of all CMKs in the account, use the ListAliases operation.
The CMK that you use for this operation must be in a compatible key state. For details, see How Key State Affects Use of a Customer Master Key in the AWS Key Management Service Developer Guide .
See also: AWS API Documentation
Exceptions
Examples
The following example updates the specified alias to refer to the specified customer master key (CMK).
Expected Output:
:example: response = client.update_alias(
AliasName='string',
TargetKeyId='string'
)
:type AliasName: string
:param AliasName: [REQUIRED]\nIdentifies the alias that is changing its CMK. This value must begin with alias/ followed by the alias name, such as alias/ExampleAlias . You cannot use UpdateAlias to change the alias name.\n
:type TargetKeyId: string
:param TargetKeyId: [REQUIRED]\nIdentifies the CMK to associate with the alias. When the update operation completes, the alias will point to this CMK.\nThe CMK must be in the same AWS account and Region as the alias. Also, the new target CMK must be the same type as the current target CMK (both symmetric or both asymmetric) and they must have the same key usage.\nSpecify the key ID or the Amazon Resource Name (ARN) of the CMK.\nFor example:\n\nKey ID: 1234abcd-12ab-34cd-56ef-1234567890ab\nKey ARN: arn:aws:kms:us-east-2:111122223333:key/1234abcd-12ab-34cd-56ef-1234567890ab\n\nTo get the key ID and key ARN for a CMK, use ListKeys or DescribeKey .\nTo verify that the alias is mapped to the correct CMK, use ListAliases .\n
:return: response = client.update_alias(
# The alias to update.
AliasName='alias/ExampleAlias',
# The identifier of the CMK that the alias will refer to after this operation succeeds. You can use the key ID or the Amazon Resource Name (ARN) of the CMK.
TargetKeyId='1234abcd-12ab-34cd-56ef-1234567890ab',
)
print(response)
:returns:
KMS.Client.exceptions.DependencyTimeoutException
KMS.Client.exceptions.NotFoundException
KMS.Client.exceptions.KMSInternalException
KMS.Client.exceptions.KMSInvalidStateException
"""
pass
def update_custom_key_store(CustomKeyStoreId=None, NewCustomKeyStoreName=None, KeyStorePassword=None, CloudHsmClusterId=None):
"""
Changes the properties of a custom key store. Use the CustomKeyStoreId parameter to identify the custom key store you want to edit. Use the remaining parameters to change the properties of the custom key store.
You can only update a custom key store that is disconnected. To disconnect the custom key store, use DisconnectCustomKeyStore . To reconnect the custom key store after the update completes, use ConnectCustomKeyStore . To find the connection state of a custom key store, use the DescribeCustomKeyStores operation.
Use the parameters of UpdateCustomKeyStore to edit your keystore settings.
If the operation succeeds, it returns a JSON object with no properties.
This operation is part of the Custom Key Store feature feature in AWS KMS, which combines the convenience and extensive integration of AWS KMS with the isolation and control of a single-tenant key store.
See also: AWS API Documentation
Exceptions
:example: response = client.update_custom_key_store(
CustomKeyStoreId='string',
NewCustomKeyStoreName='string',
KeyStorePassword='string',
CloudHsmClusterId='string'
)
:type CustomKeyStoreId: string
:param CustomKeyStoreId: [REQUIRED]\nIdentifies the custom key store that you want to update. Enter the ID of the custom key store. To find the ID of a custom key store, use the DescribeCustomKeyStores operation.\n
:type NewCustomKeyStoreName: string
:param NewCustomKeyStoreName: Changes the friendly name of the custom key store to the value that you specify. The custom key store name must be unique in the AWS account.
:type KeyStorePassword: string
:param KeyStorePassword: Enter the current password of the kmsuser crypto user (CU) in the AWS CloudHSM cluster that is associated with the custom key store.\nThis parameter tells AWS KMS the current password of the kmsuser crypto user (CU). It does not set or change the password of any users in the AWS CloudHSM cluster.\n
:type CloudHsmClusterId: string
:param CloudHsmClusterId: Associates the custom key store with a related AWS CloudHSM cluster.\nEnter the cluster ID of the cluster that you used to create the custom key store or a cluster that shares a backup history and has the same cluster certificate as the original cluster. You cannot use this parameter to associate a custom key store with an unrelated cluster. In addition, the replacement cluster must fulfill the requirements for a cluster associated with a custom key store. To view the cluster certificate of a cluster, use the DescribeClusters operation.\n
:rtype: dict
ReturnsResponse Syntax
{}
Response Structure
(dict) --
Exceptions
KMS.Client.exceptions.CustomKeyStoreNotFoundException
KMS.Client.exceptions.CustomKeyStoreNameInUseException
KMS.Client.exceptions.CloudHsmClusterNotFoundException
KMS.Client.exceptions.CloudHsmClusterNotRelatedException
KMS.Client.exceptions.CustomKeyStoreInvalidStateException
KMS.Client.exceptions.KMSInternalException
KMS.Client.exceptions.CloudHsmClusterNotActiveException
KMS.Client.exceptions.CloudHsmClusterInvalidConfigurationException
:return: {}
:returns:
CustomKeyStoreId (string) -- [REQUIRED]
Identifies the custom key store that you want to update. Enter the ID of the custom key store. To find the ID of a custom key store, use the DescribeCustomKeyStores operation.
NewCustomKeyStoreName (string) -- Changes the friendly name of the custom key store to the value that you specify. The custom key store name must be unique in the AWS account.
KeyStorePassword (string) -- Enter the current password of the kmsuser crypto user (CU) in the AWS CloudHSM cluster that is associated with the custom key store.
This parameter tells AWS KMS the current password of the kmsuser crypto user (CU). It does not set or change the password of any users in the AWS CloudHSM cluster.
CloudHsmClusterId (string) -- Associates the custom key store with a related AWS CloudHSM cluster.
Enter the cluster ID of the cluster that you used to create the custom key store or a cluster that shares a backup history and has the same cluster certificate as the original cluster. You cannot use this parameter to associate a custom key store with an unrelated cluster. In addition, the replacement cluster must fulfill the requirements for a cluster associated with a custom key store. To view the cluster certificate of a cluster, use the DescribeClusters operation.
"""
pass
def update_key_description(KeyId=None, Description=None):
"""
Updates the description of a customer master key (CMK). To see the description of a CMK, use DescribeKey .
You cannot perform this operation on a CMK in a different AWS account.
The CMK that you use for this operation must be in a compatible key state. For details, see How Key State Affects Use of a Customer Master Key in the AWS Key Management Service Developer Guide .
See also: AWS API Documentation
Exceptions
Examples
The following example updates the description of the specified CMK.
Expected Output:
:example: response = client.update_key_description(
KeyId='string',
Description='string'
)
:type KeyId: string
:param KeyId: [REQUIRED]\nA unique identifier for the customer master key (CMK).\nSpecify the key ID or the Amazon Resource Name (ARN) of the CMK.\nFor example:\n\nKey ID: 1234abcd-12ab-34cd-56ef-1234567890ab\nKey ARN: arn:aws:kms:us-east-2:111122223333:key/1234abcd-12ab-34cd-56ef-1234567890ab\n\nTo get the key ID and key ARN for a CMK, use ListKeys or DescribeKey .\n
:type Description: string
:param Description: [REQUIRED]\nNew description for the CMK.\n
:return: response = client.update_key_description(
# The updated description.
Description='Example description that indicates the intended use of this CMK.',
# The identifier of the CMK whose description you are updating. You can use the key ID or the Amazon Resource Name (ARN) of the CMK.
KeyId='1234abcd-12ab-34cd-56ef-1234567890ab',
)
print(response)
:returns:
KMS.Client.exceptions.NotFoundException
KMS.Client.exceptions.InvalidArnException
KMS.Client.exceptions.DependencyTimeoutException
KMS.Client.exceptions.KMSInternalException
KMS.Client.exceptions.KMSInvalidStateException
"""
pass
def verify(KeyId=None, Message=None, MessageType=None, Signature=None, SigningAlgorithm=None, GrantTokens=None):
"""
Verifies a digital signature that was generated by the Sign operation.
Verification confirms that an authorized user signed the message with the specified CMK and signing algorithm, and the message hasn\'t changed since it was signed. If the signature is verified, the value of the SignatureValid field in the response is True . If the signature verification fails, the Verify operation fails with an KMSInvalidSignatureException exception.
A digital signature is generated by using the private key in an asymmetric CMK. The signature is verified by using the public key in the same asymmetric CMK. For information about symmetric and asymmetric CMKs, see Using Symmetric and Asymmetric CMKs in the AWS Key Management Service Developer Guide .
To verify a digital signature, you can use the Verify operation. Specify the same asymmetric CMK, message, and signing algorithm that were used to produce the signature.
You can also verify the digital signature by using the public key of the CMK outside of AWS KMS. Use the GetPublicKey operation to download the public key in the asymmetric CMK and then use the public key to verify the signature outside of AWS KMS. The advantage of using the Verify operation is that it is performed within AWS KMS. As a result, it\'s easy to call, the operation is performed within the FIPS boundary, it is logged in AWS CloudTrail, and you can use key policy and IAM policy to determine who is authorized to use the CMK to verify signatures.
The CMK that you use for this operation must be in a compatible key state. For details, see How Key State Affects Use of a Customer Master Key in the AWS Key Management Service Developer Guide .
See also: AWS API Documentation
Exceptions
:example: response = client.verify(
KeyId='string',
Message=b'bytes',
MessageType='RAW'|'DIGEST',
Signature=b'bytes',
SigningAlgorithm='RSASSA_PSS_SHA_256'|'RSASSA_PSS_SHA_384'|'RSASSA_PSS_SHA_512'|'RSASSA_PKCS1_V1_5_SHA_256'|'RSASSA_PKCS1_V1_5_SHA_384'|'RSASSA_PKCS1_V1_5_SHA_512'|'ECDSA_SHA_256'|'ECDSA_SHA_384'|'ECDSA_SHA_512',
GrantTokens=[
'string',
]
)
:type KeyId: string
:param KeyId: [REQUIRED]\nIdentifies the asymmetric CMK that will be used to verify the signature. This must be the same CMK that was used to generate the signature. If you specify a different CMK, the signature verification fails.\nTo specify a CMK, use its key ID, Amazon Resource Name (ARN), alias name, or alias ARN. When using an alias name, prefix it with 'alias/' . To specify a CMK in a different AWS account, you must use the key ARN or alias ARN.\nFor example:\n\nKey ID: 1234abcd-12ab-34cd-56ef-1234567890ab\nKey ARN: arn:aws:kms:us-east-2:111122223333:key/1234abcd-12ab-34cd-56ef-1234567890ab\nAlias name: alias/ExampleAlias\nAlias ARN: arn:aws:kms:us-east-2:111122223333:alias/ExampleAlias\n\nTo get the key ID and key ARN for a CMK, use ListKeys or DescribeKey . To get the alias name and alias ARN, use ListAliases .\n
:type Message: bytes
:param Message: [REQUIRED]\nSpecifies the message that was signed. You can submit a raw message of up to 4096 bytes, or a hash digest of the message. If you submit a digest, use the MessageType parameter with a value of DIGEST .\nIf the message specified here is different from the message that was signed, the signature verification fails. A message and its hash digest are considered to be the same message.\n
:type MessageType: string
:param MessageType: Tells AWS KMS whether the value of the Message parameter is a message or message digest. The default value, RAW, indicates a message. To indicate a message digest, enter DIGEST .\n\nWarning\nUse the DIGEST value only when the value of the Message parameter is a message digest. If you use the DIGEST value with a raw message, the security of the verification operation can be compromised.\n\n
:type Signature: bytes
:param Signature: [REQUIRED]\nThe signature that the Sign operation generated.\n
:type SigningAlgorithm: string
:param SigningAlgorithm: [REQUIRED]\nThe signing algorithm that was used to sign the message. If you submit a different algorithm, the signature verification fails.\n
:type GrantTokens: list
:param GrantTokens: A list of grant tokens.\nFor more information, see Grant Tokens in the AWS Key Management Service Developer Guide .\n\n(string) --\n\n
:rtype: dict
ReturnsResponse Syntax
{
'KeyId': 'string',
'SignatureValid': True|False,
'SigningAlgorithm': 'RSASSA_PSS_SHA_256'|'RSASSA_PSS_SHA_384'|'RSASSA_PSS_SHA_512'|'RSASSA_PKCS1_V1_5_SHA_256'|'RSASSA_PKCS1_V1_5_SHA_384'|'RSASSA_PKCS1_V1_5_SHA_512'|'ECDSA_SHA_256'|'ECDSA_SHA_384'|'ECDSA_SHA_512'
}
Response Structure
(dict) --
KeyId (string) --
The unique identifier for the asymmetric CMK that was used to verify the signature.
SignatureValid (boolean) --
A Boolean value that indicates whether the signature was verified. A value of True indicates that the Signature was produced by signing the Message with the specified KeyID and SigningAlgorithm. If the signature is not verified, the Verify operation fails with a KMSInvalidSignatureException exception.
SigningAlgorithm (string) --
The signing algorithm that was used to verify the signature.
Exceptions
KMS.Client.exceptions.NotFoundException
KMS.Client.exceptions.DisabledException
KMS.Client.exceptions.KeyUnavailableException
KMS.Client.exceptions.DependencyTimeoutException
KMS.Client.exceptions.InvalidKeyUsageException
KMS.Client.exceptions.InvalidGrantTokenException
KMS.Client.exceptions.KMSInternalException
KMS.Client.exceptions.KMSInvalidStateException
KMS.Client.exceptions.KMSInvalidSignatureException
:return: {
'KeyId': 'string',
'SignatureValid': True|False,
'SigningAlgorithm': 'RSASSA_PSS_SHA_256'|'RSASSA_PSS_SHA_384'|'RSASSA_PSS_SHA_512'|'RSASSA_PKCS1_V1_5_SHA_256'|'RSASSA_PKCS1_V1_5_SHA_384'|'RSASSA_PKCS1_V1_5_SHA_512'|'ECDSA_SHA_256'|'ECDSA_SHA_384'|'ECDSA_SHA_512'
}
:returns:
KMS.Client.exceptions.NotFoundException
KMS.Client.exceptions.DisabledException
KMS.Client.exceptions.KeyUnavailableException
KMS.Client.exceptions.DependencyTimeoutException
KMS.Client.exceptions.InvalidKeyUsageException
KMS.Client.exceptions.InvalidGrantTokenException
KMS.Client.exceptions.KMSInternalException
KMS.Client.exceptions.KMSInvalidStateException
KMS.Client.exceptions.KMSInvalidSignatureException
"""
pass
| 50.510664 | 1,572 | 0.735056 | 34,833 | 251,038 | 5.264261 | 0.039962 | 0.017326 | 0.036576 | 0.013961 | 0.825577 | 0.790664 | 0.7614 | 0.735824 | 0.717991 | 0.703038 | 0 | 0.033196 | 0.199018 | 251,038 | 4,969 | 1,573 | 50.520829 | 0.878743 | 0.977095 | 0 | 0.5 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | false | 0.54 | 0.03 | 0 | 0.53 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 11 |
58fb12e3eece69872f4aed6e714622f83dad2951 | 197 | py | Python | sql_gen/sql_gen/filter_loader.py | vecin2/em-dev-tools | b34e7a7d8dc8df301cfce2aced8ac38121cf5d56 | [
"MIT"
] | null | null | null | sql_gen/sql_gen/filter_loader.py | vecin2/em-dev-tools | b34e7a7d8dc8df301cfce2aced8ac38121cf5d56 | [
"MIT"
] | null | null | null | sql_gen/sql_gen/filter_loader.py | vecin2/em-dev-tools | b34e7a7d8dc8df301cfce2aced8ac38121cf5d56 | [
"MIT"
] | null | null | null | import importlib
def load_filters(env):
get_template_filter=getattr(importlib.import_module("filters.description"), "get_template_filter")
env.filters['description']=get_template_filter()
| 32.833333 | 102 | 0.80203 | 24 | 197 | 6.25 | 0.5 | 0.22 | 0.34 | 0.386667 | 0.466667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.081218 | 197 | 5 | 103 | 39.4 | 0.828729 | 0 | 0 | 0 | 0 | 0 | 0.25 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0.5 | 0 | 0.75 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
4508d4f242845b8a4c95bfb25d269ce37b698f22 | 136 | py | Python | content/utils.py | tedor/home-blog | 41e6cde964b9501864925f17d496ffea1fd0e770 | [
"BSD-3-Clause"
] | null | null | null | content/utils.py | tedor/home-blog | 41e6cde964b9501864925f17d496ffea1fd0e770 | [
"BSD-3-Clause"
] | null | null | null | content/utils.py | tedor/home-blog | 41e6cde964b9501864925f17d496ffea1fd0e770 | [
"BSD-3-Clause"
] | null | null | null | from settings import SITE_NAMES, SITE_ID
def get_site_name():
try:
return SITE_NAMES[SITE_ID]
except:
return '' | 19.428571 | 40 | 0.654412 | 19 | 136 | 4.368421 | 0.631579 | 0.216867 | 0.313253 | 0.361446 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.272059 | 136 | 7 | 41 | 19.428571 | 0.838384 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | true | 0 | 0.166667 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 7 |
4521de689ece1f4ddd0305f20de91f0adff5e887 | 27,118 | py | Python | tests/test_integration.py | git-afsantos/haroslaunch | 5c5826683a6979c2249da0969a85b8739c238914 | [
"MIT"
] | null | null | null | tests/test_integration.py | git-afsantos/haroslaunch | 5c5826683a6979c2249da0969a85b8739c238914 | [
"MIT"
] | null | null | null | tests/test_integration.py | git-afsantos/haroslaunch | 5c5826683a6979c2249da0969a85b8739c238914 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
# SPDX-License-Identifier: MIT
# Copyright © 2021 André Santos
###############################################################################
# Imports
###############################################################################
from errno import EACCES
from pathlib import Path
try:
from xmlrpc.client import Binary
except ImportError:
from xmlrpclib import Binary
from haroslaunch.data_structs import STRING_TYPES
from haroslaunch.launch_interpreter import LaunchInterpreter
from haroslaunch.launch_xml_parser import parse_from_file
###############################################################################
# Mock ROS Interface
###############################################################################
class MockInterface(object):
def __init__(self):
self.ast_cache = {}
self.env = {}
@property
def ros_distro(self):
return 'melodic'
def get_environment_variable(self, name):
return self.env.get(name)
def get_package_path(self, name):
return Path(__file__).parent
def request_parse_tree(self, filepath):
if isinstance(filepath, STRING_TYPES):
filepath = Path(filepath)
assert isinstance(filepath, Path)
launch = Path(__file__).parent / 'launch'
if filepath.parent != launch:
raise ValueError(filepath)
ast = self.ast_cache.get(filepath)
if ast is None:
ast = parse_from_file(filepath) #!
self.ast_cache[filepath] = ast
return ast
def read_text_file(self, filepath):
if isinstance(filepath, STRING_TYPES):
filepath = Path(filepath)
assert isinstance(filepath, Path)
safe_dir = Path(__file__).parent
if not safe_dir in filepath.parents:
raise ValueError(filepath)
try:
return filepath.read_text()
except AttributeError: # Python 2
with open(str(filepath), 'r') as fh:
data = fh.read()
return data
def read_binary_file(self, filepath):
if isinstance(filepath, STRING_TYPES):
filepath = Path(filepath)
assert isinstance(filepath, Path)
safe_dir = Path(__file__).parent
if not safe_dir in filepath.parents:
raise ValueError(filepath)
try:
return Binary(filepath.read_bytes()).data
except AttributeError: # Python 2
with open(str(filepath), 'rb') as fh:
data = fh.read()
return Binary(data).data
def execute_command(self, cmd):
raise EnvironmentError(EACCES, cmd)
###############################################################################
# Test Kobuki Minimal
###############################################################################
def test_kobuki_minimal():
fp = Path(__file__).parent / 'launch' / 'kobuki_minimal.launch'
iface = MockInterface()
lfi = LaunchInterpreter(iface, include_absent=True)
lfi.interpret(fp)
assert not lfi.machines
assert not lfi.rosparam_cmds
assert len(lfi.nodes) == 3
assert len(lfi.parameters) == 34
# Node 0 -------------------------------
node = lfi.nodes[0]
assert not node.is_test_node
assert node.name.own == 'mobile_base_nodelet_manager'
assert node.name.full == '/mobile_base_nodelet_manager'
assert node.system is None
assert node.condition.is_true
assert node.traceability.filepath.endswith('/kobuki_minimal.launch')
assert node.traceability.line == 4
assert node.traceability.column == 3
assert node.package == 'nodelet'
assert node.executable == 'nodelet'
assert node.machine is None
assert node.is_required.value is False
assert node.respawns.value is False
assert node.respawn_delay.value == 0.0
assert node.args.value == 'manager'
assert node.output.value == 'log'
assert node.working_dir.value == 'ROS_HOME'
assert node.launch_prefix is None
assert len(node.remaps) == 0
assert len(node.environment) == 0
# Node 1 -------------------------------
node = lfi.nodes[1]
assert not node.is_test_node
assert node.name.own == 'mobile_base'
assert node.name.full == '/mobile_base'
assert node.system is None
assert node.condition.is_true
assert node.traceability.filepath.endswith('/kobuki_minimal.launch')
assert node.traceability.line == 5
assert node.traceability.column == 3
assert node.package == 'nodelet'
assert node.executable == 'nodelet'
assert node.machine is None
assert node.is_required.value is False
assert node.respawns.value is False
assert node.respawn_delay.value == 0.0
assert node.args.value == 'load kobuki_node/KobukiNodelet mobile_base_nodelet_manager'
assert node.output.value == 'log'
assert node.working_dir.value == 'ROS_HOME'
assert node.launch_prefix is None
assert len(node.remaps) == 2
assert node.remaps['/mobile_base/odom'].get_value() == '/odom'
assert node.remaps['/mobile_base/joint_states'].get_value() == '/joint_states'
assert len(node.environment) == 0
# Node 2 -------------------------------
node = lfi.nodes[2]
assert not node.is_test_node
assert node.name.own == 'diagnostic_aggregator'
assert node.name.full == '/diagnostic_aggregator'
assert node.system is None
assert node.condition.is_true
assert node.traceability.filepath.endswith('/kobuki_minimal.launch')
assert node.traceability.line == 12
assert node.traceability.column == 3
assert node.package == 'diagnostic_aggregator'
assert node.executable == 'aggregator_node'
assert node.machine is None
assert node.is_required.value is False
assert node.respawns.value is False
assert node.respawn_delay.value == 0.0
assert node.args.value == ''
assert node.output.value == 'log'
assert node.working_dir.value == 'ROS_HOME'
assert node.launch_prefix is None
assert len(node.remaps) == 0
assert len(node.environment) == 0
# Parameters ---------------------------
params = {
'/mobile_base/device_port': '/dev/kobuki',
'/mobile_base/wheel_left_joint_name': 'wheel_left_joint',
'/mobile_base/wheel_right_joint_name': 'wheel_right_joint',
'/mobile_base/battery_capacity': 16.5,
'/mobile_base/battery_low': 14.0,
'/mobile_base/battery_dangerous': 13.2,
'/mobile_base/cmd_vel_timeout': 0.6,
'/mobile_base/publish_tf': True,
'/mobile_base/use_imu_heading': True,
'/mobile_base/odom_frame': 'odom',
'/mobile_base/base_frame': 'base_footprint',
'/diagnostic_aggregator/pub_rate': 1.0,
'/diagnostic_aggregator/base_path': '',
'/diagnostic_aggregator/analyzers/power/type': 'diagnostic_aggregator/GenericAnalyzer',
'/diagnostic_aggregator/analyzers/power/path': 'Power System',
'/diagnostic_aggregator/analyzers/power/timeout': 5.0,
'/diagnostic_aggregator/analyzers/power/contains': ['Battery'],
'/diagnostic_aggregator/analyzers/power/remove_prefix': 'mobile_base_nodelet_manager',
'/diagnostic_aggregator/analyzers/kobuki/type': 'diagnostic_aggregator/GenericAnalyzer',
'/diagnostic_aggregator/analyzers/kobuki/path': 'Kobuki',
'/diagnostic_aggregator/analyzers/kobuki/timeout': 5.0,
'/diagnostic_aggregator/analyzers/kobuki/contains': ['Watchdog', 'Motor State'],
'/diagnostic_aggregator/analyzers/kobuki/remove_prefix': 'mobile_base_nodelet_manager',
'/diagnostic_aggregator/analyzers/sensors/type': 'diagnostic_aggregator/GenericAnalyzer',
'/diagnostic_aggregator/analyzers/sensors/path': 'Sensors',
'/diagnostic_aggregator/analyzers/sensors/timeout': 5.0,
'/diagnostic_aggregator/analyzers/sensors/contains': ['Cliff Sensor',
'Wall Sensor', 'Wheel Drop', 'Motor Current', 'Gyro Sensor'],
'/diagnostic_aggregator/analyzers/sensors/remove_prefix': 'mobile_base_nodelet_manager',
'/diagnostic_aggregator/analyzers/input_ports/type': 'diagnostic_aggregator/GenericAnalyzer',
'/diagnostic_aggregator/analyzers/input_ports/path': 'Input Ports',
'/diagnostic_aggregator/analyzers/input_ports/timeout': 5.0,
'/diagnostic_aggregator/analyzers/input_ports/contains': ['Digital Input', 'Analog Input'],
'/diagnostic_aggregator/analyzers/input_ports/remove_prefix': 'mobile_base_nodelet_manager',
}
for p in lfi.parameters:
assert p.namespace.full.startswith(('/mobile_base', '/diagnostic_aggregator'))
assert p.system is None
assert p.condition.is_true
assert p.traceability.filepath.endswith('/kobuki_minimal.launch')
assert p.traceability.line in (6, 7, 13)
assert p.traceability.column == 5
assert p.value.value == params[p.name.full]
###############################################################################
# Test Kobuki Safe Teleoperation
###############################################################################
def test_kobuki_safe_keyop():
fp = Path(__file__).parent / 'launch' / 'kobuki_safe_keyop.launch'
iface = MockInterface()
lfi = LaunchInterpreter(iface, include_absent=True)
lfi.interpret(fp)
assert not lfi.machines
assert not lfi.rosparam_cmds
assert len(lfi.nodes) == 4
assert len(lfi.parameters) == 12
# Node 0 -------------------------------
node = lfi.nodes[0]
assert not node.is_test_node
assert node.name.own == 'cmd_vel_mux'
assert node.name.full == '/cmd_vel_mux'
assert node.system is None
assert node.condition.is_true
assert node.traceability.filepath.endswith('/kobuki_safe_keyop.launch')
assert node.traceability.line == 5
assert node.traceability.column == 3
assert node.package == 'nodelet'
assert node.executable == 'nodelet'
assert node.machine is None
assert node.is_required.value is False
assert node.respawns.value is False
assert node.respawn_delay.value == 0.0
assert node.args.value == 'load yocs_cmd_vel_mux/CmdVelMuxNodelet mobile_base_nodelet_manager'
assert node.output.value == 'log'
assert node.working_dir.value == 'ROS_HOME'
assert node.launch_prefix is None
assert len(node.remaps) == 1
assert node.remaps['/cmd_vel_mux/output'].get_value() == '/mobile_base/commands/velocity'
assert len(node.environment) == 0
# Node 1 -------------------------------
node = lfi.nodes[1]
assert not node.is_test_node
assert node.name.own == 'kobuki_safety_controller'
assert node.name.full == '/kobuki_safety_controller'
assert node.system is None
assert node.condition.is_true
assert node.traceability.filepath.endswith('/kobuki_safe_keyop.launch')
assert node.traceability.line == 10
assert node.traceability.column == 3
assert node.package == 'nodelet'
assert node.executable == 'nodelet'
assert node.machine is None
assert node.is_required.value is False
assert node.respawns.value is False
assert node.respawn_delay.value == 0.0
assert node.args.value == 'load kobuki_safety_controller/SafetyControllerNodelet mobile_base_nodelet_manager'
assert node.output.value == 'log'
assert node.working_dir.value == 'ROS_HOME'
assert node.launch_prefix is None
assert len(node.remaps) == 4
assert node.remaps['/kobuki_safety_controller/cmd_vel'].get_value() == '/cmd_vel_mux/safety_controller'
assert node.remaps['/kobuki_safety_controller/events/bumper'].get_value() == '/mobile_base/events/bumper'
assert node.remaps['/kobuki_safety_controller/events/cliff'].get_value() == '/mobile_base/events/cliff'
assert node.remaps['/kobuki_safety_controller/events/wheel_drop'].get_value() == '/mobile_base/events/wheel_drop'
assert len(node.environment) == 0
# Node 2 -------------------------------
node = lfi.nodes[2]
assert not node.is_test_node
assert node.name.own == 'keyop_vel_smoother'
assert node.name.full == '/keyop_vel_smoother'
assert node.system is None
assert node.condition.is_true
assert node.traceability.filepath.endswith('/kobuki_safe_keyop.launch')
assert node.traceability.line == 17
assert node.traceability.column == 3
assert node.package == 'nodelet'
assert node.executable == 'nodelet'
assert node.machine is None
assert node.is_required.value is False
assert node.respawns.value is False
assert node.respawn_delay.value == 0.0
assert node.args.value == 'load yocs_velocity_smoother/VelocitySmootherNodelet mobile_base_nodelet_manager'
assert node.output.value == 'log'
assert node.working_dir.value == 'ROS_HOME'
assert node.launch_prefix is None
assert len(node.remaps) == 3
assert node.remaps['/keyop_vel_smoother/smooth_cmd_vel'].get_value() == '/cmd_vel_mux/keyboard_teleop'
assert node.remaps['/keyop_vel_smoother/odometry'].get_value() == '/odom'
assert node.remaps['/keyop_vel_smoother/robot_cmd_vel'].get_value() == '/mobile_base/commands/velocity'
assert len(node.environment) == 0
# Node 3 -------------------------------
node = lfi.nodes[3]
assert not node.is_test_node
assert node.name.own == 'keyop'
assert node.name.full == '/keyop'
assert node.system is None
assert node.condition.is_true
assert node.traceability.filepath.endswith('/kobuki_safe_keyop.launch')
assert node.traceability.line == 26
assert node.traceability.column == 3
assert node.package == 'kobuki_keyop'
assert node.executable == 'keyop'
assert node.machine is None
assert node.is_required.value is False
assert node.respawns.value is False
assert node.respawn_delay.value == 0.0
assert node.args.value == ''
assert node.output.value == 'screen'
assert node.working_dir.value == 'ROS_HOME'
assert node.launch_prefix is None
assert len(node.remaps) == 2
assert node.remaps['/keyop/motor_power'].get_value() == '/mobile_base/commands/motor_power'
assert node.remaps['/keyop/cmd_vel'].get_value() == '/keyop_vel_smoother/raw_cmd_vel'
assert len(node.environment) == 0
# Parameters ---------------------------
params = {
'/cmd_vel_mux/yaml_cfg_file': str(Path(__file__).parent / 'param' / 'keyop_mux.yaml'),
'/keyop_vel_smoother/speed_lim_v': 0.8,
'/keyop_vel_smoother/speed_lim_w': 5.4,
'/keyop_vel_smoother/accel_lim_v': 1.0,
'/keyop_vel_smoother/accel_lim_w': 7.0,
'/keyop_vel_smoother/frequency': 20.0,
'/keyop_vel_smoother/decel_factor': 1.0,
'/keyop/linear_vel_step': 0.05,
'/keyop/linear_vel_max': 1.5,
'/keyop/angular_vel_step': 0.33,
'/keyop/angular_vel_max': 6.6,
'/keyop/wait_for_connection_': True,
}
for p in lfi.parameters:
assert p.namespace.full.startswith(('/cmd_vel_mux', '/keyop'))
assert p.system is None
assert p.condition.is_true
assert p.traceability.filepath.endswith('/kobuki_safe_keyop.launch')
assert p.traceability.line in (6, 18, 29, 30, 31, 32, 33)
assert p.traceability.column == 5
assert p.value.value == params[p.name.full]
###############################################################################
# Test Kobuki Safe Teleoperation (Full)
###############################################################################
def test_kobuki_minimal_safe_keyop():
fp = Path(__file__).parent / 'launch' / 'kobuki_minimal.launch'
iface = MockInterface()
lfi = LaunchInterpreter(iface, include_absent=True)
lfi.interpret(fp)
fp = Path(__file__).parent / 'launch' / 'kobuki_safe_keyop.launch'
lfi.interpret(fp)
assert not lfi.machines
assert not lfi.rosparam_cmds
assert len(lfi.nodes) == 7
assert len(lfi.parameters) == 46
# Node 0 -------------------------------
node = lfi.nodes[0]
assert not node.is_test_node
assert node.name.own == 'mobile_base_nodelet_manager'
assert node.name.full == '/mobile_base_nodelet_manager'
assert node.system is None
assert node.condition.is_true
assert node.traceability.filepath.endswith('/kobuki_minimal.launch')
assert node.traceability.line == 4
assert node.traceability.column == 3
assert node.package == 'nodelet'
assert node.executable == 'nodelet'
assert node.machine is None
assert node.is_required.value is False
assert node.respawns.value is False
assert node.respawn_delay.value == 0.0
assert node.args.value == 'manager'
assert node.output.value == 'log'
assert node.working_dir.value == 'ROS_HOME'
assert node.launch_prefix is None
assert len(node.remaps) == 0
assert len(node.environment) == 0
# Node 1 -------------------------------
node = lfi.nodes[1]
assert not node.is_test_node
assert node.name.own == 'mobile_base'
assert node.name.full == '/mobile_base'
assert node.system is None
assert node.condition.is_true
assert node.traceability.filepath.endswith('/kobuki_minimal.launch')
assert node.traceability.line == 5
assert node.traceability.column == 3
assert node.package == 'nodelet'
assert node.executable == 'nodelet'
assert node.machine is None
assert node.is_required.value is False
assert node.respawns.value is False
assert node.respawn_delay.value == 0.0
assert node.args.value == 'load kobuki_node/KobukiNodelet mobile_base_nodelet_manager'
assert node.output.value == 'log'
assert node.working_dir.value == 'ROS_HOME'
assert node.launch_prefix is None
assert len(node.remaps) == 2
assert node.remaps['/mobile_base/odom'].get_value() == '/odom'
assert node.remaps['/mobile_base/joint_states'].get_value() == '/joint_states'
assert len(node.environment) == 0
# Node 2 -------------------------------
node = lfi.nodes[2]
assert not node.is_test_node
assert node.name.own == 'diagnostic_aggregator'
assert node.name.full == '/diagnostic_aggregator'
assert node.system is None
assert node.condition.is_true
assert node.traceability.filepath.endswith('/kobuki_minimal.launch')
assert node.traceability.line == 12
assert node.traceability.column == 3
assert node.package == 'diagnostic_aggregator'
assert node.executable == 'aggregator_node'
assert node.machine is None
assert node.is_required.value is False
assert node.respawns.value is False
assert node.respawn_delay.value == 0.0
assert node.args.value == ''
assert node.output.value == 'log'
assert node.working_dir.value == 'ROS_HOME'
assert node.launch_prefix is None
assert len(node.remaps) == 0
assert len(node.environment) == 0
# Node 3 -------------------------------
node = lfi.nodes[3]
assert not node.is_test_node
assert node.name.own == 'cmd_vel_mux'
assert node.name.full == '/cmd_vel_mux'
assert node.system is None
assert node.condition.is_true
assert node.traceability.filepath.endswith('/kobuki_safe_keyop.launch')
assert node.traceability.line == 5
assert node.traceability.column == 3
assert node.package == 'nodelet'
assert node.executable == 'nodelet'
assert node.machine is None
assert node.is_required.value is False
assert node.respawns.value is False
assert node.respawn_delay.value == 0.0
assert node.args.value == 'load yocs_cmd_vel_mux/CmdVelMuxNodelet mobile_base_nodelet_manager'
assert node.output.value == 'log'
assert node.working_dir.value == 'ROS_HOME'
assert node.launch_prefix is None
assert len(node.remaps) == 1
assert node.remaps['/cmd_vel_mux/output'].get_value() == '/mobile_base/commands/velocity'
assert len(node.environment) == 0
# Node 4 -------------------------------
node = lfi.nodes[4]
assert not node.is_test_node
assert node.name.own == 'kobuki_safety_controller'
assert node.name.full == '/kobuki_safety_controller'
assert node.system is None
assert node.condition.is_true
assert node.traceability.filepath.endswith('/kobuki_safe_keyop.launch')
assert node.traceability.line == 10
assert node.traceability.column == 3
assert node.package == 'nodelet'
assert node.executable == 'nodelet'
assert node.machine is None
assert node.is_required.value is False
assert node.respawns.value is False
assert node.respawn_delay.value == 0.0
assert node.args.value == 'load kobuki_safety_controller/SafetyControllerNodelet mobile_base_nodelet_manager'
assert node.output.value == 'log'
assert node.working_dir.value == 'ROS_HOME'
assert node.launch_prefix is None
assert len(node.remaps) == 4
assert node.remaps['/kobuki_safety_controller/cmd_vel'].get_value() == '/cmd_vel_mux/safety_controller'
assert node.remaps['/kobuki_safety_controller/events/bumper'].get_value() == '/mobile_base/events/bumper'
assert node.remaps['/kobuki_safety_controller/events/cliff'].get_value() == '/mobile_base/events/cliff'
assert node.remaps['/kobuki_safety_controller/events/wheel_drop'].get_value() == '/mobile_base/events/wheel_drop'
assert len(node.environment) == 0
# Node 5 -------------------------------
node = lfi.nodes[5]
assert not node.is_test_node
assert node.name.own == 'keyop_vel_smoother'
assert node.name.full == '/keyop_vel_smoother'
assert node.system is None
assert node.condition.is_true
assert node.traceability.filepath.endswith('/kobuki_safe_keyop.launch')
assert node.traceability.line == 17
assert node.traceability.column == 3
assert node.package == 'nodelet'
assert node.executable == 'nodelet'
assert node.machine is None
assert node.is_required.value is False
assert node.respawns.value is False
assert node.respawn_delay.value == 0.0
assert node.args.value == 'load yocs_velocity_smoother/VelocitySmootherNodelet mobile_base_nodelet_manager'
assert node.output.value == 'log'
assert node.working_dir.value == 'ROS_HOME'
assert node.launch_prefix is None
assert len(node.remaps) == 3
assert node.remaps['/keyop_vel_smoother/smooth_cmd_vel'].get_value() == '/cmd_vel_mux/keyboard_teleop'
assert node.remaps['/keyop_vel_smoother/odometry'].get_value() == '/odom'
assert node.remaps['/keyop_vel_smoother/robot_cmd_vel'].get_value() == '/mobile_base/commands/velocity'
assert len(node.environment) == 0
# Node 6 -------------------------------
node = lfi.nodes[6]
assert not node.is_test_node
assert node.name.own == 'keyop'
assert node.name.full == '/keyop'
assert node.system is None
assert node.condition.is_true
assert node.traceability.filepath.endswith('/kobuki_safe_keyop.launch')
assert node.traceability.line == 26
assert node.traceability.column == 3
assert node.package == 'kobuki_keyop'
assert node.executable == 'keyop'
assert node.machine is None
assert node.is_required.value is False
assert node.respawns.value is False
assert node.respawn_delay.value == 0.0
assert node.args.value == ''
assert node.output.value == 'screen'
assert node.working_dir.value == 'ROS_HOME'
assert node.launch_prefix is None
assert len(node.remaps) == 2
assert node.remaps['/keyop/motor_power'].get_value() == '/mobile_base/commands/motor_power'
assert node.remaps['/keyop/cmd_vel'].get_value() == '/keyop_vel_smoother/raw_cmd_vel'
assert len(node.environment) == 0
# Parameters ---------------------------
params = {
'/mobile_base/device_port': '/dev/kobuki',
'/mobile_base/wheel_left_joint_name': 'wheel_left_joint',
'/mobile_base/wheel_right_joint_name': 'wheel_right_joint',
'/mobile_base/battery_capacity': 16.5,
'/mobile_base/battery_low': 14.0,
'/mobile_base/battery_dangerous': 13.2,
'/mobile_base/cmd_vel_timeout': 0.6,
'/mobile_base/publish_tf': True,
'/mobile_base/use_imu_heading': True,
'/mobile_base/odom_frame': 'odom',
'/mobile_base/base_frame': 'base_footprint',
'/diagnostic_aggregator/pub_rate': 1.0,
'/diagnostic_aggregator/base_path': '',
'/diagnostic_aggregator/analyzers/power/type': 'diagnostic_aggregator/GenericAnalyzer',
'/diagnostic_aggregator/analyzers/power/path': 'Power System',
'/diagnostic_aggregator/analyzers/power/timeout': 5.0,
'/diagnostic_aggregator/analyzers/power/contains': ['Battery'],
'/diagnostic_aggregator/analyzers/power/remove_prefix': 'mobile_base_nodelet_manager',
'/diagnostic_aggregator/analyzers/kobuki/type': 'diagnostic_aggregator/GenericAnalyzer',
'/diagnostic_aggregator/analyzers/kobuki/path': 'Kobuki',
'/diagnostic_aggregator/analyzers/kobuki/timeout': 5.0,
'/diagnostic_aggregator/analyzers/kobuki/contains': ['Watchdog', 'Motor State'],
'/diagnostic_aggregator/analyzers/kobuki/remove_prefix': 'mobile_base_nodelet_manager',
'/diagnostic_aggregator/analyzers/sensors/type': 'diagnostic_aggregator/GenericAnalyzer',
'/diagnostic_aggregator/analyzers/sensors/path': 'Sensors',
'/diagnostic_aggregator/analyzers/sensors/timeout': 5.0,
'/diagnostic_aggregator/analyzers/sensors/contains': ['Cliff Sensor',
'Wall Sensor', 'Wheel Drop', 'Motor Current', 'Gyro Sensor'],
'/diagnostic_aggregator/analyzers/sensors/remove_prefix': 'mobile_base_nodelet_manager',
'/diagnostic_aggregator/analyzers/input_ports/type': 'diagnostic_aggregator/GenericAnalyzer',
'/diagnostic_aggregator/analyzers/input_ports/path': 'Input Ports',
'/diagnostic_aggregator/analyzers/input_ports/timeout': 5.0,
'/diagnostic_aggregator/analyzers/input_ports/contains': ['Digital Input', 'Analog Input'],
'/diagnostic_aggregator/analyzers/input_ports/remove_prefix': 'mobile_base_nodelet_manager',
'/cmd_vel_mux/yaml_cfg_file': str(Path(__file__).parent / 'param' / 'keyop_mux.yaml'),
'/keyop_vel_smoother/speed_lim_v': 0.8,
'/keyop_vel_smoother/speed_lim_w': 5.4,
'/keyop_vel_smoother/accel_lim_v': 1.0,
'/keyop_vel_smoother/accel_lim_w': 7.0,
'/keyop_vel_smoother/frequency': 20.0,
'/keyop_vel_smoother/decel_factor': 1.0,
'/keyop/linear_vel_step': 0.05,
'/keyop/linear_vel_max': 1.5,
'/keyop/angular_vel_step': 0.33,
'/keyop/angular_vel_max': 6.6,
'/keyop/wait_for_connection_': True,
}
for p in lfi.parameters:
assert p.namespace.full.startswith(('/mobile_base', '/cmd_vel_mux',
'/keyop', '/diagnostic_aggregator'))
assert p.system is None
assert p.condition.is_true
assert p.traceability.filepath.endswith(('/kobuki_minimal.launch',
'/kobuki_safe_keyop.launch'))
assert p.traceability.line in (6, 7, 13, 18, 29, 30, 31, 32, 33)
assert p.traceability.column == 5
assert p.value.value == params[p.name.full]
| 45.962712 | 117 | 0.66502 | 3,346 | 27,118 | 5.182905 | 0.075015 | 0.151078 | 0.031138 | 0.025833 | 0.937896 | 0.937896 | 0.93559 | 0.93559 | 0.929247 | 0.922789 | 0 | 0.011656 | 0.17741 | 27,118 | 589 | 118 | 46.040747 | 0.765724 | 0.032377 | 0 | 0.881356 | 0 | 0 | 0.301483 | 0.248535 | 0 | 0 | 0 | 0 | 0.640301 | 1 | 0.020716 | false | 0 | 0.015066 | 0.00565 | 0.052731 | 0.003766 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
189d1b4dc38a1cfa6827b1b968b09ce95b1678bb | 1,634 | py | Python | python3/tests/SMBConnectionTests/test_auth.py | frafra/pysmb | 0b0b9a7601f8beb583c44799bc88e1a64c3440f3 | [
"Zlib"
] | 280 | 2015-01-21T14:34:41.000Z | 2022-03-02T03:36:05.000Z | python3/tests/SMBConnectionTests/test_auth.py | frafra/pysmb | 0b0b9a7601f8beb583c44799bc88e1a64c3440f3 | [
"Zlib"
] | 160 | 2015-01-09T22:05:44.000Z | 2022-03-29T11:34:44.000Z | python3/tests/SMBConnectionTests/test_auth.py | frafra/pysmb | 0b0b9a7601f8beb583c44799bc88e1a64c3440f3 | [
"Zlib"
] | 101 | 2015-01-17T21:12:22.000Z | 2022-01-26T11:12:16.000Z |
from nose2.tools.decorators import with_teardown
from smb.SMBConnection import SMBConnection
from smb import smb_structs
from .util import getConnectionInfo
conn = None
def teardown_func():
global conn
conn.close()
@with_teardown(teardown_func)
def test_NTLMv1_auth_SMB1():
global conn
smb_structs.SUPPORT_SMB2 = False
info = getConnectionInfo()
conn = SMBConnection(info['user'], info['password'], info['client_name'], info['server_name'], domain = info['domain'], use_ntlm_v2 = False)
assert conn.connect(info['server_ip'], info['server_port'])
@with_teardown(teardown_func)
def test_NTLMv2_auth_SMB1():
global conn
smb_structs.SUPPORT_SMB2 = False
info = getConnectionInfo()
conn = SMBConnection(info['user'], info['password'], info['client_name'], info['server_name'], domain = info['domain'], use_ntlm_v2 = True)
assert conn.connect(info['server_ip'], info['server_port'])
@with_teardown(teardown_func)
def test_NTLMv1_auth_SMB2():
global conn
smb_structs.SUPPORT_SMB2 = True
info = getConnectionInfo()
conn = SMBConnection(info['user'], info['password'], info['client_name'], info['server_name'], domain = info['domain'], use_ntlm_v2 = False)
assert conn.connect(info['server_ip'], info['server_port'])
@with_teardown(teardown_func)
def test_NTLMv2_auth_SMB2():
global conn
smb_structs.SUPPORT_SMB2 = True
info = getConnectionInfo()
conn = SMBConnection(info['user'], info['password'], info['client_name'], info['server_name'], domain = info['domain'], use_ntlm_v2 = True)
assert conn.connect(info['server_ip'], info['server_port'])
| 37.136364 | 144 | 0.72705 | 214 | 1,634 | 5.294393 | 0.191589 | 0.105914 | 0.070609 | 0.084731 | 0.843778 | 0.843778 | 0.843778 | 0.843778 | 0.805825 | 0.805825 | 0 | 0.012048 | 0.136475 | 1,634 | 43 | 145 | 38 | 0.790928 | 0 | 0 | 0.694444 | 0 | 0 | 0.146969 | 0 | 0 | 0 | 0 | 0 | 0.111111 | 1 | 0.138889 | false | 0.111111 | 0.111111 | 0 | 0.25 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 7 |
18d76486cc09f09d6b12f697fb8430e6b850a2b4 | 40,596 | py | Python | src/v5.1/resources/swagger_client/api/assessments_api.py | xmarcosx/edfi-notebook | 0564ebdf1d0f45a9d25056e7e61369f0a837534d | [
"Apache-2.0"
] | 2 | 2021-04-27T17:18:17.000Z | 2021-04-27T19:14:39.000Z | src/v5.1/resources/swagger_client/api/assessments_api.py | xmarcosx/edfi-notebook | 0564ebdf1d0f45a9d25056e7e61369f0a837534d | [
"Apache-2.0"
] | null | null | null | src/v5.1/resources/swagger_client/api/assessments_api.py | xmarcosx/edfi-notebook | 0564ebdf1d0f45a9d25056e7e61369f0a837534d | [
"Apache-2.0"
] | 1 | 2022-01-06T09:43:11.000Z | 2022-01-06T09:43:11.000Z | # coding: utf-8
"""
Ed-Fi Operational Data Store API
The Ed-Fi ODS / API enables applications to read and write education data stored in an Ed-Fi ODS through a secure REST interface. *** > *Note: Consumers of ODS / API information should sanitize all data for display and storage. The ODS / API provides reasonable safeguards against cross-site scripting attacks and other malicious content, but the platform does not and cannot guarantee that the data it contains is free of all potentially harmful content.* *** # noqa: E501
OpenAPI spec version: 3
Generated by: https://github.com/swagger-api/swagger-codegen.git
"""
from __future__ import absolute_import
import re # noqa: F401
# python 2 and python 3 compatibility library
import six
from swagger_client.api_client import ApiClient
class AssessmentsApi(object):
"""NOTE: This class is auto generated by the swagger code generator program.
Do not edit the class manually.
Ref: https://github.com/swagger-api/swagger-codegen
"""
def __init__(self, api_client=None):
if api_client is None:
api_client = ApiClient()
self.api_client = api_client
def delete_assessment_by_id(self, id, **kwargs): # noqa: E501
"""Deletes an existing resource using the resource identifier. # noqa: E501
The DELETE operation is used to delete an existing resource by identifier. If the resource doesn't exist, an error will result (the resource will not be found). # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_assessment_by_id(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str id: A resource identifier that uniquely identifies the resource. (required)
:param str if_match: The ETag header value used to prevent the DELETE from removing a resource modified by another consumer.
:return: None
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.delete_assessment_by_id_with_http_info(id, **kwargs) # noqa: E501
else:
(data) = self.delete_assessment_by_id_with_http_info(id, **kwargs) # noqa: E501
return data
def delete_assessment_by_id_with_http_info(self, id, **kwargs): # noqa: E501
"""Deletes an existing resource using the resource identifier. # noqa: E501
The DELETE operation is used to delete an existing resource by identifier. If the resource doesn't exist, an error will result (the resource will not be found). # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_assessment_by_id_with_http_info(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str id: A resource identifier that uniquely identifies the resource. (required)
:param str if_match: The ETag header value used to prevent the DELETE from removing a resource modified by another consumer.
:return: None
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'if_match'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method delete_assessment_by_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if self.api_client.client_side_validation and ('id' not in params or
params['id'] is None): # noqa: E501
raise ValueError("Missing the required parameter `id` when calling `delete_assessment_by_id`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
header_params = {}
if 'if_match' in params:
header_params['If-Match'] = params['if_match'] # noqa: E501
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['oauth2_client_credentials'] # noqa: E501
return self.api_client.call_api(
'/ed-fi/assessments/{id}', 'DELETE',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type=None, # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def deletes_assessments(self, **kwargs): # noqa: E501
"""Retrieves deleted resources based on change version. # noqa: E501
The DELETES operation is used to retrieve deleted resources. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.deletes_assessments(async_req=True)
>>> result = thread.get()
:param async_req bool
:param int offset: Indicates how many items should be skipped before returning results.
:param int limit: Indicates the maximum number of items that should be returned in the results.
:param int min_change_version: Used in synchronization to set sequence minimum ChangeVersion
:param int max_change_version: Used in synchronization to set sequence maximum ChangeVersion
:return: list[EdFiAssessment]
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.deletes_assessments_with_http_info(**kwargs) # noqa: E501
else:
(data) = self.deletes_assessments_with_http_info(**kwargs) # noqa: E501
return data
def deletes_assessments_with_http_info(self, **kwargs): # noqa: E501
"""Retrieves deleted resources based on change version. # noqa: E501
The DELETES operation is used to retrieve deleted resources. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.deletes_assessments_with_http_info(async_req=True)
>>> result = thread.get()
:param async_req bool
:param int offset: Indicates how many items should be skipped before returning results.
:param int limit: Indicates the maximum number of items that should be returned in the results.
:param int min_change_version: Used in synchronization to set sequence minimum ChangeVersion
:param int max_change_version: Used in synchronization to set sequence maximum ChangeVersion
:return: list[EdFiAssessment]
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['offset', 'limit', 'min_change_version', 'max_change_version'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method deletes_assessments" % key
)
params[key] = val
del params['kwargs']
if self.api_client.client_side_validation and ('limit' in params and params['limit'] > 500): # noqa: E501
raise ValueError("Invalid value for parameter `limit` when calling `deletes_assessments`, must be a value less than or equal to `500`") # noqa: E501
if self.api_client.client_side_validation and ('limit' in params and params['limit'] < 0): # noqa: E501
raise ValueError("Invalid value for parameter `limit` when calling `deletes_assessments`, must be a value greater than or equal to `0`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
if 'offset' in params:
query_params.append(('offset', params['offset'])) # noqa: E501
if 'limit' in params:
query_params.append(('limit', params['limit'])) # noqa: E501
if 'min_change_version' in params:
query_params.append(('minChangeVersion', params['min_change_version'])) # noqa: E501
if 'max_change_version' in params:
query_params.append(('maxChangeVersion', params['max_change_version'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['oauth2_client_credentials'] # noqa: E501
return self.api_client.call_api(
'/ed-fi/assessments/deletes', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='list[EdFiAssessment]', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_assessments(self, **kwargs): # noqa: E501
"""Retrieves specific resources using the resource's property values (using the \"Get\" pattern). # noqa: E501
This GET operation provides access to resources using the \"Get\" search pattern. The values of any properties of the resource that are specified will be used to return all matching results (if it exists). # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_assessments(async_req=True)
>>> result = thread.get()
:param async_req bool
:param int offset: Indicates how many items should be skipped before returning results.
:param int limit: Indicates the maximum number of items that should be returned in the results.
:param int min_change_version: Used in synchronization to set sequence minimum ChangeVersion
:param int max_change_version: Used in synchronization to set sequence maximum ChangeVersion
:param bool total_count: Indicates if the total number of items available should be returned in the 'Total-Count' header of the response. If set to false, 'Total-Count' header will not be provided.
:param str assessment_identifier: A unique number or alphanumeric code assigned to an assessment.
:param str namespace: Namespace for the Assessment.
:param int education_organization_id: The identifier assigned to an education organization.
:param str assessment_category_descriptor: The category of an assessment based on format and content. For example: Achievement test Advanced placement test Alternate assessment/grade-level standards Attitudinal test Cognitive and perceptual skills test ...
:param bool adaptive_assessment: Indicates that the assessment is adaptive.
:param str assessment_family: The AssessmentFamily this Assessment is a member of.
:param str assessment_form: Identifies the form of the assessment, for example a regular versus makeup form, multiple choice versus constructed response, etc.
:param str assessment_title: The title or name of the Assessment.
:param int assessment_version: The version identifier for the assessment.
:param str id:
:param float max_raw_score: The maximum raw score achievable across all assessment items that are correct and scored at the maximum.
:param str nomenclature: Reflects the specific nomenclature used for Assessment.
:param date revision_date: The month, day, and year that the conceptual design for the assessment was most recently revised substantially.
:return: list[EdFiAssessment]
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_assessments_with_http_info(**kwargs) # noqa: E501
else:
(data) = self.get_assessments_with_http_info(**kwargs) # noqa: E501
return data
def get_assessments_with_http_info(self, **kwargs): # noqa: E501
"""Retrieves specific resources using the resource's property values (using the \"Get\" pattern). # noqa: E501
This GET operation provides access to resources using the \"Get\" search pattern. The values of any properties of the resource that are specified will be used to return all matching results (if it exists). # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_assessments_with_http_info(async_req=True)
>>> result = thread.get()
:param async_req bool
:param int offset: Indicates how many items should be skipped before returning results.
:param int limit: Indicates the maximum number of items that should be returned in the results.
:param int min_change_version: Used in synchronization to set sequence minimum ChangeVersion
:param int max_change_version: Used in synchronization to set sequence maximum ChangeVersion
:param bool total_count: Indicates if the total number of items available should be returned in the 'Total-Count' header of the response. If set to false, 'Total-Count' header will not be provided.
:param str assessment_identifier: A unique number or alphanumeric code assigned to an assessment.
:param str namespace: Namespace for the Assessment.
:param int education_organization_id: The identifier assigned to an education organization.
:param str assessment_category_descriptor: The category of an assessment based on format and content. For example: Achievement test Advanced placement test Alternate assessment/grade-level standards Attitudinal test Cognitive and perceptual skills test ...
:param bool adaptive_assessment: Indicates that the assessment is adaptive.
:param str assessment_family: The AssessmentFamily this Assessment is a member of.
:param str assessment_form: Identifies the form of the assessment, for example a regular versus makeup form, multiple choice versus constructed response, etc.
:param str assessment_title: The title or name of the Assessment.
:param int assessment_version: The version identifier for the assessment.
:param str id:
:param float max_raw_score: The maximum raw score achievable across all assessment items that are correct and scored at the maximum.
:param str nomenclature: Reflects the specific nomenclature used for Assessment.
:param date revision_date: The month, day, and year that the conceptual design for the assessment was most recently revised substantially.
:return: list[EdFiAssessment]
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['offset', 'limit', 'min_change_version', 'max_change_version', 'total_count', 'assessment_identifier', 'namespace', 'education_organization_id', 'assessment_category_descriptor', 'adaptive_assessment', 'assessment_family', 'assessment_form', 'assessment_title', 'assessment_version', 'id', 'max_raw_score', 'nomenclature', 'revision_date'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_assessments" % key
)
params[key] = val
del params['kwargs']
if self.api_client.client_side_validation and ('limit' in params and params['limit'] > 500): # noqa: E501
raise ValueError("Invalid value for parameter `limit` when calling `get_assessments`, must be a value less than or equal to `500`") # noqa: E501
if self.api_client.client_side_validation and ('limit' in params and params['limit'] < 0): # noqa: E501
raise ValueError("Invalid value for parameter `limit` when calling `get_assessments`, must be a value greater than or equal to `0`") # noqa: E501
if self.api_client.client_side_validation and ('assessment_identifier' in params and
len(params['assessment_identifier']) > 60):
raise ValueError("Invalid value for parameter `assessment_identifier` when calling `get_assessments`, length must be less than or equal to `60`") # noqa: E501
if self.api_client.client_side_validation and ('namespace' in params and
len(params['namespace']) > 255):
raise ValueError("Invalid value for parameter `namespace` when calling `get_assessments`, length must be less than or equal to `255`") # noqa: E501
if self.api_client.client_side_validation and ('assessment_category_descriptor' in params and
len(params['assessment_category_descriptor']) > 306):
raise ValueError("Invalid value for parameter `assessment_category_descriptor` when calling `get_assessments`, length must be less than or equal to `306`") # noqa: E501
if self.api_client.client_side_validation and ('assessment_family' in params and
len(params['assessment_family']) > 60):
raise ValueError("Invalid value for parameter `assessment_family` when calling `get_assessments`, length must be less than or equal to `60`") # noqa: E501
if self.api_client.client_side_validation and ('assessment_form' in params and
len(params['assessment_form']) > 60):
raise ValueError("Invalid value for parameter `assessment_form` when calling `get_assessments`, length must be less than or equal to `60`") # noqa: E501
if self.api_client.client_side_validation and ('assessment_title' in params and
len(params['assessment_title']) > 100):
raise ValueError("Invalid value for parameter `assessment_title` when calling `get_assessments`, length must be less than or equal to `100`") # noqa: E501
if self.api_client.client_side_validation and ('nomenclature' in params and
len(params['nomenclature']) > 35):
raise ValueError("Invalid value for parameter `nomenclature` when calling `get_assessments`, length must be less than or equal to `35`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
if 'offset' in params:
query_params.append(('offset', params['offset'])) # noqa: E501
if 'limit' in params:
query_params.append(('limit', params['limit'])) # noqa: E501
if 'min_change_version' in params:
query_params.append(('minChangeVersion', params['min_change_version'])) # noqa: E501
if 'max_change_version' in params:
query_params.append(('maxChangeVersion', params['max_change_version'])) # noqa: E501
if 'total_count' in params:
query_params.append(('totalCount', params['total_count'])) # noqa: E501
if 'assessment_identifier' in params:
query_params.append(('assessmentIdentifier', params['assessment_identifier'])) # noqa: E501
if 'namespace' in params:
query_params.append(('namespace', params['namespace'])) # noqa: E501
if 'education_organization_id' in params:
query_params.append(('educationOrganizationId', params['education_organization_id'])) # noqa: E501
if 'assessment_category_descriptor' in params:
query_params.append(('assessmentCategoryDescriptor', params['assessment_category_descriptor'])) # noqa: E501
if 'adaptive_assessment' in params:
query_params.append(('adaptiveAssessment', params['adaptive_assessment'])) # noqa: E501
if 'assessment_family' in params:
query_params.append(('assessmentFamily', params['assessment_family'])) # noqa: E501
if 'assessment_form' in params:
query_params.append(('assessmentForm', params['assessment_form'])) # noqa: E501
if 'assessment_title' in params:
query_params.append(('assessmentTitle', params['assessment_title'])) # noqa: E501
if 'assessment_version' in params:
query_params.append(('assessmentVersion', params['assessment_version'])) # noqa: E501
if 'id' in params:
query_params.append(('id', params['id'])) # noqa: E501
if 'max_raw_score' in params:
query_params.append(('maxRawScore', params['max_raw_score'])) # noqa: E501
if 'nomenclature' in params:
query_params.append(('nomenclature', params['nomenclature'])) # noqa: E501
if 'revision_date' in params:
query_params.append(('revisionDate', params['revision_date'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['oauth2_client_credentials'] # noqa: E501
return self.api_client.call_api(
'/ed-fi/assessments', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='list[EdFiAssessment]', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_assessments_by_id(self, id, **kwargs): # noqa: E501
"""Retrieves a specific resource using the resource's identifier (using the \"Get By Id\" pattern). # noqa: E501
This GET operation retrieves a resource by the specified resource identifier. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_assessments_by_id(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str id: A resource identifier that uniquely identifies the resource. (required)
:param str if_none_match: The previously returned ETag header value, used here to prevent the unnecessary data transfer of an unchanged resource.
:return: EdFiAssessment
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_assessments_by_id_with_http_info(id, **kwargs) # noqa: E501
else:
(data) = self.get_assessments_by_id_with_http_info(id, **kwargs) # noqa: E501
return data
def get_assessments_by_id_with_http_info(self, id, **kwargs): # noqa: E501
"""Retrieves a specific resource using the resource's identifier (using the \"Get By Id\" pattern). # noqa: E501
This GET operation retrieves a resource by the specified resource identifier. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_assessments_by_id_with_http_info(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str id: A resource identifier that uniquely identifies the resource. (required)
:param str if_none_match: The previously returned ETag header value, used here to prevent the unnecessary data transfer of an unchanged resource.
:return: EdFiAssessment
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'if_none_match'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_assessments_by_id" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if self.api_client.client_side_validation and ('id' not in params or
params['id'] is None): # noqa: E501
raise ValueError("Missing the required parameter `id` when calling `get_assessments_by_id`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
header_params = {}
if 'if_none_match' in params:
header_params['If-None-Match'] = params['if_none_match'] # noqa: E501
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['oauth2_client_credentials'] # noqa: E501
return self.api_client.call_api(
'/ed-fi/assessments/{id}', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='EdFiAssessment', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def post_assessment(self, assessment, **kwargs): # noqa: E501
"""Creates or updates resources based on the natural key values of the supplied resource. # noqa: E501
The POST operation can be used to create or update resources. In database terms, this is often referred to as an \"upsert\" operation (insert + update). Clients should NOT include the resource \"id\" in the JSON body because it will result in an error (you must use a PUT operation to update a resource by \"id\"). The web service will identify whether the resource already exists based on the natural key values provided, and update or create the resource appropriately. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.post_assessment(assessment, async_req=True)
>>> result = thread.get()
:param async_req bool
:param EdFiAssessment assessment: The JSON representation of the \"assessment\" resource to be created or updated. (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.post_assessment_with_http_info(assessment, **kwargs) # noqa: E501
else:
(data) = self.post_assessment_with_http_info(assessment, **kwargs) # noqa: E501
return data
def post_assessment_with_http_info(self, assessment, **kwargs): # noqa: E501
"""Creates or updates resources based on the natural key values of the supplied resource. # noqa: E501
The POST operation can be used to create or update resources. In database terms, this is often referred to as an \"upsert\" operation (insert + update). Clients should NOT include the resource \"id\" in the JSON body because it will result in an error (you must use a PUT operation to update a resource by \"id\"). The web service will identify whether the resource already exists based on the natural key values provided, and update or create the resource appropriately. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.post_assessment_with_http_info(assessment, async_req=True)
>>> result = thread.get()
:param async_req bool
:param EdFiAssessment assessment: The JSON representation of the \"assessment\" resource to be created or updated. (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['assessment'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method post_assessment" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'assessment' is set
if self.api_client.client_side_validation and ('assessment' not in params or
params['assessment'] is None): # noqa: E501
raise ValueError("Missing the required parameter `assessment` when calling `post_assessment`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'assessment' in params:
body_params = params['assessment']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['oauth2_client_credentials'] # noqa: E501
return self.api_client.call_api(
'/ed-fi/assessments', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type=None, # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def put_assessment(self, id, assessment, **kwargs): # noqa: E501
"""Updates or creates a resource based on the resource identifier. # noqa: E501
The PUT operation is used to update or create a resource by identifier. If the resource doesn't exist, the resource will be created using that identifier. Additionally, natural key values cannot be changed using this operation, and will not be modified in the database. If the resource \"id\" is provided in the JSON body, it will be ignored as well. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.put_assessment(id, assessment, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str id: A resource identifier that uniquely identifies the resource. (required)
:param EdFiAssessment assessment: The JSON representation of the \"assessment\" resource to be created or updated. (required)
:param str if_match: The ETag header value used to prevent the PUT from updating a resource modified by another consumer.
:return: None
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.put_assessment_with_http_info(id, assessment, **kwargs) # noqa: E501
else:
(data) = self.put_assessment_with_http_info(id, assessment, **kwargs) # noqa: E501
return data
def put_assessment_with_http_info(self, id, assessment, **kwargs): # noqa: E501
"""Updates or creates a resource based on the resource identifier. # noqa: E501
The PUT operation is used to update or create a resource by identifier. If the resource doesn't exist, the resource will be created using that identifier. Additionally, natural key values cannot be changed using this operation, and will not be modified in the database. If the resource \"id\" is provided in the JSON body, it will be ignored as well. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.put_assessment_with_http_info(id, assessment, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str id: A resource identifier that uniquely identifies the resource. (required)
:param EdFiAssessment assessment: The JSON representation of the \"assessment\" resource to be created or updated. (required)
:param str if_match: The ETag header value used to prevent the PUT from updating a resource modified by another consumer.
:return: None
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'assessment', 'if_match'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method put_assessment" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if self.api_client.client_side_validation and ('id' not in params or
params['id'] is None): # noqa: E501
raise ValueError("Missing the required parameter `id` when calling `put_assessment`") # noqa: E501
# verify the required parameter 'assessment' is set
if self.api_client.client_side_validation and ('assessment' not in params or
params['assessment'] is None): # noqa: E501
raise ValueError("Missing the required parameter `assessment` when calling `put_assessment`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
header_params = {}
if 'if_match' in params:
header_params['If-Match'] = params['if_match'] # noqa: E501
form_params = []
local_var_files = {}
body_params = None
if 'assessment' in params:
body_params = params['assessment']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['oauth2_client_credentials'] # noqa: E501
return self.api_client.call_api(
'/ed-fi/assessments/{id}', 'PUT',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type=None, # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
| 54.128 | 493 | 0.650261 | 4,926 | 40,596 | 5.180674 | 0.076939 | 0.04326 | 0.018339 | 0.016928 | 0.917398 | 0.89906 | 0.883856 | 0.873197 | 0.864734 | 0.862265 | 0 | 0.016076 | 0.269091 | 40,596 | 749 | 494 | 54.200267 | 0.843994 | 0.422973 | 0 | 0.712531 | 0 | 0.027027 | 0.254045 | 0.055917 | 0 | 0 | 0 | 0 | 0 | 1 | 0.031941 | false | 0 | 0.009828 | 0 | 0.088452 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
e16c0a86882079e9ee5e5ddb755db70a97a488f6 | 16,718 | py | Python | mAiLab006.py | brianchiang-tw/Python | dab3e737a5b77ad4582c11dcddce67cb98cc9f25 | [
"MIT"
] | null | null | null | mAiLab006.py | brianchiang-tw/Python | dab3e737a5b77ad4582c11dcddce67cb98cc9f25 | [
"MIT"
] | null | null | null | mAiLab006.py | brianchiang-tw/Python | dab3e737a5b77ad4582c11dcddce67cb98cc9f25 | [
"MIT"
] | null | null | null | # For image output to bmp file
import os
# For image operation
from library.image_tool_box import *
# For math/statistic operation
from library.math_tool_box import StatMaker
import math
### Prelude: Downlaod and unpack those 4 test data from MNIST database.
# train-images-idx3-ubyte.gz: training set images (9912422 bytes)
# train-labels-idx1-ubyte.gz: training set labels (28881 bytes)
# t10k-images-idx3-ubyte.gz: test set images (1648877 bytes)
# t10k-labels-idx1-ubyte.gz: test set labels (4542 bytes)
# MNIST database
# http://yann.lecun.com/exdb/mnist/
# This is is completed, and they're saved in sub-directory "./data_of_mAiLab003"
file_name_of_MNIST_image = 'train-images.idx3-ubyte'
file_name_of_MNIST_label = 'train-labels.idx1-ubyte'
data_directory_path = 'data_of_mAiLab003/'
ouput_directory_path = './output_of_mAiLab006/'
path_of_MNIST_image = data_directory_path + file_name_of_MNIST_image
path_of_MNIST_label = data_directory_path + file_name_of_MNIST_label
# Create output directory for edge image
if not os.path.isdir(ouput_directory_path):
os.mkdir(ouput_directory_path)
with open(path_of_MNIST_image, 'rb') as file_handle:
# Read header of MNIST image file
header_return = get_MNIST_image_header(file_handle)
if -1 == header_return:
# Handle End-of-File, or exception
pass
else:
(img_height, img_width) = header_return
image_container = []
max_pooling_img_container = []
avg_pooling_img_container = []
for index in range(5):
image_return = read_one_MNIST_image(file_handle, img_height, img_width)
if -2 == image_return:
# Handle exception
print("Error occurs in index {:0>2d}".format( index ) )
break
else:
serial_number = index + 1
image_matrix = image_return
# Push origianl source image into image container
image_container.append(image_matrix)
### 1. Conduct and output pooling image, with 'Max' as well as 'Average' operator, on first 5 input source images.
# max pooling, with step_size = 2
max_pooling_img = array_pooling(image_matrix, step_size=2, pooling_mode="Max")
max_pooling_img_container.append( max_pooling_img )
# print to console
print("\n Max pooling image of input_#{:2d}".format( serial_number ) )
print_image_array( max_pooling_img )
# output and saved as BitMap file
save_to_bmp(max_pooling_img, ouput_directory_path+"image_"+str(serial_number)+"_max_pooling")
# average pooling, with step_size = 2
avg_pooling_img = array_pooling(image_matrix, step_size=2, pooling_mode="Average")
avg_pooling_img_container.append( avg_pooling_img )
# print to console
print("\n Average pooling image of input_#{:2d}".format( serial_number ) )
print_image_array( avg_pooling_img )
# output and saved as BitMap file
save_to_bmp(avg_pooling_img, ouput_directory_path+"image_"+str(serial_number)+"_average_pooling")
'''
Example output:
### 1. Conduct and output pooling image, with 'Max' as well as 'Average' operator, on first 5 input source images.
Max pooling image of input_# 1
00 00 00 00 00 00 00 00 00 00 00 00 00 00
00 00 00 00 00 00 00 00 00 00 00 00 00 00
00 00 00 00 00 00 12 12 88 AF FF F7 00 00
00 00 00 31 FD FD FD FD FD E1 FD C3 00 00
00 00 00 12 FD FD FD C6 F7 00 00 00 00 00
00 00 00 00 0E 9A FD 02 00 00 00 00 00 00
00 00 00 00 00 0B FD E1 6C 00 00 00 00 00
00 00 00 00 00 00 51 FD FD 96 00 00 00 00
00 00 00 00 00 00 00 10 FC FD 40 00 00 00
00 00 00 00 00 00 94 FD FD FD 02 00 00 00
00 00 00 00 42 FD FD FD FD 4E 00 00 00 00
00 00 AC FD FD FD FD 50 00 00 00 00 00 00
00 00 FD FD D4 84 00 00 00 00 00 00 00 00
00 00 00 00 00 00 00 00 00 00 00 00 00 00
Image is saved into './output_of_mAiLab006/image_1_max_pooling.bmp'.
Average pooling image of input_# 1
00 00 00 00 00 00 00 00 00 00 00 00 00 00
00 00 00 00 00 00 00 00 00 00 00 00 00 00
00 00 00 00 00 00 05 09 41 32 69 5D 00 00
00 00 00 0C 8B BC E8 FD FC 8F 9E 4A 00 00
00 00 00 04 B1 D8 F1 61 AB 00 00 00 00 00
00 00 00 00 03 49 C4 00 00 00 00 00 00 00
00 00 00 00 00 02 B3 71 1B 00 00 00 00 00
00 00 00 00 00 00 14 B5 DB 32 00 00 00 00
00 00 00 00 00 00 00 04 94 EB 10 00 00 00
00 00 00 00 00 00 2E A4 EB DF 00 00 00 00
00 00 00 00 16 97 F5 EF 86 13 00 00 00 00
00 00 38 A7 F4 FA 94 16 00 00 00 00 00 00
00 00 61 7E 56 25 00 00 00 00 00 00 00 00
00 00 00 00 00 00 00 00 00 00 00 00 00 00
Image is saved into './output_of_mAiLab006/image_1_average_pooling.bmp'.
Max pooling image of input_# 2
00 00 00 00 00 00 00 00 00 00 00 00 00 00
00 00 00 00 00 00 00 00 00 00 00 00 00 00
00 00 00 00 00 00 00 EE FD FC 00 00 00 00
00 00 00 00 00 0A E0 FD FC FC FD 00 00 00
00 00 00 00 00 EE FD FD FD BD FF 00 00 00
00 00 00 00 A5 FD FC 4B 79 00 FD A5 00 00
00 00 00 39 FC F0 1C 00 00 00 FD C3 00 00
00 00 00 F6 FD 00 00 00 00 00 FF C4 00 00
00 00 00 FC E6 00 00 00 07 FC FD 0C 00 00
00 00 00 FD E1 00 00 72 FD FC 00 00 00 00
00 00 00 FC FC E5 FC FD DF 38 00 00 00 00
00 00 00 C7 FC FD FC 91 00 00 00 00 00 00
00 00 00 00 00 00 00 00 00 00 00 00 00 00
00 00 00 00 00 00 00 00 00 00 00 00 00 00
Image is saved into './output_of_mAiLab006/image_2_max_pooling.bmp'.
Average pooling image of input_# 2
00 00 00 00 00 00 00 00 00 00 00 00 00 00
00 00 00 00 00 00 00 00 00 00 00 00 00 00
00 00 00 00 00 00 00 54 E5 AE 00 00 00 00
00 00 00 00 00 02 54 F6 EC CD 6D 00 00 00
00 00 00 00 00 71 FC CA F6 66 D2 00 00 00
00 00 00 00 3E F4 B4 15 23 00 FA 35 00 00
00 00 00 10 E9 5D 0B 00 00 00 FC 61 00 00
00 00 00 82 C9 00 00 00 00 00 FD 56 00 00
00 00 00 A8 77 00 00 00 01 83 B7 03 00 00
00 00 00 A9 5C 00 00 1C B0 92 00 00 00 00
00 00 00 A8 E0 82 BF E7 82 0E 00 00 00 00
00 00 00 3F DD FC A5 24 00 00 00 00 00 00
00 00 00 00 00 00 00 00 00 00 00 00 00 00
00 00 00 00 00 00 00 00 00 00 00 00 00 00
Image is saved into './output_of_mAiLab006/image_2_average_pooling.bmp'.
Max pooling image of input_# 3
00 00 00 00 00 00 00 00 00 00 00 00 00 00
00 00 00 00 00 00 00 00 00 00 00 00 00 00
00 00 00 00 00 00 00 00 00 00 E8 27 00 00
00 00 A3 00 00 00 00 00 00 02 D2 28 00 00
00 00 DE 00 00 00 00 00 00 B7 FE 00 00 00
00 78 FE 00 00 00 00 00 00 E7 FE 00 00 00
00 9F FE 00 00 00 00 0E B2 FE D8 00 00 00
00 9F FE CF FD FE F0 F3 EA FC 28 00 00 00
00 00 B1 B1 B1 62 00 00 A9 FE 00 00 00 00
00 00 00 00 00 00 00 00 A9 FE 00 00 00 00
00 00 00 00 00 00 00 00 A9 FF 00 00 00 00
00 00 00 00 00 00 00 00 A9 FF 00 00 00 00
00 00 00 00 00 00 00 00 60 FE 00 00 00 00
00 00 00 00 00 00 00 00 00 00 00 00 00 00
Image is saved into './output_of_mAiLab006/image_3_max_pooling.bmp'.
Average pooling image of input_# 3
00 00 00 00 00 00 00 00 00 00 00 00 00 00
00 00 00 00 00 00 00 00 00 00 00 00 00 00
00 00 00 00 00 00 00 00 00 00 4A 09 00 00
00 00 6C 00 00 00 00 00 00 00 A5 13 00 00
00 00 C0 00 00 00 00 00 00 34 C6 00 00 00
00 29 CE 00 00 00 00 00 00 71 94 00 00 00
00 4F AD 00 00 00 00 03 42 E5 50 00 00 00
00 4D CF 67 7E A7 B7 B3 6F F4 0A 00 00 00
00 00 4A 58 58 26 00 00 43 D8 00 00 00 00
00 00 00 00 00 00 00 00 54 9B 00 00 00 00
00 00 00 00 00 00 00 00 54 AE 00 00 00 00
00 00 00 00 00 00 00 00 54 CB 00 00 00 00
00 00 00 00 00 00 00 00 18 65 00 00 00 00
00 00 00 00 00 00 00 00 00 00 00 00 00 00
Image is saved into './output_of_mAiLab006/image_3_average_pooling.bmp'.
Max pooling image of input_# 4
00 00 00 00 00 00 00 00 00 00 00 00 00 00
00 00 00 00 00 00 00 00 00 00 00 00 00 00
00 00 00 00 00 00 00 00 00 FD FF 00 00 00
00 00 00 00 00 00 00 00 7F FB FD 00 00 00
00 00 00 00 00 00 00 3C FB FB 1F 00 00 00
00 00 00 00 00 00 00 FD FD BD 00 00 00 00
00 00 00 00 00 00 68 FD FB 00 00 00 00 00
00 00 00 00 00 20 FD FD 17 00 00 00 00 00
00 00 00 00 00 DD FB FB 00 00 00 00 00 00
00 00 00 00 00 FD FB 0C 00 00 00 00 00 00
00 00 00 00 E4 FF FD 00 00 00 00 00 00 00
00 00 00 00 FB FD 00 00 00 00 00 00 00 00
00 00 00 00 C1 FD 00 00 00 00 00 00 00 00
00 00 00 00 00 00 00 00 00 00 00 00 00 00
Image is saved into './output_of_mAiLab006/image_4_max_pooling.bmp'.
Average pooling image of input_# 4
00 00 00 00 00 00 00 00 00 00 00 00 00 00
00 00 00 00 00 00 00 00 00 00 00 00 00 00
00 00 00 00 00 00 00 00 00 5E 4F 00 00 00
00 00 00 00 00 00 00 00 37 F9 9D 00 00 00
00 00 00 00 00 00 00 0F C3 C9 09 00 00 00
00 00 00 00 00 00 00 6B F8 3F 00 00 00 00
00 00 00 00 00 00 22 F0 90 00 00 00 00 00
00 00 00 00 00 08 CE D6 05 00 00 00 00 00
00 00 00 00 00 69 FB 73 00 00 00 00 00 00
00 00 00 00 00 F7 C4 03 00 00 00 00 00 00
00 00 00 00 6C FC 6C 00 00 00 00 00 00 00
00 00 00 00 9D EC 00 00 00 00 00 00 00 00
00 00 00 00 36 76 00 00 00 00 00 00 00 00
00 00 00 00 00 00 00 00 00 00 00 00 00 00
Image is saved into './output_of_mAiLab006/image_4_average_pooling.bmp'.
Max pooling image of input_# 5
00 00 00 00 00 00 00 00 00 00 00 00 00 00
00 00 00 00 00 00 00 00 00 00 00 00 00 00
00 00 00 00 00 00 00 00 00 00 00 00 00 00
00 00 00 00 00 00 94 FD FD 94 37 00 00 00
00 00 00 00 04 F2 FC FD FC FD A8 00 00 00
00 00 00 00 FD FC B7 00 FC FC 15 00 00 00
00 00 00 E8 FD B0 24 FC FD 81 00 00 00 00
00 00 00 FC FD FC FC FD FB 00 00 00 00 00
00 00 00 37 FD D9 3E FF 8F 00 00 00 00 00
00 00 00 00 00 00 47 FD 15 00 00 00 00 00
00 00 00 00 00 00 6A FD 15 00 00 00 00 00
00 00 00 00 00 00 2D FF 38 00 00 00 00 00
00 00 00 00 00 00 00 FC FC 0B 00 00 00 00
00 00 00 00 00 00 00 0E FC 2A 00 00 00 00
Image is saved into './output_of_mAiLab006/image_5_max_pooling.bmp'.
Average pooling image of input_# 5
PS D:\Python> & C:/Users/123/Anaconda3/python.exe d:/Python/mAiLab006.py
Max pooling image of input_# 1
00 00 00 00 00 00 00 00 00 00 00 00 00 00
00 00 00 00 00 00 00 00 00 00 00 00 00 00
00 00 00 00 00 00 12 12 88 AF FF F7 00 00
00 00 00 31 FD FD FD FD FD E1 FD C3 00 00
00 00 00 12 FD FD FD C6 F7 00 00 00 00 00
00 00 00 00 0E 9A FD 02 00 00 00 00 00 00
00 00 00 00 00 0B FD E1 6C 00 00 00 00 00
00 00 00 00 00 00 51 FD FD 96 00 00 00 00
00 00 00 00 00 00 00 10 FC FD 40 00 00 00
00 00 00 00 00 00 94 FD FD FD 02 00 00 00
00 00 00 00 42 FD FD FD FD 4E 00 00 00 00
00 00 AC FD FD FD FD 50 00 00 00 00 00 00
00 00 FD FD D4 84 00 00 00 00 00 00 00 00
00 00 00 00 00 00 00 00 00 00 00 00 00 00
Image is saved into './output_of_mAiLab006/image_1_max_pooling.bmp'.
Average pooling image of input_# 1
00 00 00 00 00 00 00 00 00 00 00 00 00 00
00 00 00 00 00 00 00 00 00 00 00 00 00 00
00 00 00 00 00 00 05 09 41 32 69 5D 00 00
00 00 00 0C 8B BC E8 FD FC 8F 9E 4A 00 00
00 00 00 04 B1 D8 F1 61 AB 00 00 00 00 00
00 00 00 00 03 49 C4 00 00 00 00 00 00 00
00 00 00 00 00 02 B3 71 1B 00 00 00 00 00
00 00 00 00 00 00 14 B5 DB 32 00 00 00 00
00 00 00 00 00 00 00 04 94 EB 10 00 00 00
00 00 00 00 00 00 2E A4 EB DF 00 00 00 00
00 00 00 00 16 97 F5 EF 86 13 00 00 00 00
00 00 38 A7 F4 FA 94 16 00 00 00 00 00 00
00 00 61 7E 56 25 00 00 00 00 00 00 00 00
00 00 00 00 00 00 00 00 00 00 00 00 00 00
Image is saved into './output_of_mAiLab006/image_1_average_pooling.bmp'.
Max pooling image of input_# 2
00 00 00 00 00 00 00 00 00 00 00 00 00 00
00 00 00 00 00 00 00 00 00 00 00 00 00 00
00 00 00 00 00 00 00 EE FD FC 00 00 00 00
00 00 00 00 00 0A E0 FD FC FC FD 00 00 00
00 00 00 00 00 EE FD FD FD BD FF 00 00 00
00 00 00 00 A5 FD FC 4B 79 00 FD A5 00 00
00 00 00 39 FC F0 1C 00 00 00 FD C3 00 00
00 00 00 F6 FD 00 00 00 00 00 FF C4 00 00
00 00 00 FC E6 00 00 00 07 FC FD 0C 00 00
00 00 00 FD E1 00 00 72 FD FC 00 00 00 00
00 00 00 FC FC E5 FC FD DF 38 00 00 00 00
00 00 00 C7 FC FD FC 91 00 00 00 00 00 00
00 00 00 00 00 00 00 00 00 00 00 00 00 00
00 00 00 00 00 00 00 00 00 00 00 00 00 00
Image is saved into './output_of_mAiLab006/image_2_max_pooling.bmp'.
Average pooling image of input_# 2
00 00 00 00 00 00 00 00 00 00 00 00 00 00
00 00 00 00 00 00 00 00 00 00 00 00 00 00
00 00 00 00 00 00 00 54 E5 AE 00 00 00 00
00 00 00 00 00 02 54 F6 EC CD 6D 00 00 00
00 00 00 00 00 71 FC CA F6 66 D2 00 00 00
00 00 00 00 3E F4 B4 15 23 00 FA 35 00 00
00 00 00 10 E9 5D 0B 00 00 00 FC 61 00 00
00 00 00 82 C9 00 00 00 00 00 FD 56 00 00
00 00 00 A8 77 00 00 00 01 83 B7 03 00 00
00 00 00 A9 5C 00 00 1C B0 92 00 00 00 00
00 00 00 A8 E0 82 BF E7 82 0E 00 00 00 00
00 00 00 3F DD FC A5 24 00 00 00 00 00 00
00 00 00 00 00 00 00 00 00 00 00 00 00 00
00 00 00 00 00 00 00 00 00 00 00 00 00 00
Image is saved into './output_of_mAiLab006/image_2_average_pooling.bmp'.
Max pooling image of input_# 3
00 00 00 00 00 00 00 00 00 00 00 00 00 00
00 00 00 00 00 00 00 00 00 00 00 00 00 00
00 00 00 00 00 00 00 00 00 00 E8 27 00 00
00 00 A3 00 00 00 00 00 00 02 D2 28 00 00
00 00 DE 00 00 00 00 00 00 B7 FE 00 00 00
00 78 FE 00 00 00 00 00 00 E7 FE 00 00 00
00 9F FE 00 00 00 00 0E B2 FE D8 00 00 00
00 9F FE CF FD FE F0 F3 EA FC 28 00 00 00
00 00 B1 B1 B1 62 00 00 A9 FE 00 00 00 00
00 00 00 00 00 00 00 00 A9 FE 00 00 00 00
00 00 00 00 00 00 00 00 A9 FF 00 00 00 00
00 00 00 00 00 00 00 00 A9 FF 00 00 00 00
00 00 00 00 00 00 00 00 60 FE 00 00 00 00
00 00 00 00 00 00 00 00 00 00 00 00 00 00
Image is saved into './output_of_mAiLab006/image_3_max_pooling.bmp'.
Average pooling image of input_# 3
00 00 00 00 00 00 00 00 00 00 00 00 00 00
00 00 00 00 00 00 00 00 00 00 00 00 00 00
00 00 00 00 00 00 00 00 00 00 4A 09 00 00
00 00 6C 00 00 00 00 00 00 00 A5 13 00 00
00 00 C0 00 00 00 00 00 00 34 C6 00 00 00
00 29 CE 00 00 00 00 00 00 71 94 00 00 00
00 4F AD 00 00 00 00 03 42 E5 50 00 00 00
00 4D CF 67 7E A7 B7 B3 6F F4 0A 00 00 00
00 00 4A 58 58 26 00 00 43 D8 00 00 00 00
00 00 00 00 00 00 00 00 54 9B 00 00 00 00
00 00 00 00 00 00 00 00 54 AE 00 00 00 00
00 00 00 00 00 00 00 00 54 CB 00 00 00 00
00 00 00 00 00 00 00 00 18 65 00 00 00 00
00 00 00 00 00 00 00 00 00 00 00 00 00 00
Image is saved into './output_of_mAiLab006/image_3_average_pooling.bmp'.
Max pooling image of input_# 4
00 00 00 00 00 00 00 00 00 00 00 00 00 00
00 00 00 00 00 00 00 00 00 00 00 00 00 00
00 00 00 00 00 00 00 00 00 FD FF 00 00 00
00 00 00 00 00 00 00 00 7F FB FD 00 00 00
00 00 00 00 00 00 00 3C FB FB 1F 00 00 00
00 00 00 00 00 00 00 FD FD BD 00 00 00 00
00 00 00 00 00 00 68 FD FB 00 00 00 00 00
00 00 00 00 00 20 FD FD 17 00 00 00 00 00
00 00 00 00 00 DD FB FB 00 00 00 00 00 00
00 00 00 00 00 FD FB 0C 00 00 00 00 00 00
00 00 00 00 E4 FF FD 00 00 00 00 00 00 00
00 00 00 00 FB FD 00 00 00 00 00 00 00 00
00 00 00 00 C1 FD 00 00 00 00 00 00 00 00
00 00 00 00 00 00 00 00 00 00 00 00 00 00
Image is saved into './output_of_mAiLab006/image_4_max_pooling.bmp'.
Average pooling image of input_# 4
00 00 00 00 00 00 00 00 00 00 00 00 00 00
00 00 00 00 00 00 00 00 00 00 00 00 00 00
00 00 00 00 00 00 00 00 00 5E 4F 00 00 00
00 00 00 00 00 00 00 00 37 F9 9D 00 00 00
00 00 00 00 00 00 00 0F C3 C9 09 00 00 00
00 00 00 00 00 00 00 6B F8 3F 00 00 00 00
00 00 00 00 00 00 22 F0 90 00 00 00 00 00
00 00 00 00 00 08 CE D6 05 00 00 00 00 00
00 00 00 00 00 69 FB 73 00 00 00 00 00 00
00 00 00 00 00 F7 C4 03 00 00 00 00 00 00
00 00 00 00 6C FC 6C 00 00 00 00 00 00 00
00 00 00 00 9D EC 00 00 00 00 00 00 00 00
00 00 00 00 36 76 00 00 00 00 00 00 00 00
00 00 00 00 00 00 00 00 00 00 00 00 00 00
Image is saved into './output_of_mAiLab006/image_4_average_pooling.bmp'.
Max pooling image of input_# 5
00 00 00 00 00 00 00 00 00 00 00 00 00 00
00 00 00 00 00 00 00 00 00 00 00 00 00 00
00 00 00 00 00 00 00 00 00 00 00 00 00 00
00 00 00 00 00 00 94 FD FD 94 37 00 00 00
00 00 00 00 04 F2 FC FD FC FD A8 00 00 00
00 00 00 00 FD FC B7 00 FC FC 15 00 00 00
00 00 00 E8 FD B0 24 FC FD 81 00 00 00 00
00 00 00 FC FD FC FC FD FB 00 00 00 00 00
00 00 00 37 FD D9 3E FF 8F 00 00 00 00 00
00 00 00 00 00 00 47 FD 15 00 00 00 00 00
00 00 00 00 00 00 6A FD 15 00 00 00 00 00
00 00 00 00 00 00 2D FF 38 00 00 00 00 00
00 00 00 00 00 00 00 FC FC 0B 00 00 00 00
00 00 00 00 00 00 00 0E FC 2A 00 00 00 00
Image is saved into './output_of_mAiLab006/image_5_max_pooling.bmp'.
Average pooling image of input_# 5
00 00 00 00 00 00 00 00 00 00 00 00 00 00
00 00 00 00 00 00 00 00 00 00 00 00 00 00
00 00 00 00 00 00 00 00 00 00 00 00 00 00
00 00 00 00 00 00 32 73 5B 3A 0D 00 00 00
00 00 00 00 01 60 E7 80 A4 FC 47 00 00 00
00 00 00 00 78 E1 34 00 CA CA 05 00 00 00
00 00 00 5D E8 2E 09 86 E5 23 00 00 00 00
00 00 00 82 C4 A1 E5 F6 8B 00 00 00 00 00
00 00 00 0D 7A 58 20 EE 3F 00 00 00 00 00
00 00 00 00 00 00 11 FC 0A 00 00 00 00 00
00 00 00 00 00 00 2C FC 0A 00 00 00 00 00
00 00 00 00 00 00 0B F4 13 00 00 00 00 00
00 00 00 00 00 00 00 88 A3 02 00 00 00 00
00 00 00 00 00 00 00 03 63 0A 00 00 00 00
''' | 35.344609 | 118 | 0.679687 | 4,531 | 16,718 | 2.454204 | 0.071507 | 0.919065 | 1.238309 | 1.469784 | 0.866097 | 0.856655 | 0.856475 | 0.840108 | 0.838669 | 0.830036 | 0 | 0.562166 | 0.294234 | 16,718 | 473 | 119 | 35.344609 | 0.380286 | 0.057961 | 0 | 0.05 | 0 | 0 | 0.104787 | 0.029323 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | false | 0.025 | 0.1 | 0 | 0.1 | 0.125 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 12 |
e16e9f82e9e135db9b2f63c2691e5ac7abb391f3 | 4,116 | py | Python | hubcheck/pageobjects/po_supportticketsearch.py | codedsk/hubcheck | 2ff506eb56ba00f035300862f8848e4168452a17 | [
"MIT"
] | 1 | 2016-02-13T13:42:23.000Z | 2016-02-13T13:42:23.000Z | hubcheck/pageobjects/po_supportticketsearch.py | codedsk/hubcheck | 2ff506eb56ba00f035300862f8848e4168452a17 | [
"MIT"
] | null | null | null | hubcheck/pageobjects/po_supportticketsearch.py | codedsk/hubcheck | 2ff506eb56ba00f035300862f8848e4168452a17 | [
"MIT"
] | null | null | null | from hubcheck.pageobjects.po_generic_page import GenericPage
from hubcheck.pageobjects.basepageelement import Link
class SupportTicketSearchPage2011(GenericPage):
"""support ticket search"""
def __init__(self,browser,catalog):
super(SupportTicketSearchPage2011,self).__init__(browser,catalog)
self.path = "/support/tickets"
# load hub's classes
SupportTicketSearchPage2011_Locators = self.load_class('SupportTicketSearchPage2011_Locators')
TicketListSearchForm = self.load_class('TicketListSearchForm')
# update this object's locator
self.locators.update(AdminLoginPage_Locators.locators)
# setup page object's components
self.stats = Link(self,{'base':'stats'})
self.newticket = Link(self,{'base':'newticket'})
self.ticketlistsearch = TicketListSearchForm(self,{'base':'ticketlistsearch'})
def filter_by_keyword(self,value):
return self.ticketlistsearch.filter_by_keyword(value)
def filter_by_dropdown(self,value):
return self.ticketlistsearch.filter_by_dropdown(value)
def ticket_rows_displayed(self):
return self.ticketlistsearch.ticket_rows_displayed()
def ticket_row_by_index(self,index):
return self.ticketlistsearch.ticket_row_by_index(index)
def goto_page_number(self,pagenumber):
return self.ticketlistsearch.goto_page_number(pagenumber)
def goto_page_relative(self,relation):
return self.ticketlistsearch.goto_page_relative(relation)
def get_pagination_counts(self):
return self.ticketlistsearch.get_pagination_counts()
def display_limit(self,limit=None):
return self.ticketlistsearch.display_limit(limit)
class SupportTicketSearchPage2011_Locators_Base(object):
"""locators for SupportTicketSearchPage2011 object"""
locators = {
'ticketlistsearch' : "css=[name='adminForm']",
'stats' : "css=.stats",
'newticket' : "css=.new-ticket",
}
class SupportTicketSearchPage2011_Locators_Base_2(object):
"""locators for SupportTicketSearchPage2011 object"""
locators = {
'ticketlistsearch' : "css=#ticketForm",
'stats' : "css=.stats",
'newticket' : "css=.new-ticket",
}
class SupportTicketSearchPage2012(GenericPage):
"""support ticket search"""
def __init__(self,browser,catalog):
super(SupportTicketSearchPage2012,self).__init__(browser,catalog)
self.path = "/support/tickets"
# load hub's classes
SupportTicketSearchPage2012_Locators = self.load_class('SupportTicketSearchPage2012_Locators')
TicketListSearchForm = self.load_class('TicketListSearchForm')
# update this object's locator
self.locators.update(SupportTicketSearchPage2012_Locators.locators)
# setup page object's components
self.stats = Link(self,{'base':'stats'})
self.newticket = Link(self,{'base':'newticket'})
self.ticketlistsearch = TicketListSearchForm(self,{'base':'ticketlistsearch'})
def filter_by_keyword(self,value):
return self.ticketlistsearch.filter_by_keyword(value)
def ticket_rows_displayed(self):
return self.ticketlistsearch.ticket_rows_displayed()
def ticket_row_by_index(self,index):
return self.ticketlistsearch.ticket_row_by_index(index)
def goto_page_number(self,pagenumber):
return self.ticketlistsearch.goto_page_number(pagenumber)
def goto_page_relative(self,relation):
return self.ticketlistsearch.goto_page_relative(relation)
def get_pagination_counts(self):
return self.ticketlistsearch.get_pagination_counts()
def display_limit(self,limit=None):
return self.ticketlistsearch.display_limit(limit)
class SupportTicketSearchPage2012_Locators_Base(object):
"""locators for SupportTicketSearchPage2012 object"""
locators = {
'ticketlistsearch' : "css=#main form",
'stats' : "css=.stats",
'newticket' : "css=.add",
}
| 36.424779 | 102 | 0.702867 | 407 | 4,116 | 6.874693 | 0.176904 | 0.121515 | 0.139385 | 0.042888 | 0.779485 | 0.755897 | 0.755897 | 0.740529 | 0.657613 | 0.657613 | 0 | 0.018479 | 0.198008 | 4,116 | 112 | 103 | 36.75 | 0.829143 | 0.084062 | 0 | 0.685714 | 0 | 0 | 0.116876 | 0.02514 | 0 | 0 | 0 | 0 | 0 | 1 | 0.242857 | false | 0 | 0.028571 | 0.214286 | 0.6 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 7 |
e19e52b84e855c6d82fd4b9b03b78b2463dc4305 | 117 | py | Python | hello-world/src/app.py | kenichi-ogawa-1988/docker-build-symlink | 0a1aded397ce810ec738d5185a1f04ab48cb99ab | [
"MIT"
] | null | null | null | hello-world/src/app.py | kenichi-ogawa-1988/docker-build-symlink | 0a1aded397ce810ec738d5185a1f04ab48cb99ab | [
"MIT"
] | null | null | null | hello-world/src/app.py | kenichi-ogawa-1988/docker-build-symlink | 0a1aded397ce810ec738d5185a1f04ab48cb99ab | [
"MIT"
] | null | null | null | from .common.print_with_time import print_with_time
if __name__ == '__main__':
print_with_time('Hello world!')
| 19.5 | 51 | 0.760684 | 17 | 117 | 4.411765 | 0.647059 | 0.36 | 0.52 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.136752 | 117 | 5 | 52 | 23.4 | 0.742574 | 0 | 0 | 0 | 0 | 0 | 0.17094 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.333333 | 0 | 0.333333 | 0.666667 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 7 |
bef818957d0e194da8dcb446e1217769eb5537b3 | 10,456 | py | Python | tests/suggestions/test_suggest_match_video_controller.py | tervay/the-blue-alliance | e14c15cb04b455f90a2fcfdf4c1cdbf8454e17f8 | [
"MIT"
] | 1 | 2016-03-19T20:29:35.000Z | 2016-03-19T20:29:35.000Z | tests/suggestions/test_suggest_match_video_controller.py | gregmarra/the-blue-alliance | 5bedaf5c80b4623984760d3da3289640639112f9 | [
"MIT"
] | 11 | 2020-10-10T03:05:29.000Z | 2022-02-27T09:57:22.000Z | tests/suggestions/test_suggest_match_video_controller.py | gregmarra/the-blue-alliance | 5bedaf5c80b4623984760d3da3289640639112f9 | [
"MIT"
] | null | null | null | import unittest2
import webapp2
import webtest
from datetime import datetime
from google.appengine.ext import ndb
from google.appengine.ext import testbed
from webapp2_extras.routes import RedirectRoute
from consts.district_type import DistrictType
from consts.event_type import EventType
from controllers.suggestions.suggest_match_video_controller import SuggestMatchVideoController, \
SuggestMatchVideoPlaylistController
from models.account import Account
from models.event import Event
from models.match import Match
from models.suggestion import Suggestion
class TestSuggestMatchVideoController(unittest2.TestCase):
def setUp(self):
self.testbed = testbed.Testbed()
self.testbed.activate()
self.testbed.init_datastore_v3_stub()
self.testbed.init_memcache_stub()
self.testbed.init_user_stub()
ndb.get_context().clear_cache() # Prevent data from leaking between tests
app = webapp2.WSGIApplication([
RedirectRoute(r'/suggest/match/video', SuggestMatchVideoController, 'suggest-match-video', strict_slash=True),
], debug=True)
self.testapp = webtest.TestApp(app)
self.event = Event(
id="2016necmp",
name="New England District Championship",
event_type_enum=EventType.DISTRICT_CMP,
event_district_enum=DistrictType.NEW_ENGLAND,
short_name="New England",
event_short="necmp",
year=2016,
end_date=datetime(2016, 03, 27),
official=False,
city='Hartford',
state_prov='CT',
country='USA',
venue="Some Venue",
venue_address="Some Venue, Hartford, CT, USA",
timezone_id="America/New_York",
start_date=datetime(2016, 03, 24),
webcast_json="",
website="http://www.firstsv.org"
)
self.event.put()
self.match = Match(
id="2016necmp_f1m1",
event=ndb.Key(Event, "2016necmp"),
year=2016,
comp_level="f",
set_number=1,
match_number=1,
team_key_names=['frc846', 'frc2135', 'frc971', 'frc254', 'frc1678', 'frc973'],
time=datetime.fromtimestamp(1409527874),
time_string="4:31 PM",
youtube_videos=["JbwUzl3W9ug"],
tba_videos=[],
alliances_json='{\
"blue": {\
"score": 270,\
"teams": [\
"frc846",\
"frc2135",\
"frc971"]},\
"red": {\
"score": 310,\
"teams": [\
"frc254",\
"frc1678",\
"frc973"]}}',
score_breakdown_json='{\
"blue": {\
"auto": 70,\
"teleop_goal+foul": 40,\
"assist": 120,\
"truss+catch": 40\
},"red": {\
"auto": 70,\
"teleop_goal+foul": 50,\
"assist": 150,\
"truss+catch": 40}}'
)
self.match.put()
def tearDown(self):
self.testbed.deactivate()
def loginUser(self):
self.testbed.setup_env(
user_email="user@example.com",
user_id="123",
user_is_admin='0',
overwrite=True
)
Account.get_or_insert(
"123",
email="user@example.com",
registered=True
)
def getSuggestionForm(self, match_key):
response = self.testapp.get('/suggest/match/video?match_key={}'.format(match_key))
self.assertEqual(response.status_int, 200)
form = response.forms.get('suggest_match_video', None)
self.assertIsNotNone(form)
return form
def test_login_redirect(self):
response = self.testapp.get('/suggest/match/video?match_key=2016necmp_f1m1', status='3*')
response = response.follow(expect_errors=True)
self.assertTrue(response.request.path.startswith("/account/login_required"))
def test_no_params(self):
self.loginUser()
response = self.testapp.get('/suggest/match/video', status='3*')
response = response.follow(expect_errors=True)
self.assertEqual(response.request.path, '/')
def test_submit_empty_form(self):
self.loginUser()
form = self.getSuggestionForm('2016necmp_f1m1')
response = form.submit().follow()
self.assertEqual(response.status_int, 200)
request = response.request
self.assertEqual(request.GET.get('status'), 'bad_url')
def test_submit_match_video(self):
self.loginUser()
form = self.getSuggestionForm('2016necmp_f1m1')
form['youtube_url'] = "http://youtu.be/bHGyTjxbLz8"
response = form.submit().follow()
self.assertEqual(response.status_int, 200)
# Make sure the Suggestion gets created
suggestion_id = "media_2016_match_2016necmp_f1m1_youtube_bHGyTjxbLz8"
suggestion = Suggestion.get_by_id(suggestion_id)
self.assertIsNotNone(suggestion)
self.assertEqual(suggestion.review_state, Suggestion.REVIEW_PENDING)
# Ensure we show a success message on the page
request = response.request
self.assertEqual(request.GET.get('status'), 'success')
class TestSuggestMatchVideoPlaylistController(unittest2.TestCase):
def setUp(self):
self.testbed = testbed.Testbed()
self.testbed.activate()
self.testbed.init_datastore_v3_stub()
self.testbed.init_memcache_stub()
self.testbed.init_user_stub()
ndb.get_context().clear_cache() # Prevent data from leaking between tests
app = webapp2.WSGIApplication([
RedirectRoute(r'/suggest/event/video', SuggestMatchVideoPlaylistController, 'suggest-event-video', strict_slash=True),
], debug=True)
self.testapp = webtest.TestApp(app)
self.event = Event(
id="2016necmp",
name="New England District Championship",
event_type_enum=EventType.DISTRICT_CMP,
event_district_enum=DistrictType.NEW_ENGLAND,
short_name="New England",
event_short="necmp",
year=2016,
end_date=datetime(2016, 03, 27),
official=False,
city='Hartford',
state_prov='CT',
country='USA',
venue="Some Venue",
venue_address="Some Venue, Hartford, CT, USA",
timezone_id="America/New_York",
start_date=datetime(2016, 03, 24),
webcast_json="",
website="http://www.firstsv.org"
)
self.event.put()
self.match = Match(
id="2016necmp_f1m1",
event=ndb.Key(Event, "2016necmp"),
year=2016,
comp_level="f",
set_number=1,
match_number=1,
team_key_names=['frc846', 'frc2135', 'frc971', 'frc254', 'frc1678', 'frc973'],
time=datetime.fromtimestamp(1409527874),
time_string="4:31 PM",
youtube_videos=["JbwUzl3W9ug"],
tba_videos=[],
alliances_json='{\
"blue": {\
"score": 270,\
"teams": [\
"frc846",\
"frc2135",\
"frc971"]},\
"red": {\
"score": 310,\
"teams": [\
"frc254",\
"frc1678",\
"frc973"]}}',
score_breakdown_json='{\
"blue": {\
"auto": 70,\
"teleop_goal+foul": 40,\
"assist": 120,\
"truss+catch": 40\
},"red": {\
"auto": 70,\
"teleop_goal+foul": 50,\
"assist": 150,\
"truss+catch": 40}}'
)
self.match.put()
def tearDown(self):
self.testbed.deactivate()
def loginUser(self):
self.testbed.setup_env(
user_email="user@example.com",
user_id="123",
user_is_admin='0',
overwrite=True
)
Account.get_or_insert(
"123",
email="user@example.com",
registered=True
)
def getSuggestionForm(self, event_key):
response = self.testapp.get('/suggest/event/video?event_key={}'.format(event_key))
self.assertEqual(response.status_int, 200)
form = response.forms.get('event_videos', None)
self.assertIsNotNone(form)
return form
def test_login_redirect(self):
response = self.testapp.get('/suggest/event/video?event_key=2016necmp', status='3*')
response = response.follow(expect_errors=True)
self.assertTrue(response.request.path.startswith("/account/login_required"))
def test_no_params(self):
self.loginUser()
response = self.testapp.get('/suggest/event/video', status='3*')
response = response.follow(expect_errors=True)
self.assertEqual(response.request.path, '/')
def test_bad_event(self):
self.loginUser()
response = self.testapp.get('/suggest/event/video?event_key=2016foo', expect_errors=True)
self.assertEqual(response.status_int, 404)
def test_submit_empty_form(self):
self.loginUser()
form = self.getSuggestionForm('2016necmp')
response = form.submit().follow()
self.assertEqual(response.status_int, 200)
request = response.request
self.assertEqual(request.GET.get('num_added'), '0')
def test_submit_one_video(self):
self.loginUser()
response = self.testapp.post('/suggest/event/video?event_key=2016necmp', {
'num_videos': 1,
'video_id_0': '37F5tbrFqJQ',
'match_partial_0': 'f1m1'
}).follow()
self.assertEqual(response.status_int, 200)
request = response.request
self.assertEqual(request.GET.get('num_added'), '1')
suggestion_id = "media_2016_match_2016necmp_f1m1_youtube_37F5tbrFqJQ"
suggestion = Suggestion.get_by_id(suggestion_id)
self.assertIsNotNone(suggestion)
| 35.087248 | 130 | 0.568382 | 1,039 | 10,456 | 5.545717 | 0.214629 | 0.026727 | 0.035925 | 0.026727 | 0.825929 | 0.814648 | 0.793127 | 0.79226 | 0.766053 | 0.731343 | 0 | 0.049408 | 0.312835 | 10,456 | 297 | 131 | 35.205387 | 0.75254 | 0.015494 | 0 | 0.782946 | 0 | 0 | 0.148994 | 0.036641 | 0 | 0 | 0 | 0 | 0.077519 | 0 | null | null | 0 | 0.054264 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
8329610caffceb2389c6238adde2a3dd2d898725 | 29,087 | py | Python | brewerslab-orig-slave/test/test_pitmRelay.py | allena29/brewerslabng | f47e671971436b7af806b54f6019c5b185d7d194 | [
"Apache-2.0"
] | 1 | 2020-04-12T10:08:10.000Z | 2020-04-12T10:08:10.000Z | brewerslab-orig-slave/test/test_pitmRelay.py | allena29/brewerslabng | f47e671971436b7af806b54f6019c5b185d7d194 | [
"Apache-2.0"
] | 2 | 2021-12-13T20:09:45.000Z | 2022-03-08T21:09:57.000Z | brewerslab-orig-slave/test/test_pitmRelay.py | allena29/brewerslabng | f47e671971436b7af806b54f6019c5b185d7d194 | [
"Apache-2.0"
] | null | null | null | from mock import patch, Mock, call, ANY
import unittest
import time
from pitmRelay import pitmRelay
class TestPitmRelay(unittest.TestCase):
def setUp(self):
self.subject = pitmRelay(rpi=False)
self.subject.groot.log = Mock()
self.subject.lcdDisplay = Mock()
self.subject.gpio = Mock()
self.subject.zoneTemp = 20
def test_callback_zome_temp_thread_when_idle(self):
# Setup
self.subject._mode = 'idle'
cm = {
}
# Action
self.subject.callback_zone_temp_thread(cm)
# Assert
self.assertEqual(self.subject.groot.log.call_count, 0)
def test_callback_zome_temp_thread_when_ferm_valid_result_HEATING(self):
# Setup
self.subject._mode = 'ferm'
self.subject.cfg.fermProbe = 'ferm-probe'
self.subject.zoneTarget = 20
self.subject.zoneUpTarget = 2
self.subject.zoneDownTarget = 200
self.subject.fridgeHeat = True
self.subject._gpioFermHeat = True
self.subject.fridgeCool = False
self.subject._gpioFermCool = False
cm = {
'currentResult' : {
self.subject.cfg.fermProbe : {
'valid' : True,
'temperature' : 19.0
}
}
}
# Action
self.subject.callback_zone_temp_thread(cm)
# Assert
self.assertEqual(self.subject.groot.log.call_count, 1)
self.subject.groot.log.assert_called_once_with('Temp: 19.0 Target: 20(>2 <200) fridgeHeat: True/True fridgeCool: False/False (delay True) ', importance=0)
def test_callback_zome_temp_thread_when_ferm_valid_result_COOLING_AND_REACHED(self):
# Setup
self.subject._mode = 'ferm'
self.subject.cfg.fermProbe = 'ferm-probe'
self.subject.fridgeCompressorDelay = 0
self.subject.zoneTarget = 18
self.subject.zoneUpTarget = 17.3
self.subject.zoneDownTarget = 17.7
self.subject._lastValidReading['ferm'] = time.time()
# We want two readings to test crossing from cooling to no cooling
cm = {
'currentResult' : {
self.subject.cfg.fermProbe : {
'valid' : True,
'temperature' : 19.0
}
}
}
self.subject.callback_zone_temp_thread(cm)
cm = {
'currentResult' : {
self.subject.cfg.fermProbe : {
'valid' : True,
'temperature' : 18.11
}
}
}
self.subject.fridgeCool = True
self.subject.callback_zone_temp_thread(cm)
self.subject._turn_cooling_off = Mock()
self.subject._turn_cooling_on = Mock()
# Action
self.subject._zone_ferm()
# Assert
self.assertEqual(self.subject._turn_cooling_off.call_count, 1)
self.assertEqual(self.subject._turn_cooling_on.call_count, 0)
def test_callback_zome_temp_thread_when_ferm_valid_result_COOLING(self):
# Setup
self.subject._mode = 'ferm'
self.subject.cfg.fermProbe = 'ferm-probe'
self.subject.fridgeCompressorDelay = 0
self.subject.zoneTarget = 18
self.subject.zoneUpTarget = 17.3
self.subject.zoneDownTarget = 17.7
self.subject.fridgeHeat = False
self.subject._gpioFermHeat = False
self.subject.fridgeCool = True
self.subject._gpioFermCool = True
cm = {
'currentResult' : {
self.subject.cfg.fermProbe : {
'valid' : True,
'temperature' : 19.0
}
}
}
# Action
self.subject.callback_zone_temp_thread(cm)
# Assert
self.assertEqual(self.subject.groot.log.call_count, 1)
self.subject.groot.log.assert_called_once_with('Temp: 19.0 Target: 18(>17.3 <17.7) fridgeHeat: False/False fridgeCool: True/True (delay False) ', importance=0)
def test_callback_zome_temp_thread_when_ferm_valid_result_TEMP_ERROR(self):
# Setup
self.subject._mode = 'ferm'
self.subject.cfg.fermProbe = 'ferm-probe'
self.subject.fridgeCompressorDelay = 0
self.subject.zoneTarget = 18
self.subject.fridgeHeat = False
self.subject._gpioFermHeat = False
self.subject.fridgeCool = True
self.subject._gpioFermCool = True
cm = {
'currentResult' : {
self.subject.cfg.fermProbe : {
'valid' : False,
}
}
}
# Action
self.subject.callback_zone_temp_thread(cm)
# Assert
self.assertEqual(self.subject.groot.log.call_count, 0)
self.subject.lcdDisplay.sendMessage.assert_called_once_with('Temp Result Error', 2)
def test_callback_zome_temp_thread_when_ferm_update_targets(self):
HEAT_AT_TEMP = 10.1
COOL_AT_TEMP = 29.9
TARGET_TEMP = 20.0
# Setup
self.subject._mode = 'idle'
cm = {
'tempTargetFerm' : (HEAT_AT_TEMP, COOL_AT_TEMP, TARGET_TEMP)
}
# Action
self.subject.callback_zone_temp_thread(cm)
# Assert
self.assertEqual(self.subject.zoneUpTarget, HEAT_AT_TEMP)
self.assertEqual(self.subject.zoneDownTarget, COOL_AT_TEMP)
self.assertEqual(self.subject.zoneTarget, TARGET_TEMP)
self.assertEqual(self.subject.groot.log.call_count, 0)
def test_callback_zome_temp_thread_when_ferm_update_targets_INVALID(self):
HEAT_AT_TEMP = 4.5
COOL_AT_TEMP = 29.9
TARGET_TEMP = 20.0
# Setup
self.subject._mode = 'idle'
cm = {
'tempTargetFerm' : (HEAT_AT_TEMP, COOL_AT_TEMP, TARGET_TEMP)
}
# Action
self.subject.callback_zone_temp_thread(cm)
# Assert
self.subject.groot.log.assert_called_once_with('Temp Target is invalid %s,%s,%s' % (cm['tempTargetFerm']), importance=2)
def test_callback_zome_temp_thread_when_ferm_update_targets_INVALID_2(self):
HEAT_AT_TEMP = 10.4
COOL_AT_TEMP = 4.4
TARGET_TEMP = 20.0
# Setup
self.subject._mode = 'idle'
cm = {
'tempTargetFerm' : (HEAT_AT_TEMP, COOL_AT_TEMP, TARGET_TEMP)
}
# Action
self.subject.callback_zone_temp_thread(cm)
# Assert
self.subject.groot.log.assert_called_once_with('Temp Target is invalid %s,%s,%s' % (cm['tempTargetFerm']), importance=2)
def test_callback_zome_temp_thread_when_ferm_update_targets_INVALID_3(self):
HEAT_AT_TEMP = 10.4
COOL_AT_TEMP = 24.4
TARGET_TEMP = 4
# Setup
self.subject._mode = 'idle'
cm = {
'tempTargetFerm' : (HEAT_AT_TEMP, COOL_AT_TEMP, TARGET_TEMP)
}
# Action
self.subject.callback_zone_temp_thread(cm)
# Assert
self.subject.groot.log.assert_called_once_with('Temp Target is invalid %s,%s,%s' % (cm['tempTargetFerm']), importance=2)
def test_zoneThread_mode_idle(self):
# Setup
self.subject._mode = 'idle'
self.subject._zone_idle_shutdown = Mock()
# Action
self.subject._do_zone_thread()
# Assert
self.subject._zone_idle_shutdown.assert_called_once()
def test_zoneThread_mode_shutdown(self):
# Setup
self.subject._mode = 'idle'
self.subject._zone_idle_shutdown = Mock()
# Action
self.subject._do_zone_thread()
# Assert
self.subject._zone_idle_shutdown.assert_called_once()
def test_zone_idle_shutdown(self):
# Action
self.subject._zone_idle_shutdown()
# Assert
calls = [
call("fermCool", 0),
call("recircfan", 0),
call("extractor", 0),
call("fermHeat", 0)
]
self.subject.gpio.output.assert_has_calls(calls)
self.assertEqual(self.subject._gpioFermCool, False)
self.assertEqual(self.subject._gpioFermHeat, False)
self.assertEqual(self.subject._gpiorecircfan, False)
self.assertEqual(self.subject._gpioExtractor, False)
self.assertEqual(self.subject.fridgeHeat, False)
self.assertEqual(self.subject.fridgeCool, False)
def test_zoneThread_mode_boil(self):
# Setup
self.subject._mode = 'boil'
self.subject._zone_boil = Mock()
# Action
self.subject._do_zone_thread()
# Assert
self.subject._zone_boil.assert_called_once()
def test_zone_boil(self):
# Action
self.subject._zone_boil()
# Assert
calls = [
call("fermHeat", 0),
call("fermCool", 0),
call("extractor", 1)
]
self.subject.gpio.output.assert_has_calls(calls)
self.assertEqual(self.subject._gpioFermCool, False)
self.assertEqual(self.subject._gpioFermHeat, False)
self.assertEqual(self.subject._gpioExtractor, True)
self.assertEqual(self.subject.fridgeHeat, False)
self.assertEqual(self.subject.fridgeCool, False)
def test_zoneThread_mode_ferm_first_time_around(self):
# Setup
self.subject._mode = 'ferm'
self.subject._zone_ferm = Mock()
# Action
self.subject._do_zone_thread()
# Assert
self.subject._zone_ferm.assert_called_once()
self.assertEqual(self.subject._lastValidReading['ferm'] > -1, True)
def test_ferm_safetycheck_for_missing_readings(self):
# Setup
self.subject._lastValidReading['ferm'] = time.time() - 500
# Action
return_value = self.subject._safety_check_for_missing_readings()
# Assert
self.subject.groot.log.assert_called_with("Critical: no valid readings for 100 seconds")
self.subject.gpio.output.assert_has_calls([
call('fermHeat', 0),
call('fermCool', 0),
call('recircfan', 0)
])
self.assertEqual(self.subject._gpioFermCool, False)
self.assertEqual(self.subject._gpioFermHeat, False)
self.assertEqual(self.subject.fridgeCompressorDelay, 300)
self.subject.lcdDisplay.sendMessage.assert_called_once_with('CRITICAL Temp Result Error', 2)
self.assertEqual(return_value, False)
def test_ferm_safetycheck_for_missing_readings_when_everything_is_awesome(self):
# Setup
self.subject._lastValidReading['ferm'] = time.time() - 5
# Action
return_value = self.subject._safety_check_for_missing_readings()
# Assert
self.assertEqual( self.subject.gpio.output.call_count, 0)
self.assertEqual(return_value, True)
def test_zone_ferm(self):
# Setup
self.subject._safety_check_for_missing_readings = Mock()
self.subject._lastValidReading['ferm'] = time.time() - 5
# Action
self.subject._zone_ferm()
# Assert
self.subject._safety_check_for_missing_readings.assert_called_once()
def test_zone_ferm_with_missing_readings(self):
# Setup
self.subject._lastValidReading['ferm'] = time.time() - 500
# Action
self.subject._zone_ferm()
# Assert
self.subject._safety_check_for_missing_readings()
#TODO:
@patch("os.path.exists")
def test_zone_ferm_no_ferm_control_flag_present(self, pathexistsMock):
# Setup
pathexistsMock.return_value = True
self.subject._disable_ferm_control = Mock()
self.subject._safety_check_for_missing_readings = Mock()
# Action
self.subject._zone_ferm()
# Assert
self.subject._disable_ferm_control.assert_called_once()
@patch("os.path.exists")
def test_zone_ferm_no_ferm_control_flag_NOT_present(self, pathexistsMock):
# Setup
pathexistsMock.return_value = False
self.subject._disable_ferm_control = Mock()
self.subject._safety_check_for_missing_readings = Mock()
# Action
self.subject._zone_ferm()
# Assert
self.subject._disable_ferm_control.assert_not_called()
def test_zone_ferm_dsiable_ferm_control(self):
# Setup
self.subject._turn_cooling_off = Mock()
self.subject._turn_heating_off = Mock()
# Action
self.subject._disable_ferm_control()
# Assert
self.subject._turn_cooling_off.assert_called_once()
self.subject._turn_heating_off.assert_called_once()
def test_safety_check_for_unrealistic_readings_79(self):
# Setup
self.subject.zoneTemp = 79
# Action
return_value = self.subject._safety_check_for_unrealistic_readings()
# Assert
self.assertEqual(return_value, False)
def test_safety_check_for_unrealistic_readings_2(self):
# Setup
self.subject.zoneTemp = 2
# Action
return_value = self.subject._safety_check_for_unrealistic_readings()
# Assert
self.assertEqual(return_value, False)
def test_safety_check_for_unrealistic_readings_20(self):
# Action
return_value = self.subject._safety_check_for_unrealistic_readings()
# Assert
self.assertEqual(return_value, True)
def test_zone_ferm_with_unrealistic_values_79(self):
# Setup
self.subject.zoneTemp = 79
self.subject._lastValidReading['ferm'] = time.time() - 50
# Action
return_value = self.subject._zone_ferm()
# Assert
# TODO
def test_zone_ferm_with_unrealistic_values_3(self):
# Setup
self.subject.zoneTemp = 3
self.subject._lastValidReading['ferm'] = time.time() - 50
# Action
return_value = self.subject._zone_ferm()
# Assert
# TODO
@patch("os.path.exists")
def test_is_heating_required_flag_present(self, pathExistsMock):
# Setup
pathExistsMock.return_value = True
# Action
return_value = self.subject._is_heating_required()
# Assert
self.assertEqual(return_value, False)
@patch("os.path.exists")
def test_is_heating_required_flag_present_temp_high(self, pathExistsMock):
# Setup
pathExistsMock.return_value = True
self.subject.zoneTemp = 5
self.subject.fridgeHeat = False
# Action
return_value = self.subject._is_heating_required()
# Assert
self.assertEqual(return_value, False)
@patch("os.path.exists")
def test_is_heating_required_flag_NOT_present_temp_high(self, pathExistsMock):
# Setup
pathExistsMock.return_value = False
self.subject.zoneTemp = 30
self.subject.zoneUpTarget = 20
self.subject.fridgeHeat = False
# Action
return_value = self.subject._is_heating_required()
# Assert
self.assertEqual(return_value, False)
@patch("os.path.exists")
def test_is_heating_required_flag_NOT_present_temp_low(self, pathExistsMock):
# Setup
pathExistsMock.return_value = False
self.subject.zoneTemp = 5
self.subject.zoneUpTarget = 20
self.subject.fridgeHeat = False
# Action
return_value = self.subject._is_heating_required()
# Assert
self.assertEqual(return_value, True)
@patch("os.path.exists")
def test_is_cooling_required_flag_present(self, pathExistsMock):
# Setup
pathExistsMock.return_value = True
# Action
return_value = self.subject._is_cooling_required()
# Assert
self.assertEqual(return_value, False)
@patch("os.path.exists")
def test_is_cooling_required_flag_present_temp_high(self, pathExistsMock):
# Setup
pathExistsMock.return_value = True
self.subject.zoneTemp = 59
self.subject.zoneDownTarget = 40
self.subject.fridgeCool = False
# Action
return_value = self.subject._is_cooling_required()
# Assert
self.assertEqual(return_value, False)
@patch("os.path.exists")
def test_is_cooling_required_flag_NOT_present_temp_high(self, pathExistsMock):
# Setup
pathExistsMock.return_value = False
self.subject.zoneTemp = 59
self.subject.zoneDownTarget = 40
self.subject.fridgeCool = False
# Action
return_value = self.subject._is_cooling_required()
# Assert
self.assertEqual(return_value, True)
@patch("os.path.exists")
def test_is_cooling_required_flag_NOT_present_temp_low(self, pathExistsMock):
# Setup
pathExistsMock.return_value = False
self.subject.zoneTemp = 20
self.subject.zoneDownTarget = 40
self.subject.fridgeCool = False
# Action
return_value = self.subject._is_cooling_required()
# Assert
self.assertEqual(return_value, False)
def test_turn_heating_on(self):
# Action
self.subject._turn_heating_on()
# Assert
self.subject.lcdDisplay.sendMessage.assert_called_once_with(" Heating", 2)
self.subject.gpio.output.assert_called_once_with('fermHeat', 1)
self.assertEqual(self.subject.fermHeatActiveFor> -1, True)
def test_turn_cooling_on(self):
# Action
self.subject._turn_cooling_on()
# Assert
self.subject.lcdDisplay.sendMessage.assert_called_once_with(" Cooling", 2)
self.subject.gpio.output.assert_called_once_with('fermCool', 1)
self.assertEqual(self.subject.fermCoolActiveFor> -1, True)
def test_turn_cooling_off_fridge_has_not_run_recently(self):
# Seup
self.subject.meterFermC = 10
self.subject.fermCoolActiveFor = time.time() - 500
self.fridge_compressor_delay = 2
# Action
self.subject._turn_cooling_off()
# Assert
self.subject.gpio.output.assert_called_once_with('fermCool', 0)
self.assertEqual(self.subject.fridgeCompressorDelay, 300)
self.assertEqual(self.subject.meterFermC > 500, True)
self.subject.groot.log.assert_called_once()
self.assertEqual(self.subject.fermCoolActiveFor, -1)
def test_turn_cooling_off_fridge_been_running(self):
# Seup
self.subject.meterFermC = 10
self.subject.fermCoolActiveFor = time.time() - 500
self.subject.fridgeCompressorDelay = -43
# Action
self.subject._turn_cooling_off()
# Assert
self.subject.gpio.output.assert_called_once_with('fermCool', 0)
self.assertEqual(self.subject.fridgeCompressorDelay, 300)
self.assertEqual(self.subject.meterFermC > 500, True)
self.subject.groot.log.assert_called_once()
self.assertEqual(self.subject.fermCoolActiveFor, -1)
def test_turn_heating_off(self):
# Seup
self.subject.meterFermH = 10
self.subject.fermHeatActiveFor = time.time() - 500
# Action
self.subject._turn_heating_off()
# Assert
self.subject.gpio.output.assert_called_once_with('fermHeat', 0)
self.assertEqual(self.subject.meterFermH > 500, True)
self.subject.groot.log.assert_called_once()
self.assertEqual(self.subject.fermHeatActiveFor, -1)
def test_run_recirc_fan_off(self):
# Action
self.subject._turn_recirc_fan_off()
# Assert
self.subject.gpio.output.assert_called_once_with('recircfan', 0)
def test_run_recirc_fan_on(self):
# Action
self.subject._turn_recirc_fan_on()
# Assert
self.subject.gpio.output.assert_called_once_with('recircfan', 1)
def test_saftey_check_for_compressor(self):
# Setup
self.subject.fridgeCompressorDelay=400
self.subject._turn_cooling_off = Mock()
# Action
return_value = self.subject._safety_check_will_starting_the_fridge_damage_the_compressor()
# Assert
self.assertEqual(return_value, True)
self.subject.lcdDisplay.sendMessage.assert_called_once()
self.subject._turn_cooling_off.assert_called_once()
def test_saftey_check_for_compressor_all_ok(self):
# Setup
self.subject.fridgeCompressorDelay=0
# Action
return_value = self.subject._safety_check_will_starting_the_fridge_damage_the_compressor()
# Assert
self.assertEqual(return_value, False)
self.subject.lcdDisplay.sendMessage.assert_not_called()
self.subject.gpio.output.assert_not_called()
def test_safety_check_has_frige_been_running_too_long(self):
# Setup
self.subject._turn_cooling_off = Mock()
self.subject.fermCoolActiveFor = time.time() - 2500
# Action
return_value = self.subject._safety_check_has_fridge_been_running_too_long_if_so_turn_off()
# Assert
self.assertEqual(return_value, True)
self.subject._turn_cooling_off.assert_called_once()
self.assertEqual(self.subject.fridgeCompressorDelay, 601)
def test_safety_check_has_frige_been_running_too_long_hardly_run_at_all_yet(self):
# Setup
self.subject._turn_cooling_off = Mock()
self.subject.fermCoolActiveFor = time.time() - 2
# Action
return_value = self.subject._safety_check_has_fridge_been_running_too_long_if_so_turn_off()
# Assert
self.assertEqual(return_value, False)
self.subject._turn_cooling_off.assert_not_called()
self.assertNotEqual(self.subject.fridgeCompressorDelay, 601)
def test_zone_ferm_heating_required(self):
# Setup
self.subject.zoneTemp = 19
self.subject.zoneTarget = 20
self.subject.zoneUpTarget = 19.5
self.subject._safety_check_for_missing_readings = Mock()
self.subject._safety_check_for_unrealistic_readings = Mock()
self.subject._turn_cooling_off = Mock()
self.subject._turn_heating_on = Mock()
self.subject._turn_recirc_fan_on = Mock()
# Action
self.subject._zone_ferm()
# Assert
self.subject._safety_check_for_missing_readings.assert_called_once()
self.subject._safety_check_for_unrealistic_readings.assert_called_once()
self.subject._turn_cooling_off.assert_called_once()
self.subject._turn_heating_on.assert_called_once()
self.subject._turn_recirc_fan_on.assert_called_once()
def test_zone_ferm_cooling_required(self):
# Setup
self.subject.zoneTemp = 21
self.subject.zoneTarget = 20
self.subject.zoneDownTarget = 20.5
self.subject.fermCoolActiveFor = -1
self.subject._safety_check_for_missing_readings = Mock()
self.subject._safety_check_for_unrealistic_readings = Mock()
self.subject._turn_heating_off = Mock()
self.subject._turn_cooling_on = Mock()
self.subject._turn_recirc_fan_on = Mock()
self.subject._safety_check_will_starting_the_fridge_damage_the_compressor = Mock()
self.subject._safety_check_will_starting_the_fridge_damage_the_compressor.return_value = False
self.subject._safety_check_has_fridge_been_running_too_long_if_so_turn_off = Mock()
self.subject._safety_check_has_fridge_been_running_too_long_if_so_turn_off.return_value = False
# Action
self.subject._zone_ferm()
# Assert
self.subject._safety_check_for_missing_readings.assert_called_once()
self.subject._safety_check_for_unrealistic_readings.assert_called_once()
self.subject._turn_cooling_on.assert_called_once()
self.subject._turn_heating_off.assert_called_once()
self.subject._turn_recirc_fan_on.assert_called_once()
def test_zone_ferm_cooling_required_compressor_overrun_kics_inn(self):
# Setup
self.subject.zoneTemp = 21
self.subject.zoneTarget = 20
self.subject.zoneDownTarget = 20.5
self.subject.fermCoolActiveFor = time.time() - 2000
self.subject._safety_check_for_missing_readings = Mock()
self.subject._safety_check_for_unrealistic_readings = Mock()
self.subject._turn_heating_off = Mock()
self.subject._turn_cooling_on = Mock()
self.subject._turn_cooling_off = Mock()
self.subject._turn_recirc_fan_off = Mock()
self.subject._safety_check_will_starting_the_fridge_damage_the_compressor = Mock()
self.subject._safety_check_will_starting_the_fridge_damage_the_compressor.return_value = False
self.subject._safety_check_has_fridge_been_running_too_long_if_so_turn_off = Mock()
self.subject._safety_check_has_fridge_been_running_too_long_if_so_turn_off.return_value = True
# Action
self.subject._zone_ferm()
# Assert
self.subject._safety_check_for_missing_readings.assert_called_once()
self.subject._safety_check_for_unrealistic_readings.assert_called_once()
self.subject._turn_heating_off.assert_called_once()
self.subject._turn_recirc_fan_off.assert_called_once()
def test_zone_ferm_cooling_required_compressor_protection_kicks_in(self):
# Setup
self.subject.zoneTemp = 21
self.subject.zoneTarget = 20
self.subject.zoneDownTarget = 20.5
self.subject.fermCoolActiveFor = -1
self.subject._safety_check_for_missing_readings = Mock()
self.subject._safety_check_for_unrealistic_readings = Mock()
self.subject._turn_heating_off = Mock()
self.subject._turn_cooling_on = Mock()
self.subject._turn_cooling_off = Mock()
self.subject._turn_recirc_fan_off = Mock()
self.subject._safety_check_will_starting_the_fridge_damage_the_compressor = Mock()
self.subject._safety_check_will_starting_the_fridge_damage_the_compressor.return_value = True
self.subject._safety_check_has_fridge_been_running_too_long_if_so_turn_off = Mock()
self.subject._safety_check_has_fridge_been_running_too_long_if_so_turn_off.return_value = False
# Action
self.subject._zone_ferm()
# Assert
self.subject._safety_check_for_missing_readings.assert_called_once()
self.subject._safety_check_for_unrealistic_readings.assert_called_once()
self.subject._turn_heating_off.assert_called_once()
self.subject._turn_recirc_fan_off.assert_called_once()
def test_zone_ferm_target_reached_when_heating(self):
self.subject.zoneTemp = 20.45210002
self.subject.fridgeHeat = True
self.subject.zoneTarget = 20
self.subject._safety_check_for_missing_readings = Mock()
self.subject._safety_check_for_unrealistic_readings = Mock()
self.subject._turn_heating_off = Mock()
self.subject._turn_cooling_off = Mock()
self.subject._turn_recirc_fan_off = Mock()
self.subject._safety_check_will_starting_the_fridge_damage_the_compressor = Mock()
self.subject._safety_check_has_fridge_been_running_too_long_if_so_turn_off = Mock()
# Action
self.subject._zone_ferm()
# Assert
self.assertEqual(self.subject._turn_cooling_off.call_count > 0, True)
self.assertEqual(self.subject._turn_heating_off.call_count > 0, True)
self.assertEqual(self.subject._turn_recirc_fan_off.call_count > 0, True)
def test_zone_ferm_target_reached_when_cooling(self):
self.subject.zoneTemp = 19.299
self.subject.fridgeCool = True
self.subject.zoneTarget = 20
self.subject._safety_check_for_missing_readings = Mock()
self.subject._safety_check_for_unrealistic_readings = Mock()
self.subject._turn_heating_off = Mock()
self.subject._turn_cooling_off = Mock()
self.subject._turn_recirc_fan_off = Mock()
self.subject._safety_check_will_starting_the_fridge_damage_the_compressor = Mock()
self.subject._safety_check_has_fridge_been_running_too_long_if_so_turn_off = Mock()
# Action
self.subject._zone_ferm()
# Assert
self.assertEqual(self.subject._turn_cooling_off.call_count > 0, True)
self.assertEqual(self.subject._turn_heating_off.call_count > 0, True)
self.assertEqual(self.subject._turn_recirc_fan_off.call_count > 0, True)
def test_callback_set_mode(self):
# Setup
self.subject._mode = 'that_other_mode'
cm = {'_mode' : 'thismode' }
# Action
self.subject.callback_set_mode(cm)
# Assert
self.assertEqual(self.subject._mode, 'thismode')
def test_callback_set_mode_wiithout_a_mode(self):
# Setup
self.subject._mode = 'that_other_mode'
cm = {}
# Action
self.subject.callback_set_mode(cm)
# Assert
self.assertEqual(self.subject._mode, 'that_other_mode')
if __name__ == '__main__':
unittest.main()
| 33.204338 | 167 | 0.664352 | 3,389 | 29,087 | 5.314547 | 0.0599 | 0.206429 | 0.047471 | 0.061074 | 0.920604 | 0.894287 | 0.842152 | 0.798179 | 0.778302 | 0.748654 | 0 | 0.013491 | 0.248255 | 29,087 | 875 | 168 | 33.242286 | 0.810208 | 0.037886 | 0 | 0.642718 | 0 | 0.003884 | 0.039242 | 0 | 0 | 0 | 0 | 0.001143 | 0.231068 | 1 | 0.106796 | false | 0 | 0.017476 | 0 | 0.126214 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
83416a944af8be163e1a99acdfaf20f06a71a63f | 158 | py | Python | pylibzfs/data_conv.py | chong601/pyzfs-rest-api | 16d76416b869b9861353229506ab5633986c615f | [
"Apache-1.1"
] | null | null | null | pylibzfs/data_conv.py | chong601/pyzfs-rest-api | 16d76416b869b9861353229506ab5633986c615f | [
"Apache-1.1"
] | null | null | null | pylibzfs/data_conv.py | chong601/pyzfs-rest-api | 16d76416b869b9861353229506ab5633986c615f | [
"Apache-1.1"
] | null | null | null | def handle_dict_data():
pass
def handle_str_to_bytes(data):
return bytes(data, 'utf-8')
def handle_bytes_to_str(data):
return str(data, 'utf-8') | 19.75 | 31 | 0.702532 | 27 | 158 | 3.814815 | 0.407407 | 0.262136 | 0.15534 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.015152 | 0.164557 | 158 | 8 | 32 | 19.75 | 0.765152 | 0 | 0 | 0 | 0 | 0 | 0.062893 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | false | 0.166667 | 0 | 0.333333 | 0.833333 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | 0 | 7 |
834931fde7c07e38ec8abc0191ed8949095e1fe6 | 10,730 | py | Python | src/main/apps/quiz/migrations/0007_auto_20210309_2207.py | kid7960/Uni-board | 6a9525bef972a31576fc5dc190d9fe106e701c3e | [
"MIT"
] | null | null | null | src/main/apps/quiz/migrations/0007_auto_20210309_2207.py | kid7960/Uni-board | 6a9525bef972a31576fc5dc190d9fe106e701c3e | [
"MIT"
] | null | null | null | src/main/apps/quiz/migrations/0007_auto_20210309_2207.py | kid7960/Uni-board | 6a9525bef972a31576fc5dc190d9fe106e701c3e | [
"MIT"
] | null | null | null | # Generated by Django 3.1.7 on 2021-03-09 22:07
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('quiz', '0006_auto_20210309_1843'),
]
operations = [
migrations.AlterField(
model_name='quiz_model',
name='Question_10',
field=models.TextField(blank=True, max_length=1024, null=True),
),
migrations.AlterField(
model_name='quiz_model',
name='Question_10_Correct_Answer',
field=models.PositiveSmallIntegerField(blank=True, null=True),
),
migrations.AlterField(
model_name='quiz_model',
name='Question_10_Option_1',
field=models.CharField(blank=True, max_length=2048, null=True),
),
migrations.AlterField(
model_name='quiz_model',
name='Question_10_Option_2',
field=models.CharField(blank=True, max_length=2048, null=True),
),
migrations.AlterField(
model_name='quiz_model',
name='Question_10_Option_3',
field=models.CharField(blank=True, max_length=2048, null=True),
),
migrations.AlterField(
model_name='quiz_model',
name='Question_10_Option_4',
field=models.CharField(blank=True, max_length=2048, null=True),
),
migrations.AlterField(
model_name='quiz_model',
name='Question_2',
field=models.TextField(blank=True, max_length=1024, null=True),
),
migrations.AlterField(
model_name='quiz_model',
name='Question_2_Correct_Answer',
field=models.PositiveSmallIntegerField(blank=True, null=True),
),
migrations.AlterField(
model_name='quiz_model',
name='Question_2_Option_1',
field=models.CharField(blank=True, max_length=2048, null=True),
),
migrations.AlterField(
model_name='quiz_model',
name='Question_2_Option_2',
field=models.CharField(blank=True, max_length=2048, null=True),
),
migrations.AlterField(
model_name='quiz_model',
name='Question_2_Option_3',
field=models.CharField(blank=True, max_length=2048, null=True),
),
migrations.AlterField(
model_name='quiz_model',
name='Question_2_Option_4',
field=models.CharField(blank=True, max_length=2048, null=True),
),
migrations.AlterField(
model_name='quiz_model',
name='Question_3',
field=models.TextField(blank=True, max_length=1024, null=True),
),
migrations.AlterField(
model_name='quiz_model',
name='Question_3_Correct_Answer',
field=models.PositiveSmallIntegerField(blank=True, null=True),
),
migrations.AlterField(
model_name='quiz_model',
name='Question_3_Option_1',
field=models.CharField(blank=True, max_length=2048, null=True),
),
migrations.AlterField(
model_name='quiz_model',
name='Question_3_Option_2',
field=models.CharField(blank=True, max_length=2048, null=True),
),
migrations.AlterField(
model_name='quiz_model',
name='Question_3_Option_3',
field=models.CharField(blank=True, max_length=2048, null=True),
),
migrations.AlterField(
model_name='quiz_model',
name='Question_3_Option_4',
field=models.CharField(blank=True, max_length=2048, null=True),
),
migrations.AlterField(
model_name='quiz_model',
name='Question_4',
field=models.TextField(blank=True, max_length=1024, null=True),
),
migrations.AlterField(
model_name='quiz_model',
name='Question_4_Correct_Answer',
field=models.PositiveSmallIntegerField(blank=True, null=True),
),
migrations.AlterField(
model_name='quiz_model',
name='Question_4_Option_1',
field=models.CharField(blank=True, max_length=2048, null=True),
),
migrations.AlterField(
model_name='quiz_model',
name='Question_4_Option_2',
field=models.CharField(blank=True, max_length=2048, null=True),
),
migrations.AlterField(
model_name='quiz_model',
name='Question_4_Option_3',
field=models.CharField(blank=True, max_length=2048, null=True),
),
migrations.AlterField(
model_name='quiz_model',
name='Question_4_Option_4',
field=models.CharField(blank=True, max_length=2048, null=True),
),
migrations.AlterField(
model_name='quiz_model',
name='Question_5',
field=models.TextField(blank=True, max_length=1024, null=True),
),
migrations.AlterField(
model_name='quiz_model',
name='Question_5_Correct_Answer',
field=models.PositiveSmallIntegerField(blank=True, null=True),
),
migrations.AlterField(
model_name='quiz_model',
name='Question_5_Option_1',
field=models.CharField(blank=True, max_length=2048, null=True),
),
migrations.AlterField(
model_name='quiz_model',
name='Question_5_Option_2',
field=models.CharField(blank=True, max_length=2048, null=True),
),
migrations.AlterField(
model_name='quiz_model',
name='Question_5_Option_3',
field=models.CharField(blank=True, max_length=2048, null=True),
),
migrations.AlterField(
model_name='quiz_model',
name='Question_5_Option_4',
field=models.CharField(blank=True, max_length=2048, null=True),
),
migrations.AlterField(
model_name='quiz_model',
name='Question_6',
field=models.TextField(blank=True, max_length=1024, null=True),
),
migrations.AlterField(
model_name='quiz_model',
name='Question_6_Correct_Answer',
field=models.PositiveSmallIntegerField(blank=True, null=True),
),
migrations.AlterField(
model_name='quiz_model',
name='Question_6_Option_1',
field=models.CharField(blank=True, max_length=2048, null=True),
),
migrations.AlterField(
model_name='quiz_model',
name='Question_6_Option_2',
field=models.CharField(blank=True, max_length=2048, null=True),
),
migrations.AlterField(
model_name='quiz_model',
name='Question_6_Option_3',
field=models.CharField(blank=True, max_length=2048, null=True),
),
migrations.AlterField(
model_name='quiz_model',
name='Question_6_Option_4',
field=models.CharField(blank=True, max_length=2048, null=True),
),
migrations.AlterField(
model_name='quiz_model',
name='Question_7',
field=models.TextField(blank=True, max_length=1024, null=True),
),
migrations.AlterField(
model_name='quiz_model',
name='Question_7_Correct_Answer',
field=models.PositiveSmallIntegerField(blank=True, null=True),
),
migrations.AlterField(
model_name='quiz_model',
name='Question_7_Option_1',
field=models.CharField(blank=True, max_length=2048, null=True),
),
migrations.AlterField(
model_name='quiz_model',
name='Question_7_Option_2',
field=models.CharField(blank=True, max_length=2048, null=True),
),
migrations.AlterField(
model_name='quiz_model',
name='Question_7_Option_3',
field=models.CharField(blank=True, max_length=2048, null=True),
),
migrations.AlterField(
model_name='quiz_model',
name='Question_7_Option_4',
field=models.CharField(blank=True, max_length=2048, null=True),
),
migrations.AlterField(
model_name='quiz_model',
name='Question_8',
field=models.TextField(blank=True, max_length=1024, null=True),
),
migrations.AlterField(
model_name='quiz_model',
name='Question_8_Correct_Answer',
field=models.PositiveSmallIntegerField(blank=True, null=True),
),
migrations.AlterField(
model_name='quiz_model',
name='Question_8_Option_1',
field=models.CharField(blank=True, max_length=2048, null=True),
),
migrations.AlterField(
model_name='quiz_model',
name='Question_8_Option_2',
field=models.CharField(blank=True, max_length=2048, null=True),
),
migrations.AlterField(
model_name='quiz_model',
name='Question_8_Option_3',
field=models.CharField(blank=True, max_length=2048, null=True),
),
migrations.AlterField(
model_name='quiz_model',
name='Question_8_Option_4',
field=models.CharField(blank=True, max_length=2048, null=True),
),
migrations.AlterField(
model_name='quiz_model',
name='Question_9',
field=models.TextField(blank=True, max_length=1024, null=True),
),
migrations.AlterField(
model_name='quiz_model',
name='Question_9_Correct_Answer',
field=models.PositiveSmallIntegerField(blank=True, null=True),
),
migrations.AlterField(
model_name='quiz_model',
name='Question_9_Option_1',
field=models.CharField(blank=True, max_length=2048, null=True),
),
migrations.AlterField(
model_name='quiz_model',
name='Question_9_Option_2',
field=models.CharField(blank=True, max_length=2048, null=True),
),
migrations.AlterField(
model_name='quiz_model',
name='Question_9_Option_3',
field=models.CharField(blank=True, max_length=2048, null=True),
),
migrations.AlterField(
model_name='quiz_model',
name='Question_9_Option_4',
field=models.CharField(blank=True, max_length=2048, null=True),
),
]
| 37.78169 | 75 | 0.587418 | 1,118 | 10,730 | 5.36941 | 0.045617 | 0.161919 | 0.224888 | 0.26087 | 0.975512 | 0.975512 | 0.975512 | 0.975512 | 0.975512 | 0.96685 | 0 | 0.040939 | 0.301118 | 10,730 | 283 | 76 | 37.915194 | 0.759568 | 0.004194 | 0 | 0.779783 | 1 | 0 | 0.14715 | 0.023308 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.00361 | 0 | 0.01444 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 10 |
362c0733d90d3d177ee1245061c9f57a68da1ef0 | 22,554 | py | Python | CalculePA/Main.py | UnrealSaltyy/Math-Perimeter-Aire | b4e1932f2855671a2011084fd99db1e4660e46ec | [
"Apache-2.0"
] | null | null | null | CalculePA/Main.py | UnrealSaltyy/Math-Perimeter-Aire | b4e1932f2855671a2011084fd99db1e4660e46ec | [
"Apache-2.0"
] | null | null | null | CalculePA/Main.py | UnrealSaltyy/Math-Perimeter-Aire | b4e1932f2855671a2011084fd99db1e4660e46ec | [
"Apache-2.0"
] | null | null | null | import time
import os
import math
from datetime import datetime
version = "1.02"
now = datetime.now()
current_time = now.strftime("%H:%M:%S")
clear = lambda: os.system("cls")
pi = math.pi
def load():
print(pi)
print("Loading Assets")
time.sleep(.1)
print("Loading Application")
time.sleep(.1)
print("Loading Extras")
time.sleep(.1)
print("Loading Loops")
time.sleep(.1)
print("Loading Imports")
time.sleep(.1)
print("Loading Variables")
time.sleep(.1)
print("Loading Junk")
time.sleep(1)
print("Loading Responses")
time.sleep(.1)
print("Loading Functions")
time.sleep(.1)
print("Loading Tables")
time.sleep(.1)
print("Loading Modules")
time.sleep(1)
print("Loading Tips")
time.sleep(.1)
print("Loading Code")
time.sleep(.1)
print("Loading Python")
time.sleep(3)
clear()
print(" ")
print("Please Wait")
time.sleep(3)
#Loading Imports
path = os.getcwd()+"\Loadup.txt"
print(path)
if os.path.exists(path):
print("Path found")
clear()
else:
print("Loading")
loadup = open('Loadup.txt', 'w')
loadup.write("Current: ")
loadup.write(current_time)
loadup.write(" Version: ")
loadup.write(version)
loadup.close()
load()
clear()
print("Loaded")
clear()
print("Select language:")
print("A: FR")
print("B: ENG (Buggy)")
Language = input(": ")
clear()
if Language == "A":
def start():
print("Veuillez choisir votre option:")
print("A: Calcule perimetre, aire d'un rectangle")
print("B: Calcule perimetre, aire d'un care")
print("C: Calcule perimetre, aire d'un cercle")
als = input(": ")
if als == "A":
def start1():
longeur = float(input("Longeur: "))
largeur = float(input("Largeur: "))
aire = longeur * largeur
perimetre = (longeur + largeur) * 2
print(" ")
print("Calcul de l'aire d'un rectangle: ", longeur, "x", largeur)
print("Calcul de perimetre d'un rectangle: ", longeur, "+", largeur, "x 2")
print(" ")
print("L'aire: ", aire)
print("Perimetre: ", perimetre)
time.sleep(1)
print(" ")
print("A: Recommencer")
print("B: Retour")
option1 = input(": ")
if option1 == "A":
clear()
start1()
elif option1 == "B":
clear()
start()
elif option1 == "a":
clear()
start1()
elif option1 == "b":
clear()
start()
start1()
elif als == "B":
def start2():
cote = float(input("Cote: "))
print(" ")
print("Calcul de l'aire d'un carre: ", "4", "x", cote)
print("Calcul de perimetre d'un carre: ", cote, "x", cote)
print(" ")
aire = 4*cote
perimetre = cote*cote
print("L'aire: ", aire)
print("Perimetre: ", perimetre)
time.sleep(1)
print(" ")
print("A: Recommencer")
print("B: Retour")
option2 = input(": ")
if option2 == "A":
clear()
start2()
elif option2 == "B":
clear()
start()
elif option2 == "a":
clear()
start2()
elif option2 == "b":
clear()
start()
start2()
elif als == "C":
def start3():
rayon = float(input("Rayon: "))
diametre = float(input("Diametre: "))
print(" ")
print("Calcul de l'aire d'un cercle: ", "π", "x", rayon, "x", rayon)
print("Calcul de perimetre d'un cercle: ","π", "x", diametre)
print(" ")
aire = pi*(rayon*rayon)
perimetre = pi*diametre
print("L'aire: ", aire)
print("Perimetre: ", perimetre)
time.sleep(1)
print(" ")
print("A: Recommencer")
print("B: Retour")
option2 = input(": ")
if option2 == "A":
clear()
start3()
elif option2 == "B":
clear()
start()
elif option2 == "a":
clear()
start3()
elif option2 == "b":
clear()
start()
start3()
if als == "a":
def start1():
longeur = float(input("Longeur: "))
largeur = float(input("Largeur: "))
aire = longeur * largeur
perimetre = (longeur + largeur) * 2
print(" ")
print("Calcul de l'aire d'un rectangle: ", longeur, "x", largeur)
print("Calcul de perimetre d'un rectangle: ", longeur, "+", largeur, "x 2")
print(" ")
print("L'aire: ", aire)
print("Perimetre: ", perimetre)
time.sleep(1)
print(" ")
print("A: Recommencer")
print("B: Retour")
option1 = input(": ")
if option1 == "A":
clear()
start1()
elif option1 == "B":
clear()
start()
elif option1 == "a":
clear()
start1()
elif option1 == "b":
clear()
start()
start1()
elif als == "b":
def start2():
cote = float(input("Cote: "))
print(" ")
print("Calcul de l'aire d'un carre: ", "4", "x", cote)
print("Calcul de perimetre d'un carre: ", cote, "x", cote)
print(" ")
aire = 4*cote
perimetre = cote*cote
print("L'aire: ", aire)
print("Perimetre: ", perimetre)
time.sleep(1)
print(" ")
print("A: Recommencer")
print("B: Retour")
option2 = input(": ")
if option2 == "A":
clear()
start2()
elif option2 == "B":
clear()
start()
elif option2 == "a":
clear()
start2()
elif option2 == "b":
clear()
start()
start2()
elif als == "c":
def start3():
rayon = float(input("Rayon: "))
diametre = float(input("Diametre: "))
print(" ")
print("Calcul de l'aire d'un cercle: ", "π", "x", rayon, "x", rayon)
print("Calcul de perimetre d'un cercle: ","π", "x", diametre)
print(" ")
aire = pi*(rayon*rayon)
perimetre = pi*diametre
print("L'aire: ", aire)
print("Perimetre: ", perimetre)
time.sleep(1)
print(" ")
print("A: Recommencer")
print("B: Retour")
option2 = input(": ")
if option2 == "A":
clear()
start3()
elif option2 == "B":
clear()
start()
elif option2 == "a":
clear()
start3()
elif option2 == "b":
clear()
start()
start3()
start()
if Language == "a":
def start():
print("Veuillez choisir votre option:")
print("A: Calcule perimetre, aire d'un rectangle")
print("B: Calcule perimetre, aire d'un care")
print("C: Calcule perimetre, aire d'un cercle")
als = input(": ")
if als == "A":
def start1():
longeur = float(input("Longeur: "))
largeur = float(input("Largeur: "))
aire = longeur * largeur
perimetre = (longeur + largeur) * 2
print(" ")
print("Calcul de l'aire d'un rectangle: ", longeur, "x", largeur)
print("Calcul de perimetre d'un rectangle: ", longeur, "+", largeur, "x 2")
print(" ")
print("L'aire: ", aire)
print("Perimetre: ", perimetre)
time.sleep(1)
print(" ")
print("A: Recommencer")
print("B: Retour")
option1 = input(": ")
if option1 == "A":
clear()
start1()
elif option1 == "B":
clear()
start()
elif option1 == "a":
clear()
start1()
elif option1 == "b":
clear()
start()
start1()
elif als == "B":
def start2():
cote = float(input("Cote: "))
print(" ")
print("Calcul de l'aire d'un carre: ", "4", "x", cote)
print("Calcul de perimetre d'un carre: ", cote, "x", cote)
print(" ")
aire = 4 * cote
perimetre = cote * cote
print("L'aire: ", aire)
print("Perimetre: ", perimetre)
time.sleep(1)
print(" ")
print("A: Recommencer")
print("B: Retour")
option2 = input(": ")
if option2 == "A":
clear()
start2()
elif option2 == "B":
clear()
start()
elif option2 == "a":
clear()
start2()
elif option2 == "b":
clear()
start()
start2()
elif als == "C":
def start3():
rayon = float(input("Rayon: "))
diametre = float(input("Diametre: "))
print(" ")
print("Calcul de l'aire d'un cercle: ", "π", "x", rayon, "x", rayon)
print("Calcul de perimetre d'un cercle: ", "π", "x", diametre)
print(" ")
aire = pi * (rayon * rayon)
perimetre = pi * diametre
print("L'aire: ", aire)
print("Perimetre: ", perimetre)
time.sleep(1)
print(" ")
print("A: Recommencer")
print("B: Retour")
option2 = input(": ")
if option2 == "A":
clear()
start3()
elif option2 == "B":
clear()
start()
elif option2 == "a":
clear()
start3()
elif option2 == "b":
clear()
start()
start3()
if als == "a":
def start1():
longeur = float(input("Longeur: "))
largeur = float(input("Largeur: "))
aire = longeur * largeur
perimetre = (longeur + largeur) * 2
print(" ")
print("Calcul de l'aire d'un rectangle: ", longeur, "x", largeur)
print("Calcul de perimetre d'un rectangle: ", longeur, "+", largeur, "x 2")
print(" ")
print("L'aire: ", aire)
print("Perimetre: ", perimetre)
time.sleep(1)
print(" ")
print("A: Recommencer")
print("B: Retour")
option1 = input(": ")
if option1 == "A":
clear()
start1()
elif option1 == "B":
clear()
start()
elif option1 == "a":
clear()
start1()
elif option1 == "b":
clear()
start()
start1()
elif als == "b":
def start2():
cote = float(input("Cote: "))
print(" ")
print("Calcul de l'aire d'un carre: ", "4", "x", cote)
print("Calcul de perimetre d'un carre: ", cote, "x", cote)
print(" ")
aire = 4 * cote
perimetre = cote * cote
print("L'aire: ", aire)
print("Perimetre: ", perimetre)
time.sleep(1)
print(" ")
print("A: Recommencer")
print("B: Retour")
option2 = input(": ")
if option2 == "A":
clear()
start2()
elif option2 == "B":
clear()
start()
elif option2 == "a":
clear()
start2()
elif option2 == "b":
clear()
start()
start2()
elif als == "c":
def start3():
rayon = float(input("Rayon: "))
diametre = float(input("Diametre: "))
print(" ")
print("Calcul de l'aire d'un cercle: ", "π", "x", rayon, "x", rayon)
print("Calcul de perimetre d'un cercle: ", "π", "x", diametre)
print(" ")
aire = pi * (rayon * rayon)
perimetre = pi * diametre
print("L'aire: ", aire)
print("Perimetre: ", perimetre)
time.sleep(1)
print(" ")
print("A: Recommencer")
print("B: Retour")
option2 = input(": ")
if option2 == "A":
clear()
start3()
elif option2 == "B":
clear()
start()
elif option2 == "a":
clear()
start3()
elif option2 == "b":
clear()
start()
start3()
start()
if Language == "B":
def start():
print("Please select an option:")
print("A: Calculate the perimeter, air of a rectangle")
print("B: Calculate the perimeter, air of a square")
print("C: Calculate the perimeter, air of a cercle")
als = input(": ")
if als == "a":
def start1():
longeur = float(input("Length : "))
largeur = float(input("width : "))
aire = longeur * largeur
perimetre = (longeur + largeur) * 2
print(" ")
print("Calculation the air of a rectangle: ", longeur, "x", largeur)
print("Calculation the perimeter of a rectangle: ", longeur, "+", largeur, "x 2")
print(" ")
print("The air: ", aire)
print("The perimetre: ", perimetre)
time.sleep(1)
print(" ")
print("A: Restart")
print("B: Back")
option1 = input(": ")
if option1 == "A":
clear()
start1()
elif option1 == "B":
clear()
start()
elif option1 == "a":
clear()
start1()
elif option1 == "b":
clear()
start()
start1()
elif als == "b":
def start2():
cote = float(input("Side: "))
print(" ")
print("Calculation de the air of a square: ", "4", "x", cote)
print("Calculation the perimeter of a square: ", cote, "x", cote)
print(" ")
aire = 4 * cote
perimetre = cote * cote
print("The air: ", aire)
print("The perimetre: ", perimetre)
time.sleep(1)
print(" ")
print("A: Restart")
print("B: Back")
option2 = input(": ")
if option2 == "A":
clear()
start2()
elif option2 == "B":
clear()
start()
elif option2 == "a":
clear()
start2()
elif option2 == "b":
clear()
start()
start2()
elif als == "c":
def start3():
rayon = float(input("Rayon: "))
diametre = float(input("Diameter: "))
print(" ")
print("Calculation the air of a circle: ", "π", "x", rayon, "x", rayon)
print("Calculation the perimeter of a circle: ", "π", "x", diametre)
print(" ")
aire = pi * (rayon * rayon)
perimetre = pi * diametre
print("The air: ", aire)
print("The perimetre: ", perimetre)
time.sleep(1)
print(" ")
print("A: Restart")
print("B: Back")
option2 = input(": ")
if option2 == "A":
clear()
start3()
elif option2 == "B":
clear()
start()
elif option2 == "a":
clear()
start3()
elif option2 == "b":
clear()
start()
start3()
elif als == "A":
def start1():
longeur = float(input("Length : "))
largeur = float(input("width : "))
aire = longeur * largeur
perimetre = (longeur + largeur) * 2
print(" ")
print("Calculation the air of a rectangle: ", longeur, "x", largeur)
print("Calculation the perimeter of a rectangle: ", longeur, "+", largeur, "x 2")
print(" ")
print("The air: ", aire)
print("The perimetre: ", perimetre)
time.sleep(1)
print(" ")
print("A: Restart")
print("B: Back")
option1 = input(": ")
if option1 == "A":
clear()
start1()
elif option1 == "B":
clear()
start()
elif option1 == "a":
clear()
start1()
elif option1 == "b":
clear()
start()
start1()
elif als == "B":
def start2():
cote = float(input("Side: "))
print(" ")
print("Calculation de the air of a square: ", "4", "x", cote)
print("Calculation the perimeter of a square: ", cote, "x", cote)
print(" ")
aire = 4 * cote
perimetre = cote * cote
print("The air: ", aire)
print("The perimetre: ", perimetre)
time.sleep(1)
print(" ")
print("A: Restart")
print("B: Back")
option2 = input(": ")
if option2 == "A":
clear()
start2()
elif option2 == "B":
clear()
start()
elif option2 == "a":
clear()
start2()
elif option2 == "b":
clear()
start()
start2()
elif als == "C":
def start3():
rayon = float(input("Rayon: "))
diametre = float(input("Diameter: "))
print(" ")
print("Calculation the air of a circle: ", "π", "x", rayon, "x", rayon)
print("Calculation the perimeter of a circle: ", "π", "x", diametre)
print(" ")
aire = pi * (rayon * rayon)
perimetre = pi * diametre
print("The air: ", aire)
print("The perimetre: ", perimetre)
time.sleep(1)
print(" ")
print("A: Restart")
print("B: Back")
option2 = input(": ")
if option2 == "A":
clear()
start3()
elif option2 == "B":
clear()
start()
elif option2 == "a":
clear()
start3()
elif option2 == "b":
clear()
start()
start3()
start()
| 33.864865 | 98 | 0.35701 | 1,784 | 22,554 | 4.512332 | 0.064462 | 0.053416 | 0.049193 | 0.057764 | 0.918509 | 0.882981 | 0.872919 | 0.872919 | 0.872919 | 0.872919 | 0 | 0.020433 | 0.518267 | 22,554 | 665 | 99 | 33.915789 | 0.720479 | 0.000665 | 0 | 0.916533 | 0 | 0 | 0.149161 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.035313 | false | 0 | 0.008026 | 0 | 0.043339 | 0.317817 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
363bc98f6d15fe012efce230d3b1b69e2bf1b859 | 17,803 | py | Python | sci_analysis/test/test_grouplinregress.py | cmmorrow/sci-analysis | de65ba29fe210eb950daa3dbc2e956963a4770ef | [
"MIT"
] | 17 | 2017-05-10T18:25:36.000Z | 2021-12-23T14:43:49.000Z | sci_analysis/test/test_grouplinregress.py | cmmorrow/sci-analysis | de65ba29fe210eb950daa3dbc2e956963a4770ef | [
"MIT"
] | 57 | 2016-08-22T23:58:05.000Z | 2019-07-31T06:54:22.000Z | sci_analysis/test/test_grouplinregress.py | cmmorrow/sci-analysis | de65ba29fe210eb950daa3dbc2e956963a4770ef | [
"MIT"
] | null | null | null | import unittest
import numpy as np
import pandas as pd
import scipy.stats as st
from ..analysis import GroupLinearRegression
from ..analysis.exc import MinimumSizeError, NoDataError
from ..data import UnequalVectorLengthError, Vector
class MyTestCase(unittest.TestCase):
def test_linregress_four_groups(self):
np.random.seed(987654321)
input_1 = st.norm.rvs(size=100), st.norm.rvs(size=100)
input_2 = st.norm.rvs(size=100), st.norm.rvs(size=100)
input_3 = st.norm.rvs(size=100), st.norm.rvs(size=100)
input_4_x = st.norm.rvs(size=100)
input_4_y = [x + st.norm.rvs(0, 0.5, size=1)[0] for x in input_4_x]
input_4 = input_4_x, input_4_y
cs_x = np.concatenate((input_1[0], input_2[0], input_3[0], input_4[0]))
cs_y = np.concatenate((input_1[1], input_2[1], input_3[1], input_4[1]))
grp = [1] * 100 + [2] * 100 + [3] * 100 + [4] * 100
input_array = pd.DataFrame({'a': cs_x, 'b': cs_y, 'c': grp})
output = """
Linear Regression
-----------------
n Slope Intercept r^2 Std Err p value Group
--------------------------------------------------------------------------------------------------
100 -0.0056 0.0478 0.0000 0.1030 0.9567 1
100 0.0570 -0.1671 0.0037 0.0950 0.5497 2
100 -0.2521 0.1637 0.0506 0.1103 0.0244 3
100 0.9635 0.1043 0.8181 0.0459 0.0000 4 """
exp = GroupLinearRegression(input_array['a'], input_array['b'], groups=input_array['c'], display=False)
self.assertTupleEqual(exp.counts, ('100', '100', '100', '100'))
self.assertAlmostEqual(exp.slope[0], -0.005613130406764816)
self.assertAlmostEqual(exp.slope[1], 0.0570354136308546)
self.assertAlmostEqual(exp.slope[2], -0.2521496921022714)
self.assertAlmostEqual(exp.slope[3], 0.9634599098599703)
self.assertAlmostEqual(exp.intercept[0], 0.04775111565537506)
self.assertAlmostEqual(exp.intercept[1], -0.1670688836199169)
self.assertAlmostEqual(exp.intercept[2], 0.1637132078993005)
self.assertAlmostEqual(exp.intercept[3], 0.10434448563066669)
self.assertAlmostEqual(exp.r_squared[0], 3.030239852495909e-05)
self.assertAlmostEqual(exp.r_squared[1], 0.00366271257512563)
self.assertAlmostEqual(exp.r_squared[2], 0.05062765121282169)
self.assertAlmostEqual(exp.r_squared[3], 0.8180520671815105)
self.assertAlmostEqual(exp.statistic[0], 3.030239852495909e-05)
self.assertAlmostEqual(exp.statistic[1], 0.00366271257512563)
self.assertAlmostEqual(exp.statistic[2], 0.05062765121282169)
self.assertAlmostEqual(exp.statistic[3], 0.8180520671815105)
self.assertAlmostEqual(exp.r_value[0], -0.005504761441239674)
self.assertAlmostEqual(exp.r_value[1], 0.06052034843856759)
self.assertAlmostEqual(exp.r_value[2], -0.2250058915069152)
self.assertAlmostEqual(exp.r_value[3], 0.9044623083255103)
self.assertAlmostEqual(exp.std_err[0], 0.1030023210648352)
self.assertAlmostEqual(exp.std_err[1], 0.09502400478678666)
self.assertAlmostEqual(exp.std_err[2], 0.11029855015697929)
self.assertAlmostEqual(exp.std_err[3], 0.04589905033402483)
self.assertAlmostEqual(exp.p_value[0], 0.956651586890106)
self.assertAlmostEqual(exp.p_value[1], 0.5497443545114141)
self.assertAlmostEqual(exp.p_value[2], 0.024403659194742487)
self.assertAlmostEqual(exp.p_value[3], 4.844813765580163e-38)
self.assertEqual(str(exp), output)
def test_linregress_four_groups_string(self):
np.random.seed(987654321)
input_1 = st.norm.rvs(size=100), st.norm.rvs(size=100)
input_2 = st.norm.rvs(size=100), st.norm.rvs(size=100)
input_3 = st.norm.rvs(size=100), st.norm.rvs(size=100)
input_4_x = st.norm.rvs(size=100)
input_4_y = [x + st.norm.rvs(0, 0.5, size=1)[0] for x in input_4_x]
input_4 = input_4_x, input_4_y
cs_x = np.concatenate((input_1[0], input_2[0], input_3[0], input_4[0]))
cs_y = np.concatenate((input_1[1], input_2[1], input_3[1], input_4[1]))
grp = ['a'] * 100 + ['b'] * 100 + ['c'] * 100 + ['d'] * 100
input_array = pd.DataFrame({'a': cs_x, 'b': cs_y, 'c': grp})
output = """
Linear Regression
-----------------
n Slope Intercept r^2 Std Err p value Group
--------------------------------------------------------------------------------------------------
100 -0.0056 0.0478 0.0000 0.1030 0.9567 a
100 0.0570 -0.1671 0.0037 0.0950 0.5497 b
100 -0.2521 0.1637 0.0506 0.1103 0.0244 c
100 0.9635 0.1043 0.8181 0.0459 0.0000 d """
exp = GroupLinearRegression(input_array['a'], input_array['b'], groups=input_array['c'], display=False)
self.assertTupleEqual(exp.counts, ('100', '100', '100', '100'))
self.assertAlmostEqual(exp.slope[0], -0.005613130406764816)
self.assertAlmostEqual(exp.slope[1], 0.0570354136308546)
self.assertAlmostEqual(exp.slope[2], -0.2521496921022714)
self.assertAlmostEqual(exp.slope[3], 0.9634599098599703)
self.assertAlmostEqual(exp.intercept[0], 0.04775111565537506)
self.assertAlmostEqual(exp.intercept[1], -0.1670688836199169)
self.assertAlmostEqual(exp.intercept[2], 0.1637132078993005)
self.assertAlmostEqual(exp.intercept[3], 0.10434448563066669)
self.assertAlmostEqual(exp.r_squared[0], 3.030239852495909e-05)
self.assertAlmostEqual(exp.r_squared[1], 0.00366271257512563)
self.assertAlmostEqual(exp.r_squared[2], 0.05062765121282169)
self.assertAlmostEqual(exp.r_squared[3], 0.8180520671815105)
self.assertAlmostEqual(exp.statistic[0], 3.030239852495909e-05)
self.assertAlmostEqual(exp.statistic[1], 0.00366271257512563)
self.assertAlmostEqual(exp.statistic[2], 0.05062765121282169)
self.assertAlmostEqual(exp.statistic[3], 0.8180520671815105)
self.assertAlmostEqual(exp.r_value[0], -0.005504761441239674)
self.assertAlmostEqual(exp.r_value[1], 0.06052034843856759)
self.assertAlmostEqual(exp.r_value[2], -0.2250058915069152)
self.assertAlmostEqual(exp.r_value[3], 0.9044623083255103)
self.assertAlmostEqual(exp.std_err[0], 0.1030023210648352)
self.assertAlmostEqual(exp.std_err[1], 0.09502400478678666)
self.assertAlmostEqual(exp.std_err[2], 0.11029855015697929)
self.assertAlmostEqual(exp.std_err[3], 0.04589905033402483)
self.assertAlmostEqual(exp.p_value[0], 0.956651586890106)
self.assertAlmostEqual(exp.p_value[1], 0.5497443545114141)
self.assertAlmostEqual(exp.p_value[2], 0.024403659194742487)
self.assertAlmostEqual(exp.p_value[3], 4.844813765580163e-38)
self.assertEqual(str(exp), output)
def test_no_data(self):
"""Test the case where there's no data."""
self.assertRaises(NoDataError, lambda: GroupLinearRegression([], []))
def test_at_minimum_size(self):
"""Test to make sure the case where the length of data is just above the minimum size."""
np.random.seed(987654321)
input_1 = st.norm.rvs(size=2), st.norm.rvs(size=2)
input_2 = st.norm.rvs(size=2), st.norm.rvs(size=2)
input_3 = st.norm.rvs(size=2), st.norm.rvs(size=2)
input_4_x = st.norm.rvs(size=2)
input_4_y = [x + st.norm.rvs(0, 0.5, size=1)[0] for x in input_4_x]
input_4 = input_4_x, input_4_y
cs_x = np.concatenate((input_1[0], input_2[0], input_3[0], input_4[0]))
cs_y = np.concatenate((input_1[1], input_2[1], input_3[1], input_4[1]))
grp = [1] * 2 + [2] * 2 + [3] * 2 + [4] * 2
input_array = pd.DataFrame({'a': cs_x, 'b': cs_y, 'c': grp})
output = """
Linear Regression
-----------------
n Slope Intercept r^2 Std Err p value Group
--------------------------------------------------------------------------------------------------
2 -1.0763 1.2343 1.0000 0.0000 0.0000 1
2 2.0268 0.6799 1.0000 0.0000 0.0000 2
2 1.8891 -2.4800 1.0000 0.0000 0.0000 3
2 0.1931 -0.2963 1.0000 0.0000 0.0000 4 """
exp = GroupLinearRegression(input_array['a'], input_array['b'], groups=input_array['c'], display=False)
self.assertEqual(str(exp), output)
def test_all_below_minimum_size(self):
"""Test the case where all the supplied data is less than the minimum size."""
np.random.seed(987654321)
input_1 = st.norm.rvs(size=1), st.norm.rvs(size=1)
input_2 = st.norm.rvs(size=1), st.norm.rvs(size=1)
input_3 = st.norm.rvs(size=1), st.norm.rvs(size=1)
input_4 = st.norm.rvs(size=1), st.norm.rvs(size=1)
cs_x = np.concatenate((input_1[0], input_2[0], input_3[0], input_4[0]))
cs_y = np.concatenate((input_1[1], input_2[1], input_3[1], input_4[1]))
grp = [1, 2, 3, 4]
input_array = pd.DataFrame({'a': cs_x, 'b': cs_y, 'c': grp})
self.assertRaises(
NoDataError,
lambda: GroupLinearRegression(input_array['a'], input_array['b'], groups=input_array['c'])
)
def test_below_minimum_size(self):
"""Test the case where a group is less than the minimum size."""
np.random.seed(987654321)
input_1 = st.norm.rvs(size=10), st.norm.rvs(size=10)
input_2 = st.norm.rvs(size=10), st.norm.rvs(size=10)
input_3 = st.norm.rvs(size=1), st.norm.rvs(size=1)
input_4 = st.norm.rvs(size=10), st.norm.rvs(size=10)
cs_x = np.concatenate((input_1[0], input_2[0], input_3[0], input_4[0]))
cs_y = np.concatenate((input_1[1], input_2[1], input_3[1], input_4[1]))
grp = [1] * 10 + [2] * 10 + [3] + [4] * 10
input_array = pd.DataFrame({'a': cs_x, 'b': cs_y, 'c': grp})
output = """
Linear Regression
-----------------
n Slope Intercept r^2 Std Err p value Group
--------------------------------------------------------------------------------------------------
10 0.4268 -0.2032 0.2877 0.2374 0.1100 1
10 0.1214 -0.6475 0.0393 0.2123 0.5832 2
10 0.2367 0.2525 0.1131 0.2343 0.3419 4 """
exp = GroupLinearRegression(input_array['a'], input_array['b'], groups=input_array['c'])
self.assertEqual(output, str(exp))
def test_vector_no_data(self):
"""Test the case where there's no data with a vector as input."""
self.assertRaises(NoDataError, lambda: GroupLinearRegression(Vector([], other=[])))
def test_no_ydata(self):
"""Test the case where the ydata argument is None."""
self.assertRaises(AttributeError, lambda: GroupLinearRegression([1, 2, 3, 4]))
def test_unequal_pair_lengths(self):
"""Test the case where the supplied pairs are unequal."""
np.random.seed(987654321)
input_1 = st.norm.rvs(size=100), st.norm.rvs(size=96)
self.assertRaises(UnequalVectorLengthError, lambda: GroupLinearRegression(input_1[0], input_1[1]))
def test_linregress_one_group(self):
np.random.seed(987654321)
input_array = st.norm.rvs(size=100), st.norm.rvs(size=100)
output = """
Linear Regression
-----------------
n Slope Intercept r^2 Std Err p value Group
--------------------------------------------------------------------------------------------------
100 -0.0056 0.0478 0.0000 0.1030 0.9567 1 """
exp = GroupLinearRegression(input_array[0], input_array[1], display=False)
self.assertEqual(str(exp), output)
def test_linregress_vector(self):
np.random.seed(987654321)
input_1 = st.norm.rvs(size=100), st.norm.rvs(size=100)
input_2 = st.norm.rvs(size=100), st.norm.rvs(size=100)
input_3 = st.norm.rvs(size=100), st.norm.rvs(size=100)
input_4_x = st.norm.rvs(size=100)
input_4_y = [x + st.norm.rvs(0, 0.5, size=1)[0] for x in input_4_x]
input_4 = input_4_x, input_4_y
cs_x = np.concatenate((input_1[0], input_2[0], input_3[0], input_4[0]))
cs_y = np.concatenate((input_1[1], input_2[1], input_3[1], input_4[1]))
grp = [1] * 100 + [2] * 100 + [3] * 100 + [4] * 100
input_array = Vector(cs_x, other=cs_y, groups=grp)
output = """
Linear Regression
-----------------
n Slope Intercept r^2 Std Err p value Group
--------------------------------------------------------------------------------------------------
100 -0.0056 0.0478 0.0000 0.1030 0.9567 1
100 0.0570 -0.1671 0.0037 0.0950 0.5497 2
100 -0.2521 0.1637 0.0506 0.1103 0.0244 3
100 0.9635 0.1043 0.8181 0.0459 0.0000 4 """
exp = GroupLinearRegression(input_array, display=False)
self.assertTupleEqual(exp.counts, ('100', '100', '100', '100'))
self.assertAlmostEqual(exp.slope[0], -0.005613130406764816)
self.assertAlmostEqual(exp.slope[1], 0.0570354136308546)
self.assertAlmostEqual(exp.slope[2], -0.2521496921022714)
self.assertAlmostEqual(exp.slope[3], 0.9634599098599703)
self.assertAlmostEqual(exp.intercept[0], 0.04775111565537506)
self.assertAlmostEqual(exp.intercept[1], -0.1670688836199169)
self.assertAlmostEqual(exp.intercept[2], 0.1637132078993005)
self.assertAlmostEqual(exp.intercept[3], 0.10434448563066669)
self.assertAlmostEqual(exp.r_squared[0], 3.030239852495909e-05)
self.assertAlmostEqual(exp.r_squared[1], 0.00366271257512563)
self.assertAlmostEqual(exp.r_squared[2], 0.05062765121282169)
self.assertAlmostEqual(exp.r_squared[3], 0.8180520671815105)
self.assertAlmostEqual(exp.statistic[0], 3.030239852495909e-05)
self.assertAlmostEqual(exp.statistic[1], 0.00366271257512563)
self.assertAlmostEqual(exp.statistic[2], 0.05062765121282169)
self.assertAlmostEqual(exp.statistic[3], 0.8180520671815105)
self.assertAlmostEqual(exp.r_value[0], -0.005504761441239674)
self.assertAlmostEqual(exp.r_value[1], 0.06052034843856759)
self.assertAlmostEqual(exp.r_value[2], -0.2250058915069152)
self.assertAlmostEqual(exp.r_value[3], 0.9044623083255103)
self.assertAlmostEqual(exp.std_err[0], 0.1030023210648352)
self.assertAlmostEqual(exp.std_err[1], 0.09502400478678666)
self.assertAlmostEqual(exp.std_err[2], 0.11029855015697929)
self.assertAlmostEqual(exp.std_err[3], 0.04589905033402483)
self.assertAlmostEqual(exp.p_value[0], 0.956651586890106)
self.assertAlmostEqual(exp.p_value[1], 0.5497443545114141)
self.assertAlmostEqual(exp.p_value[2], 0.024403659194742487)
self.assertAlmostEqual(exp.p_value[3], 4.844813765580163e-38)
self.assertEqual(str(exp), output)
def test_linregress_missing_data(self):
np.random.seed(987654321)
input_1 = st.norm.rvs(size=100), st.norm.rvs(size=100)
input_2 = st.norm.rvs(size=100), st.norm.rvs(size=100)
input_3 = st.norm.rvs(size=100), st.norm.rvs(size=100)
input_4_x = st.norm.rvs(size=100)
input_4_y = [x + st.norm.rvs(0, 0.5, size=1)[0] for x in input_4_x]
input_4 = input_4_x, input_4_y
cs_x = np.concatenate((input_1[0], input_2[0], input_3[0], input_4[0]))
cs_y = np.concatenate((input_1[1], input_2[1], input_3[1], input_4[1]))
grp = [1] * 100 + [2] * 100 + [3] * 100 + [4] * 100
input_array = pd.DataFrame({'a': cs_x, 'b': cs_y, 'c': grp})
input_array['a'][24] = np.nan
input_array['a'][256] = np.nan
input_array['b'][373] = np.nan
input_array['b'][24] = np.nan
input_array['b'][128] = np.nan
output = """
Linear Regression
-----------------
n Slope Intercept r^2 Std Err p value Group
--------------------------------------------------------------------------------------------------
99 -0.0115 0.0340 0.0001 0.1028 0.9114 1
99 0.0281 -0.1462 0.0009 0.0950 0.7681 2
99 -0.2546 0.1653 0.0495 0.1133 0.0269 3
99 0.9635 0.1043 0.8178 0.0462 0.0000 4 """
exp = GroupLinearRegression(input_array['a'], input_array['b'], groups=input_array['c'], display=False)
self.assertEqual(str(exp), output)
if __name__ == '__main__':
unittest.main()
| 57.244373 | 111 | 0.56923 | 2,311 | 17,803 | 4.259628 | 0.090004 | 0.179195 | 0.204795 | 0.072633 | 0.89232 | 0.866315 | 0.852601 | 0.850975 | 0.842645 | 0.837261 | 0 | 0.211746 | 0.260686 | 17,803 | 310 | 112 | 57.429032 | 0.536165 | 0.023142 | 0 | 0.735294 | 0 | 0 | 0.237223 | 0.039528 | 0 | 0 | 0 | 0 | 0.363971 | 1 | 0.044118 | false | 0 | 0.025735 | 0 | 0.073529 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
365079803a92fb90dc60f35e2b0f9719cf05496a | 31,057 | py | Python | deprecated/models/enet/enet_modules.py | alfrunesiq/SemanticSegmentationActiveLearning | 3f953a22c8fd95828c9bd4c5ce52a53e991391e4 | [
"MIT"
] | 9 | 2019-06-14T07:29:28.000Z | 2021-03-27T09:45:56.000Z | deprecated/models/enet/enet_modules.py | alfrunesiq/SemanticSegmentationActiveLearning | 3f953a22c8fd95828c9bd4c5ce52a53e991391e4 | [
"MIT"
] | 2 | 2020-08-10T10:18:21.000Z | 2021-03-18T20:30:04.000Z | deprecated/models/enet/enet_modules.py | alfrunesiq/SemanticSegmentationActiveLearning | 3f953a22c8fd95828c9bd4c5ce52a53e991391e4 | [
"MIT"
] | 1 | 2020-03-07T08:37:12.000Z | 2020-03-07T08:37:12.000Z | import tensorflow as tf
from ..util import extra_ops as xops
def block_initial(inputs, training,
padding="SAME",
kernel_initializer=tf.initializers.glorot_uniform(),
output_width=16,
name="Initial"):
"""
ENet initial block:
+-------+
| Input |
+-------+
/ \
/ \
/ \
+--------------+ +----------------+
| 3x3 conv x16 | | 2x2/s2 MaxPool |
+--------------+ +----------------+
\ /
\ /
\ /
+--------+
| Concat |
+--------+
:param inputs: Input tensor (format=[N,H,W,C])
:param padding: Padding for the conv operation
:param output_width: Number of channels for the output tensor
:param name: Name of the scope for the block
:returns: output tensor, trainable variables
:rtype: (tf.Tensor, dict)
"""
params = {}
with tf.variable_scope(name):
with tf.name_scope("ShapeOps"):
# shape(inputs)=[N,H,W,C]
input_shape = inputs.get_shape().as_list()
# Get input channel count
# NOTE: input channel shape need to be deterministic
input_ch = input_shape[3]
# output width is concatenation of max pool and conv
conv_width = output_width - input_ch
conv_kern_shape = [3,3,input_ch,conv_width]
# Get conv. kernel
kern = tf.get_variable(name="Kernel",
shape=conv_kern_shape,
initializer=tf.glorot_uniform_initializer(),
trainable=True)
out_conv = tf.nn.conv2d(inputs, kern,
strides=[1,2,2,1],
padding=padding,
name="Conv2D")
out_mp = tf.nn.max_pool(inputs,
ksize=[1,2,2,1],
strides=[1,2,2,1],
padding=padding,
name="MaxPool")
out = tf.concat([out_conv, out_mp],
axis=3,
name="Concat")
params["Kernel"] = kern
return out, params
def block_bottleneck(inputs,
training,
padding="SAME",
projection_rate=4,
dilations=[1,1,1,1],
bn_decay=0.90,
asymmetric=False,
kernel_initializer=tf.initializers.glorot_uniform(),
alpha_initializer=tf.initializers.constant(0.25),
drop_rate=0.1,
name="Bottleneck"):
"""
Implements the plain bottleneck module in ENet, including possibility
of dilated convolution and asymmetric (spatially separable) convolution.
+-------+
| Input |
+-------+
||
+------------++----------+
| |
| +-------------------------+
| | 1x1 conv |
| | x(input_ch/proj_rate) |
| | -> BatchNorm -> PReLU |
| +-------------------------+
| | Projection
| V
| +-------------------------+
| | 3x3 conv |
| | x(input_ch/proj_rate) |
| | -> BatchNorm -> PReLU |
| +-------------------------+
| | Convolution
| V
| +-------------------------+
| | 1x1 conv |
| | x(input_ch) |
| | -> BatchNorm |
| +-------------------------+
| | Expansion
+------------++----------+
\/ Residual connection
+-----------+
| Add |
| -> PReLU |
+-----------+
:param inputs: Input tensor.
:param training: Whether to accumulate statistics in batch norm and
apply spatial dropout TODO: determine where to put DO.
:param padding: Padding for the main convolution.
:param projection_rate: Bottleneck operates on @projection_rate less channels.
:param dilations: Dilationrates in the main convolution block.
:param bn_decay: Decay rate for exp. running mean in batch norm.
:param asymmetric: Use asymmetric (spatially separable) conv.
:param kernel_initializer: tf.initializer for the conv kernels.
:param alpha_initializer: tf.initializer for the PReLU parameters.
:param drop_rate: Dropout probability (training=True)
:param name: Name of the block scope.
:returns: Output tensor, Parameters
NOTE: Parameters are stored in a dictionary indexed by the scopes.
:rtype: (tf.Tensor, dict)
"""
variables = {} # Dict with Variables in the block
out = None
with tf.variable_scope(name):
with tf.name_scope("ShapeOps"):
# Get input channel count
input_shape = inputs.get_shape().as_list()
input_ch = input_shape[-1]
# Number of filters in the bottleneck are reduced by a factor of
# @projection_rate
bneck_filters = input_ch // projection_rate
# Get conv. kernels' shape
proj_kern_shape = [1,1,input_ch,bneck_filters]
if asymmetric:
conv_kern_shape = [ \
[5,1,bneck_filters,bneck_filters],
[1,5,bneck_filters,bneck_filters],
]
else:
conv_kern_shape = [3,3,bneck_filters,bneck_filters]
exp_kern_shape = [1,1,bneck_filters,input_ch]
# END scope ShapeOps
############ Main Branch ############
with tf.variable_scope("DownProject"):
# Bottleneck projection operation
alpha = tf.get_variable(name="Alpha",
dtype=tf.float32,
initializer=alpha_initializer,
shape=[bneck_filters],
trainable=True)
kern = tf.get_variable(name="Kernel",
dtype=tf.float32,
initializer=kernel_initializer,
shape=proj_kern_shape,
trainable=True)
out = tf.nn.conv2d(inputs, kern,
strides=[1,1,1,1],
padding=padding,
name="Conv2D")
out, bn_params = xops.batch_normalization(out, training,
decay=bn_decay)
out = xops.prelu(out, alpha, name="PReLU")
variables["DownProject"] = {}
variables["DownProject"]["Kernel"] = kern
variables["DownProject"]["Alpha"] = alpha
variables["DownProject"]["BatchNorm"] = bn_params
with tf.variable_scope("Conv"):
# Main convolution operation
alpha = tf.get_variable(name="Alpha",
dtype=tf.float32,
initializer=alpha_initializer,
shape=[bneck_filters],
trainable=True)
if asymmetric:
kern = [ \
tf.get_variable(name="KernelCol",
dtype=tf.float32,
initializer=kernel_initializer,
shape=conv_kern_shape[0],
trainable=True),
tf.get_variable(name="KernelRow",
dtype=tf.float32,
initializer=kernel_initializer,
shape=conv_kern_shape[1],
trainable=True)
]
out = tf.nn.conv2d(out, kern[0],
strides=[1,1,1,1],
padding=padding,
dilations=dilations,
name="Conv2D")
out = tf.nn.conv2d(out, kern[1],
strides=[1,1,1,1],
padding=padding,
dilations=dilations,
name="Conv2D")
else:
kern = tf.get_variable(name="Kernel",
dtype=tf.float32,
initializer=kernel_initializer,
shape=conv_kern_shape,
trainable=True)
out = tf.nn.conv2d(out, kern,
strides=[1,1,1,1],
padding=padding,
dilations=dilations,
name="Conv2D")
out, bn_params = xops.batch_normalization(out, training,
decay=bn_decay)
out = xops.prelu(out, alpha, name="PReLU")
variables["Conv"] = {}
variables["Conv"]["Kernel"] = kern
variables["Conv"]["Alpha"] = alpha
variables["Conv"]["BatchNorm"] = bn_params
# END scope Conv
with tf.variable_scope("Expansion"):
# Feature expansion operation
kern = tf.get_variable(name="Kernel",
dtype=tf.float32,
initializer=kernel_initializer,
shape=exp_kern_shape,
trainable=True)
out = tf.nn.conv2d(out, kern,
strides=[1,1,1,1],
padding=padding,
name="Conv2D")
out, bn_params = xops.batch_normalization(out, training,
decay=bn_decay)
if training and drop_rate > 0.0:
out = xops.spatial_dropout(out, drop_rate, name="SpatialDropout")
variables["Expansion"] = {}
variables["Expansion"]["Kernel"] = kern
variables["Expansion"]["BatchNorm"] = bn_params
# NOTE: no prelu here
# TODO: add spatial dropout here if training == True
#####################################
alpha = tf.get_variable(name="Alpha",
shape=[input_ch],
dtype=tf.float32,
initializer=alpha_initializer,
trainable=True)
# NOTE: out comes from main branch
out = tf.add(inputs, out, name="Residual")
out = xops.prelu(out, alpha, name="PReLU")
variables["Alpha"] = alpha
# END scope @name
return out, variables
# END def block_bottleneck
def block_bottleneck_upsample(inputs, unpool_argmax, training,
padding="SAME",
projection_rate=4,
dilations=[1,1,1,1],
bn_decay=0.90,
kernel_initializer=tf.initializers.glorot_uniform(),
alpha_initializer=tf.initializers.constant(0.25),
drop_rate=0.1,
name="BottleneckUpsample"):
"""
+-------+
| Input |
+-------+
||
+------------++----------+
| |
| +-------------------------+
| | 2x2/s2 conv |
| | x(input_ch/proj_rate) |
| | -> BatchNorm -> PReLU |
V +-------------------------+
+----------------+ | Projection
| 1x1 conv | V
| x(input_ch/2) | +-------------------------+
+----------------+ | 3x3 conv |
| | x(input_ch/proj_rate) |
V | -> BatchNorm -> PReLU |
+----------------+ +-------------------------+
| 2x2 max_unpool | | Convolution
+----------------+ V
| +-------------------------+
| | 1x1 conv |
| | x(input_ch/2) |
| | -> BatchNorm |
| +-------------------------+
| | Expansion
+------------++----------+
\/ Residual connection
+-----------+
| Add |
| -> PReLU |
+-----------+
:param inputs: Input tensor.
:param unpool_argmax: Switches for the unpool op from the corresponding
downsampling max_pool op in the encoder stage.
:param training: Whether to accumulate statistics in batch norm.
:param padding: Padding for the main convolution.
:param projection_rate: Bottleneck operates on @projection_rate less channels.
:param dilations: Dilationrates in the main convolution block.
:param bn_decay: Decay rate for exp. running mean in batch norm.
:param kernel_initializer: tf.initializer for the conv kernels.
:param alpha_initializer: tf.initializer for the PReLU parameters.
:param drop_rate: Dropout probability (training==True)
:param name: Name of the block scope.
:returns: Output tensor, Parameters
NOTE: Parameters are stored in a dictionary indexed by the scopes.
:rtype: (tf.Tensor, dict)
"""
variables = {} # Dict with Variables in the block
out = None
with tf.variable_scope(name):
with tf.name_scope("ShapeOps"):
# Get input shape (batch dim. is assumed unresolvable)
shape = tf.shape(inputs, name="InputShape")
batch_sz = shape[0]
# Check if height / width / channels are resolvable
input_shape = inputs.get_shape().as_list()
if input_shape[1] == None or input_shape[2] == None \
or input_shape[3] == None:
input_shape = shape
input_ch = input_shape[3]
# Number of filters in the bottleneck are reduced by a factor of
# @projection_rate
bneck_filters = input_ch // projection_rate
# Get conv. kernels' shape
proj_kern_shape = [1,1,input_ch,bneck_filters]
conv_kern_shape = [3,3,bneck_filters,bneck_filters]
conv_out_shape = tf.stack([batch_sz, 2*input_shape[1],
2*input_shape[2], bneck_filters],
name="ConvTsposeOutShape")
# NOTE: upsampling halves the number of output channels following
# VGG-philosophy of preserving computational complexity
exp_kern_shape = [1,1,bneck_filters,input_ch//2]
# TODO: check if 1x1 of 3x3 is actually used
res_kern_shape = [1,1,input_ch,input_ch//2]
# END scope ShapeOps
############ Main Branch ############
with tf.variable_scope("DownProject"):
# Bottleneck projection operation
alpha = tf.get_variable(name="Alpha",
shape=[bneck_filters],
dtype=tf.float32,
initializer=alpha_initializer,
trainable=True)
kern = tf.get_variable(name="Kernel",
shape=proj_kern_shape,
dtype=tf.float32,
initializer=kernel_initializer,
trainable=True)
out = tf.nn.conv2d(inputs, kern,
strides=[1,1,1,1],
padding=padding,
name="Conv2D")
out, bn_params = xops.batch_normalization(out, training,
decay=bn_decay)
out = xops.prelu(out, alpha, name="PReLU")
variables["DownProject"] = {}
variables["DownProject"]["Kernel"] = kern
variables["DownProject"]["Alpha"] = alpha
variables["DownProject"]["BatchNorm"] = bn_params
# END scope DownProject
with tf.variable_scope("Conv"):
# Main convolution operation
alpha = tf.get_variable(name="Alpha",
dtype=tf.float32,
initializer=alpha_initializer,
shape=[bneck_filters],
trainable=True)
kern = tf.get_variable(name="Kernel",
dtype=tf.float32,
initializer=kernel_initializer,
shape=conv_kern_shape,
trainable=True)
out = tf.nn.conv2d_transpose(out, kern, conv_out_shape,
strides=[1,2,2,1],
name="Conv2DTranspose")
out, bn_params = xops.batch_normalization(out, training,
decay=bn_decay)
out = xops.prelu(out, alpha, name="PReLU")
variables["Conv"] = {}
variables["Conv"]["Kernel"] = kern
variables["Conv"]["Alpha"] = alpha
variables["Conv"]["BatchNorm"] = bn_params
# END scope Conv
with tf.variable_scope("Expansion"):
# Feature expansion operation
kern = tf.get_variable(name="Kernel",
dtype=tf.float32,
initializer=kernel_initializer,
shape=exp_kern_shape,
trainable=True)
out = tf.nn.conv2d(out, kern,
strides=[1,1,1,1],
padding=padding,
name="Conv2D")
out, bn_params = xops.batch_normalization(out, training,
decay=bn_decay)
if training and drop_rate > 0.0:
out = xops.spatial_dropout(out, drop_rate, name="SpatialDropout")
# NOTE: no prelu here
variables["Expansion"] = {}
variables["Expansion"]["Kernel"] = kern
variables["Expansion"]["BatchNorm"] = bn_params
# END scope Expansion
#####################################
########## Residual Branch ##########
kern = tf.get_variable(name="Kernel",
dtype=tf.float32,
initializer=kernel_initializer,
shape=res_kern_shape,
trainable=True)
res_out = tf.nn.conv2d(inputs, kern,
strides=[1,1,1,1],
padding=padding,
name="Conv2D")
res_out = xops.unpool_2d(res_out, unpool_argmax,
strides=[1,2,2,1])
variables["Kernel"] = kern
#####################################
alpha = tf.get_variable(name="Alpha",
shape=[input_ch//2],
dtype=tf.float32,
initializer=alpha_initializer,
trainable=True)
# NOTE: out comes from main branch
out = tf.add(res_out, out, name="Residual")
out = xops.prelu(out, alpha, name="PReLU")
variables["Alpha"] = alpha
# end with tf.variable_scope(name)
return out, variables
# END def block_bottleneck
def block_bottleneck_downsample(inputs, training,
padding="SAME",
projection_rate=4,
bn_decay=0.90,
dilations=[1,1,1,1],
kernel_initializer=tf.initializers.glorot_uniform(),
alpha_initializer=tf.initializers.constant(0.25),
drop_rate=0.1,
name="BottleneckDownsample"):
"""
+-------+
| Input |
+-------+
||
+------------++----------+
| V
| +-------------------------+
| | 2x2/s2 conv |
| | x(input_ch/proj_rate) |
| | -> BatchNorm -> PReLU |
V +-------------------------+
+----------------+ | Projection
| 2x2/s2 MaxPool | V
+----------------+ +-------------------------+
| | 3x3 conv |
| | x(input_ch/proj_rate) |
V | -> BatchNorm -> PReLU |
+--------------+ +-------------------------+
| Zero padding | | Convolution
+--------------+ V
| +-------------------------+
| | 1x1 conv |
| | x(2*input_ch) |
| | -> BatchNorm |
| +-------------------------+
| | Expansion
+------------++----------+
\/ Residual connection
+-----------+
| Add |
| -> PReLU |
+-----------+
:param inputs: Input tensor.
:param training: Whether to accumulate statistics in batch norm and
apply spatial dropout
:param padding: Padding for the main convolution.
:param projection_rate: Bottleneck operates on @projection_rate less channels.
:param dilations: Dilationrates in the main convolution block.
:param bn_decay: Decay rate for exp. running mean in batch norm.
:param kernel_initializer: tf.initializer for the conv kernels.
:param alpha_initializer: tf.initializer for the PReLU parameters.
:param name: Name of the block scope.
:param drop_rate: Dropout probability (training==True)
:returns: operation output, parameters, max pool argmax
:rtype: (tf.Tensor, dict, tf.Tensor)
"""
variables = {} # Dict with Variables in the block
out = None
with tf.variable_scope(name):
with tf.name_scope("ShapeOps"):
# Get input channel count
input_ch = inputs.get_shape().as_list()[-1]
# Number of filters in the bottleneck are reduced by a factor of
# @projection_rate
bneck_filters = input_ch // projection_rate
# Get conv. kernels' shape
proj_kern_shape = [2,2,input_ch,bneck_filters]
conv_kern_shape = [3,3,bneck_filters,bneck_filters]
# NOTE: downsampling doubles the output channel count following
# VGG-philosophy of preserving computational complexity
exp_kern_shape = [1,1,bneck_filters,2*input_ch]
zero_padding = [[0,0],[0,0],[0,0],[0,input_ch]]
# END scope ShapeOps
############ Main Branch ############
with tf.variable_scope("DownProject"):
# Bottleneck projection operation
alpha = tf.get_variable(name="Alpha",
dtype=tf.float32,
initializer=alpha_initializer,
shape=[bneck_filters],
trainable=True)
kern = tf.get_variable(name="Kernel",
dtype=tf.float32,
initializer=kernel_initializer,
shape=proj_kern_shape,
trainable=True)
out = tf.nn.conv2d(inputs, kern,
strides=[1,2,2,1],
padding=padding,
name="Conv2D")
out, bn_params = xops.batch_normalization(out, training,
decay=bn_decay)
out = xops.prelu(out, alpha, name="PReLU")
variables["DownProject"] = {}
variables["DownProject"]["Kernel"] = kern
variables["DownProject"]["Alpha"] = alpha
variables["DownProject"]["BatchNorm"] = bn_params
# END scope DownProject
with tf.variable_scope("Conv"):
# Main convolution operation
alpha = tf.get_variable(name="Alpha",
dtype=tf.float32,
initializer=alpha_initializer,
shape=[bneck_filters],
trainable=True)
kern = tf.get_variable(name="Kernel",
dtype=tf.float32,
initializer=kernel_initializer,
shape=conv_kern_shape,
trainable=True)
out = tf.nn.conv2d(out, kern,
strides=[1,1,1,1],
padding=padding,
dilations=dilations,
name="Conv2D")
out, bn_params = xops.batch_normalization(out, training,
decay=bn_decay)
out = xops.prelu(out, alpha, name="PReLU")
variables["Conv"] = {}
variables["Conv"]["Kernel"] = kern
variables["Conv"]["Alpha"] = alpha
variables["Conv"]["BatchNorm"] = bn_params
# END scope Conv
with tf.variable_scope("Expansion"):
# Feature expansion operation
kern = tf.get_variable(name="Kernel",
dtype=tf.float32,
initializer=kernel_initializer,
shape=exp_kern_shape,
trainable=True)
out = tf.nn.conv2d(out, kern,
strides=[1,1,1,1],
padding=padding,
name="Conv2D")
out, bn_params = xops.batch_normalization(out, training,
decay=bn_decay)
variables["Expansion"] = {}
variables["Expansion"]["Kernel"] = kern
variables["Expansion"]["BatchNorm"] = bn_params
# NOTE: no prelu here
if training and drop_rate > 0.0:
out = xops.spatial_dropout(out, drop_rate, name="SpatialDropout")
# END scope Expansion
#####################################
########## Residual Branch ##########
# MAYBE: add Targmax=tf.int32 below? (can still address 4GB)
# BUG: max_pool_with_argmax apparently doesn't support
# Targmax=tf.int32
res_out, max_pool_argmax = \
tf.nn.max_pool_with_argmax(inputs,
ksize=[1,2,2,1],
strides=[1,2,2,1],
Targmax=tf.int64,
padding=padding,
name="MaxPool")
# tf.tile() ?
res_out = tf.pad(res_out,
paddings=zero_padding,
name="ZeroPad")
#####################################
alpha = tf.get_variable(name="Alpha",
shape=[2*input_ch],
dtype=tf.float32,
initializer=alpha_initializer,
trainable=True)
# NOTE: out comes from main branch
out = tf.add(res_out, out, name="Residual")
out = xops.prelu(out, alpha, name="PReLU")
variables["Alpha"] = alpha
# END scope @name
return out, variables, max_pool_argmax
# END def block_bottleneck
def block_final(inputs, num_classes,
kernel_initializer=tf.initializers.glorot_uniform(),
name="Final"):
variables = {}
with tf.variable_scope(name):
with tf.name_scope("ShapeOps"):
shape = tf.shape(inputs, name="InputShape")
batch_sz = shape[0]
# Check if height / width is resolvable
input_shape = inputs.get_shape().as_list()
if input_shape[1] == None or input_shape[2] == None:
input_shape = shape
out_shape = tf.stack([batch_sz,2*input_shape[1],
2*input_shape[2],num_classes])
kern_shape = [3,3,num_classes,inputs.shape[-1]]
kern = tf.get_variable(name="Kernel",
dtype=tf.float32,
initializer=kernel_initializer,
shape=kern_shape,
trainable=True)
out = tf.nn.conv2d_transpose(inputs, kern, out_shape,
strides=[1,2,2,1],
name="Conv2DTranspose")
variables["Kernel"] = kern
#END scope @name
return out, variables
| 46.28465 | 84 | 0.420646 | 2,541 | 31,057 | 4.998426 | 0.09209 | 0.007086 | 0.006141 | 0.030785 | 0.825998 | 0.805212 | 0.787025 | 0.764349 | 0.748209 | 0.72262 | 0 | 0.018411 | 0.459607 | 31,057 | 670 | 85 | 46.353731 | 0.738366 | 0.299514 | 0 | 0.808786 | 0 | 0 | 0.051198 | 0 | 0 | 0 | 0 | 0.004478 | 0 | 1 | 0.01292 | false | 0 | 0.005168 | 0 | 0.031008 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
367e6db6b9ea89aa1aa4af090f338dfe525e4eca | 192 | py | Python | tests/io/test_get_absolute_fpath.py | safurrier/data_science_utils | 842b025ea3197e8a9946401257b2fa22ef1bf82d | [
"MIT"
] | null | null | null | tests/io/test_get_absolute_fpath.py | safurrier/data_science_utils | 842b025ea3197e8a9946401257b2fa22ef1bf82d | [
"MIT"
] | null | null | null | tests/io/test_get_absolute_fpath.py | safurrier/data_science_utils | 842b025ea3197e8a9946401257b2fa22ef1bf82d | [
"MIT"
] | 1 | 2020-03-30T20:59:04.000Z | 2020-03-30T20:59:04.000Z | # %%
import os
import shutil
from data_science_toolbox.io.get_absolute_fpath import get_absolute_fpath
def test_get_absolute_fpath():
assert get_absolute_fpath() == os.path.abspath('.')
| 21.333333 | 73 | 0.786458 | 28 | 192 | 5 | 0.571429 | 0.314286 | 0.457143 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.114583 | 192 | 8 | 74 | 24 | 0.823529 | 0.010417 | 0 | 0 | 0 | 0 | 0.005319 | 0 | 0 | 0 | 0 | 0 | 0.2 | 1 | 0.2 | true | 0 | 0.6 | 0 | 0.8 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
368202ba555b156e3cd6ba2c37ee5712daa00127 | 1,453 | py | Python | gdsfactory/tests/test_ports_select.py | jorgepadilla19/gdsfactory | 68e1c18257a75d4418279851baea417c8899a165 | [
"MIT"
] | 42 | 2020-05-25T09:33:45.000Z | 2022-03-29T03:41:19.000Z | gdsfactory/tests/test_ports_select.py | jorgepadilla19/gdsfactory | 68e1c18257a75d4418279851baea417c8899a165 | [
"MIT"
] | 133 | 2020-05-28T18:29:04.000Z | 2022-03-31T22:21:42.000Z | gdsfactory/tests/test_ports_select.py | jorgepadilla19/gdsfactory | 68e1c18257a75d4418279851baea417c8899a165 | [
"MIT"
] | 17 | 2020-06-30T07:07:50.000Z | 2022-03-17T15:45:27.000Z | import gdsfactory as gf
def test_get_ports() -> None:
c = gf.components.mzi_phase_shifter_top_heater_metal(length_x=123)
p = c.get_ports_dict()
assert len(p) == 4, len(p)
p_dc = c.get_ports_dict(width=11.0)
p_dc_layer = c.get_ports_dict(layer=(49, 0))
assert len(p_dc) == 2, f"{len(p_dc)}"
assert len(p_dc_layer) == 2, f"{len(p_dc_layer)}"
p_optical = c.get_ports_dict(width=0.5)
assert len(p_optical) == 2, f"{len(p_optical)}"
p_optical_west = c.get_ports_dict(orientation=180, width=0.5)
p_optical_east = c.get_ports_dict(orientation=0, width=0.5)
assert len(p_optical_east) == 1, f"{len(p_optical_east)}"
assert len(p_optical_west) == 1, f"{len(p_optical_west)}"
if __name__ == "__main__":
test_get_ports()
# c = gf.components.mzi_phase_shifter()
# c.show()
# p = c.get_ports_dict()
# assert len(p) == 4, len(p)
# p_dc = c.get_ports_dict(width=11.)
# p_dc_layer = c.get_ports_dict(layer=(49, 0))
# assert len(p_dc) == 2, f"{len(p_dc)}"
# assert len(p_dc_layer) == 2, f"{len(p_dc_layer)}"
# p_optical = c.get_ports_dict(width=0.5)
# assert len(p_optical) == 2, f"{len(p_optical)}"
# p_optical_west = c.get_ports_dict(orientation=180, width=0.5)
# p_optical_east = c.get_ports_dict(orientation=0, width=0.5)
# assert len(p_optical_east) == 1, f"{len(p_optical_east)}"
# assert len(p_optical_west) == 1, f"{len(p_optical_west)}"
| 33.022727 | 70 | 0.653131 | 262 | 1,453 | 3.270992 | 0.167939 | 0.112019 | 0.126021 | 0.18203 | 0.893816 | 0.893816 | 0.828471 | 0.828471 | 0.828471 | 0.828471 | 0 | 0.038558 | 0.17894 | 1,453 | 43 | 71 | 33.790698 | 0.679799 | 0.406056 | 0 | 0 | 0 | 0 | 0.110849 | 0.049528 | 0 | 0 | 0 | 0 | 0.352941 | 1 | 0.058824 | false | 0 | 0.058824 | 0 | 0.117647 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
36a2d3a4f26b68ae001c2b73aaec4b2a6d5be6e6 | 244 | py | Python | src/sage/game_theory/all.py | bopopescu/Sage-8 | 71be00ad5f25ca95381fae7cce96421ffdd43425 | [
"BSL-1.0"
] | null | null | null | src/sage/game_theory/all.py | bopopescu/Sage-8 | 71be00ad5f25ca95381fae7cce96421ffdd43425 | [
"BSL-1.0"
] | null | null | null | src/sage/game_theory/all.py | bopopescu/Sage-8 | 71be00ad5f25ca95381fae7cce96421ffdd43425 | [
"BSL-1.0"
] | null | null | null | from sage.misc.lazy_import import lazy_import
lazy_import('sage.game_theory.cooperative_game', 'CooperativeGame')
lazy_import('sage.game_theory.normal_form_game', 'NormalFormGame')
lazy_import('sage.game_theory.matching_game', 'MatchingGame')
| 40.666667 | 67 | 0.836066 | 33 | 244 | 5.818182 | 0.424242 | 0.260417 | 0.21875 | 0.28125 | 0.375 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.045082 | 244 | 5 | 68 | 48.8 | 0.824034 | 0 | 0 | 0 | 0 | 0 | 0.561475 | 0.393443 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
36cf7ce9a1d7426082d26e6555c7993435ca1500 | 129 | py | Python | IntroProPython/aula9-Arquivos/listagem09-25.py | SweydAbdul/estudos-python | b052708d0566a0afb9a1c04d035467d45f820879 | [
"MIT"
] | null | null | null | IntroProPython/aula9-Arquivos/listagem09-25.py | SweydAbdul/estudos-python | b052708d0566a0afb9a1c04d035467d45f820879 | [
"MIT"
] | null | null | null | IntroProPython/aula9-Arquivos/listagem09-25.py | SweydAbdul/estudos-python | b052708d0566a0afb9a1c04d035467d45f820879 | [
"MIT"
] | null | null | null | import os.path
print(os.path.join('c:', 'dados', 'programas'))
print(os.path.abspath(os.path.join('c:', 'dados', 'programas')))
| 25.8 | 64 | 0.658915 | 20 | 129 | 4.25 | 0.45 | 0.282353 | 0.258824 | 0.258824 | 0.588235 | 0.588235 | 0 | 0 | 0 | 0 | 0 | 0 | 0.069767 | 129 | 4 | 65 | 32.25 | 0.708333 | 0 | 0 | 0 | 0 | 0 | 0.248062 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.333333 | 0 | 0.333333 | 0.666667 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 8 |
36e474f12ff07687312c5b71136f61e04d9eb440 | 6,816 | py | Python | Data_processing/post_process_master_table.py | maynard242/Database-Design-and-Analysis | 01006fce5b4fdca3915b2a1dc9b3e3c170c892af | [
"MIT"
] | 2 | 2018-11-17T05:44:26.000Z | 2018-11-17T07:16:25.000Z | Data_processing/post_process_master_table.py | maynard242/Database-Design-and-Analysis | 01006fce5b4fdca3915b2a1dc9b3e3c170c892af | [
"MIT"
] | null | null | null | Data_processing/post_process_master_table.py | maynard242/Database-Design-and-Analysis | 01006fce5b4fdca3915b2a1dc9b3e3c170c892af | [
"MIT"
] | null | null | null | #/usr/bin/env python3
# Program to do some processing of the master-table (new -> master_table_process)
import psycopg2
conn = psycopg2.connect(database="awesome", user = "awesome_admin",
password="w205.Awesome", host = "34.193.7.196", port="5432")
cur = conn.cursor()
cur.execute('''CREATE TABLE IF NOT EXISTS master_table_process AS SELECT * FROM master_table;''')
conn.commit()
cur.execute('''ALTER TABLE master_table_process ADD same_eth_mentee INT;''')
conn.commit()
cur.execute('''UPDATE master_table_process SET same_eth_mentee=1 WHERE mentee_eth=mentor_eth AND mentor_eth !=0 AND mentee_eth != 0;''')
conn.commit()
cur.execute('''UPDATE master_table_process SET same_eth_mentee=0 WHERE mentee_eth != mentor_eth;''')
conn.commit()
cur.execute('''UPDATE master_table_process SET same_eth_mentee=0 WHERE mentor_eth IS NULL OR mentee_eth IS NULL;''')
conn.commit()
cur.execute('''ALTER TABLE master_table_process ADD touchpoints INT,
ADD mentor_online_freq INT,
ADD pair_online_freq INT,
ADD pair_meetings INT
ADD x1 INT
ADD x2 INT
ADD x3 INT
ADD x4 INT
ADD x5 INT
ADD x6 INT
ADD x7 INT
ADD x8 INT;''')
conn.commit()
# Creatting touchpoints
cur.execute('''UPDATE master_table_process SET touchpoints= ROUND((mentor_tp_1314 + mentor_tp_1415 + mentor_tp_1516),0);''')
conn.commit()
# Creating overall mentor_online_freq
cur.execute('''UPDATE master_table_process SET x1=0 WHERE mentor_online_freq_1213=9999;''')
conn.commit()
cur.execute('''UPDATE master_table_process SET x1=1 WHERE mentor_online_freq_1213!=9999;''')
conn.commit()
cur.execute('''UPDATE master_table_process SET x2=0 WHERE mentor_online_freq_1415=9999;''')
conn.commit()
cur.execute('''UPDATE master_table_process SET x2=1 WHERE mentor_online_freq_1415!=9999;''')
conn.commit()
cur.execute('''UPDATE master_table_process SET x3=0 WHERE mentor_online_freq_1516=9999;''')
conn.commit()
cur.execute('''UPDATE master_table_process SET x3=1 WHERE mentor_online_freq_1516!=9999;''')
conn.commit()
cur.execute('''UPDATE master_table_process SET x4=0 WHERE mentor_online_freq_1213=9999;''')
conn.commit()
cur.execute('''UPDATE master_table_process SET x4=mentor_online_freq_1213 WHERE mentor_online_freq_1213!=9999;''')
conn.commit()
cur.execute('''UPDATE master_table_process SET x5=0 WHERE mentor_online_freq_1415=9999;''')
conn.commit()
cur.execute('''UPDATE master_table_process SET x5=mentor_online_freq_1415 WHERE mentor_online_freq_1415!=9999;''')
conn.commit()
cur.execute('''UPDATE master_table_process SET x6=0 WHERE mentor_online_freq_1516=9999;''')
conn.commit()
cur.execute('''UPDATE master_table_process SET x6=mentor_online_freq_1516 WHERE mentor_online_freq_1516!=9999;''')
conn.commit()
cur.execute('''UPDATE master_table_process SET mentor_online_freq= ROUND( (x4+x5+x6) /(x1+x2+x3),0 );''')
conn.commit()
# Creating pair_online_freq
cur.execute('''UPDATE master_table_process SET x1=0 WHERE pair_online_freq_1213=9999;''')
conn.commit()
cur.execute('''UPDATE master_table_process SET x1=1 WHERE pair_online_freq_1213!=9999;''')
conn.commit()
cur.execute('''UPDATE master_table_process SET x2=0 WHERE pair_online_freq_1314_A=9999;''')
conn.commit()
cur.execute('''UPDATE master_table_process SET x2=1 WHERE pair_online_freq_1314_A!=9999;''')
conn.commit()
cur.execute('''UPDATE master_table_process SET x3=0 WHERE pair_online_freq_1415=9999;''')
conn.commit()
cur.execute('''UPDATE master_table_process SET x3=1 WHERE pair_online_freq_1415!=9999;''')
conn.commit()
cur.execute('''UPDATE master_table_process SET x4=0 WHERE pair_online_freq_1516=9999;''')
conn.commit()
cur.execute('''UPDATE master_table_process SET x4=1 WHERE pair_online_freq_1516!=9999;''')
conn.commit()
cur.execute('''UPDATE master_table_process SET x5=0 WHERE pair_online_freq_1213=9999;''')
conn.commit()
cur.execute('''UPDATE master_table_process SET x5=pair_online_freq_1213 WHERE pair_online_freq_1213!=9999;''')
conn.commit()
cur.execute('''UPDATE master_table_process SET x6=0 WHERE pair_online_freq_1314_A=9999;''')
conn.commit()
cur.execute('''UPDATE master_table_process SET x6=pair_online_freq_1314_A WHERE pair_online_freq_1314_A!=9999;''')
conn.commit()
cur.execute('''UPDATE master_table_process SET x7=0 WHERE mentor_online_freq_1415=9999;''')
conn.commit()
cur.execute('''UPDATE master_table_process SET x7=pair_online_freq_1415 WHERE pair_online_freq_1415!=9999;''')
conn.commit()
cur.execute('''UPDATE master_table_process SET x8=0 WHERE pair_online_freq_1516=9999;''')
conn.commit()
cur.execute('''UPDATE master_table_process SET x8=pair_online_freq_1516 WHERE pair_online_freq_1516!=9999;''')
conn.commit()
cur.execute('''UPDATE master_table_process SET pair_online_freq= ROUND( (x5+x6+x7+x8) /(x1+x2+x3+x4),0 );''')
conn.commit()
# Creating pair_meetings
cur.execute('''UPDATE master_table_process SET x1=0 WHERE pair_meetings_1213=9999;''')
conn.commit()
cur.execute('''UPDATE master_table_process SET x1=pair_meetings_1213 WHERE pair_meetings_1213!=9999;''')
conn.commit()
cur.execute('''UPDATE master_table_process SET x2=0 WHERE pair_meetings_1314_A=9999;''')
conn.commit()
cur.execute('''UPDATE master_table_process SET x2=pair_meetings_1314_A WHERE pair_meetings_1314_A!=9999;''')
conn.commit()
cur.execute('''UPDATE master_table_process SET x3=0 WHERE pair_total_mtgs_1415=9999;''')
conn.commit()
cur.execute('''UPDATE master_table_process SET x3=pair_total_mtgs_1415 WHERE pair_total_mtgs_1415!=9999;''')
conn.commit()
cur.execute('''UPDATE master_table_process SET x4=0 WHERE pair_curriculum_mtgs_1516=9999;''')
conn.commit()
cur.execute('''UPDATE master_table_process SET x4=pair_curriculum_mtgs_1516 WHERE pair_curriculum_mtgs_1516!=9999;''')
conn.commit()
cur.execute('''UPDATE master_table_process SET x5=0 WHERE pair_other_mtgs_1516=9999;''')
conn.commit()
cur.execute('''UPDATE master_table_process SET x5=pair_other_mtgs_1516 WHERE pair_other_mtgs_1516!=9999;''')
conn.commit()
cur.execute('''UPDATE master_table_process SET pair_meetings= ROUND( x1+x2+x3+x4+x5,0 );''')
conn.commit()
cur.execute('''ALTER TABLE master_table_process DROP x1 INT
DROP x2 INT
DROP x3 INT
DROP x4 INT
DROP x5 INT
DROP x6 INT
DROP x7 INT
DROP x8 INT;''')
conn.close()
| 36.645161 | 137 | 0.70892 | 1,000 | 6,816 | 4.541 | 0.089 | 0.125963 | 0.198194 | 0.218014 | 0.7963 | 0.763488 | 0.763488 | 0.75534 | 0.75534 | 0.74477 | 0 | 0.086642 | 0.168574 | 6,816 | 185 | 138 | 36.843243 | 0.714664 | 0.03037 | 0 | 0.4 | 0 | 0.016667 | 0.711658 | 0.219919 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.008333 | 0.008333 | 0 | 0.008333 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
7fd1e247da5bb26d223ad829bd63cd3c83c03857 | 98 | py | Python | ver1_0/openassembly/pirate_core/__init__.py | fragro/Open-Assembly | e9679ff5e7ae9881fa5781d763288ed2f40b014d | [
"BSD-3-Clause"
] | 1 | 2015-11-05T08:22:19.000Z | 2015-11-05T08:22:19.000Z | ver1_0/openassembly/pirate_core/__init__.py | fragro/Open-Assembly | e9679ff5e7ae9881fa5781d763288ed2f40b014d | [
"BSD-3-Clause"
] | null | null | null | ver1_0/openassembly/pirate_core/__init__.py | fragro/Open-Assembly | e9679ff5e7ae9881fa5781d763288ed2f40b014d | [
"BSD-3-Clause"
] | 1 | 2018-02-03T18:25:41.000Z | 2018-02-03T18:25:41.000Z | from pirate_core.views import *
from pirate_core.models import *
from pirate_core.forms import *
| 19.6 | 32 | 0.806122 | 15 | 98 | 5.066667 | 0.466667 | 0.394737 | 0.552632 | 0.526316 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.132653 | 98 | 4 | 33 | 24.5 | 0.894118 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
7fe045067296e77cc74cf013bdac319f3ba484a2 | 1,194 | py | Python | tests/picklers_test.py | rafidka/adawat | 81828bbea2c3d06d560d6bdbea698b2427dd9917 | [
"MIT"
] | null | null | null | tests/picklers_test.py | rafidka/adawat | 81828bbea2c3d06d560d6bdbea698b2427dd9917 | [
"MIT"
] | 4 | 2020-08-02T23:50:50.000Z | 2020-08-29T02:19:34.000Z | tests/picklers_test.py | rafidka/adawat | 81828bbea2c3d06d560d6bdbea698b2427dd9917 | [
"MIT"
] | null | null | null | from adawat.picklers import MemoryPickler, FilePickler, ObjectNotFoundError
def test_MemoryPickler():
pickler = MemoryPickler()
some_obj = {
'some_field': 'some_value'
}
some_obj_id = 'this-is-a-relatively-unique-id'
# Dump the object
pickler.dump(some_obj_id, some_obj)
# Load the object
some_obj_loaded = pickler.load(some_obj_id)
assert some_obj == some_obj_loaded
# Delete the object
pickler.delete(some_obj_id)
# Reload the object and
try:
pickler.load(some_obj_id)
assert False
except ObjectNotFoundError:
# We are good.
pass
def test_FilePickler():
pickler = FilePickler()
some_obj = {
'some_field': 'some_value'
}
some_obj_id = 'this-is-a-relatively-unique-id'
# Dump the object
pickler.dump(some_obj_id, some_obj)
# Load the object
some_obj_loaded = pickler.load(some_obj_id)
assert some_obj == some_obj_loaded
# Delete the object
pickler.delete(some_obj_id)
# Reload the object and
try:
pickler.load(some_obj_id)
assert False
except ObjectNotFoundError:
# We are good.
pass
| 19.9 | 75 | 0.652429 | 153 | 1,194 | 4.830065 | 0.235294 | 0.189445 | 0.121786 | 0.097429 | 0.806495 | 0.806495 | 0.806495 | 0.806495 | 0.806495 | 0.806495 | 0 | 0 | 0.268007 | 1,194 | 59 | 76 | 20.237288 | 0.845538 | 0.141541 | 0 | 0.774194 | 0 | 0 | 0.098619 | 0.059172 | 0 | 0 | 0 | 0 | 0.129032 | 1 | 0.064516 | false | 0.064516 | 0.032258 | 0 | 0.096774 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 8 |
7fe18b49fca73d1081762689f0e633fa1228ec06 | 42,506 | py | Python | Yukki/Plugins/callback.py | vaibhav252/YukkiMusic-Old | 06aa5ecbdbf62fac461836601dbcd5854a8c55fb | [
"MIT"
] | null | null | null | Yukki/Plugins/callback.py | vaibhav252/YukkiMusic-Old | 06aa5ecbdbf62fac461836601dbcd5854a8c55fb | [
"MIT"
] | null | null | null | Yukki/Plugins/callback.py | vaibhav252/YukkiMusic-Old | 06aa5ecbdbf62fac461836601dbcd5854a8c55fb | [
"MIT"
] | null | null | null | import os
import re
import yt_dlp
import aiohttp
import random
import asyncio
import shutil
import aiofiles
import requests
import time as sedtime
from os import path
from time import time
from .. import converter
from asyncio import QueueEmpty
from pyrogram import Client, filters
from pyrogram.types import (
Message,
CallbackQuery,
InlineKeyboardButton,
InlineKeyboardMarkup,
InputMediaPhoto,
)
from pytgcalls import StreamType
from Yukki.config import LOG_GROUP_ID
from youtubesearchpython import VideosSearch
from ..YukkiUtilities.tgcallsrun import ASS_ACC
from Yukki import app, BOT_USERNAME, dbb, SUDOERS
from pytgcalls.types.input_stream import InputAudioStream, InputStream
from aiohttp import ClientResponseError, ServerTimeoutError, TooManyRedirects
from Yukki import dbb, app, BOT_USERNAME, BOT_ID, ASSID, ASSNAME, ASSUSERNAME, ASSMENTION
from Yukki.YukkiUtilities.tgcallsrun import (yukki, convert, download, clear, get, is_empty, put, task_done, smexy)
from ..YukkiUtilities.tgcallsrun import (yukki, convert, download, clear, get, is_empty, put, task_done)
from Yukki.YukkiUtilities.helpers.decorators import errors
from Yukki.YukkiUtilities.helpers.filters import command, other_filters
from Yukki.YukkiUtilities.helpers.paste import paste
from Yukki.YukkiUtilities.tgcallsrun import (yukki, clear, get, is_empty, put, task_done)
from Yukki.YukkiUtilities.database.queue import (is_active_chat, add_active_chat, remove_active_chat, music_on, is_music_playing, music_off)
from Yukki.YukkiUtilities.database.playlist import (get_playlist_count, _get_playlists, get_note_names, get_playlist, save_playlist, delete_playlist)
from Yukki.YukkiUtilities.database.assistant import (_get_assistant, get_assistant, save_assistant)
from Yukki.YukkiUtilities.helpers.inline import (play_keyboard, search_markup, play_markup, playlist_markup, audio_markup)
from Yukki.YukkiUtilities.helpers.inline import play_keyboard, confirm_keyboard, play_list_keyboard, close_keyboard, confirm_group_keyboard
from Yukki.YukkiUtilities.tgcallsrun import (yukki, convert, download, clear, get, is_empty, put, task_done, smexy)
from Yukki.YukkiUtilities.database.queue import (is_active_chat, add_active_chat, remove_active_chat, music_on, is_music_playing, music_off)
from Yukki.YukkiUtilities.database.onoff import (is_on_off, add_on, add_off)
from Yukki.YukkiUtilities.database.blacklistchat import (blacklisted_chats, blacklist_chat, whitelist_chat)
from Yukki.YukkiUtilities.database.gbanned import (get_gbans_count, is_gbanned_user, add_gban_user, add_gban_user)
from Yukki.YukkiUtilities.database.theme import (_get_theme, get_theme, save_theme)
from Yukki.YukkiUtilities.database.assistant import (_get_assistant, get_assistant, save_assistant)
from ..config import DURATION_LIMIT, ASS_ID
from ..YukkiUtilities.helpers.decorators import errors
from ..YukkiUtilities.helpers.filters import command
from ..YukkiUtilities.helpers.gets import (get_url, themes, random_assistant, ass_det)
from ..YukkiUtilities.helpers.thumbnails import gen_thumb
from ..YukkiUtilities.helpers.chattitle import CHAT_TITLE
from ..YukkiUtilities.helpers.ytdl import ytdl_opts
from ..YukkiUtilities.helpers.inline import (play_keyboard, search_markup, play_markup, playlist_markup)
from pykeyboard import InlineKeyboard
from Yukki import aiohttpsession as session
pattern = re.compile(
r"^text/|json$|yaml$|xml$|toml$|x-sh$|x-shellscript$"
)
flex = {}
async def isPreviewUp(preview: str) -> bool:
for _ in range(7):
try:
async with session.head(preview, timeout=2) as resp:
status = resp.status
size = resp.content_length
except asyncio.exceptions.TimeoutError:
return False
if status == 404 or (status == 200 and size == 0):
await asyncio.sleep(0.4)
else:
return True if status == 200 else False
return False
@Client.on_callback_query(filters.regex(pattern=r"ppcl"))
async def closesmex(_,CallbackQuery):
callback_data = CallbackQuery.data.strip()
chat_id = CallbackQuery.message.chat.id
callback_request = callback_data.split(None, 1)[1]
userid = CallbackQuery.from_user.id
try:
smex, user_id = callback_request.split("|")
except Exception as e:
await CallbackQuery.message.edit(f"❌ an error occured\n\n**reason:** `{e}`")
return
if CallbackQuery.from_user.id != int(user_id):
await CallbackQuery.answer("💡 sorry this is not your request", show_alert=True)
return
await CallbackQuery.message.delete()
await CallbackQuery.answer()
@Client.on_callback_query(filters.regex("pausevc"))
async def pausevc(_,CallbackQuery):
a = await app.get_chat_member(CallbackQuery.message.chat.id , CallbackQuery.from_user.id)
if not a.can_manage_voice_chats:
return await CallbackQuery.answer("You must be admin with permissions:\n\n❌ » manage_video_chats", show_alert=True)
checking = CallbackQuery.from_user.first_name
chat_id = CallbackQuery.message.chat.id
if await is_active_chat(chat_id):
if await is_music_playing(chat_id):
await yukki.pytgcalls.pause_stream(chat_id)
await music_off(chat_id)
await CallbackQuery.answer("streaming paused")
user_id = CallbackQuery.from_user.id
user_name = CallbackQuery.from_user.first_name
rpk = "["+user_name+"](tg://user?id="+str(user_id)+")"
await CallbackQuery.edit_message_text(f"⏸ music playback has paused", reply_markup=play_keyboard)
else:
await CallbackQuery.answer(f"❌ no music is currently playing", show_alert=True)
return
else:
await CallbackQuery.answer(f"❌ no music is currently playing", show_alert=True)
@Client.on_callback_query(filters.regex("resumevc"))
async def resumevc(_,CallbackQuery):
a = await app.get_chat_member(CallbackQuery.message.chat.id , CallbackQuery.from_user.id)
if not a.can_manage_voice_chats:
return await CallbackQuery.answer("You must be admin with permissions:\n\n❌ » manage_video_chats", show_alert=True)
checking = CallbackQuery.from_user.first_name
chat_id = CallbackQuery.message.chat.id
if await is_active_chat(chat_id):
if await is_music_playing(chat_id):
await CallbackQuery.answer("❌ no music is paused", show_alert=True)
return
else:
await music_on(chat_id)
await yukki.pytgcalls.resume_stream(chat_id)
await CallbackQuery.answer("streaming resumed")
user_id = CallbackQuery.from_user.id
user_name = CallbackQuery.from_user.first_name
rpk = "["+user_name+"](tg://user?id="+str(user_id)+")"
await CallbackQuery.edit_message_text(f"▶ music playback has resumed", reply_markup=play_keyboard)
else:
await CallbackQuery.answer(f"❌ no music is currently playing", show_alert=True)
@Client.on_callback_query(filters.regex("skipvc"))
async def skipvc(_,CallbackQuery):
a = await app.get_chat_member(CallbackQuery.message.chat.id , CallbackQuery.from_user.id)
if not a.can_manage_voice_chats:
return await CallbackQuery.answer("You must be admin with permissions:\n\n❌ » manage_video_chats", show_alert=True)
checking = CallbackQuery.from_user.first_name
chat_id = CallbackQuery.message.chat.id
chat_title = CallbackQuery.message.chat.title
if await is_active_chat(chat_id):
task_done(CallbackQuery.message.chat.id)
if is_empty(CallbackQuery.message.chat.id):
user_id = CallbackQuery.from_user.id
await remove_active_chat(chat_id)
user_name = CallbackQuery.from_user.first_name
rpk = "["+user_name+"](tg://user?id="+str(user_id)+")"
await remove_active_chat(chat_id)
await CallbackQuery.answer()
await CallbackQuery.message.reply(f"{rpk} want to skip music**\n\n❌ There's no more music in queue!\n\n» userbot leaving video chat.")
await yukki.pytgcalls.leave_group_call(chat_id)
return
else:
await CallbackQuery.answer("You've skipped to the next song", show_alert=True)
afk = get(chat_id)['file']
f1 = (afk[0])
f2 = (afk[1])
f3 = (afk[2])
finxx = (f"{f1}{f2}{f3}")
if str(finxx) != "raw":
mystic = await CallbackQuery.message.reply("💡 currently playing playlist\n\n📥 downloading next music from playlist...")
url = (f"https://www.youtube.com/watch?v={afk}")
try:
with yt_dlp.YoutubeDL(ytdl_opts) as ytdl:
x = ytdl.extract_info(url, download=False)
except Exception as e:
return await mystic.edit(f"failed to download this video.\n\n**reason:** `{e}`")
title = (x["title"])
videoid = afk
def my_hook(d):
if d['status'] == 'downloading':
percentage = d['_percent_str']
per = (str(percentage)).replace(".","", 1).replace("%","", 1)
per = int(per)
eta = d['eta']
speed = d['_speed_str']
size = d['_total_bytes_str']
bytesx = d['total_bytes']
if str(bytesx) in flex:
pass
else:
flex[str(bytesx)] = 1
if flex[str(bytesx)] == 1:
flex[str(bytesx)] += 1
sedtime.sleep(1)
mystic.edit(f"Downloading {title[:50]}\n\n**FileSize:** {size}\n**Downloaded:** {percentage}\n**Speed:** {speed}\n**ETA:** {eta} sec")
if per > 500:
if flex[str(bytesx)] == 2:
flex[str(bytesx)] += 1
sedtime.sleep(0.5)
mystic.edit(f"Downloading {title[:50]}...\n\n**FileSize:** {size}\n**Downloaded:** {percentage}\n**Speed:** {speed}\n**ETA:** {eta} sec")
print(f"[{videoid}] Downloaded {percentage} at a speed of {speed} in {chat_title} | ETA: {eta} seconds")
if per > 800:
if flex[str(bytesx)] == 3:
flex[str(bytesx)] += 1
sedtime.sleep(0.5)
mystic.edit(f"Downloading {title[:50]}....\n\n**FileSize:** {size}\n**Downloaded:** {percentage}\n**Speed:** {speed}\n**ETA:** {eta} sec")
print(f"[{videoid}] Downloaded {percentage} at a speed of {speed} in {chat_title} | ETA: {eta} seconds")
if per == 1000:
if flex[str(bytesx)] == 4:
flex[str(bytesx)] = 1
sedtime.sleep(0.5)
mystic.edit(f"Downloading {title[:50]}.....\n\n**FileSize:** {size}\n**Downloaded:** {percentage}\n**Speed:** {speed}\n**ETA:** {eta} sec")
print(f"[{videoid}] Downloaded {percentage} at a speed of {speed} in {chat_title} | ETA: {eta} seconds")
loop = asyncio.get_event_loop()
xx = await loop.run_in_executor(None, download, url, my_hook)
file = await convert(xx)
await yukki.pytgcalls.change_stream(
chat_id,
InputStream(
InputAudioStream(
file,
),
),
)
thumbnail = (x["thumbnail"])
duration = (x["duration"])
duration = round(x["duration"] / 60)
theme = random.choice(themes)
ctitle = (await app.get_chat(chat_id)).title
ctitle = await CHAT_TITLE(ctitle)
f2 = open(f'search/{afk}id.txt', 'r')
userid =(f2.read())
thumb = await gen_thumb(thumbnail, title, userid, theme, ctitle)
user_id = userid
buttons = play_markup(videoid, user_id)
await mystic.delete()
semx = await app.get_users(userid)
user_id = CallbackQuery.from_user.id
user_name = CallbackQuery.from_user.first_name
rpk = "["+user_name+"](tg://user?id="+str(user_id)+")"
await CallbackQuery.send_photo(
chat_id,
photo=thumb,
reply_markup=InlineKeyboardMarkup(buttons),
caption=(f"⏭ **Skipped to the next track**\n\n🗂 **Name** {title[:80]}\n⏱ **Duration:** `{duration}`\n🧸 **Request by:** {semx.mention}")
)
os.remove(thumb)
else:
await yukki.pytgcalls.change_stream(
chat_id,
InputStream(
InputAudioStream(
afk,
),
),
)
_chat_ = ((str(afk)).replace("_","", 1).replace("/","", 1).replace(".","", 1))
f2 = open(f'search/{_chat_}title.txt', 'r')
title =(f2.read())
f3 = open(f'search/{_chat_}duration.txt', 'r')
duration =(f3.read())
f4 = open(f'search/{_chat_}username.txt', 'r')
username =(f4.read())
f4 = open(f'search/{_chat_}videoid.txt', 'r')
videoid =(f4.read())
user_id = 1
videoid = str(videoid)
if videoid == "smex1":
buttons = audio_markup(videoid, user_id)
else:
buttons = play_markup(videoid, user_id)
user_id = CallbackQuery.from_user.id
user_name = CallbackQuery.from_user.first_name
rpk = "["+user_name+"](tg://user?id="+str(user_id)+")"
await CallbackQuery.send_photo(
chat_id,
photo=f"downloads/{_chat_}final.png",
reply_markup=InlineKeyboardMarkup(buttons),
caption=f"⏭ **Skipped to the next track**\n\n🗂 **Name:** {title[:80]}\n⏱ **Duration:** `{duration}`\n🧸 **Request by:** {username}",
)
return
@Client.on_callback_query(filters.regex("stopvc"))
async def stopvc(_,CallbackQuery):
a = await app.get_chat_member(CallbackQuery.message.chat.id , CallbackQuery.from_user.id)
if not a.can_manage_voice_chats:
return await CallbackQuery.answer("You must be admin with permissions:\n\n❌ » manage_video_chats", show_alert=True)
checking = CallbackQuery.from_user.first_name
chat_id = CallbackQuery.message.chat.id
if await is_active_chat(chat_id):
try:
clear(chat_id)
except QueueEmpty:
pass
try:
await yukki.pytgcalls.leave_group_call(chat_id)
except Exception as e:
pass
await remove_active_chat(CallbackQuery.message.chat.id)
user_id = CallbackQuery.from_user.id
user_name = CallbackQuery.from_user.first_name
rpk = "["+user_name+"](tg://user?id="+str(user_id)+")"
await CallbackQuery.edit_message_text("✅ This music playback has ended", reply_markup=close_keyboard)
else:
await CallbackQuery.answer(f"❌ no music is currently playing", show_alert=True)
@Client.on_callback_query(filters.regex("play_playlist"))
async def play_playlist(_,CallbackQuery):
callback_data = CallbackQuery.data.strip()
chat_id = CallbackQuery.message.chat.id
callback_request = callback_data.split(None, 1)[1]
userid = CallbackQuery.from_user.id
try:
user_id,smex = callback_request.split("|")
except Exception as e:
await CallbackQuery.answer()
return await CallbackQuery.message.edit(f"an error occured\n**reason**: `{e}`")
Name = CallbackQuery.from_user.first_name
chat_title = CallbackQuery.message.chat.title
if str(smex) == "personal":
if CallbackQuery.from_user.id != int(user_id):
return await CallbackQuery.answer("💡 This is not your personal playlist", show_alert=True)
_playlist = await get_note_names(CallbackQuery.from_user.id)
if not _playlist:
return await CallbackQuery.answer(f"❌ You not have personal playlist on database", show_alert=True)
else:
await CallbackQuery.message.delete()
logger_text=f"""💡 starting playlist
Group: {chat_title}
Request by: {Name}
▶ personal playlist playback"""
mystic = await CallbackQuery.message.reply_text(f"💡 Starting {Name}'s personal playlist.\n\n🎧 Request by: {CallbackQuery.from_user.first_name}")
checking = f"[{CallbackQuery.from_user.first_name}](tg://user?id={userid})"
msg = f"Queued Playlist:\n\n"
j = 0
for note in _playlist:
_note = await get_playlist(CallbackQuery.from_user.id, note)
title = _note["title"]
videoid = _note["videoid"]
url = (f"https://www.youtube.com/watch?v={videoid}")
duration = _note["duration"]
if await is_active_chat(chat_id):
position = await put(chat_id, file=videoid)
j += 1
msg += f"{j}- {title[:50]}\n"
msg += f"Queued Position: {position}\n\n"
f20 = open(f'search/{videoid}id.txt', 'w')
f20.write(f"{user_id}")
f20.close()
else:
try:
with yt_dlp.YoutubeDL(ytdl_opts) as ytdl:
x = ytdl.extract_info(url, download=False)
except Exception as e:
return await mystic.edit(f"failed to download this video.\n\n**reason:** `{e}`")
title = (x["title"])
thumbnail = (x["thumbnail"])
def my_hook(d):
if d['status'] == 'downloading':
percentage = d['_percent_str']
per = (str(percentage)).replace(".","", 1).replace("%","", 1)
per = int(per)
eta = d['eta']
speed = d['_speed_str']
size = d['_total_bytes_str']
bytesx = d['total_bytes']
if str(bytesx) in flex:
pass
else:
flex[str(bytesx)] = 1
if flex[str(bytesx)] == 1:
flex[str(bytesx)] += 1
try:
if eta > 2:
mystic.edit(f"Downloading {title[:50]}\n\n**FileSize:** {size}\n**Downloaded:** {percentage}\n**Speed:** {speed}\n**ETA:** {eta} sec")
except Exception as e:
pass
if per > 250:
if flex[str(bytesx)] == 2:
flex[str(bytesx)] += 1
if eta > 2:
mystic.edit(f"Downloading {title[:50]}..\n\n**FileSize:** {size}\n**Downloaded:** {percentage}\n**Speed:** {speed}\n**ETA:** {eta} sec")
print(f"[{videoid}] Downloaded {percentage} at a speed of {speed} | ETA: {eta} seconds")
if per > 500:
if flex[str(bytesx)] == 3:
flex[str(bytesx)] += 1
if eta > 2:
mystic.edit(f"Downloading {title[:50]}...\n\n**FileSize:** {size}\n**Downloaded:** {percentage}\n**Speed:** {speed}\n**ETA:** {eta} sec")
print(f"[{videoid}] Downloaded {percentage} at a speed of {speed} | ETA: {eta} seconds")
if per > 800:
if flex[str(bytesx)] == 4:
flex[str(bytesx)] += 1
if eta > 2:
mystic.edit(f"Downloading {title[:50]}....\n\n**FileSize:** {size}\n**Downloaded:** {percentage}\n**Speed:** {speed}\n**ETA:** {eta} sec")
print(f"[{videoid}] Downloaded {percentage} at a speed of {speed} | ETA: {eta} seconds")
if d['status'] == 'finished':
try:
taken = d['_elapsed_str']
except Exception as e:
taken = "00:00"
size = d['_total_bytes_str']
mystic.edit(f"**Downloaded {title[:50]}.....**\n\n**FileSize:** {size}\n**Time Taken:** {taken} sec\n\n**Converting File** [__FFmpeg processing__]")
print(f"[{videoid}] Downloaded | Elapsed: {taken} seconds")
loop = asyncio.get_event_loop()
xx = await loop.run_in_executor(None, download, url, my_hook)
file = await convert(xx)
await music_on(chat_id)
await add_active_chat(chat_id)
await yukki.pytgcalls.join_group_call(
chat_id,
InputStream(
InputAudioStream(
file,
),
),
stream_type=StreamType().local_stream,
)
theme = random.choice(themes)
ctitle = CallbackQuery.message.chat.title
ctitle = await CHAT_TITLE(ctitle)
thumb = await gen_thumb(thumbnail, title, userid, theme, ctitle)
buttons = play_markup(videoid, user_id)
m = await CallbackQuery.message.reply_photo(
photo=thumb,
reply_markup=InlineKeyboardMarkup(buttons),
caption=(f"🗂 **Name:** [{title[:80]}]({url})\n⏱ **Duration:** `{duration}`\n🧸 **Request by:** {checking}")
)
os.remove(thumb)
await CallbackQuery.message.delete()
await mystic.delete()
m = await CallbackQuery.message.reply_text("🔄 pasting queued playlist to bin...")
link = await paste(msg)
preview = link + "/preview.png"
urlxp = link + "/index.txt"
a1 = InlineKeyboardButton(text=f"Checkout Queued Playlist", url=urlxp)
key = InlineKeyboardMarkup(
[
[
InlineKeyboardButton(text="▶️", callback_data=f'resumevc2'),
InlineKeyboardButton(text="⏸️", callback_data=f'pausevc2'),
InlineKeyboardButton(text="⏭️", callback_data=f'skipvc2'),
InlineKeyboardButton(text="⏹️", callback_data=f'stopvc2')
],
[
a1,
],
[
InlineKeyboardButton(text="🗑 Close", callback_data=f'close2')
]
]
)
if await isPreviewUp(preview):
try:
await CallbackQuery.message.reply_photo(
photo=preview, caption=f"This is queued personal playlist of {Name}.\n\nIf you want to delete any music from playlist use: /delmyplaylist", quote=False, reply_markup=key
)
await m.delete()
except Exception:
pass
else:
await CallbackQuery.message.reply_text(
text=msg, reply_markup=key
)
await m.delete()
if str(smex) == "group":
_playlist = await get_note_names(CallbackQuery.message.chat.id)
if not _playlist:
return await CallbackQuery.answer(f"This Group has no a playlist on database, try to adding music into playlist.", show_alert=True)
else:
await CallbackQuery.message.delete()
logger_text=f"""💡 starting playlist
Group: {chat_title}
Request By: {Name}
▶ Group's playlist playback"""
mystic = await CallbackQuery.message.reply_text(f"💡 Starting Groups's playlist.\n\n🎧 Request by: {CallbackQuery.from_user.first_name}")
checking = f"[{CallbackQuery.from_user.first_name}](tg://user?id={userid})"
msg = f"Queued Playlist:\n\n"
j = 0
for note in _playlist:
_note = await get_playlist(CallbackQuery.message.chat.id, note)
title = _note["title"]
videoid = _note["videoid"]
url = (f"https://www.youtube.com/watch?v={videoid}")
duration = _note["duration"]
if await is_active_chat(chat_id):
position = await put(chat_id, file=videoid)
j += 1
msg += f"{j}- {title[:50]}\n"
msg += f"Queued Position: {position}\n\n"
f20 = open(f'search/{videoid}id.txt', 'w')
f20.write(f"{user_id}")
f20.close()
else:
try:
with yt_dlp.YoutubeDL(ytdl_opts) as ytdl:
x = ytdl.extract_info(url, download=False)
except Exception as e:
return await mystic.edit(f"failed to download this video.\n\n**reason:** `{e}`")
title = (x["title"])
thumbnail = (x["thumbnail"])
def my_hook(d):
if d['status'] == 'downloading':
percentage = d['_percent_str']
per = (str(percentage)).replace(".","", 1).replace("%","", 1)
per = int(per)
eta = d['eta']
speed = d['_speed_str']
size = d['_total_bytes_str']
bytesx = d['total_bytes']
if str(bytesx) in flex:
pass
else:
flex[str(bytesx)] = 1
if flex[str(bytesx)] == 1:
flex[str(bytesx)] += 1
try:
if eta > 2:
mystic.edit(f"Downloading {title[:50]}\n\n**FileSize:** {size}\n**Downloaded:** {percentage}\n**Speed:** {speed}\n**ETA:** {eta} sec")
except Exception as e:
pass
if per > 250:
if flex[str(bytesx)] == 2:
flex[str(bytesx)] += 1
if eta > 2:
mystic.edit(f"Downloading {title[:50]}..\n\n**FileSize:** {size}\n**Downloaded:** {percentage}\n**Speed:** {speed}\n**ETA:** {eta} sec")
print(f"[{videoid}] Downloaded {percentage} at a speed of {speed} | ETA: {eta} seconds")
if per > 500:
if flex[str(bytesx)] == 3:
flex[str(bytesx)] += 1
if eta > 2:
mystic.edit(f"Downloading {title[:50]}...\n\n**FileSize:** {size}\n**Downloaded:** {percentage}\n**Speed:** {speed}\n**ETA:** {eta} sec")
print(f"[{videoid}] Downloaded {percentage} at a speed of {speed} | ETA: {eta} seconds")
if per > 800:
if flex[str(bytesx)] == 4:
flex[str(bytesx)] += 1
if eta > 2:
mystic.edit(f"Downloading {title[:50]}....\n\n**FileSize:** {size}\n**Downloaded:** {percentage}\n**Speed:** {speed}\n**ETA:** {eta} sec")
print(f"[{videoid}] Downloaded {percentage} at a speed of {speed} | ETA: {eta} seconds")
if d['status'] == 'finished':
try:
taken = d['_elapsed_str']
except Exception as e:
taken = "00:00"
size = d['_total_bytes_str']
mystic.edit(f"**Downloaded: {title[:50]}...**\n\n**Size:** {size}\n**Time:** `{taken}` sec\n\n**Converting File** [__FFmpeg processing__]")
print(f"[{videoid}] Downloaded | Elapsed: {taken} seconds")
loop = asyncio.get_event_loop()
xx = await loop.run_in_executor(None, download, url, my_hook)
file = await convert(xx)
await music_on(chat_id)
await add_active_chat(chat_id)
await yukki.pytgcalls.join_group_call(
chat_id,
InputStream(
InputAudioStream(
file,
),
),
stream_type=StreamType().local_stream,
)
theme = random.choice(themes)
ctitle = CallbackQuery.message.chat.title
ctitle = await CHAT_TITLE(ctitle)
thumb = await gen_thumb(thumbnail, title, userid, theme, ctitle)
buttons = play_markup(videoid, user_id)
m = await CallbackQuery.message.reply_photo(
photo=thumb,
reply_markup=InlineKeyboardMarkup(buttons),
caption=(f"🗂 **Name:** [{title[:80]}]({url})\n⏱ **Duration:** `{duration}`\n🧸 **Request by:** {checking}")
)
os.remove(thumb)
await CallbackQuery.message.delete()
await asyncio.sleep(1)
await mystic.delete()
m = await CallbackQuery.message.reply_text("🔄 pasting queued playlist to bin...")
link = await paste(msg)
preview = link + "/preview.png"
urlxp = link + "/index.txt"
a1 = InlineKeyboardButton(text=f"Checkout Queued Playlist", url=urlxp)
key = InlineKeyboardMarkup(
[
[
InlineKeyboardButton(text="▶️", callback_data=f'resumevc2'),
InlineKeyboardButton(text="⏸️", callback_data=f'pausevc2'),
InlineKeyboardButton(text="⏭️", callback_data=f'skipvc2'),
InlineKeyboardButton(text="⏹️", callback_data=f'stopvc2')
],
[
a1,
],
[
InlineKeyboardButton(text="🗑 Close", callback_data=f'close2')
]
]
)
if await isPreviewUp(preview):
try:
await CallbackQuery.message.reply_photo(
photo=preview, caption=f"This is queued playlist of this Group.\n\nIf you want to delete any music from playlist use: /delchatplaylist", quote=False, reply_markup=key
)
await m.delete()
except Exception:
pass
else:
await CallbackQuery.message.reply_text(
text=msg, reply_markup=key
)
await m.delete()
@Client.on_callback_query(filters.regex("group_playlist"))
async def group_playlist(_,CallbackQuery):
await CallbackQuery.answer()
a = await app.get_chat_member(CallbackQuery.message.chat.id , CallbackQuery.from_user.id)
if not a.can_manage_voice_chats:
return await CallbackQuery.answer("You must be admin with permissions:\n\n❌ » manage_video_chats", show_alert=True)
callback_data = CallbackQuery.data.strip()
chat_id = CallbackQuery.message.chat.id
callback_request = callback_data.split(None, 1)[1]
userid = CallbackQuery.from_user.id
try:
url,smex= callback_request.split("|")
except Exception as e:
return await CallbackQuery.message.edit(f"❌ an error occured\n\n**reason:** `{e}`")
name = CallbackQuery.from_user.mention
_count = await get_note_names(chat_id)
count = 0
if not _count:
sex = await CallbackQuery.answer("💡 Generating Group's playlist from database...", show_alert=True)
await asyncio.sleep(2)
else:
for smex in _count:
count += 1
count = int(count)
if count == 30:
return await CallbackQuery.answer("💡 Sorry you can only have 30 music in Group's playlist", show_alert=True)
try:
url = (f"https://www.youtube.com/watch?v={url}")
results = VideosSearch(url, limit=1)
for result in results.result()["result"]:
title = (result["title"])
duration = (result["duration"])
videoid = (result["id"])
except Exception as e:
return await CallbackQuery.message.reply_text(f"❌ an error occured.\n\nplease forward to @VeezSupportGroup\n\n**reason:** `{e}`")
_check = await get_playlist(chat_id, videoid)
title = title[:50]
if _check:
return await CallbackQuery.answer(f"{name} this track is already added to Group's playlist !", show_alert=True)
assis = {
"videoid": videoid,
"title": title,
"duration": duration,
}
await save_playlist(chat_id, videoid, assis)
Name = CallbackQuery.from_user.mention
return await CallbackQuery.answer("✅ Track added to Group's playlist !", show_alert=True)
@Client.on_callback_query(filters.regex("playlist"))
async def pla_playylistt(_,CallbackQuery):
await CallbackQuery.answer()
callback_data = CallbackQuery.data.strip()
chat_id = CallbackQuery.message.chat.id
callback_request = callback_data.split(None, 1)[1]
userid = CallbackQuery.from_user.id
try:
url,smex= callback_request.split("|")
except Exception as e:
return await CallbackQuery.message.edit(f"❌ an error occured\n\n**reason:** `{e}`")
name = CallbackQuery.from_user.mention
_count = await get_note_names(userid)
count = 0
if not _count:
sex = await CallbackQuery.answer("💡 Generating your personal playlist from database...", show_alert=True)
await asyncio.sleep(2)
else:
for smex in _count:
count += 1
count = int(count)
if count == 30:
if userid in SUDOERS:
pass
else:
return await CallbackQuery.answer("💡 Sorry you can only have 30 music in personal playlist !", show_alert=True)
try:
url = (f"https://www.youtube.com/watch?v={url}")
results = VideosSearch(url, limit=1)
for result in results.result()["result"]:
title = (result["title"])
duration = (result["duration"])
videoid = (result["id"])
except Exception as e:
return await CallbackQuery.message.reply_text(f"an error occured.\n\nplease forward to @VeezSupportGroup\n**Possible Reason:**{e}")
_check = await get_playlist(userid, videoid)
if _check:
return await CallbackQuery.answer(f"{name} this track is already added to your personal playlist !", show_alert=True)
title = title[:50]
assis = {
"videoid": videoid,
"title": title,
"duration": duration,
}
await save_playlist(userid, videoid, assis)
return await CallbackQuery.answer("✅ Track added to your personal playlist !", show_alert=True)
@Client.on_callback_query(filters.regex("P_list"))
async def P_list(_,CallbackQuery):
_playlist = await get_note_names(CallbackQuery.from_user.id)
if not _playlist:
return await CallbackQuery.answer("❌ You not have personal playlist in database, try to adding music into playlist.", show_alert=True)
else:
j = 0
await CallbackQuery.answer()
msg = f"Personal Playlist:\n\n"
for note in _playlist:
j += 1
_note = await get_playlist(CallbackQuery.from_user.id, note)
title = _note["title"]
duration = _note["duration"]
msg += f"{j}- {title[:60]}\n"
msg += f"Duration: {duration} min(s)\n\n"
await CallbackQuery.answer()
await CallbackQuery.message.delete()
m = await CallbackQuery.message.reply_text("🔄 pasting playlist to bin...")
link = await paste(msg)
preview = link + "/preview.png"
print(link)
urlxp = link + "/index.txt"
user_id = CallbackQuery.from_user.id
user_name = CallbackQuery.from_user.first_name
a2 = InlineKeyboardButton(text=f"💡 Start {user_name[:17]}'s playlist", callback_data=f'play_playlist {user_id}|personal')
a3 = InlineKeyboardButton(text=f"🔗 Check Playlist", url=urlxp)
key = InlineKeyboardMarkup(
[
[
a2,
],
[
a3,
InlineKeyboardButton(text="🗑 Close", callback_data=f'close2')
]
]
)
if await isPreviewUp(preview):
try:
await CallbackQuery.message.reply_photo(
photo=preview, quote=False, reply_markup=key
)
await m.delete()
except Exception as e :
print(e)
pass
else:
print("5")
await CallbackQuery.message.reply_photo(
photo=link, quote=False, reply_markup=key
)
await m.delete()
@Client.on_callback_query(filters.regex("G_list"))
async def G_list(_,CallbackQuery):
user_id = CallbackQuery.from_user.id
_playlist = await get_note_names(CallbackQuery.message.chat.id)
if not _playlist:
return await CallbackQuery.answer(f"❌ This Group has no playlist in database, try to adding music into playlist.", show_alert=True)
else:
await CallbackQuery.answer()
j = 0
msg = f"Group Playlist:\n\n"
for note in _playlist:
j += 1
_note = await get_playlist(CallbackQuery.message.chat.id, note)
title = _note["title"]
duration = _note["duration"]
msg += f"{j}- {title[:60]}\n"
msg += f" Duration- {duration} Min(s)\n\n"
await CallbackQuery.answer()
await CallbackQuery.message.delete()
m = await CallbackQuery.message.reply_text("🔄 pasting playlist to bin...")
link = await paste(msg)
preview = link + "/preview.png"
urlxp = link + "/index.txt"
user_id = CallbackQuery.from_user.id
user_name = CallbackQuery.from_user.first_name
a1 = InlineKeyboardButton(text=f"💡 Start Group's playlist", callback_data=f'play_playlist {user_id}|group')
a3 = InlineKeyboardButton(text=f"🔗 Check Playlist", url=urlxp)
key = InlineKeyboardMarkup(
[
[
a1,
],
[
a3,
InlineKeyboardButton(text="🗑 Close", callback_data=f'close2')
]
]
)
if await isPreviewUp(preview):
try:
await CallbackQuery.message.reply_photo(
photo=preview, quote=False, reply_markup=key
)
await m.delete()
except Exception:
pass
else:
await CallbackQuery.message.reply_photo(
photo=link, quote=False, reply_markup=key
)
await m.delete()
@Client.on_callback_query(filters.regex("cbgroupdel"))
async def cbgroupdel(_,CallbackQuery):
a = await app.get_chat_member(CallbackQuery.message.chat.id , CallbackQuery.from_user.id)
if not a.can_manage_voice_chats:
return await CallbackQuery.answer("You must be admin with permissions:\n\n❌ » manage_video_chats", show_alert=True)
await CallbackQuery.message.delete()
await CallbackQuery.answer()
_playlist = await get_note_names(CallbackQuery.message.chat.id)
if not _playlist:
return await CallbackQuery.answer("❌ This Group has no a playlist in database.", show_alert=True)
else:
titlex = []
for note in _playlist:
await delete_playlist(CallbackQuery.message.chat.id, note)
await CallbackQuery.answer("✅ Successfully deleted the whole Group's playlist", show_alert=True)
@Client.on_callback_query(filters.regex("cbdel"))
async def delplcb(_,CallbackQuery):
await CallbackQuery.answer()
await CallbackQuery.message.delete()
_playlist = await get_note_names(CallbackQuery.from_user.id)
if not _playlist:
return await CallbackQuery.answer("❌ You not have a personal playlist in database.", show_alert=True)
else:
titlex = []
for note in _playlist:
await delete_playlist(CallbackQuery.from_user.id, note)
await CallbackQuery.answer("✅ Successfully deleted your whole persoanl playlist", show_alert=True)
| 49.5986 | 189 | 0.545194 | 4,621 | 42,506 | 4.891582 | 0.091106 | 0.064502 | 0.043665 | 0.027473 | 0.833569 | 0.810786 | 0.775128 | 0.760883 | 0.736684 | 0.715626 | 0 | 0.00875 | 0.343928 | 42,506 | 856 | 190 | 49.656542 | 0.798472 | 0 | 0 | 0.70438 | 0 | 0.040146 | 0.186915 | 0.039171 | 0 | 0 | 0 | 0 | 0 | 1 | 0.00365 | false | 0.014599 | 0.06326 | 0 | 0.110706 | 0.017032 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
3d033568977bf964c7f1f2bc98da49958f417949 | 134 | py | Python | plugins/tenable_io/komand_tenable_io/actions/__init__.py | lukaszlaszuk/insightconnect-plugins | 8c6ce323bfbb12c55f8b5a9c08975d25eb9f8892 | [
"MIT"
] | 46 | 2019-06-05T20:47:58.000Z | 2022-03-29T10:18:01.000Z | plugins/tenable_io/komand_tenable_io/actions/__init__.py | lukaszlaszuk/insightconnect-plugins | 8c6ce323bfbb12c55f8b5a9c08975d25eb9f8892 | [
"MIT"
] | 386 | 2019-06-07T20:20:39.000Z | 2022-03-30T17:35:01.000Z | plugins/tenable_io/komand_tenable_io/actions/__init__.py | lukaszlaszuk/insightconnect-plugins | 8c6ce323bfbb12c55f8b5a9c08975d25eb9f8892 | [
"MIT"
] | 43 | 2019-07-09T14:13:58.000Z | 2022-03-28T12:04:46.000Z | # GENERATED BY KOMAND SDK - DO NOT EDIT
from .download_report.action import DownloadReport
from .launch_scan.action import LaunchScan
| 33.5 | 50 | 0.828358 | 19 | 134 | 5.736842 | 0.842105 | 0.220183 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.126866 | 134 | 3 | 51 | 44.666667 | 0.931624 | 0.276119 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
3d2d9f8b1dc715369f003c7d748a76420481127d | 3,540 | py | Python | latest version/udpserver/udp_sender.py | Young-Lord/Arcaea-server | bf761fada1923f23f793957fd0ec13f3d79a231d | [
"MIT"
] | null | null | null | latest version/udpserver/udp_sender.py | Young-Lord/Arcaea-server | bf761fada1923f23f793957fd0ec13f3d79a231d | [
"MIT"
] | null | null | null | latest version/udpserver/udp_sender.py | Young-Lord/Arcaea-server | bf761fada1923f23f793957fd0ec13f3d79a231d | [
"MIT"
] | null | null | null | import time
from .udp_class import Room, b
class CommandSender:
def __init__(self, room: Room = Room()) -> None:
self.room = room
self.timestamp = round(time.time() * 1000000)
self.random_code = b'\x11\x11\x11\x11\x00\x00\x00\x00'
def command_0c(self):
return b'\x06\x16\x0c\x09' + b(self.room.room_id, 8) + b(self.room.command_queue_length, 4) + self.random_code + b(self.room.state) + b(self.room.countdown, 4) + b(self.timestamp, 8) + b'\x0b\x0b\x0b\x0b\x0b\x0b\x0b\x0b\x0b\x0b\x0b'
def command_0d(self, code: int):
return b'\x06\x16\x0d\x09' + b(self.room.room_id, 8) + b(self.room.command_queue_length, 4) + self.random_code + b(code) + b'\x07\x07\x07\x07\x07\x07\x07'
def command_0e(self, player_index: int):
# 分数广播
player = self.room.players[player_index]
return b'\x06\x16\x0e\x09' + b(self.room.room_id, 8) + b(self.room.command_queue_length, 4) + b(player.player_id, 8) + b(player.character_id) + b(player.is_uncapped) + b(player.difficulty) + b(player.score, 4) + b(player.timer, 4) + b(player.cleartype) + b(player.player_state) + b(player.download_percent) + b'\x01' + b(player.last_score, 4) + b(player.last_timer, 4) + b(player.online)
def command_0f(self, player_index: int, song_idx: int):
# 歌曲推荐
player = self.room.players[player_index]
return b'\x06\x16\x0f\x09' + b(self.room.room_id, 8) + b(self.room.command_queue_length, 4) + b(player.player_id, 8) + b(song_idx, 2) + b'\x06\x06\x06\x06\x06\x06'
def command_10(self):
# 房主宣告
return b'\x06\x16\x10\x09' + b(self.room.room_id, 8) + b(self.room.command_queue_length, 4) + self.random_code + b(self.room.host_id, 8)
def command_11(self):
return b'\x06\x16\x11\x09' + b(self.room.room_id, 8) + b(self.room.command_queue_length, 4) + self.random_code + self.room.get_players_info() + b'\x08\x08\x08\x08\x08\x08\x08\x08'
def command_12(self, player_index: int):
player = self.room.players[player_index]
return b'\x06\x16\x12\x09' + b(self.room.room_id, 8) + b(self.room.command_queue_length, 4) + self.random_code + b(player_index) + b(player.player_id, 8) + b(player.character_id) + b(player.is_uncapped) + b(player.difficulty) + b(player.score, 4) + b(player.timer, 4) + b(player.cleartype) + b(player.player_state) + b(player.download_percent) + b(player.online)
def command_13(self):
return b'\x06\x16\x13\x09' + b(self.room.room_id, 8) + b(self.room.command_queue_length, 4) + self.random_code + b(self.room.host_id, 8) + b(self.room.state) + b(self.room.countdown, 4) + b(self.timestamp, 8) + b(self.room.song_idx, 2) + b(self.room.interval, 2) + b(self.room.times, 7) + self.room.get_player_last_score() + b(self.room.last_song_idx, 2) + b(self.room.round_switch, 1) + b'\x01'
def command_14(self):
return b'\x06\x16\x14\x09' + b(self.room.room_id, 8) + b(self.room.command_queue_length, 4) + self.random_code + self.room.song_unlock + b'\x08\x08\x08\x08\x08\x08\x08\x08'
def command_15(self):
return b'\x06\x16\x15\x09' + b(self.room.room_id, 8) + b(self.room.command_queue_length, 4) + self.room.get_players_info() + self.room.song_unlock + b(self.room.host_id, 8) + b(self.room.state) + b(self.room.countdown, 4) + b(self.timestamp, 8) + b(self.room.song_idx, 2) + b(self.room.interval, 2) + b(self.room.times, 7) + self.room.get_player_last_score() + b(self.room.last_song_idx, 2) + b(self.room.round_switch, 1) + b'\x09\x09\x09\x09\x09\x09\x09\x09\x09'
| 75.319149 | 471 | 0.674011 | 628 | 3,540 | 3.643312 | 0.133758 | 0.174825 | 0.153409 | 0.061189 | 0.801573 | 0.710664 | 0.701486 | 0.701486 | 0.689685 | 0.689685 | 0 | 0.084689 | 0.149435 | 3,540 | 46 | 472 | 76.956522 | 0.675191 | 0.003955 | 0 | 0.1 | 0 | 0.033333 | 0.112436 | 0.064736 | 0 | 0 | 0 | 0 | 0 | 1 | 0.366667 | false | 0 | 0.066667 | 0.233333 | 0.8 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 8 |
3d36bbca1d5f361145901793cb113ed9cdcdaa23 | 17,137 | py | Python | episcanpy/plotting/_scanpy_plotting.py | AbiriAmir/epiScanpy | 3b7c0d0c91602bc446888e09a0aa6c21f25e5f48 | [
"BSD-3-Clause"
] | null | null | null | episcanpy/plotting/_scanpy_plotting.py | AbiriAmir/epiScanpy | 3b7c0d0c91602bc446888e09a0aa6c21f25e5f48 | [
"BSD-3-Clause"
] | null | null | null | episcanpy/plotting/_scanpy_plotting.py | AbiriAmir/epiScanpy | 3b7c0d0c91602bc446888e09a0aa6c21f25e5f48 | [
"BSD-3-Clause"
] | null | null | null |
# Just a bunch of scanopy functions
# to add: violin, ranking, clustermap, stacked_violin, heatmap, dotplot,
# matrixplot, tracksplot, dendrogram, correlation_matrix
import scanpy as sc
def pca(adata,
color=None,
feature_symbols=None,
use_raw=None,
layer=None,
sort_order=True,
groups=None,
components=None,
projection='2d',
legend_loc='right margin',
legend_fontsize=None,
legend_fontweight=None,
legend_fontoutline=None,
size=None,
color_map=None,
palette=None,
frameon=None,
ncols=None,
wspace=None,
hspace=0.25,
title=None,
return_fig=None,
show=None,
save=None):
"""\
Scatter plot in PCA coordinates.
Parameters
----------
{adata_color_etc}
{scatter_bulk}
{show_save_ax}
Returns
-------
If `show==False` a :class:`~matplotlib.axes.Axes` or a list of it.
"""
sc.pl.pca(adata, color=color, gene_symbols=feature_symbols, use_raw=use_raw,
layer=layer, sort_order=sort_order, groups=groups, components=components,
projection=projection, legend_loc=legend_loc, legend_fontsize=legend_fontsize,
legend_fontweight=legend_fontweight, legend_fontoutline=legend_fontoutline,
size=size, color_map=color_map, palette=palette, frameon=frameon, ncols=ncols,
wspace=wspace, hspace=hspace, title=title, return_fig=return_fig, show=show,
save=save)
def tsne(adata,
color=None,
feature_symbols=None,
use_raw=None,
layer=None,
sort_order=True,
groups=None,
components=None,
projection='2d',
legend_loc='right margin',
legend_fontsize=None,
legend_fontweight=None,
legend_fontoutline=None,
size=None,
color_map=None,
palette=None,
frameon=None,
ncols=None,
wspace=None,
hspace=0.25,
title=None,
return_fig=None,
show=None,
save=None):
"""\
Scatter plot in tSNE basis.
Parameters
----------
{adata_color_etc}
{edges_arrows}
{scatter_bulk}
{show_save_ax}
Returns
-------
If `show==False` a :class:`~matplotlib.axes.Axes` or a list of it.
"""
sc.pl.tsne(adata, color=color, gene_symbols=feature_symbols, use_raw=use_raw,
layer=layer, sort_order=sort_order, groups=groups, components=components,
projection=projection, legend_loc=legend_loc, legend_fontsize=legend_fontsize,
legend_fontweight=legend_fontweight, legend_fontoutline=legend_fontoutline,
size=size, color_map=color_map, palette=palette, frameon=frameon, ncols=ncols,
wspace=wspace, hspace=hspace, title=title, return_fig=return_fig, show=show,
save=save)
def umap(adata,
color=None,
feature_symbols=None,
use_raw=None,
layer=None,
sort_order=True,
groups=None,
components=None,
projection='2d',
legend_loc='right margin',
legend_fontsize=None,
legend_fontweight=None,
legend_fontoutline=None,
size=None,
color_map=None,
palette=None,
frameon=None,
ncols=None,
wspace=None,
hspace=0.25,
title=None,
return_fig=None,
show=None,
save=None):
"""\
Scatter plot in UMAP basis.
Parameters
----------
{adata_color_etc}
{edges_arrows}
{scatter_bulk}
{show_save_ax}
Returns
-------
If `show==False` a :class:`~matplotlib.axes.Axes` or a list of it.
"""
sc.pl.umap(adata, color=color, gene_symbols=feature_symbols, use_raw=use_raw,
layer=layer, sort_order=sort_order, groups=groups, components=components,
projection=projection, legend_loc=legend_loc, legend_fontsize=legend_fontsize,
legend_fontweight=legend_fontweight, legend_fontoutline=legend_fontoutline,
size=size, color_map=color_map, palette=palette, frameon=frameon, ncols=ncols,
wspace=wspace, hspace=hspace, title=title, return_fig=return_fig, show=show,
save=save)
def diffmap(adata,
color=None,
feature_symbols=None,
use_raw=None,
layer=None,
sort_order=True,
groups=None,
components=None,
projection='2d',
legend_loc='right margin',
legend_fontsize=None,
legend_fontweight=None,
legend_fontoutline=None,
size=None,
color_map=None,
palette=None,
frameon=None,
ncols=None,
wspace=None,
hspace=0.25,
title=None,
return_fig=None,
show=None,
save=None):
"""\
Scatter plot in Diffusion Map basis.
Parameters
----------
{adata_color_etc}
{scatter_bulk}
{show_save_ax}
Returns
-------
If `show==False` a :class:`~matplotlib.axes.Axes` or a list of it.
"""
sc.pl.diffmap(adata, color=color, gene_symbols=feature_symbols, use_raw=use_raw,
layer=layer, sort_order=sort_order, groups=groups, components=components,
projection=projection, legend_loc=legend_loc, legend_fontsize=legend_fontsize,
legend_fontweight=legend_fontweight, legend_fontoutline=legend_fontoutline,
size=size, color_map=color_map, palette=palette, frameon=frameon, ncols=ncols,
wspace=wspace, hspace=hspace, title=title, return_fig=return_fig, show=show,
save=save)
def draw_graph(adata,
layout=None,
color=None,
feature_symbols=None,
use_raw=None,
layer=None,
sort_order=True,
groups=None,
components=None,
projection='2d',
legend_loc='right margin',
legend_fontsize=None,
legend_fontweight=None,
legend_fontoutline=None,
size=None,
color_map=None,
palette=None,
frameon=None,
ncols=None,
wspace=None,
hspace=0.25,
title=None,
return_fig=None,
show=None,
save=None):
"""\
Scatter plot in graph-drawing basis.
Parameters
----------
{adata_color_etc}
layout : {{`'fa'`, `'fr'`, `'drl'`, ...}}, optional (default: last computed)
One of the :func:`~scanpy.tl.draw_graph` layouts.
By default, the last computed layout is used.
{edges_arrows}
{scatter_bulk}
{show_save_ax}
Returns
-------
If `show==False` a :class:`~matplotlib.axes.Axes` or a list of it.
"""
sc.pl.draw_graph(adata, layout, color=color, gene_symbols=feature_symbols, use_raw=use_raw,
layer=layer, sort_order=sort_order, groups=groups, components=components,
projection=projection, legend_loc=legend_loc, legend_fontsize=legend_fontsize,
legend_fontweight=legend_fontweight, legend_fontoutline=legend_fontoutline,
size=size, color_map=color_map, palette=palette, frameon=frameon, ncols=ncols,
wspace=wspace, hspace=hspace, title=title, return_fig=return_fig, show=show,
save=save)
def rank_feat_groups_violin(adata,
groups=None,
n_features=20,
feature_names=None,
feature_symbols=None,
use_raw=None,
key='rank_features_groups',
split=True,
scale='width',
strip=True,
jitter=True,
size=1,
ax=None,
show=None,
save=None):
"""\
Plot ranking of features for all tested comparisons.
Parameters
----------
adata
Annotated data matrix.
groups
List of group names.
n_features
Number of features to show. Is ignored if `feature_names` is passed.
feature_names
List of features to plot. Is only useful if interested in a custom feature list,
which is not the result of :func:`epi.tl.rank_features`.
feature_symbols
Key for field in `.var` that stores feature symbols if you do not want to
use `.var_names` displayed in the plot.
use_raw : `bool`, optional (default: `None`)
Use `raw` attribute of `adata` if present. Defaults to the value that
was used in :func:`~scanpy.tl.rank_genes_groups`.
split
Whether to split the violins or not.
scale
See :func:`~seaborn.violinplot`.
strip
Show a strip plot on top of the violin plot.
jitter
If set to 0, no points are drawn. See :func:`~seaborn.stripplot`.
size
Size of the jitter points.
{show_save_ax}
"""
sc.pl.rank_genes_groups_violin(adata, groups, n_genes=n_features,
gene_names=feature_names, gene_symbols=feature_symbols, use_raw=use_raw,
key=key, split=split, scale=scale, strip=strip, jitter=jitter, size=size,
ax=ax, show=show, save=save)
def rank_feat_groups(adata,
groups = None,
n_features= 20,
feature_symbols = None,
key = 'rank_features_groups',
fontsize = 8,
ncols= 4,
sharey= True,
show = None,
save = None,
ax = None,):
"""\
Plot ranking of features.
Parameters
----------
adata
Annotated data matrix.
groups
The groups for which to show the feature ranking.
feature_symbols
Key for field in `.var` that stores feature symbols if you do not want to
use `.var_names`.
n_features
Number of feature to show.
fontsize
Fontsize for feature names.
ncols
Number of panels shown per row.
sharey
Controls if the y-axis of each panels should be shared. But passing
`sharey=False`, each panel has its own y-axis range.
{show_save_ax}
"""
sc.pl.rank_genes_groups(adata, groups, n_genes=n_features, gene_symbols=feature_symbols,
key=key, fontsize=fontsize, ncols=ncols, sharey=sharey, show=show, save=save , ax=ax)
def rank_feat_groups_dotplot(adata,
groups = None,
n_features = 10,
groupby = None,
key ='rank_features_groups',
show = None,
save = None):
"""\
Plot ranking of features using dotplot plot (see :func:`~scanpy.pl.dotplot`)
Parameters
----------
adata
Annotated data matrix.
groups
The groups for which to show the feature ranking.
n_features
Number of features to show.
groupby
The key of the observation grouping to consider. By default,
the groupby is chosen from the rank features groups parameter but
other groupby options can be used. It is expected that
groupby is a categorical. If groupby is not a categorical observation,
it would be subdivided into `num_categories` (see :func:`~scanpy.pl.dotplot`).
key
Key used to store the ranking results in `adata.uns`.
{show_save_ax}
**kwds
Are passed to :func:`~scanpy.pl.dotplot`.
"""
sc.pl.rank_genes_groups_dotplot(adata, groups, n_genes=n_features, groupby=groupby,
key=key, show=show, save=save)
def rank_feat_groups_heatmap(adata,
groups= None,
n_features = 10,
groupby = None,
key = 'rank_features_groups',
show = None,
save = None,):
"""\
Plot ranking of features using heatmap plot (see :func:`~scanpy.pl.heatmap`)
Parameters
----------
adata : :class:`~anndata.AnnData`
Annotated data matrix.
groups : `str` or `list` of `str`
The groups for which to show the feature ranking.
n_features
Number of features to show.
groupby
The key of the observation grouping to consider. By default,
the groupby is chosen from the rank features groups parameter but
other groupby options can be used. It is expected that
groupby is a categorical. If groupby is not a categorical observation,
it would be subdivided into `num_categories` (see :func:`~scanpy.pl.heatmap`).
key
Key used to store the ranking results in `adata.uns`.
**kwds
Are passed to :func:`~scanpy.pl.heatmap`.
{show_save_ax}
"""
sc.pl.rank_genes_groups_heatmap(adata, groups, n_genes=n_features, groupby=groupby,
key=key, show=show, save=save)
def rank_feat_groups_stacked_violin(adata,
groups = None,
n_features = 10,
groupby = None,
key = 'rank_features_groups',
show = None,
save = None):
"""\
Plot ranking of features using stacked_violin plot (see :func:`~scanpy.pl.stacked_violin`)
Parameters
----------
adata
Annotated data matrix.
groups : `str` or `list` of `str`
The groups for which to show the feature ranking.
n_features : `int`, optional (default: 10)
Number of features to show.
groupby : `str` or `None`, optional (default: `None`)
The key of the observation grouping to consider. By default,
the groupby is chosen from the rank features groups parameter but
other groupby options can be used. It is expected that
groupby is a categorical. If groupby is not a categorical observation,
it would be subdivided into `num_categories` (see :func:`~scanpy.pl.stacked_violin`).
key
Key used to store the ranking results in `adata.uns`.
{show_save_ax}
**kwds
Are passed to :func:`~scanpy.pl.stacked_violin`.
"""
sc.pl.rank_genes_groups_stacked_violin(adata, groups, n_genes=n_features, groupby=groupby,
key=key, show=show, save=save)
def rank_feat_groups_matrixplot(adata,
groups = None,
n_features = 10,
groupby = None,
key = 'rank_features_groups',
show = None,
save = None):
"""\
Plot ranking of features using matrixplot plot (see :func:`~scanpy.pl.matrixplot`)
Parameters
----------
adata
Annotated data matrix.
groups
The groups for which to show the feature ranking.
n_features
Number of features to show.
groupby
The key of the observation grouping to consider. By default,
the groupby is chosen from the rank features groups parameter but
other groupby options can be used. It is expected that
groupby is a categorical. If groupby is not a categorical observation,
it would be subdivided into `num_categories` (see :func:`~scanpy.pl.matrixplot`).
key
Key used to store the ranking results in `adata.uns`.
{show_save_ax}
**kwds
Are passed to :func:`~scanpy.pl.matrixplot`.
"""
sc.pl.rank_genes_groups_matrixplot(adata, groups, n_genes=n_features, groupby=groupby,
key=key, show=show, save=save)
def rank_feat_groups_tracksplot(adata,
groups = None,
n_features = 10,
groupby = None,
key = 'rank_features_groups',
show = None,
save = None):
"""\
Plot ranking of features using heatmap plot (see :func:`~scanpy.pl.heatmap`)
Parameters
----------
adata
Annotated data matrix.
groups
The groups for which to show the feature ranking.
n_features
Number of features to show.
groupby
The key of the observation grouping to consider. By default,
the groupby is chosen from the rank features groups parameter but
other groupby options can be used. It is expected that
groupby is a categorical. If groupby is not a categorical observation,
it would be subdivided into `num_categories` (see :func:`~scanpy.pl.heatmap`).
key
Key used to store the ranking results in `adata.uns`.
**kwds
Are passed to :func:`~scanpy.pl.tracksplot`.
{show_save_ax}
"""
sc.pl.rank_genes_groups_tracksplot(adata, groups, n_genes=n_features, groupby=groupby,
key=key, show=show, save=save)
def pca_loadings(adata,
components = None,
include_lowest = True,
show = None,
save = None):
"""\
Rank features according to contributions to PCs.
Parameters
----------
adata
Annotated data matrix.
components
For example, ``'1,2,3'`` means ``[1, 2, 3]``, first, second, third
principal component.
include_lowest
Show the features with both highest and lowest loadings.
show
Show the plot, do not return axis.
save
If `True` or a `str`, save the figure.
A string is appended to the default filename.
Infer the filetype if ending on {`'.pdf'`, `'.png'`, `'.svg'`}.
"""
sc.pl.pca_loadings(adata, components, include_lowest, show, save)
def pca_overview(adata):
"""\
Plot PCA results.
The parameters are the ones of the scatter plot. Call pca_ranking separately
if you want to change the default settings.
Parameters
----------
adata
Annotated data matrix.
color : string or list of strings, optional (default: `None`)
Keys for observation/cell annotation either as list `["ann1", "ann2"]` or
string `"ann1,ann2,..."`.
use_raw : `bool`, optional (default: `True`)
Use `raw` attribute of `adata` if present.
{scatter_bulk}
show : bool, optional (default: `None`)
Show the plot, do not return axis.
save : `bool` or `str`, optional (default: `None`)
If `True` or a `str`, save the figure.
A string is appended to the default filename.
Infer the filetype if ending on {{`'.pdf'`, `'.png'`, `'.svg'`}}.
"""
sc.pl.pca_overview(adata)
def pca_variance_ratio(adata, n_pcs=30, log=False, show=None, save=None):
"""\
Plot the variance ratio.
Parameters
----------
n_pcs : `int`, optional (default: `30`)
Number of PCs to show.
log : `bool`, optional (default: `False`)
Plot on logarithmic scale..
show : `bool`, optional (default: `None`)
Show the plot, do not return axis.
save : `bool` or `str`, optional (default: `None`)
If `True` or a `str`, save the figure.
A string is appended to the default filename.
Infer the filetype if ending on {`'.pdf'`, `'.png'`, `'.svg'`}.
"""
sc.pl.pca_variance_ratio(adata, n_pcs, log, show, save)
| 30.223986 | 94 | 0.668437 | 2,322 | 17,137 | 4.798019 | 0.111542 | 0.01867 | 0.016157 | 0.020106 | 0.808814 | 0.766807 | 0.750292 | 0.726685 | 0.70272 | 0.70272 | 0 | 0.004056 | 0.223085 | 17,137 | 566 | 95 | 30.277385 | 0.832732 | 0.491393 | 0 | 0.795082 | 0 | 0 | 0.028515 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.061475 | false | 0 | 0.004098 | 0 | 0.065574 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
3d4419dc4b581b1dd7c78083c333103d5d359712 | 146 | py | Python | python_lists.py | IamUttamKumarRoy/python-start | f02adfda9eaeff43e64896f5f04c962eef17d9b2 | [
"Apache-2.0"
] | null | null | null | python_lists.py | IamUttamKumarRoy/python-start | f02adfda9eaeff43e64896f5f04c962eef17d9b2 | [
"Apache-2.0"
] | null | null | null | python_lists.py | IamUttamKumarRoy/python-start | f02adfda9eaeff43e64896f5f04c962eef17d9b2 | [
"Apache-2.0"
] | null | null | null | list1 = ['physics', 'chemistry', 1997, 2000]
list2 = [1, 2, 3, 4, 5, 6, 7 ]
print ("list1[0]: ", list1[0])
print ("list2[1:5]: ", list2[1:5]) | 29.2 | 45 | 0.527397 | 25 | 146 | 3.08 | 0.6 | 0.233766 | 0.181818 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.228814 | 0.191781 | 146 | 5 | 46 | 29.2 | 0.423729 | 0 | 0 | 0 | 0 | 0 | 0.265734 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.5 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 7 |
3d45bc5246ed505d0ae1d79faff94e54201f50c5 | 1,557 | py | Python | PytorchCNNModules/modules/plane.py | nktankta/PytorchCNNModules | bc1469ceb37477d3f60062f14a750f272e7ceeb0 | [
"MIT"
] | null | null | null | PytorchCNNModules/modules/plane.py | nktankta/PytorchCNNModules | bc1469ceb37477d3f60062f14a750f272e7ceeb0 | [
"MIT"
] | null | null | null | PytorchCNNModules/modules/plane.py | nktankta/PytorchCNNModules | bc1469ceb37477d3f60062f14a750f272e7ceeb0 | [
"MIT"
] | null | null | null | import torch
import torch.nn as nn
from .base_module import BaseModule
class PlaneResidual(BaseModule):
def __init__(self,in_feature,out_featue,hidden_feature=None,stride=1,**kwargs):
super(PlaneResidual,self).__init__(in_feature,out_featue,stride,**kwargs)
if hidden_feature is None:
hidden_feature = in_feature
self.conv1 = nn.Conv2d(in_feature,hidden_feature,3,stride,padding=1)
self.conv2 = nn.Conv2d(hidden_feature,out_featue,3,1,padding=1)
self.act1 = self.activation(**self.activation_kwargs)
self.norm1 = self.norm_layer(hidden_feature)
self.norm2 = self.norm_layer(out_featue)
def _forward(self,x):
x = self.conv1(x)
x = self.norm1(x)
x = self.act1(x)
x = self.conv2(x)
x = self.norm2(x)
return x
class PlaneResidual_no_lastBN(BaseModule):
def __init__(self, in_feature, out_featue, hidden_feature=None, stride=1, **kwargs):
super(PlaneResidual_no_lastBN, self).__init__(in_feature, out_featue, stride, **kwargs)
if hidden_feature is None:
hidden_feature = in_feature
self.conv1 = nn.Conv2d(in_feature, hidden_feature, 3, stride, padding=1)
self.conv2 = nn.Conv2d(hidden_feature, out_featue, 3, 1, padding=1)
self.act1 = self.activation(**self.activation_kwargs)
self.norm1 = self.norm_layer(hidden_feature)
def _forward(self, x):
x = self.conv1(x)
x = self.norm1(x)
x = self.act1(x)
x = self.conv2(x)
return x
| 37.97561 | 95 | 0.663455 | 217 | 1,557 | 4.506912 | 0.18894 | 0.159509 | 0.055215 | 0.07362 | 0.830266 | 0.830266 | 0.830266 | 0.830266 | 0.830266 | 0.830266 | 0 | 0.028192 | 0.225434 | 1,557 | 40 | 96 | 38.925 | 0.782753 | 0 | 0 | 0.742857 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.114286 | false | 0 | 0.085714 | 0 | 0.314286 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
e9e85cae89e9eb0b1cdc4ec2bb185cbc2ed8cee6 | 47 | py | Python | {{cookiecutter.project_slug}}/{{cookiecutter.project_slug}}.py | lendlsmith/eleventy-cirrus | 26c1cfd045b720af9a0e19f38b64f4b07d09d101 | [
"CC0-1.0"
] | null | null | null | {{cookiecutter.project_slug}}/{{cookiecutter.project_slug}}.py | lendlsmith/eleventy-cirrus | 26c1cfd045b720af9a0e19f38b64f4b07d09d101 | [
"CC0-1.0"
] | null | null | null | {{cookiecutter.project_slug}}/{{cookiecutter.project_slug}}.py | lendlsmith/eleventy-cirrus | 26c1cfd045b720af9a0e19f38b64f4b07d09d101 | [
"CC0-1.0"
] | null | null | null | print("Hello, {{cookiecutter.project_name}}!")
| 23.5 | 46 | 0.723404 | 5 | 47 | 6.6 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.042553 | 47 | 1 | 47 | 47 | 0.733333 | 0 | 0 | 0 | 0 | 0 | 0.787234 | 0.638298 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 7 |
180434a1b1bb811963c6b5ceda93a8c3f2f2bf1b | 98 | py | Python | youtrackutils/redmine/__init__.py | cedeyn/youtrack-python-scripts | 5aad07a10ef0eb0d4a6d8a879e399e42bc149fb4 | [
"Apache-2.0"
] | null | null | null | youtrackutils/redmine/__init__.py | cedeyn/youtrack-python-scripts | 5aad07a10ef0eb0d4a6d8a879e399e42bc149fb4 | [
"Apache-2.0"
] | null | null | null | youtrackutils/redmine/__init__.py | cedeyn/youtrack-python-scripts | 5aad07a10ef0eb0d4a6d8a879e399e42bc149fb4 | [
"Apache-2.0"
] | null | null | null | from client import RedmineClient
from client import RedmineException
from mapping import Mapping
| 19.6 | 35 | 0.867347 | 12 | 98 | 7.083333 | 0.5 | 0.235294 | 0.376471 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.132653 | 98 | 4 | 36 | 24.5 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
18733e824e91a487fcda6c4156ff55832b7b5095 | 2,719 | py | Python | sped_correcao/_old/atualizaM.py | teocrono/scripts | 2970192e3184c9e1d3dd67390e544d767b809c23 | [
"MIT"
] | null | null | null | sped_correcao/_old/atualizaM.py | teocrono/scripts | 2970192e3184c9e1d3dd67390e544d767b809c23 | [
"MIT"
] | null | null | null | sped_correcao/_old/atualizaM.py | teocrono/scripts | 2970192e3184c9e1d3dd67390e544d767b809c23 | [
"MIT"
] | null | null | null |
def exec(conexao):
cursor = conexao.cursor()
update = '''
WITH value AS (
SELECT SUM(CAST(replace(r7,',','.') AS FLOAT)) as valor
FROM principal WHERE r1 = "C195" and r3 = "50"
)
UPDATE principal SET
r4 = (SELECT REPLACE(CAST(valor AS TEXT),".",",") FROM value),
r6 = (SELECT REPLACE(CAST(valor AS TEXT),".",",") FROM value),
r7 = (SELECT REPLACE(CAST(valor AS TEXT),".",",") FROM value)
WHERE r1 = "M505" and r2 = "01"
'''
cursor.execute(update)
conexao.commit()
update = '''
WITH value AS (
SELECT SUM(CAST(replace(r3,',','.') AS FLOAT)) as valor
FROM principal WHERE r1 = "C505" and r2 = "50"
)
UPDATE principal SET
r4 = (SELECT REPLACE(CAST(valor AS TEXT),".",",") FROM value),
r6 = (SELECT REPLACE(CAST(valor AS TEXT),".",",") FROM value),
r7 = (SELECT REPLACE(CAST(valor AS TEXT),".",",") FROM value)
WHERE r1 = "M505" and r2 = "04"
'''
cursor.execute(update)
conexao.commit()
update = '''
WITH value AS (
SELECT SUM(CAST(replace(r5,',','.') AS FLOAT)) as valor
FROM principal WHERE r1 = "D505" and r2 = "50"
)
UPDATE principal SET
r4 = (SELECT REPLACE(CAST(valor AS TEXT),".",",") FROM value),
r6 = (SELECT REPLACE(CAST(valor AS TEXT),".",",") FROM value),
r7 = (SELECT REPLACE(CAST(valor AS TEXT),".",",") FROM value)
WHERE r1 = "M505" and r2 = "01"
'''
cursor.execute(update)
conexao.commit()
update = '''
WITH value AS (
SELECT SUM(CAST(replace(r6,',','.') AS FLOAT)) as valor
FROM principal WHERE r1 = "D105" and r4 = "50" and r5="03"
)
UPDATE principal SET
r4 = (SELECT REPLACE(CAST(valor AS TEXT),".",",") FROM value),
r6 = (SELECT REPLACE(CAST(valor AS TEXT),".",",") FROM value),
r7 = (SELECT REPLACE(CAST(valor AS TEXT),".",",") FROM value)
WHERE r1 = "M505" and r2 = "03"
'''
cursor.execute(update)
conexao.commit()
update = '''
WITH value AS (
SELECT SUM(CAST(replace(r6,',','.') AS FLOAT)) as valor
FROM principal WHERE r1 = "D105" and r4 = "50" and r5="07"
)
UPDATE principal SET
r4 = (SELECT REPLACE(CAST(valor AS TEXT),".",",") FROM value),
r6 = (SELECT REPLACE(CAST(valor AS TEXT),".",",") FROM value),
r7 = (SELECT REPLACE(CAST(valor AS TEXT),".",",") FROM value)
WHERE r1 = "M505" and r2 = "03"
'''
cursor.execute(update)
conexao.commit()
| 36.743243 | 74 | 0.511953 | 322 | 2,719 | 4.322981 | 0.121118 | 0.140086 | 0.18319 | 0.237069 | 0.95546 | 0.95546 | 0.95546 | 0.95546 | 0.855603 | 0.855603 | 0 | 0.052033 | 0.321442 | 2,719 | 73 | 75 | 37.246575 | 0.702439 | 0 | 0 | 0.761194 | 0 | 0 | 0.85504 | 0.051508 | 0 | 0 | 0 | 0 | 0 | 1 | 0.014925 | false | 0 | 0 | 0 | 0.014925 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 10 |
1883d638f2105b40258d5d43bdbcae393fca273f | 288 | py | Python | brainlit/algorithms/__init__.py | NeuroDataDesign/brainl | fc99f59a9d835039dac713a028ac2521ac217e95 | [
"Apache-2.0"
] | null | null | null | brainlit/algorithms/__init__.py | NeuroDataDesign/brainl | fc99f59a9d835039dac713a028ac2521ac217e95 | [
"Apache-2.0"
] | null | null | null | brainlit/algorithms/__init__.py | NeuroDataDesign/brainl | fc99f59a9d835039dac713a028ac2521ac217e95 | [
"Apache-2.0"
] | null | null | null | import brainlit.algorithms.generate_fragments
import brainlit.algorithms.connect_fragments
import brainlit.algorithms.trace_analysis
from brainlit.algorithms.generate_fragments import *
from brainlit.algorithms.connect_fragments import *
from brainlit.algorithms.trace_analysis import *
| 36 | 52 | 0.881944 | 33 | 288 | 7.515152 | 0.272727 | 0.435484 | 0.290323 | 0.282258 | 0.758065 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.065972 | 288 | 7 | 53 | 41.142857 | 0.921933 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
a110cb83997c323743dcd1d876884e91870da6ad | 1,677 | py | Python | maps/default.py | gwillem/rgkit | 1e9a1b8e77012853f070c9fbe0c26cfde25b355d | [
"Unlicense"
] | 1 | 2021-04-30T20:59:32.000Z | 2021-04-30T20:59:32.000Z | maps/default.py | gwillem/rgkit | 1e9a1b8e77012853f070c9fbe0c26cfde25b355d | [
"Unlicense"
] | null | null | null | maps/default.py | gwillem/rgkit | 1e9a1b8e77012853f070c9fbe0c26cfde25b355d | [
"Unlicense"
] | null | null | null | {'spawn': [(7, 1), (8, 1), (9, 1), (10, 1), (11, 1), (5, 2), (6, 2), (12, 2), (13, 2), (3, 3), (4, 3), (14, 3), (15, 3), (3, 4), (15, 4), (2, 5), (16, 5), (2, 6), (16, 6), (1, 7), (17, 7), (1, 8), (17, 8), (1, 9), (17, 9), (1, 10), (17, 10), (1, 11), (17, 11), (2, 12), (16, 12), (2, 13), (16, 13), (3, 14), (15, 14), (3, 15), (4, 15), (14, 15), (15, 15), (5, 16), (6, 16), (12, 16), (13, 16), (7, 17), (8, 17), (9, 17), (10, 17), (11, 17)],'obstacle': [(0, 0), (1, 0), (2, 0), (3, 0), (4, 0), (5, 0), (6, 0), (7, 0), (8, 0), (9, 0), (10, 0), (11, 0), (12, 0), (13, 0), (14, 0), (15, 0), (16, 0), (17, 0), (18, 0), (0, 1), (1, 1), (2, 1), (3, 1), (4, 1), (5, 1), (6, 1), (12, 1), (13, 1), (14, 1), (15, 1), (16, 1), (17, 1), (18, 1), (0, 2), (1, 2), (2, 2), (3, 2), (4, 2), (14, 2), (15, 2), (16, 2), (17, 2), (18, 2), (0, 3), (1, 3), (2, 3), (16, 3), (17, 3), (18, 3), (0, 4), (1, 4), (2, 4), (16, 4), (17, 4), (18, 4), (0, 5), (1, 5), (17, 5), (18, 5), (0, 6), (1, 6), (17, 6), (18, 6), (0, 7), (18, 7), (0, 8), (18, 8), (0, 9), (18, 9), (0, 10), (18, 10), (0, 11), (18, 11), (0, 12), (1, 12), (17, 12), (18, 12), (0, 13), (1, 13), (17, 13), (18, 13), (0, 14), (1, 14), (2, 14), (16, 14), (17, 14), (18, 14), (0, 15), (1, 15), (2, 15), (16, 15), (17, 15), (18, 15), (0, 16), (1, 16), (2, 16), (3, 16), (4, 16), (14, 16), (15, 16), (16, 16), (17, 16), (18, 16), (0, 17), (1, 17), (2, 17), (3, 17), (4, 17), (5, 17), (6, 17), (12, 17), (13, 17), (14, 17), (15, 17), (16, 17), (17, 17), (18, 17), (0, 18), (1, 18), (2, 18), (3, 18), (4, 18), (5, 18), (6, 18), (7, 18), (8, 18), (9, 18), (10, 18), (11, 18), (12, 18), (13, 18), (14, 18), (15, 18), (16, 18), (17, 18), (18, 18)]}
| 838.5 | 1,676 | 0.334526 | 370 | 1,677 | 1.516216 | 0.056757 | 0.010695 | 0.010695 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.41896 | 0.220036 | 1,677 | 1 | 1,677 | 1,677 | 0.009939 | 0 | 0 | 0 | 0 | 0 | 0.007752 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
a185eb811b696034140cc527b8e366ef4100a631 | 7,123 | py | Python | tests/API1/testparamoutput.py | mcepl/param | 34cfce4d5f1d16a192d30c6f6cdeb73d7335add3 | [
"BSD-3-Clause"
] | 123 | 2019-12-11T17:53:05.000Z | 2022-03-16T14:21:15.000Z | tests/API1/testparamoutput.py | mcepl/param | 34cfce4d5f1d16a192d30c6f6cdeb73d7335add3 | [
"BSD-3-Clause"
] | 282 | 2019-11-28T12:40:18.000Z | 2022-03-31T13:25:44.000Z | tests/API1/testparamoutput.py | mcepl/param | 34cfce4d5f1d16a192d30c6f6cdeb73d7335add3 | [
"BSD-3-Clause"
] | 37 | 2019-12-27T09:10:25.000Z | 2022-02-22T16:28:42.000Z | """
Unit test for param.output.
"""
import sys
from unittest import SkipTest
import param
from . import API1TestCase
class TestParamDepends(API1TestCase):
def test_simple_output(self):
class P(param.Parameterized):
@param.output()
def single_output(self):
return 1
p = P()
outputs = p.param.outputs()
self.assertEqual(list(outputs), ['single_output'])
otype, method, idx = outputs['single_output']
self.assertIs(type(otype), param.Parameter)
self.assertEqual(method, p.single_output)
self.assertEqual(idx, None)
def test_subclass_output(self):
class A(param.Parameterized):
@param.output()
def single_output(self):
return 1
class B(param.Parameterized):
@param.output()
def another_output(self):
return 2
class C(A, B):
pass
p = C()
outputs = p.param.outputs()
self.assertEqual(sorted(outputs), ['another_output', 'single_output'])
otype, method, idx = outputs['single_output']
self.assertIs(type(otype), param.Parameter)
self.assertEqual(method, p.single_output)
self.assertEqual(idx, None)
otype, method, idx = outputs['another_output']
self.assertIs(type(otype), param.Parameter)
self.assertEqual(method, p.another_output)
self.assertEqual(idx, None)
def test_named_kwarg_output(self):
class P(param.Parameterized):
@param.output(value=param.Integer)
def single_output(self):
return 1
p = P()
outputs = p.param.outputs()
self.assertEqual(list(outputs), ['value'])
otype, method, idx = outputs['value']
self.assertIs(type(otype), param.Integer)
self.assertEqual(method, p.single_output)
self.assertEqual(idx, None)
def test_named_and_typed_arg_output(self):
class P(param.Parameterized):
@param.output(('value', param.Integer))
def single_output(self):
return 1
p = P()
outputs = p.param.outputs()
self.assertEqual(list(outputs), ['value'])
otype, method, idx = outputs['value']
self.assertIs(type(otype), param.Integer)
self.assertEqual(method, p.single_output)
self.assertEqual(idx, None)
def test_named_arg_output(self):
class P(param.Parameterized):
@param.output('value')
def single_output(self):
return 1
p = P()
outputs = p.param.outputs()
self.assertEqual(list(outputs), ['value'])
otype, method, idx = outputs['value']
self.assertIs(type(otype), param.Parameter)
self.assertEqual(method, p.single_output)
self.assertEqual(idx, None)
def test_typed_arg_output(self):
class P(param.Parameterized):
@param.output(int)
def single_output(self):
return 1
p = P()
outputs = p.param.outputs()
self.assertEqual(list(outputs), ['single_output'])
otype, method, idx = outputs['single_output']
self.assertIs(type(otype), param.ClassSelector)
self.assertIs(otype.class_, int)
self.assertEqual(method, p.single_output)
self.assertEqual(idx, None)
def test_multiple_named_kwarg_output(self):
py_major = sys.version_info.major
py_minor = sys.version_info.minor
if (py_major < 3 or (py_major == 3 and py_minor < 6)):
raise SkipTest('Multiple keyword output declarations only '
'supported in Python >= 3.6, skipping test.')
class P(param.Parameterized):
@param.output(value=param.Integer, value2=param.String)
def multi_output(self):
return (1, 'string')
p = P()
outputs = p.param.outputs()
self.assertEqual(set(outputs), {'value', 'value2'})
otype, method, idx = outputs['value']
self.assertIs(type(otype), param.Integer)
self.assertEqual(method, p.multi_output)
self.assertEqual(idx, 0)
otype, method, idx = outputs['value2']
self.assertIs(type(otype), param.String)
self.assertEqual(method, p.multi_output)
self.assertEqual(idx, 1)
def test_multi_named_and_typed_arg_output(self):
class P(param.Parameterized):
@param.output(('value', param.Integer), ('value2', param.String))
def multi_output(self):
return (1, 'string')
p = P()
outputs = p.param.outputs()
self.assertEqual(set(outputs), {'value', 'value2'})
otype, method, idx = outputs['value']
self.assertIs(type(otype), param.Integer)
self.assertEqual(method, p.multi_output)
self.assertEqual(idx, 0)
otype, method, idx = outputs['value2']
self.assertIs(type(otype), param.String)
self.assertEqual(method, p.multi_output)
self.assertEqual(idx, 1)
def test_multi_named_arg_output(self):
class P(param.Parameterized):
@param.output('value', 'value2')
def multi_output(self):
return (1, 2)
p = P()
outputs = p.param.outputs()
self.assertEqual(set(outputs), {'value', 'value2'})
otype, method, idx = outputs['value']
self.assertIs(type(otype), param.Parameter)
self.assertEqual(method, p.multi_output)
self.assertEqual(idx, 0)
otype, method, idx = outputs['value2']
self.assertIs(type(otype), param.Parameter)
self.assertEqual(method, p.multi_output)
self.assertEqual(idx, 1)
def test_multi_typed_arg_output(self):
with self.assertRaises(ValueError):
class P(param.Parameterized):
@param.output(int, str)
def single_output(self):
return 1
def test_multi_method_named_and_typed_arg_output(self):
class P(param.Parameterized):
@param.output(('value', param.Integer), ('value2', str))
def multi_output(self):
return (1, 'string')
@param.output(('value3', param.Number))
def single_output(self):
return 3.0
p = P()
outputs = p.param.outputs()
self.assertEqual(set(outputs), {'value', 'value2', 'value3'})
otype, method, idx = outputs['value']
self.assertIs(type(otype), param.Integer)
self.assertEqual(method, p.multi_output)
self.assertEqual(idx, 0)
otype, method, idx = outputs['value2']
self.assertIs(type(otype), param.ClassSelector)
self.assertIs(otype.class_, str)
self.assertEqual(method, p.multi_output)
self.assertEqual(idx, 1)
otype, method, idx = outputs['value3']
self.assertIs(type(otype), param.Number)
self.assertEqual(method, p.single_output)
self.assertEqual(idx, None)
| 30.570815 | 78 | 0.591745 | 801 | 7,123 | 5.151061 | 0.09613 | 0.106641 | 0.069801 | 0.081435 | 0.850703 | 0.830587 | 0.809743 | 0.792293 | 0.77872 | 0.77872 | 0 | 0.008858 | 0.286817 | 7,123 | 232 | 79 | 30.702586 | 0.803346 | 0.003791 | 0 | 0.71345 | 0 | 0 | 0.054599 | 0 | 0 | 0 | 0 | 0 | 0.356725 | 1 | 0.140351 | false | 0.005848 | 0.023392 | 0.076023 | 0.321637 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
a1b7c7f1b7b76176fef57554600abaf70a8f2841 | 16,615 | py | Python | pyAnVIL/tests/integration/fhir/test_validation.py | mmtmn/client-apis | 215adae0b7f401b4bf62e7bd79b6a8adfe69cf4f | [
"Apache-2.0"
] | 1 | 2022-01-12T21:50:44.000Z | 2022-01-12T21:50:44.000Z | pyAnVIL/tests/integration/fhir/test_validation.py | mmtmn/client-apis | 215adae0b7f401b4bf62e7bd79b6a8adfe69cf4f | [
"Apache-2.0"
] | null | null | null | pyAnVIL/tests/integration/fhir/test_validation.py | mmtmn/client-apis | 215adae0b7f401b4bf62e7bd79b6a8adfe69cf4f | [
"Apache-2.0"
] | null | null | null | import requests
import os
import pytest
@pytest.fixture
def token():
assert 'TOKEN' in os.environ
return os.environ['TOKEN']
@pytest.fixture
def base_api():
default = 'https://healthcare.googleapis.com/v1beta1/projects/fhir-test-11-329119/locations/us-west2/datasets/anvil-test/fhirStores/dev/fhir'
return os.environ.get('BASE_API', default)
def test_DocumentReference(token, base_api):
validate_url = f'{base_api}/DocumentReference/$validate'
headers = {
"Content-Type": "application/fhir+json;charset=utf-8",
"Authorization": f"Bearer {token}",
}
# missing subject
invalid_body_no_subject = { "resourceType": "DocumentReference", "id": "44a58180-e4ff-5a8c-9a1f-db4a76ce6f1f", "meta": { "profile": [ "https://ncpi-fhir.github.io/ncpi-fhir-ig/StructureDefinition/ncpi-research-document-reference" ] }, "identifier": [ { "system": "https://anvil.terra.bio/#workspaces/anvil-datastorage/1000G-high-coverage-2019", "value": "gs://fc-56ac46ea-efc4-4683-b6d5-6d95bed41c5e/CCDG_14151/Project_CCDG_14151_B01_GRM_WGS.cram.2020-02-12/Sample_HG00405/analysis/HG00405.final.cram" }, { "system": "urn:ncpi:unique-string", "value": "44a58180-e4ff-5a8c-9a1f-db4a76ce6f1f" } ], "status": "current", "custodian": { "reference": "Organization/1000G-high-coverage-2019" }, "content": [ { "attachment": { "url": "foo://dg.ANV0/a9a4489b-de29-4589-9409-4767860e8e85" }, "format": { "display": "cram" } } ], "context": { "related": [ { "reference": "Task/d0767fea-d6f6-5482-be12-9260e6901c4f" } ] } }
response = requests.post(validate_url, json=invalid_body_no_subject, headers=headers)
response.raise_for_status()
issues = response.json()['issue']
errors = [issue['expression'][0] for issue in issues if issue['severity'] == 'error']
assert 'DocumentReference.subject' in errors, f"Should have raised a 'invalid number of elements, min is 1, got 0' {issues}"
# missing task
invalid_body_no_task = {"resourceType":"DocumentReference","id":"44a58180-e4ff-5a8c-9a1f-db4a76ce6f1f","meta":{"profile":["http://fhir.ncpi-project-forge.io/StructureDefinition/ncpi-drs-document-reference"]},"identifier":[{"system":"https://anvil.terra.bio/#workspaces/anvil-datastorage/1000G-high-coverage-2019","value":"gs://fc-56ac46ea-efc4-4683-b6d5-6d95bed41c5e/CCDG_14151/Project_CCDG_14151_B01_GRM_WGS.cram.2020-02-12/Sample_HG00405/analysis/HG00405.final.cram"},{"system":"urn:ncpi:unique-string","value":"44a58180-e4ff-5a8c-9a1f-db4a76ce6f1f"}],"status":"current","custodian":{"reference":"Organization/1000G-high-coverage-2019"},"subject":{"reference":"Patient/e87f0936-319b-5b86-bc27-3e3fdde77c29"},"content":[{"attachment":{"url":"drs://dg.ANV0/a9a4489b-de29-4589-9409-4767860e8e85"},"format":{"display":"cram"}}]}
response = requests.post(validate_url, json=invalid_body_no_task, headers=headers)
response.raise_for_status()
issues = response.json()['issue']
errors = [issue['expression'][0] for issue in issues if issue['severity'] == 'error']
assert 'DocumentReference.context' in errors, f"Should have raised a 'invalid number of elements, min is 1, got 0' {issues}"
# missing custodian
invalid_body_no_custodian = {"resourceType":"DocumentReference","id":"44a58180-e4ff-5a8c-9a1f-db4a76ce6f1f","meta":{"profile":["http://fhir.ncpi-project-forge.io/StructureDefinition/ncpi-drs-document-reference"]},"identifier":[{"system":"https://anvil.terra.bio/#workspaces/anvil-datastorage/1000G-high-coverage-2019","value":"gs://fc-56ac46ea-efc4-4683-b6d5-6d95bed41c5e/CCDG_14151/Project_CCDG_14151_B01_GRM_WGS.cram.2020-02-12/Sample_HG00405/analysis/HG00405.final.cram"},{"system":"urn:ncpi:unique-string","value":"44a58180-e4ff-5a8c-9a1f-db4a76ce6f1f"}],"status":"current", "subject":{"reference":"Patient/e87f0936-319b-5b86-bc27-3e3fdde77c29"},"content":[{"attachment":{"url":"foo://dg.ANV0/a9a4489b-de29-4589-9409-4767860e8e85"},"format":{"display":"cram"}}],"context":{"related":[{"reference":"Task/d0767fea-d6f6-5482-be12-9260e6901c4f"}]}}
response = requests.post(validate_url, json=invalid_body_no_custodian, headers=headers)
response.raise_for_status()
issues = response.json()['issue']
errors = [issue['expression'][0] for issue in issues if issue['severity'] == 'error']
assert 'DocumentReference.custodian' in errors, f"Should have raised a 'invalid number of elements, min is 1, got 0' {issues}"
# all OK
valid_body = {"resourceType":"DocumentReference","id":"44a58180-e4ff-5a8c-9a1f-db4a76ce6f1f","meta":{"profile":["http://fhir.ncpi-project-forge.io/StructureDefinition/ncpi-drs-document-reference"]},"identifier":[{"system":"https://anvil.terra.bio/#workspaces/anvil-datastorage/1000G-high-coverage-2019","value":"gs://fc-56ac46ea-efc4-4683-b6d5-6d95bed41c5e/CCDG_14151/Project_CCDG_14151_B01_GRM_WGS.cram.2020-02-12/Sample_HG00405/analysis/HG00405.final.cram"},{"system":"urn:ncpi:unique-string","value":"44a58180-e4ff-5a8c-9a1f-db4a76ce6f1f"}],"status":"current","custodian":{"reference":"Organization/1000G-high-coverage-2019"},"subject":{"reference":"Patient/e87f0936-319b-5b86-bc27-3e3fdde77c29"},"content":[{"attachment":{"url":"drs://dg.ANV0/a9a4489b-de29-4589-9409-4767860e8e85"},"format":{"display":"cram"}}],"context":{"related":[{"reference":"Task/d0767fea-d6f6-5482-be12-9260e6901c4f"}]}}
response = requests.post(validate_url, json=valid_body, headers=headers)
response.raise_for_status()
issues = response.json()['issue']
errors = [issue for issue in issues if issue['severity'] == 'error']
assert len(errors) == 0, f"Should not have any errors {errors}"
# https://github.com/FirelyTeam/firely-net-sdk/issues/1055
warnings = [issue for issue in issues if issue['severity'] == 'warning' and issue['code'] != 'informational']
assert len(warnings) == 0, f"Should not have any warnings {warnings}"
def test_DocumentReferenceAttachment(token, base_api):
validate_url = f'{base_api}/DocumentReference/$validate'
invalid_body_bad_attachment = {"resourceType":"DocumentReference","id":"44a58180-e4ff-5a8c-9a1f-db4a76ce6f1f","meta":{"profile":["http://fhir.ncpi-project-forge.io/StructureDefinition/ncpi-drs-document-reference"]},"identifier":[{"system":"https://anvil.terra.bio/#workspaces/anvil-datastorage/1000G-high-coverage-2019","value":"gs://fc-56ac46ea-efc4-4683-b6d5-6d95bed41c5e/CCDG_14151/Project_CCDG_14151_B01_GRM_WGS.cram.2020-02-12/Sample_HG00405/analysis/HG00405.final.cram"},{"system":"urn:ncpi:unique-string","value":"44a58180-e4ff-5a8c-9a1f-db4a76ce6f1f"}],"status":"current","custodian":{"reference":"Organization/1000G-high-coverage-2019"},"subject":{"reference":"Patient/e87f0936-319b-5b86-bc27-3e3fdde77c29"},"content":[{"attachment":{"url":"foo://dg.ANV0/a9a4489b-de29-4589-9409-4767860e8e85"},"format":{"display":"cram"}}],"context":{"related":[{"reference":"Task/d0767fea-d6f6-5482-be12-9260e6901c4f"}]}}
headers = {
"Content-Type": "application/fhir+json;charset=utf-8",
"Authorization": f"Bearer {token}",
}
# bad attachment
response = requests.post(validate_url, json=invalid_body_bad_attachment, headers=headers)
response.raise_for_status()
issues = response.json()['issue']
errors = [issue['expression'][0] for issue in issues if issue['severity'] == 'error']
assert 'DocumentReference.attachment' in errors, f"Should have raised an invalid url {issues}\nExpected error for Google FHIR service. see https://groups.google.com/g/gcp-healthcare-discuss/c/dOKuFXqPlXo for more"
def test_Patient(token, base_api):
validate_url = f'{base_api}/Patient/$validate'
valid_body = {"resourceType":"Patient","id":"e87f0936-319b-5b86-bc27-3e3fdde77c29","meta":{"profile":["http://hl7.org/fhir/StructureDefinition/Patient"]},"identifier":[{"system":"https://anvil.terra.bio/#workspaces/anvil-datastorage/1000G-high-coverage-2019","value":"ERS4367795"},{"system":"urn:ncpi:unique-string","value":"1000G-high-coverage-2019/Patient/ERS4367795"}],"managingOrganization":{"reference":"Organization/1000G-high-coverage-2019"}}
headers = {
"Content-Type": "application/fhir+json;charset=utf-8",
"Authorization": f"Bearer {token}",
}
# all OK
response = requests.post(validate_url, json=valid_body, headers=headers)
response.raise_for_status()
issues = response.json()['issue']
errors = [issue for issue in issues if issue['severity'] == 'error']
assert len(errors) == 0, f"Should not have any errors {errors}"
warnings = [issue for issue in issues if issue['severity'] == 'warning']
assert len(warnings) == 0, f"Should not have any warnings {warnings}"
# TODO - if we implement it in schema
# # missing managingOrganization
# valid_body_managingOrganization = {"resourceType":"Patient","id":"e87f0936-319b-5b86-bc27-3e3fdde77c29","meta":{"profile":["http://hl7.org/fhir/StructureDefinition/Patient"]},"identifier":[{"system":"https://anvil.terra.bio/#workspaces/anvil-datastorage/1000G-high-coverage-2019","value":"ERS4367795"},{"system":"urn:ncpi:unique-string","value":"1000G-high-coverage-2019/Patient/ERS4367795"}]}
# response = requests.post(validate_url, json=valid_body_managingOrganization, headers=headers)
# response.raise_for_status()
# issues = response.json()['issue']
# errors = [issue for issue in issues if issue['severity'] == 'error']
# assert len(errors) > 0, f"Should have at least 1 error {errors}"
# warnings = [issue for issue in issues if issue['severity'] == 'warning']
# assert len(warnings) == 0, f"Should not have any warnings {warnings}"
def test_Task(token, base_api):
validate_url = f'{base_api}/Task/$validate'
headers = {
"Content-Type": "application/fhir+json;charset=utf-8",
"Authorization": f"Bearer {token}",
}
# all OK
valid_body = {"resourceType":"Task","id":"d0767fea-d6f6-5482-be12-9260e6901c4f","meta":{"profile":["https://ncpi-fhir.github.io/ncpi-fhir-ig/StructureDefinition/ncpi-specimen-task"]},"identifier":[{"system":"https://anvil.terra.bio/#workspaces/anvil-datastorage/1000G-high-coverage-2019","value":"ERS4367795/Task/AnVILInjest"},{"system":"urn:ncpi:unique-string","value":"1000G-high-coverage-2019/Patient/ERS4367795/Specimen/ERS4367795/Task/AnVILInjest"}],"status":"accepted","intent":"unknown","input":[{"type":{"coding":[{"code":"Specimen"}]},"valueReference":{"reference":"Specimen/e87f0936-319b-5b86-bc27-3e3fdde77c29"}}],"output":[{"type":{"coding":[{"code":"DocumentReference"}]},"valueReference":{"reference":"DocumentReference/44a58180-e4ff-5a8c-9a1f-db4a76ce6f1f"}},{"type":{"coding":[{"code":"DocumentReference"}]},"valueReference":{"reference":"DocumentReference/ed123e4e-f8f2-565c-a7f6-c492c9c4ab60"}},{"type":{"coding":[{"code":"DocumentReference"}]},"valueReference":{"reference":"DocumentReference/ccb094a2-567b-52dd-92e3-5f16b3b9f7da"}}],"focus":{"reference":"Specimen/e87f0936-319b-5b86-bc27-3e3fdde77c29"},"for":{"reference":"Patient/e87f0936-319b-5b86-bc27-3e3fdde77c29"},"owner":{"reference":"Organization/1000G-high-coverage-2019"}}
response = requests.post(validate_url, json=valid_body, headers=headers)
response.raise_for_status()
issues = response.json()['issue']
errors = [issue for issue in issues if issue['severity'] == 'error']
assert len(errors) == 0, f"Should not have any errors {errors}"
warnings = [issue for issue in issues if issue['severity'] == 'warning']
assert len(warnings) == 0, f"Should not have any warnings {warnings}"
# no focus
invalid_body_no_focus = {"resourceType":"Task","id":"d0767fea-d6f6-5482-be12-9260e6901c4f","meta":{"profile":["https://ncpi-fhir.github.io/ncpi-fhir-ig/StructureDefinition/ncpi-specimen-task"]},"identifier":[{"system":"https://anvil.terra.bio/#workspaces/anvil-datastorage/1000G-high-coverage-2019","value":"ERS4367795/Task/AnVILInjest"},{"system":"urn:ncpi:unique-string","value":"1000G-high-coverage-2019/Patient/ERS4367795/Specimen/ERS4367795/Task/AnVILInjest"}],"status":"accepted","intent":"unknown","input":[{"type":{"coding":[{"code":"Specimen"}]},"valueReference":{"reference":"Specimen/e87f0936-319b-5b86-bc27-3e3fdde77c29"}}],"output":[{"type":{"coding":[{"code":"DocumentReference"}]},"valueReference":{"reference":"DocumentReference/44a58180-e4ff-5a8c-9a1f-db4a76ce6f1f"}},{"type":{"coding":[{"code":"DocumentReference"}]},"valueReference":{"reference":"DocumentReference/ed123e4e-f8f2-565c-a7f6-c492c9c4ab60"}},{"type":{"coding":[{"code":"DocumentReference"}]},"valueReference":{"reference":"DocumentReference/ccb094a2-567b-52dd-92e3-5f16b3b9f7da"}}],"for":{"reference":"Patient/e87f0936-319b-5b86-bc27-3e3fdde77c29"},"owner":{"reference":"Organization/1000G-high-coverage-2019"}}
response = requests.post(validate_url, json=invalid_body_no_focus, headers=headers)
response.raise_for_status()
issues = response.json()['issue']
errors = [issue['expression'][0] for issue in issues if issue['severity'] == 'error']
assert 'Task.focus' in errors, f"Should have raised missing focus {issues}"
warnings = [issue for issue in issues if issue['severity'] == 'warning']
assert len(warnings) == 0, f"Should not have any warnings {warnings}"
# no for
invalid_body_no_for = {"resourceType":"Task","id":"d0767fea-d6f6-5482-be12-9260e6901c4f","meta":{"profile":["https://ncpi-fhir.github.io/ncpi-fhir-ig/StructureDefinition/ncpi-specimen-task"]},"identifier":[{"system":"https://anvil.terra.bio/#workspaces/anvil-datastorage/1000G-high-coverage-2019","value":"ERS4367795/Task/AnVILInjest"},{"system":"urn:ncpi:unique-string","value":"1000G-high-coverage-2019/Patient/ERS4367795/Specimen/ERS4367795/Task/AnVILInjest"}],"status":"accepted","intent":"unknown","input":[{"type":{"coding":[{"code":"Specimen"}]},"valueReference":{"reference":"Specimen/e87f0936-319b-5b86-bc27-3e3fdde77c29"}}],"output":[{"type":{"coding":[{"code":"DocumentReference"}]},"valueReference":{"reference":"DocumentReference/44a58180-e4ff-5a8c-9a1f-db4a76ce6f1f"}},{"type":{"coding":[{"code":"DocumentReference"}]},"valueReference":{"reference":"DocumentReference/ed123e4e-f8f2-565c-a7f6-c492c9c4ab60"}},{"type":{"coding":[{"code":"DocumentReference"}]},"valueReference":{"reference":"DocumentReference/ccb094a2-567b-52dd-92e3-5f16b3b9f7da"}}],"focus":{"reference":"Specimen/e87f0936-319b-5b86-bc27-3e3fdde77c29"},"owner":{"reference":"Organization/1000G-high-coverage-2019"}}
response = requests.post(validate_url, json=invalid_body_no_for, headers=headers)
response.raise_for_status()
issues = response.json()['issue']
errors = [issue['expression'][0] for issue in issues if issue['severity'] == 'error']
assert 'Task.for' in errors, f"Should have raised missing focus {issues}"
warnings = [issue for issue in issues if issue['severity'] == 'warning']
assert len(warnings) == 0, f"Should not have any warnings {warnings}"
# no owner
invalid_body_no_owner = {"resourceType":"Task","id":"d0767fea-d6f6-5482-be12-9260e6901c4f","meta":{"profile":["https://ncpi-fhir.github.io/ncpi-fhir-ig/StructureDefinition/ncpi-specimen-task"]},"identifier":[{"system":"https://anvil.terra.bio/#workspaces/anvil-datastorage/1000G-high-coverage-2019","value":"ERS4367795/Task/AnVILInjest"},{"system":"urn:ncpi:unique-string","value":"1000G-high-coverage-2019/Patient/ERS4367795/Specimen/ERS4367795/Task/AnVILInjest"}],"status":"accepted","intent":"unknown","input":[{"type":{"coding":[{"code":"Specimen"}]},"valueReference":{"reference":"Specimen/e87f0936-319b-5b86-bc27-3e3fdde77c29"}}],"output":[{"type":{"coding":[{"code":"DocumentReference"}]},"valueReference":{"reference":"DocumentReference/44a58180-e4ff-5a8c-9a1f-db4a76ce6f1f"}},{"type":{"coding":[{"code":"DocumentReference"}]},"valueReference":{"reference":"DocumentReference/ed123e4e-f8f2-565c-a7f6-c492c9c4ab60"}},{"type":{"coding":[{"code":"DocumentReference"}]},"valueReference":{"reference":"DocumentReference/ccb094a2-567b-52dd-92e3-5f16b3b9f7da"}}],"focus":{"reference":"Specimen/e87f0936-319b-5b86-bc27-3e3fdde77c29"},"for":{"reference":"Patient/e87f0936-319b-5b86-bc27-3e3fdde77c29"}}
response = requests.post(validate_url, json=invalid_body_no_owner, headers=headers)
response.raise_for_status()
issues = response.json()['issue']
errors = [issue['expression'][0] for issue in issues if issue['severity'] == 'error']
assert 'Task.owner' in errors, f"Should have raised missing focus {issues}"
warnings = [issue for issue in issues if issue['severity'] == 'warning']
assert len(warnings) == 0, f"Should not have any warnings {warnings}"
| 102.561728 | 1,256 | 0.724947 | 2,022 | 16,615 | 5.888724 | 0.108803 | 0.018896 | 0.035693 | 0.044092 | 0.917779 | 0.915848 | 0.910221 | 0.910221 | 0.897455 | 0.889393 | 0 | 0.10388 | 0.077039 | 16,615 | 161 | 1,257 | 103.198758 | 0.672579 | 0.063557 | 0 | 0.588785 | 0 | 0.242991 | 0.62459 | 0.277502 | 0 | 0 | 0 | 0.006211 | 0.158879 | 1 | 0.056075 | false | 0 | 0.028037 | 0 | 0.102804 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 10 |
a1f6e38ab5c4b2acca2d21bb8d6f606b2841168d | 4,781 | py | Python | test/services/policy_engine/engine/policy/gates/test_passwd_content.py | roachmd/anchore-engine | 521d6796778139a95f51542670714205c2735a81 | [
"Apache-2.0"
] | null | null | null | test/services/policy_engine/engine/policy/gates/test_passwd_content.py | roachmd/anchore-engine | 521d6796778139a95f51542670714205c2735a81 | [
"Apache-2.0"
] | null | null | null | test/services/policy_engine/engine/policy/gates/test_passwd_content.py | roachmd/anchore-engine | 521d6796778139a95f51542670714205c2735a81 | [
"Apache-2.0"
] | null | null | null | from test.services.policy_engine.engine.policy.gates import GateUnitTest
from anchore_engine.services.policy_engine.engine.policy.gate import ExecutionContext
from anchore_engine.db import get_thread_scoped_session, Image
from anchore_engine.services.policy_engine.engine.policy.gates.passwd_file import FileparsePasswordGate
from anchore_engine.services.policy_engine.engine.policy.gates.passwd_file import FileNotStoredTrigger
from anchore_engine.services.policy_engine.engine.policy.gates.passwd_file import UsernameMatchTrigger
from anchore_engine.services.policy_engine.engine.policy.gates.passwd_file import UserIdMatchTrigger
from anchore_engine.services.policy_engine.engine.policy.gates.passwd_file import GroupIdMatchTrigger
from anchore_engine.services.policy_engine.engine.policy.gates.passwd_file import ShellMatchTrigger
from anchore_engine.services.policy_engine.engine.policy.gates.passwd_file import PEntryMatchTrigger
class FileparsePasswordGateTest(GateUnitTest):
gate_clazz = FileparsePasswordGate
def test_filenotstored(self):
db = get_thread_scoped_session()
image = db.query(Image).get((self.test_env.get_images_named('centos7_verify')[0][0], '0'))
t, gate, test_context = self.get_initialized_trigger(FileNotStoredTrigger.__trigger_name__)
db.refresh(self.test_image)
test_context = gate.prepare_context(image, test_context)
t.evaluate(self.test_image, test_context)
print('Fired: {}'.format(t.fired))
self.assertEqual(0, len(t.fired))
db.rollback()
db = get_thread_scoped_session()
t, gate, test_context = self.get_initialized_trigger(FileNotStoredTrigger.__trigger_name__)
db.refresh(self.test_image)
test_context = gate.prepare_context(self.test_image, test_context)
t.evaluate(self.test_image, test_context)
print('Fired: {}'.format(t.fired))
self.assertEqual(1, len(t.fired))
db.rollback()
def test_userblacklist(self):
db = get_thread_scoped_session()
image = db.query(Image).get((self.test_env.get_images_named('centos7_verify')[0][0], '0'))
t, gate, test_context = self.get_initialized_trigger(UsernameMatchTrigger.__trigger_name__, user_names='mail,ftp,foobar')
db.refresh(self.test_image)
test_context = gate.prepare_context(image, test_context)
t.evaluate(self.test_image, test_context)
print('Fired: {}'.format(t.fired))
self.assertEqual(2, len(t.fired))
db.rollback()
def test_uidblacklist(self):
db = get_thread_scoped_session()
image = db.query(Image).get((self.test_env.get_images_named('centos7_verify')[0][0], '0'))
t, gate, test_context = self.get_initialized_trigger(UserIdMatchTrigger.__trigger_name__, user_ids='5,100')
db.refresh(self.test_image)
test_context = gate.prepare_context(image, test_context)
t.evaluate(self.test_image, test_context)
print('Fired: {}'.format(t.fired))
self.assertEqual(1, len(t.fired))
db.rollback()
def test_gidblacklist(self):
db = get_thread_scoped_session()
image = db.query(Image).get((self.test_env.get_images_named('centos7_verify')[0][0], '0'))
t, gate, test_context = self.get_initialized_trigger(GroupIdMatchTrigger.__trigger_name__, group_ids='100,10000')
db.refresh(self.test_image)
test_context = gate.prepare_context(image, test_context)
t.evaluate(self.test_image, test_context)
print('Fired: {}'.format(t.fired))
self.assertEqual(1, len(t.fired))
db.rollback()
def test_shellblacklist(self):
db = get_thread_scoped_session()
image = db.query(Image).get((self.test_env.get_images_named('centos7_verify')[0][0], '0'))
t, gate, test_context = self.get_initialized_trigger(ShellMatchTrigger.__trigger_name__, shells='/bin/bash,/bin/ksh')
db.refresh(self.test_image)
test_context = gate.prepare_context(image, test_context)
t.evaluate(self.test_image, test_context)
print('Fired: {}'.format(t.fired))
self.assertEqual(1, len(t.fired))
db.rollback()
def test_pentryblacklist(self):
db = get_thread_scoped_session()
image = db.query(Image).get((self.test_env.get_images_named('centos7_verify')[0][0], '0'))
t, gate, test_context = self.get_initialized_trigger(PEntryMatchTrigger.__trigger_name__, entry='mail:x:8:12:mail:/var/spool/mail:/sbin/nologin')
db.refresh(self.test_image)
test_context = gate.prepare_context(image, test_context)
t.evaluate(self.test_image, test_context)
print('Fired: {}'.format(t.fired))
self.assertEqual(1, len(t.fired))
db.rollback()
| 53.122222 | 153 | 0.72307 | 622 | 4,781 | 5.265273 | 0.131833 | 0.094046 | 0.102595 | 0.077863 | 0.80916 | 0.785344 | 0.774046 | 0.766107 | 0.751145 | 0.751145 | 0 | 0.01144 | 0.158963 | 4,781 | 89 | 154 | 53.719101 | 0.803034 | 0 | 0 | 0.675 | 0 | 0 | 0.051454 | 0.009621 | 0 | 0 | 0 | 0 | 0.0875 | 1 | 0.075 | false | 0.1125 | 0.125 | 0 | 0.225 | 0.0875 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 7 |
c5022c3db9ddd4d3867cae6dd43ffb3bb973e6f8 | 31,233 | py | Python | swagger_client/api/endpoint_groups_api.py | swat5421/swagger_portainer | e18b287dc906e171077912677515469ee3f4e5c2 | [
"RSA-MD"
] | null | null | null | swagger_client/api/endpoint_groups_api.py | swat5421/swagger_portainer | e18b287dc906e171077912677515469ee3f4e5c2 | [
"RSA-MD"
] | null | null | null | swagger_client/api/endpoint_groups_api.py | swat5421/swagger_portainer | e18b287dc906e171077912677515469ee3f4e5c2 | [
"RSA-MD"
] | null | null | null | # coding: utf-8
"""
Portainer API
Portainer API is an HTTP API served by Portainer. It is used by the Portainer UI and everything you can do with the UI can be done using the HTTP API. Examples are available at https://gist.github.com/deviantony/77026d402366b4b43fa5918d41bc42f8 You can find out more about Portainer at [http://portainer.io](http://portainer.io) and get some support on [Slack](http://portainer.io/slack/). # Authentication Most of the API endpoints require to be authenticated as well as some level of authorization to be used. Portainer API uses JSON Web Token to manage authentication and thus requires you to provide a token in the **Authorization** header of each request with the **Bearer** authentication mechanism. Example: ``` Bearer eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpZCI6MSwidXNlcm5hbWUiOiJhZG1pbiIsInJvbGUiOjEsImV4cCI6MTQ5OTM3NjE1NH0.NJ6vE8FY1WG6jsRQzfMqeatJ4vh2TWAeeYfDhP71YEE ``` # Security Each API endpoint has an associated access policy, it is documented in the description of each endpoint. Different access policies are available: * Public access * Authenticated access * Restricted access * Administrator access ### Public access No authentication is required to access the endpoints with this access policy. ### Authenticated access Authentication is required to access the endpoints with this access policy. ### Restricted access Authentication is required to access the endpoints with this access policy. Extra-checks might be added to ensure access to the resource is granted. Returned data might also be filtered. ### Administrator access Authentication as well as an administrator role are required to access the endpoints with this access policy. # Execute Docker requests Portainer **DO NOT** expose specific endpoints to manage your Docker resources (create a container, remove a volume, etc...). Instead, it acts as a reverse-proxy to the Docker HTTP API. This means that you can execute Docker requests **via** the Portainer HTTP API. To do so, you can use the `/endpoints/{id}/docker` Portainer API endpoint (which is not documented below due to Swagger limitations). This endpoint has a restricted access policy so you still need to be authenticated to be able to query this endpoint. Any query on this endpoint will be proxied to the Docker API of the associated endpoint (requests and responses objects are the same as documented in the Docker API). **NOTE**: You can find more information on how to query the Docker API in the [Docker official documentation](https://docs.docker.com/engine/api/v1.30/) as well as in [this Portainer example](https://gist.github.com/deviantony/77026d402366b4b43fa5918d41bc42f8). # noqa: E501
OpenAPI spec version: 1.24.1
Contact: info@portainer.io
Generated by: https://github.com/swagger-api/swagger-codegen.git
"""
from __future__ import absolute_import
import re # noqa: F401
# python 2 and python 3 compatibility library
import six
from swagger_client.api_client import ApiClient
class EndpointGroupsApi(object):
"""NOTE: This class is auto generated by the swagger code generator program.
Do not edit the class manually.
Ref: https://github.com/swagger-api/swagger-codegen
"""
def __init__(self, api_client=None):
if api_client is None:
api_client = ApiClient()
self.api_client = api_client
def endpoint_group_add_endpoint(self, id, endpoint_id, **kwargs): # noqa: E501
"""Add an endpoint to an endpoint group # noqa: E501
Add an endpoint to an endpoint group **Access policy**: administrator # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.endpoint_group_add_endpoint(id, endpoint_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: EndpointGroup identifier (required)
:param int endpoint_id: Endpoint identifier (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.endpoint_group_add_endpoint_with_http_info(id, endpoint_id, **kwargs) # noqa: E501
else:
(data) = self.endpoint_group_add_endpoint_with_http_info(id, endpoint_id, **kwargs) # noqa: E501
return data
def endpoint_group_add_endpoint_with_http_info(self, id, endpoint_id, **kwargs): # noqa: E501
"""Add an endpoint to an endpoint group # noqa: E501
Add an endpoint to an endpoint group **Access policy**: administrator # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.endpoint_group_add_endpoint_with_http_info(id, endpoint_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: EndpointGroup identifier (required)
:param int endpoint_id: Endpoint identifier (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'endpoint_id'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method endpoint_group_add_endpoint" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `endpoint_group_add_endpoint`") # noqa: E501
# verify the required parameter 'endpoint_id' is set
if ('endpoint_id' not in params or
params['endpoint_id'] is None):
raise ValueError("Missing the required parameter `endpoint_id` when calling `endpoint_group_add_endpoint`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
if 'endpoint_id' in params:
path_params['endpointId'] = params['endpoint_id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['jwt'] # noqa: E501
return self.api_client.call_api(
'/endpoint_groups/{id}/endpoints/{endpointId}', 'PUT',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type=None, # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def endpoint_group_create(self, body, **kwargs): # noqa: E501
"""Create a new endpoint # noqa: E501
Create a new endpoint group. **Access policy**: administrator # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.endpoint_group_create(body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param EndpointGroupCreateRequest body: Registry details (required)
:return: EndpointGroup
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.endpoint_group_create_with_http_info(body, **kwargs) # noqa: E501
else:
(data) = self.endpoint_group_create_with_http_info(body, **kwargs) # noqa: E501
return data
def endpoint_group_create_with_http_info(self, body, **kwargs): # noqa: E501
"""Create a new endpoint # noqa: E501
Create a new endpoint group. **Access policy**: administrator # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.endpoint_group_create_with_http_info(body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param EndpointGroupCreateRequest body: Registry details (required)
:return: EndpointGroup
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['body'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method endpoint_group_create" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'body' is set
if ('body' not in params or
params['body'] is None):
raise ValueError("Missing the required parameter `body` when calling `endpoint_group_create`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['jwt'] # noqa: E501
return self.api_client.call_api(
'/endpoint_groups', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='EndpointGroup', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def endpoint_group_delete(self, id, **kwargs): # noqa: E501
"""Remove an endpoint group # noqa: E501
Remove an endpoint group. **Access policy**: administrator # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.endpoint_group_delete(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: EndpointGroup identifier (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.endpoint_group_delete_with_http_info(id, **kwargs) # noqa: E501
else:
(data) = self.endpoint_group_delete_with_http_info(id, **kwargs) # noqa: E501
return data
def endpoint_group_delete_with_http_info(self, id, **kwargs): # noqa: E501
"""Remove an endpoint group # noqa: E501
Remove an endpoint group. **Access policy**: administrator # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.endpoint_group_delete_with_http_info(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: EndpointGroup identifier (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method endpoint_group_delete" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `endpoint_group_delete`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# Authentication setting
auth_settings = ['jwt'] # noqa: E501
return self.api_client.call_api(
'/endpoint_groups/{id}', 'DELETE',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type=None, # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def endpoint_group_delete_endpoint(self, id, endpoint_id, **kwargs): # noqa: E501
"""Remove an endpoint group # noqa: E501
Remove an endpoint group. **Access policy**: administrator # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.endpoint_group_delete_endpoint(id, endpoint_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: EndpointGroup identifier (required)
:param int endpoint_id: Endpoint identifier (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.endpoint_group_delete_endpoint_with_http_info(id, endpoint_id, **kwargs) # noqa: E501
else:
(data) = self.endpoint_group_delete_endpoint_with_http_info(id, endpoint_id, **kwargs) # noqa: E501
return data
def endpoint_group_delete_endpoint_with_http_info(self, id, endpoint_id, **kwargs): # noqa: E501
"""Remove an endpoint group # noqa: E501
Remove an endpoint group. **Access policy**: administrator # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.endpoint_group_delete_endpoint_with_http_info(id, endpoint_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: EndpointGroup identifier (required)
:param int endpoint_id: Endpoint identifier (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'endpoint_id'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method endpoint_group_delete_endpoint" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `endpoint_group_delete_endpoint`") # noqa: E501
# verify the required parameter 'endpoint_id' is set
if ('endpoint_id' not in params or
params['endpoint_id'] is None):
raise ValueError("Missing the required parameter `endpoint_id` when calling `endpoint_group_delete_endpoint`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
if 'endpoint_id' in params:
path_params['endpointId'] = params['endpoint_id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# Authentication setting
auth_settings = ['jwt'] # noqa: E501
return self.api_client.call_api(
'/endpoint_groups/{id}/endpoints/{endpointId}', 'DELETE',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type=None, # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def endpoint_group_inspect(self, id, **kwargs): # noqa: E501
"""Inspect an endpoint group # noqa: E501
Retrieve details abount an endpoint group. **Access policy**: administrator # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.endpoint_group_inspect(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: Endpoint group identifier (required)
:return: EndpointGroup
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.endpoint_group_inspect_with_http_info(id, **kwargs) # noqa: E501
else:
(data) = self.endpoint_group_inspect_with_http_info(id, **kwargs) # noqa: E501
return data
def endpoint_group_inspect_with_http_info(self, id, **kwargs): # noqa: E501
"""Inspect an endpoint group # noqa: E501
Retrieve details abount an endpoint group. **Access policy**: administrator # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.endpoint_group_inspect_with_http_info(id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: Endpoint group identifier (required)
:return: EndpointGroup
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method endpoint_group_inspect" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `endpoint_group_inspect`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['jwt'] # noqa: E501
return self.api_client.call_api(
'/endpoint_groups/{id}', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='EndpointGroup', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def endpoint_group_list(self, **kwargs): # noqa: E501
"""List endpoint groups # noqa: E501
List all endpoint groups based on the current user authorizations. Will return all endpoint groups if using an administrator account otherwise it will only return authorized endpoint groups. **Access policy**: restricted # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.endpoint_group_list(async_req=True)
>>> result = thread.get()
:param async_req bool
:return: EndpointGroupListResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.endpoint_group_list_with_http_info(**kwargs) # noqa: E501
else:
(data) = self.endpoint_group_list_with_http_info(**kwargs) # noqa: E501
return data
def endpoint_group_list_with_http_info(self, **kwargs): # noqa: E501
"""List endpoint groups # noqa: E501
List all endpoint groups based on the current user authorizations. Will return all endpoint groups if using an administrator account otherwise it will only return authorized endpoint groups. **Access policy**: restricted # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.endpoint_group_list_with_http_info(async_req=True)
>>> result = thread.get()
:param async_req bool
:return: EndpointGroupListResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = [] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method endpoint_group_list" % key
)
params[key] = val
del params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['jwt'] # noqa: E501
return self.api_client.call_api(
'/endpoint_groups', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='EndpointGroupListResponse', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def endpoint_group_update(self, id, body, **kwargs): # noqa: E501
"""Update an endpoint group # noqa: E501
Update an endpoint group. **Access policy**: administrator # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.endpoint_group_update(id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: EndpointGroup identifier (required)
:param EndpointGroupUpdateRequest body: EndpointGroup details (required)
:return: EndpointGroup
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.endpoint_group_update_with_http_info(id, body, **kwargs) # noqa: E501
else:
(data) = self.endpoint_group_update_with_http_info(id, body, **kwargs) # noqa: E501
return data
def endpoint_group_update_with_http_info(self, id, body, **kwargs): # noqa: E501
"""Update an endpoint group # noqa: E501
Update an endpoint group. **Access policy**: administrator # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.endpoint_group_update_with_http_info(id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int id: EndpointGroup identifier (required)
:param EndpointGroupUpdateRequest body: EndpointGroup details (required)
:return: EndpointGroup
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'body'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method endpoint_group_update" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params or
params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `endpoint_group_update`") # noqa: E501
# verify the required parameter 'body' is set
if ('body' not in params or
params['body'] is None):
raise ValueError("Missing the required parameter `body` when calling `endpoint_group_update`") # noqa: E501
collection_formats = {}
path_params = {}
if 'id' in params:
path_params['id'] = params['id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['jwt'] # noqa: E501
return self.api_client.call_api(
'/endpoint_groups/{id}', 'PUT',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='EndpointGroup', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
| 43.379167 | 2,674 | 0.62863 | 3,639 | 31,233 | 5.178071 | 0.078043 | 0.044579 | 0.020803 | 0.026747 | 0.879849 | 0.878841 | 0.86828 | 0.860797 | 0.856233 | 0.84944 | 0 | 0.017694 | 0.285243 | 31,233 | 719 | 2,675 | 43.439499 | 0.826375 | 0.40137 | 0 | 0.815584 | 0 | 0 | 0.181094 | 0.059183 | 0 | 0 | 0 | 0 | 0 | 1 | 0.038961 | false | 0 | 0.01039 | 0 | 0.106494 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
9a90a3769f6b8a8f63e8b4c5116b73fc650ba240 | 906 | py | Python | tests/sampleresponse/payout.py | adyaksaw/xendit-python | 47b05f2a6582104a274dc12a172c6421de86febc | [
"MIT"
] | 10 | 2020-10-31T23:34:34.000Z | 2022-03-08T19:08:55.000Z | tests/sampleresponse/payout.py | adyaksaw/xendit-python | 47b05f2a6582104a274dc12a172c6421de86febc | [
"MIT"
] | 22 | 2020-07-30T14:25:07.000Z | 2022-03-31T03:55:46.000Z | tests/sampleresponse/payout.py | adyaksaw/xendit-python | 47b05f2a6582104a274dc12a172c6421de86febc | [
"MIT"
] | 11 | 2020-07-28T08:09:40.000Z | 2022-03-18T00:14:02.000Z | def payout_response():
return {
"id": "a6ee1bf1-ffcd-4bda-a7ab-99c1d5cd0472",
"external_id": "payout-1595405117",
"amount": 50000,
"merchant_name": "Xendit&#x27;s Intern",
"status": "PENDING",
"expiration_timestamp": "2020-07-23T08:05:19.815Z",
"created": "2020-07-22T08:05:18.421Z",
"email": "test@email.co",
"payout_url": "https://payout-staging.xendit.co/web/a6ee1bf1-ffcd-4bda-a7ab-99c1d5cd0472",
}
def void_payout_response():
return {
"id": "a6ee1bf1-ffcd-4bda-a7ab-99c1d5cd0472",
"external_id": "payout-1595405117",
"amount": 50000,
"merchant_name": "Xendit&#x27;s Intern",
"status": "VOIDED",
"expiration_timestamp": "2020-07-23T08:05:19.815Z",
"created": "2020-07-22T08:05:18.421Z",
"email": "test@email.co",
}
| 34.846154 | 99 | 0.572848 | 102 | 906 | 4.990196 | 0.431373 | 0.047151 | 0.094303 | 0.117878 | 0.880157 | 0.817289 | 0.817289 | 0.817289 | 0.817289 | 0.817289 | 0 | 0.207048 | 0.248344 | 906 | 25 | 100 | 36.24 | 0.540382 | 0 | 0 | 0.695652 | 0 | 0.043478 | 0.581158 | 0.190692 | 0 | 0 | 0 | 0 | 0 | 1 | 0.086957 | true | 0 | 0 | 0.086957 | 0.173913 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
9aa00b467afd61ffc2d81f78ccfb7619e814009d | 3,606 | py | Python | src/diamond/test/testcollector.py | devanshukoyalkar-rubrik/Diamond | c4c3f2e4723c2e4381b7bf5348cc3a25f321315d | [
"MIT"
] | null | null | null | src/diamond/test/testcollector.py | devanshukoyalkar-rubrik/Diamond | c4c3f2e4723c2e4381b7bf5348cc3a25f321315d | [
"MIT"
] | null | null | null | src/diamond/test/testcollector.py | devanshukoyalkar-rubrik/Diamond | c4c3f2e4723c2e4381b7bf5348cc3a25f321315d | [
"MIT"
] | null | null | null | #!/usr/bin/python
# coding=utf-8
##########################################################################
from mock import patch
from test import unittest
import configobj
from diamond.collector import Collector
class BaseCollectorTest(unittest.TestCase):
def test_SetCustomHostname(self):
config = configobj.ConfigObj()
config['server'] = {}
config['server']['collectors_config_path'] = ''
config['collectors'] = {}
config['collectors']['default'] = {
'hostname': 'custom.localhost',
}
c = Collector(config, [])
self.assertEqual('custom.localhost', c.get_hostname())
def test_SetHostnameViaShellCmd(self):
config = configobj.ConfigObj()
config['server'] = {}
config['server']['collectors_config_path'] = ''
config['collectors'] = {}
config['collectors']['default'] = {
'hostname': 'echo custom.localhost',
'hostname_method': 'shell',
}
c = Collector(config, [])
self.assertEqual('custom.localhost', c.get_hostname())
@patch('diamond.collector.get_hostname')
def test_get_metric_path_no_prefix(self, get_hostname_mock):
config = configobj.ConfigObj()
config['collectors'] = {}
config['collectors']['default'] = {}
config['collectors']['default']['path_prefix'] = ''
config['collectors']['default']['path'] = 'bar'
get_hostname_mock.return_value = None
result = Collector(config, []).get_metric_path('foo')
self.assertEqual('bar.foo', result)
@patch('diamond.collector.get_hostname')
def test_get_metric_path_no_prefix_no_path(self, get_hostname_mock):
config = configobj.ConfigObj()
config['collectors'] = {}
config['collectors']['default'] = {}
config['collectors']['default']['path_prefix'] = ''
config['collectors']['default']['path'] = ''
get_hostname_mock.return_value = None
result = Collector(config, []).get_metric_path('foo')
self.assertEqual('foo', result)
@patch('diamond.collector.get_hostname')
def test_get_metric_path_no_path(self, get_hostname_mock):
config = configobj.ConfigObj()
config['collectors'] = {}
config['collectors']['default'] = {}
config['collectors']['default']['path_prefix'] = 'bar'
config['collectors']['default']['path'] = ''
get_hostname_mock.return_value = None
result = Collector(config, []).get_metric_path('foo')
self.assertEqual('bar.foo', result)
@patch('diamond.collector.get_hostname')
def test_get_metric_path_dot_path(self, get_hostname_mock):
config = configobj.ConfigObj()
config['collectors'] = {}
config['collectors']['default'] = {}
config['collectors']['default']['path_prefix'] = 'bar'
config['collectors']['default']['path'] = '.'
get_hostname_mock.return_value = None
result = Collector(config, []).get_metric_path('foo')
self.assertEqual('bar.foo', result)
@patch('diamond.collector.get_hostname')
def test_get_metric_path(self, get_hostname_mock):
config = configobj.ConfigObj()
config['collectors'] = {}
config['collectors']['default'] = {}
config['collectors']['default']['path_prefix'] = 'poof'
config['collectors']['default']['path'] = 'xyz'
get_hostname_mock.return_value = 'bar'
result = Collector(config, []).get_metric_path('foo')
self.assertEqual('poof.bar.xyz.foo', result)
| 32.486486 | 74 | 0.605657 | 354 | 3,606 | 5.946328 | 0.135593 | 0.182423 | 0.185748 | 0.128266 | 0.848456 | 0.836105 | 0.836105 | 0.836105 | 0.836105 | 0.811401 | 0 | 0.000354 | 0.215752 | 3,606 | 110 | 75 | 32.781818 | 0.743989 | 0.008042 | 0 | 0.684211 | 0 | 0 | 0.237646 | 0.055413 | 0 | 0 | 0 | 0 | 0.092105 | 1 | 0.092105 | false | 0 | 0.052632 | 0 | 0.157895 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
9aed509c077037bc3c4942f793ede183c203424c | 171,634 | py | Python | WallArchetypeObjects.py | nassermarafi/SRCSWArchetypes | 105a5e40ef0ba1951108dc52b382ae0c5457057a | [
"MIT"
] | 7 | 2020-04-29T08:44:12.000Z | 2022-03-05T04:00:11.000Z | WallArchetypeObjects.py | nassermarafi/SRCSWArchetypes | 105a5e40ef0ba1951108dc52b382ae0c5457057a | [
"MIT"
] | null | null | null | WallArchetypeObjects.py | nassermarafi/SRCSWArchetypes | 105a5e40ef0ba1951108dc52b382ae0c5457057a | [
"MIT"
] | 4 | 2019-12-20T04:38:11.000Z | 2021-11-21T18:25:34.000Z | # Coded by Marafi
# To Do List:
# Add Concrete Wall Weight
# Add Basement Floors
# Add Using 2.5' Deep by 6' Coupling Beams
# Distribute Forces using MRSA, include 0.85 factor for 2008 designs
####################################################################################
#region Defining Classes
####################################################################################
from __future__ import absolute_import
import numpy as np
import ATCWallArchetypeHelpers as ATCWallHelper
from ATCWallArchetypeObjects import ArchetypeData
from ATCWallArchetypeObjects import CouplingBeam
from ATCWallArchetypeObjects import PlanarWallSection
from ATCWallArchetypeObjects import TWallSection
from ATCWallArchetypeObjects import IWallSection
from six.moves import filter
class Basement:
def __init__(self, FloorStiffnesses, WallStiffnesses, BasementMass, **kwargs):
self.FloorStiffnesses = list(FloorStiffnesses)
self.WallStiffnesses = list(WallStiffnesses)
self.BasementMass = list(BasementMass)
self.__dict__.update(kwargs)
####################################################################################
# endregion
####################################################################################
####################################################################################
#region Defining Functions
####################################################################################
####################################################################################
# endregion
####################################################################################
def GetSeattle2008Hazard(Height, R=6, Period = None, IgnoreMinBaseShear = False, Overstrength=1.0):
Sds = 0.91 * Overstrength
S1 = 0.529 * Overstrength
Sd1 = 0.458 * Overstrength
TL = 6
I = 1.0
Cd = 5
if Period == None:
CuTa = 1.4 * ASCEHelper.ComputeCuTa((Height / 12.), 0.02, 0.75)
else:
CuTa = Period
Periods = [0.01 ,0.1 ,0.2 ,0.3 ,0.4 ,0.5 ,0.6 ,0.75 ,1 ,2 ,3 ,4 ,5 ,7 ,8 ,9 ,10]
BasinFactors = [1.235, 1.231, 1.249, 1.340, 1.351, 1.428, 1.477, 1.551, 1.557, 1.583, 1.541, 1.576, 1.581, 1.728, 1.744, 1.703, 1.662]
# if Height / 12. > 240:
# FactorS = np.exp(np.interp(np.log(0.2), np.log(Periods), np.log(BasinFactors))) # Add basin effects
# Factor1 = np.exp(np.interp(np.log(1.0), np.log(Periods), np.log(BasinFactors))) # Add basin effects
# else:
# FactorS = 1.0
# Factor1 = 1.0
FactorS = 1.0
Factor1 = 1.0
SaDesign = ASCEHelper.GetDesignSa(CuTa, S1 * Factor1, Sds * FactorS, Sd1 * Factor1, TL, R, I, IgnoreMinBaseShear)
return SaDesign, Sds, CuTa
def GetSeattle2014Hazard(Height, R=6, Period = None, IgnoreMinBaseShear = False, Overstrength=1.0):
Sds = 1.12 * Overstrength
S1 = 0.488 * Overstrength
Sd1 = 0.488 * Overstrength
TL = 6
I = 1.0
Cd = 5
if Period == None:
CuTa = 1.4 * ASCEHelper.ComputeCuTa((Height / 12.), 0.02, 0.75)
else:
CuTa = Period
Periods = [0.01 ,0.1 ,0.2 ,0.3 ,0.4 ,0.5 ,0.6 ,0.75 ,1 ,2 ,3 ,4 ,5 ,7 ,8 ,9 ,10]
BasinFactors = [1.235, 1.231, 1.249, 1.340, 1.351, 1.428, 1.477, 1.551, 1.557, 1.583, 1.541, 1.576, 1.581, 1.728, 1.744, 1.703, 1.662]
# if Height / 12. > 240:
# FactorS = np.exp(np.interp(np.log(0.2), np.log(Periods), np.log(BasinFactors))) # Add basin effects
# Factor1 = np.exp(np.interp(np.log(1.0), np.log(Periods), np.log(BasinFactors))) # Add basin effects
# else:
# FactorS = 1.0
# Factor1 = 1.0
FactorS = 1.0
Factor1 = 1.0
SaDesign = ASCEHelper.GetDesignSa(CuTa, S1 * Factor1, Sds * FactorS, Sd1 * Factor1, TL, R, I, IgnoreMinBaseShear)
return SaDesign, Sds, CuTa
# Global Variables
# Loading
# DL 150psf Floors #### All Floors have the same load
# LL 65psf Floors and 20psf Roof
DL_Basements = [155, 155, 155, 230] # Include Basement Wall Loads
LL_Basements = [40, 40, 40, 100] # Check LL
DL = 130 # psf
LL = 50 # psf
DL_Roof = 200 # psf
LL_Roof = 20 # psf
BasementFloorArea = 160. * 160. / 2.
FloorArea = 100. * 100. / 2.
PercentageFloorAreaResistedByWall = 0.5
FirstFloorHeight = 10 * 12.
FloorHeights = 10 * 12.
BasementFloorHeights = 10 * 12.
# Pick Out Prelim. Section Size using Shear
fy = 60.; fu = 105.
fpc_core = 8.
fpc_slabs = 5.
ConcreteDensity = 150.
def CreateArchetype(Basement=None, Use2008Maps = True, Overstrength = 1.0):
# Defining Story Levels
YGrids = [0] + np.array(np.arange(0, (NoOfStories) * FloorHeights, FloorHeights) + FloorHeights).tolist()
# Defining Gravity Loads
DeadLoads = np.ones(NoOfStories) * DL / 1000.
DeadLoads[-1] = DeadLoads[-1] * DL_Roof / DL
LiveLoads = np.ones(NoOfStories) * LL / 1000.
LiveLoads[-1] = LiveLoads[-1] * LL_Roof / LL
# Computing Mass of Wall
WallSelfWeight = []
i = -1
for section in Sections:
i += 1
if isinstance(section, IWallSection):
CoreVolume = (section.b_w * section.t_w * 2. + (section.l_w - section.t_w * 2.) * section.t_w) * (
YGrids[i + 1] - YGrids[i]) / 12. ** 3.
EquivalentDL = CoreVolume * ConcreteDensity / 1000.
WallSelfWeight.append(EquivalentDL)
elif isinstance(section, PlanarWallSection):
CoreVolume = section.l_w * section.t_w * (
YGrids[i + 1] - YGrids[i]) / 12. ** 3.
EquivalentDL = CoreVolume * ConcreteDensity / 1000.
WallSelfWeight.append(EquivalentDL)
# Defining Mass
Mass = ( DeadLoads + 0.5 * LiveLoads ) * FloorArea # Compute
Mass = Mass + np.array(WallSelfWeight) # Adding Wall Self Weight
WallTribArea = FloorArea * PercentageFloorAreaResistedByWall
WallGravityLoad = WallTribArea * DeadLoads + np.array(WallSelfWeight)
WallDeadLoads = DeadLoads * FloorArea + np.array(WallSelfWeight)
WallLiveLoads = LiveLoads * FloorArea
PDeltaGravityLoad = Mass - WallGravityLoad
if Basement is not None:
Height = YGrids[-1] - YGrids[len(Basement.FloorStiffnesses)]
else:
Height = YGrids[-1]
# Seismic Hazard
R = 6; Cd = 5
if Use2008Maps:
SaDesign, Sds, CuTa = GetSeattle2008Hazard(Height, R=R, Overstrength = Overstrength)
else:
SaDesign, Sds, CuTa = GetSeattle2014Hazard(Height, R=R, Overstrength = Overstrength)
if Basement is not None:
archetypename = ArchetypeData(Name, YGrids, R, CuTa, Length, Thickness, None, None, None,
fpc_core, fy, fu, PDeltaGravityLoad, Mass, WallGravityLoad,
None, None, None, Sections, CuTa=CuTa, SaDesign=SaDesign,
Cd=Cd, BasementProperties=Basement,
WallDeadLoads = list(WallDeadLoads), WallLiveLoads = list(WallLiveLoads), Sds = Sds)
else:
archetypename = ArchetypeData(Name, YGrids, R, CuTa, Length, Thickness, None, None, None,
fpc_core, fy, fu, PDeltaGravityLoad, Mass, WallGravityLoad,
None, None, None, Sections, CuTa=CuTa, SaDesign=SaDesign,
Cd=Cd, WallDeadLoads = list(WallDeadLoads), WallLiveLoads = list(WallLiveLoads), Sds = Sds)
return archetypename
BasementFloorStiffnesses = np.array([8200, 8200, 8200, 10100]) * 0.5
BasementWallStiffnesses = np.array([0.0496e9, 0.0496e9, 0.0496e9, 0.0496e9, ]) * 0.5
BasementMass = (np.array(DL_Basements) + 0.5 * np.array(LL_Basements)) * ( BasementFloorArea - FloorArea ) / 1000.
Basements = Basement(BasementFloorStiffnesses, BasementWallStiffnesses, BasementMass)
BasementFloorStiffnesses = np.array([8200, 8200, 10100]) * 0.5
BasementWallStiffnesses = np.array([0.0496e9, 0.0496e9, 0.0496e9, ]) * 0.5
BasementMass = (np.array(DL_Basements[1:]) + 0.5 * np.array(LL_Basements[1:])) * ( BasementFloorArea - FloorArea ) / 1000.
Basements3Levels = Basement(BasementFloorStiffnesses, BasementWallStiffnesses, BasementMass)
BasementFloorStiffnesses = np.array([8200, 10100]) * 0.5
BasementWallStiffnesses = np.array([0.0496e9, 0.0496e9 ]) * 0.5
BasementMass = (np.array(DL_Basements[2:]) + 0.5 * np.array(LL_Basements[2:])) * ( BasementFloorArea - FloorArea ) / 1000.
Basements2Levels = Basement(BasementFloorStiffnesses, BasementWallStiffnesses, BasementMass)
####################################################################################
#region Defining Archetype
####################################################################################
Archetypes = []
import ASCEHelper
############################### Performance Group #1 ###############################
# 2008 Maps
#region Archetype S4H08SEA and S4H08SEAWB
Name = 'S4H08SEA'
# print 'Importing Archetype: ' + Name
# Compute Seismic Weight
NoOfStories = 4
YGrids = [0] + np.array(np.arange(0,(NoOfStories)*13*12, 13*12)+15*12).tolist()
DeadLoads = np.ones(NoOfStories) * DL / 1000.
DeadLoads[-1] = DeadLoads[-1] * DL_Roof / DL
LiveLoads = np.ones(NoOfStories) * LL / 1000.
LiveLoads[-1] = LiveLoads[-1] * LL_Roof / LL
MassPerSqFt = DL / 1000.
Mass = np.ones(NoOfStories) * MassPerSqFt * FloorArea
Mass[-1] = FloorArea * DL_Roof / 1000. # Adjust for Roof Weight
WallTribArea = FloorArea * 0.5
WeightPerSqFt = DL
BuildingWeight = np.ones(NoOfStories) * WeightPerSqFt * FloorArea
BuildingWeight[-1] = 152. / 1000. * FloorArea # Adjust for Roof Weight
# Seismic Hazard
R = 6; Cd = 5
SaDesign, Sds, CuTa = GetSeattle2008Hazard(YGrids[-1], R=R)
Thickness = 14.
Length = 14. * 12.
Long_Spacing = 4
NoOfCols = 10
BarSize = 8.
Ag = ( (NoOfCols - 1) * Long_Spacing + 6 ) * Thickness
Rho = ( NoOfCols * 2 + 2 ) * np.pi * ( BarSize / 2. / 8.) ** 2. / Ag
# print Rho
Section1 = PlanarWallSection(Length, Thickness,
(NoOfCols - 1) * Long_Spacing + 6,
(NoOfCols - 1) * Long_Spacing + 6, BarSize,
[3] + (np.ones(NoOfCols - 2) * 2.).tolist() + [3],
[3] + (np.ones(NoOfCols - 2) * 2.).tolist() + [3],
0.255, 4.037, fpc_core, fy, fu, 3, 4., NoOfCols, 3)
NoOfCols = 6
BarSize = 8.0
Ag = ( (NoOfCols - 1) * Long_Spacing + 6 ) * Thickness
Rho = ( NoOfCols * 2 + 2 ) * np.pi * ( BarSize / 2. / 8.) ** 2. / Ag
# print Rho
Section2 = PlanarWallSection(Length, Thickness,
(NoOfCols - 1) * Long_Spacing + 6,
(NoOfCols - 1) * Long_Spacing + 6, BarSize,
[3] + (np.ones(NoOfCols - 2) * 2.).tolist() + [3],
[3] + (np.ones(NoOfCols - 2) * 2.).tolist() + [3],
0.255, 4.037, fpc_core, fy, fu, 3, 4., 8, 3)
Section3 = PlanarWallSection(Length, Thickness, 0, 0, 10.173,
[],
[],
0.255, 4.037, fpc_core, fy, fu, None, None, None, None)
Sections = [Section1, Section1, Section2, Section2]
S4H08SEA = CreateArchetype()
Archetypes.append(S4H08SEA)
Name = 'S4H08SEAWB'
NoOfStories = 6
Sections = [
Section1, Section1,
Section1, Section1, Section2, Section2
]
S4H08SEAWB = CreateArchetype(Basements2Levels)
Archetypes.append(S4H08SEAWB)
#endregion
#region Archetype S8H08SEA and S8H08SEAWB
Name = 'S8H08SEA'
# print 'Importing Archetype: ' + Name
#### Input Variables
NoOfStories = 8
Thickness = 14.
Length = 16. * 12.
Flange_Thickness = 8*12. # Assume 6' Long Core
Long_Spacing = 4
BarSize = 7.0
Rho = 0.9 #In Fraction
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section1 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing, Thickness - 3.5)
BarSize = 5.0
Rho = 0.55 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section2 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2, Thickness - 3.5)
BarSize = 4.0
Rho = 0.25 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section3 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, None, None, None, None)
Sections = [Section1, Section1, Section1,
Section2, Section2, Section2,
Section3, Section3,
]
S8H08SEA = CreateArchetype()
Archetypes.append(S8H08SEA)
Name = 'S8H08SEAWB'
NoOfStories = 11
Sections = [
Section1, Section1, Section1,
Section1, Section1, Section1,
Section2, Section2, Section2,
Section3, Section3,
]
S8H08SEAWB = CreateArchetype(Basements3Levels)
Archetypes.append(S8H08SEAWB)
#endregion
#region Archetype S12H08SEA and S12H08SEAWB
Name = 'S12H08SEA'
# print 'Importing Archetype: ' + Name
#### Input Variables
NoOfStories = 12
Thickness = 14.
Length = 20. * 12.
Flange_Thickness = 10.0*12. # Assume 6' Long Core
Long_Spacing = 4
BarSize = 5.0
Rho = 0.50 #In Fraction
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section1 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing, Thickness - 3.5)
BarSize = 5.0
Rho = 0.50 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section2 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2, Thickness - 3.5)
ThicknessBelow = float(Thickness)
Thickness = 14.
Length = Length - (ThicknessBelow - Thickness) * 2.
BarSize = 4.0
Rho = 0.35 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section3 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, None, None, None, None)
BarSize = 4.0
Rho = 0.25 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section4 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, None, None, None, None)
Sections = [Section1, Section1, Section1,
Section2, Section2, Section2,
Section3, Section3, Section3,
Section4, Section4, Section4,
]
S12H08SEA = CreateArchetype()
Archetypes.append(S12H08SEA)
Name = 'S12H08SEAWB'
NoOfStories = 16
Sections = [
Section1, Section1, Section1, Section1,
Section1, Section1, Section1,
Section2, Section2, Section2,
Section3, Section3, Section3,
Section4, Section4, Section4,
]
S12H08SEAWB = CreateArchetype(Basements)
Archetypes.append(S12H08SEAWB)
#endregion
#region Archetype S16H08SEA and S16H08SEAWB
Name = 'S16H08SEA'
#### Input Variables
NoOfStories = 16
Thickness = 14.
Length = 22. * 12.
Flange_Thickness = 11.*12. # Assume 6' Long Core
Long_Spacing = 4
BarSize = 6.0
Rho = 0.5 #In Fraction
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section1 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing, Thickness - 3.5)
BarSize = 5.0
Rho = 0.5 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section2 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2, Thickness - 3.5)
ThicknessBelow = float(Thickness)
Thickness = 14.
Length = Length - (ThicknessBelow - Thickness) * 2.
BarSize = 4.0
Rho = 0.25 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section3 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, None, None, None, None)
BarSize = 4.0
Rho = 0.25 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section4 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, None, None, None, None)
Sections = [Section1, Section1, Section1, Section1,
Section2, Section2, Section2, Section2,
Section3, Section3, Section3, Section3,
Section4, Section4, Section4, Section4,
]
S16H08SEA = CreateArchetype()
Archetypes.append(S16H08SEA)
Name = 'S16H08SEAWB'
NoOfStories = 20 # Include Basement Floors Here
Sections = [Section1, Section1, Section1, Section1,
Section1, Section1, Section1, Section1,
Section2, Section2, Section2, Section2,
Section3, Section3, Section3, Section3,
Section4, Section4, Section4, Section4,
]
S16H08SEAWB = CreateArchetype(Basements)
Archetypes.append(S16H08SEAWB)
#endregion
#region Archetype S20H08SEA and S20H08SEAWB
Name = 'S20H08SEA'
# print 'Importing Archetype: ' + Name
#### Input Variables
NoOfStories = 20
Thickness = 14.
Length = 24. * 12.
Flange_Thickness = 12*12. # Assume 6' Long Core
Long_Spacing = 4
BarSize = 6.0
Rho = 0.5 #In Fraction
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section1 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing, Thickness - 3.5)
BarSize = 5.0
Rho = 0.5 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section2 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2, Thickness - 3.5)
ThicknessBelow = float(Thickness)
Thickness = 14.
Length = Length - (ThicknessBelow - Thickness) * 2.
BarSize = 4.0
Rho =0.35 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section3 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, None, None, None, None)
BarSize = 4.0
Rho = 0.25 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section4 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, None, None, None, None)
ThicknessBelow = float(Thickness)
Thickness = 14.
Length = Length - (ThicknessBelow - Thickness) * 2.
BarSize = 4.0
Rho = 0.25 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section5 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, None, None, None, None)
Sections = [Section1, Section1, Section1, Section1,
Section2, Section2, Section2, Section2,
Section3, Section3, Section3, Section3,
Section4, Section4, Section4, Section4,
Section5, Section5, Section5, Section5,
]
S20H08SEA = CreateArchetype()
Archetypes.append(S20H08SEA)
Name = 'S20H08SEAWB'
NoOfStories = 24 # Include Basement Floors Here
Sections = [Section1, Section1, Section1, Section1,
Section1, Section1, Section1, Section1,
Section2, Section2, Section2, Section2,
Section3, Section3, Section3, Section3,
Section4, Section4, Section4, Section4,
Section5, Section5, Section5, Section5,
]
S20H08SEAWB = CreateArchetype(Basements)
Archetypes.append(S20H08SEAWB)
#endregion
#region Archetype S24H08SEA and S24H08SEAWB
Name = 'S24H08SEA'
# print 'Importing Archetype: ' + Name
#### Input Variables
NoOfStories = 24
Thickness = 18
Length = 26. * 12.
Flange_Thickness = 13.*12. # Assume 6' Long Core
Long_Spacing = 4
BarSize = 7.0
Rho = 1.0 #In Fraction
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section1 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing, Thickness - 3.5)
BarSize = 6.0
Rho = 0.75 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section2 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2., Thickness - 3.5)
ThicknessBelow = float(Thickness)
Thickness = 18.
Length = Length - (ThicknessBelow - Thickness) * 2.
BarSize = 6.0
Rho = 0.60 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section3 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2., Thickness - 3.5)
BarSize = 5.0
Rho = 0.50 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section4 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2., Thickness - 3.5)
ThicknessBelow = float(Thickness)
Thickness = 14.
Length = Length - (ThicknessBelow - Thickness) * 2.
BarSize = 5.0
Rho = 0.50 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section5 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2., Thickness - 3.5)
BarSize = 5.0
Rho = 0.50 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section6 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, None, None, None, None)
Sections = [Section1, Section1, Section1, Section1,
Section2, Section2, Section2, Section2,
Section3, Section3, Section3, Section3,
Section4, Section4, Section4, Section4,
Section5, Section5, Section5, Section5,
Section6, Section6, Section6, Section6,
]
S24H08SEA = CreateArchetype()
Archetypes.append(S24H08SEA)
Name = 'S24H08SEAWB'
NoOfStories = 28
Sections = [Section1, Section1, Section1, Section1,
Section1, Section1, Section1, Section1,
Section2, Section2, Section2, Section2,
Section3, Section3, Section3, Section3,
Section4, Section4, Section4, Section4,
Section5, Section5, Section5, Section5,
Section6, Section6, Section6, Section6,
]
S24H08SEAWB = CreateArchetype(Basements)
Archetypes.append(S24H08SEAWB)
#endregion
#region Archetype S28H08SEA and S28H08SEAWB
Name = 'S28H08SEA'
# print 'Importing Archetype: ' + Name
#### Input Variables
NoOfStories = 28
Thickness = 18.
Length = 28. * 12.
Flange_Thickness = 14*12. # Assume 6' Long Core
Long_Spacing = 4
BarSize = 7.0
Rho = 0.85 #In Fraction
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section1 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing, Thickness - 3.5)
BarSize = 6.0
Rho = 0.6 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section2 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2, Thickness - 3.5)
ThicknessBelow = float(Thickness)
Thickness = 18.
Length = Length - (ThicknessBelow - Thickness) * 2.
BarSize = 6.0
Rho = 0.5 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section3 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2, Thickness - 3.5)
BarSize = 6.0
Rho = 0.5 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section4 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2, Thickness - 3.5)
ThicknessBelow = float(Thickness)
Thickness = 16.
Length = Length - (ThicknessBelow - Thickness) * 2.
BarSize = 6.0
Rho = 0.5 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section5 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2, Thickness - 3.5)
BarSize = 6.0
Rho = 0.5 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section6 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2, Thickness - 3.5)
ThicknessBelow = float(Thickness)
Thickness = 16.
Length = Length - (ThicknessBelow - Thickness) * 2.
BarSize = 6.0
Rho = 0.5 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section7 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2, Thickness - 3.5)
Sections = [
Section1, Section1, Section1, Section1,
Section2, Section2, Section2, Section2,
Section3, Section3, Section3, Section3,
Section4, Section4, Section4, Section4,
Section5, Section5, Section5, Section5,
Section6, Section6, Section6, Section6,
Section7, Section7, Section7, Section7,
]
S28H08SEA = CreateArchetype()
Archetypes.append(S28H08SEA)
Name = 'S28H08SEAWB'
NoOfStories = 32
Sections = [
Section1, Section1, Section1, Section1,
Section1, Section1, Section1, Section1,
Section2, Section2, Section2, Section2,
Section3, Section3, Section3, Section3,
Section4, Section4, Section4, Section4,
Section5, Section5, Section5, Section5,
Section6, Section6, Section6, Section6,
Section7, Section7, Section7, Section7,
]
S28H08SEAWB = CreateArchetype(Basements)
Archetypes.append(S28H08SEAWB)
#endregion
#region Archetype S32H08SEA and S32H08SEAWB
Name = 'S32H08SEA'
# print 'Importing Archetype: ' + Name
#### Input Variables
NoOfStories = 32
Thickness = 20.
Length = 30. * 12.
Flange_Thickness = 15*12. # Assume 6' Long Core
Long_Spacing = 4
BarSize = 6.0
Rho = 0.75 #In Fraction
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section1 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing, Thickness - 3.5)
BarSize = 5.0
Rho = 0.5 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section2 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2., Thickness - 3.5)
ThicknessBelow = float(Thickness)
Thickness = 20.
Length = Length - (ThicknessBelow - Thickness) * 2.
BarSize = 5.0
Rho = 0.5 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section3 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2, Thickness - 3.5)
BarSize = 5.0
Rho = 0.5 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section4 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2, Thickness - 3.5)
ThicknessBelow = float(Thickness)
Thickness = 18.
Length = Length - (ThicknessBelow - Thickness) * 2.
BarSize = 5.0
Rho = 0.5 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section5 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2, Thickness - 3.5)
BarSize = 5.0
Rho = 0.5 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section6 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2, Thickness - 3.5)
ThicknessBelow = float(Thickness)
Thickness = 18.
Length = Length - (ThicknessBelow - Thickness) * 2.
BarSize = 5.0
Rho = 0.5 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section7 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2, Thickness - 3.5)
BarSize = 5.0
Rho = 0.5 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section8 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2, Thickness - 3.5)
Sections = [
Section1, Section1, Section1, Section1,
Section2, Section2, Section2, Section2,
Section3, Section3, Section3, Section3,
Section4, Section4, Section4, Section4,
Section5, Section5, Section5, Section5,
Section6, Section6, Section6, Section6,
Section7, Section7, Section7, Section7,
Section8, Section8, Section8, Section8,
]
S32H08SEA = CreateArchetype()
Archetypes.append(S32H08SEA)
Name = 'S32H08SEAWB'
NoOfStories = 36
Sections = [
Section1, Section1, Section1, Section1,
Section1, Section1, Section1, Section1,
Section2, Section2, Section2, Section2,
Section3, Section3, Section3, Section3,
Section4, Section4, Section4, Section4,
Section5, Section5, Section5, Section5,
Section6, Section6, Section6, Section6,
Section7, Section7, Section7, Section7,
Section8, Section8, Section8, Section8,
]
S32H08SEAWB = CreateArchetype(Basements)
Archetypes.append(S32H08SEAWB)
#endregion
#region Archetype S36H08SEA and S36H08SEAWB
Name = 'S36H08SEA'
# print 'Importing Archetype: ' + Name
#### Input Variables
NoOfStories = 36
Thickness = 22.
Length = 32. * 12.
Flange_Thickness = 16*12. # Assume 6' Long Core
Long_Spacing = 4
BarSize = 6.0
Rho = 0.6 #In Fraction
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section1 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing, Thickness - 3.5)
BarSize = 6.0
Rho = 0.5 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section2 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2, Thickness - 3.5)
ThicknessBelow = float(Thickness)
Thickness = 22.
Length = Length - (ThicknessBelow - Thickness) * 2.
BarSize = 6.0
Rho = 0.5 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section3 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2, Thickness - 3.5)
BarSize = 6.0
Rho = 0.5 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section4 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2, Thickness - 3.5)
ThicknessBelow = float(Thickness)
Thickness = 16.
Length = Length - (ThicknessBelow - Thickness) * 2.
BarSize = 6.0
Rho = 0.5 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section5 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2, Thickness - 3.5)
BarSize = 6.0
Rho = 0.5 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section6 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2, Thickness - 3.5)
ThicknessBelow = float(Thickness)
Thickness = 16.
Length = Length - (ThicknessBelow - Thickness) * 2.
BarSize = 6.0
Rho = 0.5 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section7 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2, Thickness - 3.5)
BarSize = 6.0
Rho = 0.5 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section8 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2, Thickness - 3.5)
ThicknessBelow = float(Thickness)
Thickness = 16.
Length = Length - (ThicknessBelow - Thickness) * 2.
BarSize = 6.0
Rho = 0.5 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section9 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2, Thickness - 3.5)
Sections = [
Section1, Section1, Section1, Section1,
Section2, Section2, Section2, Section2,
Section3, Section3, Section3, Section3,
Section4, Section4, Section4, Section4,
Section5, Section5, Section5, Section5,
Section6, Section6, Section6, Section6,
Section7, Section7, Section7, Section7,
Section8, Section8, Section8, Section8,
Section9, Section9, Section9, Section9,
]
S36H08SEA = CreateArchetype()
Archetypes.append(S36H08SEA)
Name = 'S36H08SEAWB'
NoOfStories = 40
Sections = [
Section1, Section1, Section1, Section1,
Section1, Section1, Section1, Section1,
Section2, Section2, Section2, Section2,
Section3, Section3, Section3, Section3,
Section4, Section4, Section4, Section4,
Section5, Section5, Section5, Section5,
Section6, Section6, Section6, Section6,
Section7, Section7, Section7, Section7,
Section8, Section8, Section8, Section8,
Section9, Section9, Section9, Section9,
]
S36H08SEAWB = CreateArchetype(Basements)
Archetypes.append(S36H08SEAWB)
#endregion
#region Archetype S40H08SEA and S40H08SEAWB
Name = 'S40H08SEA'
# print 'Importing Archetype: ' + Name
#### Input Variables
NoOfStories = 40
Thickness = 24.
Length = 34. * 12.
Flange_Thickness = 17.*12. # Assume 6' Long Core
Long_Spacing = 4
BarSize = 6.0
Rho = 0.6 #In Fraction
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section1 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing, Thickness - 3.5)
BarSize = 6.0
Rho = 0.6 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section2 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2, Thickness - 3.5)
ThicknessBelow = float(Thickness)
Thickness = 24.
Length = Length - (ThicknessBelow - Thickness) * 2.
BarSize = 6.0
Rho = 0.5 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section3 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2, Thickness - 3.5)
BarSize = 6.0
Rho = 0.5 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section4 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2, Thickness - 3.5)
ThicknessBelow = float(Thickness)
Thickness = 18.
Length = Length - (ThicknessBelow - Thickness) * 2.
BarSize = 6.0
Rho = 0.5 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section5 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2, Thickness - 3.5)
BarSize = 6.0
Rho = 0.5 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section6 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2, Thickness - 3.5)
ThicknessBelow = float(Thickness)
Thickness = 18.
Length = Length - (ThicknessBelow - Thickness) * 2.
BarSize = 6.0
Rho = 0.5 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section7 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2, Thickness - 3.5)
BarSize = 6.0
Rho = 0.5 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section8 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2, Thickness - 3.5)
ThicknessBelow = float(Thickness)
Thickness = 16.
Length = Length - (ThicknessBelow - Thickness) * 2.
BarSize = 6.0
Rho = 0.5 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section9 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2, Thickness - 3.5)
BarSize = 6.0
Rho = 0.5 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section10 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2, Thickness - 3.5)
Sections = [
Section1, Section1, Section1, Section1,
Section2, Section2, Section2, Section2,
Section3, Section3, Section3, Section3,
Section4, Section4, Section4, Section4,
Section5, Section5, Section5, Section5,
Section6, Section6, Section6, Section6,
Section7, Section7, Section7, Section7,
Section8, Section8, Section8, Section8,
Section9, Section9, Section9, Section9,
Section10, Section10, Section10, Section10,
]
S40H08SEA = CreateArchetype()
Archetypes.append(S40H08SEA)
Name = 'S40H08SEAWB'
NoOfStories = 44
Sections = [
Section1, Section1, Section1, Section1,
Section1, Section1, Section1, Section1,
Section2, Section2, Section2, Section2,
Section3, Section3, Section3, Section3,
Section4, Section4, Section4, Section4,
Section5, Section5, Section5, Section5,
Section6, Section6, Section6, Section6,
Section7, Section7, Section7, Section7,
Section8, Section8, Section8, Section8,
Section9, Section9, Section9, Section9,
Section10, Section10, Section10, Section10,
]
S40H08SEAWB = CreateArchetype(Basements)
Archetypes.append(S40H08SEAWB)
#endregion
# 2014 Maps
#region Archetype S4H14SEA and S4H14SEAWB
Name = 'S4H14SEA'
# print 'Importing Archetype: ' + Name
# Compute Seismic Weight
NoOfStories = 4
YGrids = [0] + np.array(np.arange(0,(NoOfStories)*13*12, 13*12)+15*12).tolist()
DeadLoads = np.ones(NoOfStories) * DL / 1000.
DeadLoads[-1] = DeadLoads[-1] * DL_Roof / DL
LiveLoads = np.ones(NoOfStories) * LL / 1000.
LiveLoads[-1] = LiveLoads[-1] * LL_Roof / LL
MassPerSqFt = DL / 1000.
Mass = np.ones(NoOfStories) * MassPerSqFt * FloorArea
Mass[-1] = FloorArea * DL_Roof / 1000. # Adjust for Roof Weight
WallTribArea = FloorArea * 0.5
WeightPerSqFt = DL
BuildingWeight = np.ones(NoOfStories) * WeightPerSqFt * FloorArea
BuildingWeight[-1] = 152. / 1000. * FloorArea # Adjust for Roof Weight
# Seismic Hazard
R = 6; Cd = 5
SaDesign, Sds, CuTa = GetSeattle2008Hazard(YGrids[-1], R=R)
Thickness = 18.
Length = 16. * 12.
Long_Spacing = 4
NoOfCols = 13
BarSize = 8.
Ag = ( (NoOfCols - 1) * Long_Spacing + 6 ) * Thickness
Rho = ( NoOfCols * 2 + 2 ) * np.pi * ( BarSize / 2. / 8.) ** 2. / Ag
# print Rho
Section1 = PlanarWallSection(Length, Thickness,
(NoOfCols - 1) * Long_Spacing + 6,
(NoOfCols - 1) * Long_Spacing + 6, BarSize,
[3] + (np.ones(NoOfCols - 2) * 2.).tolist() + [3],
[3] + (np.ones(NoOfCols - 2) * 2.).tolist() + [3],
0.255, 4.037, fpc_core, fy, fu, 3, 4., NoOfCols, 3)
NoOfCols = 8
BarSize = 8.0
Ag = ( (NoOfCols - 1) * Long_Spacing + 6 ) * Thickness
Rho = ( NoOfCols * 2 + 2 ) * np.pi * ( BarSize / 2. / 8.) ** 2. / Ag
# print Rho
Section2 = PlanarWallSection(Length, Thickness,
(NoOfCols - 1) * Long_Spacing + 6,
(NoOfCols - 1) * Long_Spacing + 6, BarSize,
[3] + (np.ones(NoOfCols - 2) * 2.).tolist() + [3],
[3] + (np.ones(NoOfCols - 2) * 2.).tolist() + [3],
0.255, 4.037, fpc_core, fy, fu, 3, 4., 8, 3)
Section3 = PlanarWallSection(Length, Thickness, 0, 0, 10.173,
[],
[],
0.255, 4.037, fpc_core, fy, fu, None, None, None, None)
Sections = [Section1, Section1, Section2, Section2]
S4H14SEA = CreateArchetype(Use2008Maps = False)
Archetypes.append(S4H14SEA)
Name = 'S4H14SEAWB'
NoOfStories = 6
Sections = [
Section1, Section1,
Section1, Section1, Section2, Section2
]
S4H14SEAWB = CreateArchetype(Basements2Levels, Use2008Maps = False)
Archetypes.append(S4H14SEAWB)
#endregion
#region Archetype S8H14SEA and S8H14SEAWB
Name = 'S8H14SEA'
# print 'Importing Archetype: ' + Name
#### Input Variables
NoOfStories = 8
Thickness = 16.
Length = 18. * 12.
Flange_Thickness = 9.*12. # Assume 6' Long Core
Long_Spacing = 4
BarSize = 7.0
Rho = 0.95 #In Fraction
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section1 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing, Thickness - 3.5)
BarSize = 5.0
Rho = 0.70 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section2 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2, Thickness - 3.5)
BarSize = 4.0
Rho = 0.25 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section3 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, None, None, None, None)
Sections = [Section1, Section1, Section1,
Section2, Section2, Section2,
Section3, Section3,
]
S8H14SEA = CreateArchetype(Use2008Maps = False)
Archetypes.append(S8H14SEA)
Name = 'S8H14SEAWB'
NoOfStories = 11
Sections = [
Section1, Section1, Section1,
Section1, Section1, Section1,
Section2, Section2, Section2,
Section3, Section3,
]
S8H14SEAWB = CreateArchetype(Basements3Levels, Use2008Maps = False)
Archetypes.append(S8H14SEAWB)
#endregion
#region Archetype S12H14SEA and S12H14SEAWB
Name = 'S12H14SEA'
# print 'Importing Archetype: ' + Name
#### Input Variables
NoOfStories = 12
Thickness = 18.
Length = 20. * 12.
Flange_Thickness = 10.*12. # Assume 6' Long Core
Long_Spacing = 4
BarSize = 6.0
Rho = 0.85 #In Fraction
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section1 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing, Thickness - 3.5)
BarSize = 6.0
Rho = 0.6 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section2 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2, Thickness - 3.5)
ThicknessBelow = float(Thickness)
Thickness = 18.
Length = Length - (ThicknessBelow - Thickness) * 2.
BarSize = 4.0
Rho = 0.40 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section3 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, None, None, None, None)
BarSize = 4.0
Rho = 0.25 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section4 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, None, None, None, None)
Sections = [Section1, Section1, Section1,
Section2, Section2, Section2,
Section3, Section3, Section3,
Section4, Section4, Section4,
]
S12H14SEA = CreateArchetype(Use2008Maps = False)
Archetypes.append(S12H14SEA)
Name = 'S12H14SEAWB'
NoOfStories = 16
Sections = [
Section1, Section1, Section1, Section1,
Section1, Section1, Section1,
Section2, Section2, Section2,
Section3, Section3, Section3,
Section4, Section4, Section4,
]
S12H14SEAWB = CreateArchetype(Basements, Use2008Maps = False)
Archetypes.append(S12H14SEAWB)
#endregion
#region Archetype S16H14SEA and S16H14SEAWB
Name = 'S16H14SEA'
#### Input Variables
NoOfStories = 16
Thickness = 22.
Length = 24. * 12.
Flange_Thickness = 12.*12. # Assume 6' Long Core
Long_Spacing = 4
BarSize = 7.0
Rho = 0.6 #In Fraction
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section1 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing, Thickness - 3.5)
BarSize = 6.0
Rho = 0.5 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section2 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2, Thickness - 3.5)
ThicknessBelow = float(Thickness)
Thickness = 22.
Length = Length - (ThicknessBelow - Thickness) * 2.
BarSize = 5.0
Rho = 0.40 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section3 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, None, None, None, None)
BarSize = 4.0
Rho = 0.25 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section4 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, None, None, None, None)
Sections = [Section1, Section1, Section1, Section1,
Section2, Section2, Section2, Section2,
Section3, Section3, Section3, Section3,
Section4, Section4, Section4, Section4,
]
S16H14SEA = CreateArchetype(Use2008Maps = False)
Archetypes.append(S16H14SEA)
Name = 'S16H14SEAWB'
NoOfStories = 20 # Include Basement Floors Here
Sections = [Section1, Section1, Section1, Section1,
Section1, Section1, Section1, Section1,
Section2, Section2, Section2, Section2,
Section3, Section3, Section3, Section3,
Section4, Section4, Section4, Section4,
]
S16H14SEAWB = CreateArchetype(Basements, False)
Archetypes.append(S16H14SEAWB)
#endregion
#region Archetype S20H14SEA and S20H14SEAWB
Name = 'S20H14SEA'
# print 'Importing Archetype: ' + Name
#### Input Variables
NoOfStories = 20
Thickness = 24.
Length = 26. * 12.
Flange_Thickness = 13*12. # Assume 6' Long Core
Long_Spacing = 4
BarSize = 7.0
Rho = 0.55 #In Fraction
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section1 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing, Thickness - 3.5)
BarSize = 5.0
Rho = 0.5 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section2 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2, Thickness - 3.5)
ThicknessBelow = float(Thickness)
Thickness = 24.
Length = Length - (ThicknessBelow - Thickness) * 2.
BarSize = 5.0
Rho =0.45 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section3 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, None, None, None, None)
BarSize = 4.0
Rho = 0.25 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section4 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, None, None, None, None)
ThicknessBelow = float(Thickness)
Thickness = 20.
Length = Length - (ThicknessBelow - Thickness) * 2.
BarSize = 4.0
Rho = 0.25 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section5 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, None, None, None, None)
Sections = [Section1, Section1, Section1, Section1,
Section2, Section2, Section2, Section2,
Section3, Section3, Section3, Section3,
Section4, Section4, Section4, Section4,
Section5, Section5, Section5, Section5,
]
S20H14SEA = CreateArchetype(Use2008Maps = False)
Archetypes.append(S20H14SEA)
Name = 'S20H14SEAWB'
NoOfStories = 24 # Include Basement Floors Here
Sections = [Section1, Section1, Section1, Section1,
Section1, Section1, Section1, Section1,
Section2, Section2, Section2, Section2,
Section3, Section3, Section3, Section3,
Section4, Section4, Section4, Section4,
Section5, Section5, Section5, Section5,
]
S20H14SEAWB = CreateArchetype(Basements, False)
Archetypes.append(S20H14SEAWB)
#endregion
#region Archetype S24H14SEA and S24H14SEAWB
Name = 'S24H14SEA'
# print 'Importing Archetype: ' + Name
#### Input Variables
NoOfStories = 24
Thickness = 26.
Length = 28. * 12.
Flange_Thickness = 14.*12. # Assume 6' Long Core
Long_Spacing = 4
BarSize = 8.0
Rho = 1.1 #In Fraction
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section1 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing, Thickness - 3.5)
BarSize = 7.0
Rho = 0.75 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section2 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2, Thickness - 3.5)
ThicknessBelow = float(Thickness)
Thickness = 26.
Length = Length - (ThicknessBelow - Thickness) * 2.
BarSize = 6.0
Rho = 0.6 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section3 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2, Thickness - 3.5)
BarSize = 6.0
Rho = 0.5 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section4 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2, Thickness - 3.5)
ThicknessBelow = float(Thickness)
Thickness = 22.
Length = Length - (ThicknessBelow - Thickness) * 2.
BarSize = 6.0
Rho = 0.5 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section5 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2, Thickness - 3.5)
BarSize = 6.0
Rho = 0.5 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section6 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2, Thickness - 3.5)
Sections = [Section1, Section1, Section1, Section1,
Section2, Section2, Section2, Section2,
Section3, Section3, Section3, Section3,
Section4, Section4, Section4, Section4,
Section5, Section5, Section5, Section5,
Section6, Section6, Section6, Section6,
]
S24H14SEA = CreateArchetype(Use2008Maps = False)
Archetypes.append(S24H14SEA)
Name = 'S24H14SEAWB'
NoOfStories = 28
Sections = [Section1, Section1, Section1, Section1,
Section1, Section1, Section1, Section1,
Section2, Section2, Section2, Section2,
Section3, Section3, Section3, Section3,
Section4, Section4, Section4, Section4,
Section5, Section5, Section5, Section5,
Section6, Section6, Section6, Section6,
]
S24H14SEAWB = CreateArchetype(Basements, False)
Archetypes.append(S24H14SEAWB)
#endregion
#region Archetype S28H14SEA and S28H14SEAWB
Name = 'S28H14SEA'
# print 'Importing Archetype: ' + Name
#### Input Variables
NoOfStories = 28
Thickness = 28.
Length = 30. * 12.
Flange_Thickness = 15*12. # Assume 6' Long Core
Long_Spacing = 4
BarSize = 7.0
Rho = 0.95 #In Fraction
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section1 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing, Thickness - 3.5)
BarSize = 7.0
Rho = 0.7 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section2 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2, Thickness - 3.5)
ThicknessBelow = float(Thickness)
Thickness = 28.
Length = Length - (ThicknessBelow - Thickness) * 2.
BarSize = 6.0
Rho = 0.6 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section3 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2, Thickness - 3.5)
BarSize = 6.0
Rho = 0.5 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section4 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2, Thickness - 3.5)
ThicknessBelow = float(Thickness)
Thickness = 24.
Length = Length - (ThicknessBelow - Thickness) * 2.
BarSize = 6.0
Rho = 0.5 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section5 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2, Thickness - 3.5)
BarSize = 6.0
Rho = 0.5 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section6 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2, Thickness - 3.5)
BarSize = 6.0
Rho = 0.5 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section7 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2, Thickness - 3.5)
Sections = [
Section1, Section1, Section1, Section1,
Section2, Section2, Section2, Section2,
Section3, Section3, Section3, Section3,
Section4, Section4, Section4, Section4,
Section5, Section5, Section5, Section5,
Section6, Section6, Section6, Section6,
Section7, Section7, Section7, Section7,
]
S28H14SEA = CreateArchetype(Use2008Maps = False)
Archetypes.append(S28H14SEA)
Name = 'S28H14SEAWB'
NoOfStories = 32
Sections = [
Section1, Section1, Section1, Section1,
Section1, Section1, Section1, Section1,
Section2, Section2, Section2, Section2,
Section3, Section3, Section3, Section3,
Section4, Section4, Section4, Section4,
Section5, Section5, Section5, Section5,
Section6, Section6, Section6, Section6,
Section7, Section7, Section7, Section7,
]
S28H14SEAWB = CreateArchetype(Basements, False)
Archetypes.append(S28H14SEAWB)
#endregion
#region Archetype S32H14SEA and S32H14SEAWB
Name = 'S32H14SEA'
# print 'Importing Archetype: ' + Name
#### Input Variables
NoOfStories = 32
Thickness = 30.
Length = 32. * 12.
Flange_Thickness = 16.*12. # Assume 6' Long Core
Long_Spacing = 4
BarSize = 7.0
Rho = 0.95 #In Fraction
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section1 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing, Thickness - 3.5)
BarSize = 7.0
Rho = 0.8 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section2 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2, Thickness - 3.5)
ThicknessBelow = float(Thickness)
Thickness = 30.
Length = Length - (ThicknessBelow - Thickness) * 2.
BarSize = 7.0
Rho = 0.7 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section3 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2, Thickness - 3.5)
BarSize = 5.0
Rho = 0.5 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section4 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2, Thickness - 3.5)
ThicknessBelow = float(Thickness)
Thickness = 26.
Length = Length - (ThicknessBelow - Thickness) * 2.
BarSize = 6.0
Rho = 0.5 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section5 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2, Thickness - 3.5)
BarSize = 6.0
Rho = 0.5 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section6 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2, Thickness - 3.5)
ThicknessBelow = float(Thickness)
Thickness = 26.
Length = Length - (ThicknessBelow - Thickness) * 2.
BarSize = 6.0
Rho = 0.5 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section7 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2, Thickness - 3.5)
BarSize = 6.0
Rho = 0.5 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section8 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2, Thickness - 3.5)
Sections = [
Section1, Section1, Section1, Section1,
Section2, Section2, Section2, Section2,
Section3, Section3, Section3, Section3,
Section4, Section4, Section4, Section4,
Section5, Section5, Section5, Section5,
Section6, Section6, Section6, Section6,
Section7, Section7, Section7, Section7,
Section8, Section8, Section8, Section8,
]
S32H14SEA = CreateArchetype(Use2008Maps = False)
Archetypes.append(S32H14SEA)
Name = 'S32H14SEAWB'
NoOfStories = 36
Sections = [
Section1, Section1, Section1, Section1,
Section1, Section1, Section1, Section1,
Section2, Section2, Section2, Section2,
Section3, Section3, Section3, Section3,
Section4, Section4, Section4, Section4,
Section5, Section5, Section5, Section5,
Section6, Section6, Section6, Section6,
Section7, Section7, Section7, Section7,
Section8, Section8, Section8, Section8,
]
S32H14SEAWB = CreateArchetype(Basements, False)
Archetypes.append(S32H14SEAWB)
#endregion
#region Archetype S36H14SEA and S36H14SEAWB
Name = 'S36H14SEA'
# print 'Importing Archetype: ' + Name
#### Input Variables
NoOfStories = 36
Thickness = 32.
Length = 34. * 12.
Flange_Thickness = 17.*12. # Assume 6' Long Core
Long_Spacing = 4
BarSize = 8.0
Rho = 1.1 #In Fraction
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section1 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing, Thickness - 3.5)
BarSize = 8.0
Rho = 0.8 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section2 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2, Thickness - 3.5)
ThicknessBelow = float(Thickness)
Thickness = 32.
Length = Length - (ThicknessBelow - Thickness) * 2.
BarSize = 7.0
Rho = 0.7 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section3 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2, Thickness - 3.5)
BarSize = 7.0
Rho = 0.6 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section4 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2, Thickness - 3.5)
ThicknessBelow = float(Thickness)
Thickness = 28.
Length = Length - (ThicknessBelow - Thickness) * 2.
BarSize = 6.0
Rho = 0.5 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section5 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2, Thickness - 3.5)
BarSize = 6.0
Rho = 0.5 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section6 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2, Thickness - 3.5)
ThicknessBelow = float(Thickness)
Thickness = 28.
Length = Length - (ThicknessBelow - Thickness) * 2.
BarSize = 6.0
Rho = 0.5 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section7 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2, Thickness - 3.5)
BarSize = 6.0
Rho = 0.5 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section8 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2, Thickness - 3.5)
BarSize = 6.0
Rho = 0.5 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section9 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2, Thickness - 3.5)
Sections = [
Section1, Section1, Section1, Section1,
Section2, Section2, Section2, Section2,
Section3, Section3, Section3, Section3,
Section4, Section4, Section4, Section4,
Section5, Section5, Section5, Section5,
Section6, Section6, Section6, Section6,
Section7, Section7, Section7, Section7,
Section8, Section8, Section8, Section8,
Section9, Section9, Section9, Section9,
]
S36H14SEA = CreateArchetype(Use2008Maps = False)
Archetypes.append(S36H14SEA)
Name = 'S36H14SEAWB'
NoOfStories = 40
Sections = [
Section1, Section1, Section1, Section1,
Section1, Section1, Section1, Section1,
Section2, Section2, Section2, Section2,
Section3, Section3, Section3, Section3,
Section4, Section4, Section4, Section4,
Section5, Section5, Section5, Section5,
Section6, Section6, Section6, Section6,
Section7, Section7, Section7, Section7,
Section8, Section8, Section8, Section8,
Section9, Section9, Section9, Section9,
]
S36H14SEAWB = CreateArchetype(Basements, False)
Archetypes.append(S36H14SEAWB)
#endregion
#region Archetype S40H14SEA and S40H14SEAWB
Name = 'S40H14SEA'
# print 'Importing Archetype: ' + Name
#### Input Variables
NoOfStories = 40
Thickness = 34.
Length = 36. * 12.
Flange_Thickness = 18.*12. # Assume 6' Long Core
Long_Spacing = 4
BarSize = 8.0
Rho = 1.2 #In Fraction
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section1 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing, Thickness - 3.5)
BarSize = 8.0
Rho = 1.0 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section2 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2, Thickness - 3.5)
ThicknessBelow = float(Thickness)
Thickness = 34.
Length = Length - (ThicknessBelow - Thickness) * 2.
BarSize = 7.0
Rho = 0.8 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section3 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2, Thickness - 3.5)
BarSize = 6.0
Rho = 0.8 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section4 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2, Thickness - 3.5)
ThicknessBelow = float(Thickness)
Thickness = 28.
Length = Length - (ThicknessBelow - Thickness) * 2.
BarSize = 5.0
Rho = 0.7 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section5 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2, Thickness - 3.5)
BarSize = 6.0
Rho = 0.5 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section6 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2, Thickness - 3.5)
ThicknessBelow = float(Thickness)
Thickness = 28.
Length = Length - (ThicknessBelow - Thickness) * 2.
BarSize = 6.0
Rho = 0.5 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section7 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2, Thickness - 3.5)
BarSize = 6.0
Rho = 0.5 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section8 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2, Thickness - 3.5)
ThicknessBelow = float(Thickness)
Thickness = 24.
Length = Length - (ThicknessBelow - Thickness) * 2.
BarSize = 6.0
Rho = 0.5 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section9 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2, Thickness - 3.5)
BarSize = 6.0
Rho = 0.5 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section10 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2, Thickness - 3.5)
Sections = [
Section1, Section1, Section1, Section1,
Section2, Section2, Section2, Section2,
Section3, Section3, Section3, Section3,
Section4, Section4, Section4, Section4,
Section5, Section5, Section5, Section5,
Section6, Section6, Section6, Section6,
Section7, Section7, Section7, Section7,
Section8, Section8, Section8, Section8,
Section9, Section9, Section9, Section9,
Section10, Section10, Section10, Section10,
]
S40H14SEA = CreateArchetype(Use2008Maps = False)
Archetypes.append(S40H14SEA)
Name = 'S40H14SEAWB'
NoOfStories = 44
Sections = [
Section1, Section1, Section1, Section1,
Section1, Section1, Section1, Section1,
Section2, Section2, Section2, Section2,
Section3, Section3, Section3, Section3,
Section4, Section4, Section4, Section4,
Section5, Section5, Section5, Section5,
Section6, Section6, Section6, Section6,
Section7, Section7, Section7, Section7,
Section8, Section8, Section8, Section8,
Section9, Section9, Section9, Section9,
Section10, Section10, Section10, Section10,
]
S40H14SEAWB = CreateArchetype(Basements, False)
Archetypes.append(S40H14SEAWB)
#endregion
############################### Performance Group #2 ###############################
##### 2008 Maps ######
#region Archetype S4H08SEAPG2 and S4H08SEAWBPG2
Name = 'S4H08SEAPG2'
# print 'Importing Archetype: ' + Name
# Compute Seismic Weight
NoOfStories = 4
YGrids = [0] + np.array(np.arange(0,(NoOfStories)*13*12, 13*12)+15*12).tolist()
DeadLoads = np.ones(NoOfStories) * DL / 1000.
DeadLoads[-1] = DeadLoads[-1] * DL_Roof / DL
LiveLoads = np.ones(NoOfStories) * LL / 1000.
LiveLoads[-1] = LiveLoads[-1] * LL_Roof / LL
MassPerSqFt = DL / 1000.
Mass = np.ones(NoOfStories) * MassPerSqFt * FloorArea
Mass[-1] = FloorArea * DL_Roof / 1000. # Adjust for Roof Weight
WallTribArea = FloorArea * 0.5
WeightPerSqFt = DL
BuildingWeight = np.ones(NoOfStories) * WeightPerSqFt * FloorArea
BuildingWeight[-1] = 152. / 1000. * FloorArea # Adjust for Roof Weight
# Seismic Hazard
R = 6; Cd = 5
SaDesign, Sds, CuTa = GetSeattle2008Hazard(YGrids[-1], R=R)
Thickness = 14.
Length = 10. * 12.
Long_Spacing = 4
NoOfCols = 14
BarSize = 8.
Ag = ( (NoOfCols - 1) * Long_Spacing + 6 ) * Thickness
Rho = ( NoOfCols * 2 + 2 ) * np.pi * ( BarSize / 2. / 8.) ** 2. / Ag
# print Rho
Section1 = PlanarWallSection(Length, Thickness,
(NoOfCols - 1) * Long_Spacing + 6,
(NoOfCols - 1) * Long_Spacing + 6, BarSize,
[3] + (np.ones(NoOfCols - 2) * 2.).tolist() + [3],
[3] + (np.ones(NoOfCols - 2) * 2.).tolist() + [3],
0.255, 4.037, fpc_core, fy, fu, 3, 4., NoOfCols, 3)
NoOfCols = 6
BarSize = 8.0
Ag = ( (NoOfCols - 1) * Long_Spacing + 6 ) * Thickness
Rho = ( NoOfCols * 2 + 2 ) * np.pi * ( BarSize / 2. / 8.) ** 2. / Ag
# print Rho
Section2 = PlanarWallSection(Length, Thickness,
(NoOfCols - 1) * Long_Spacing + 6,
(NoOfCols - 1) * Long_Spacing + 6, BarSize,
[3] + (np.ones(NoOfCols - 2) * 2.).tolist() + [3],
[3] + (np.ones(NoOfCols - 2) * 2.).tolist() + [3],
0.255, 4.037, fpc_core, fy, fu, 3, 4., 8, 3)
Section3 = PlanarWallSection(Length, Thickness, 0, 0, 10.173,
[],
[],
0.255, 4.037, fpc_core, fy, fu, None, None, None, None)
Sections = [Section1, Section1, Section2, Section2]
S4H08SEAPG2 = CreateArchetype()
Archetypes.append(S4H08SEAPG2)
Name = 'S4H08SEAWBPG2'
NoOfStories = 6
Sections = [
Section1, Section1,
Section1, Section1, Section2, Section2
]
S4H08SEAWBPG2 = CreateArchetype(Basements2Levels)
Archetypes.append(S4H08SEAWBPG2)
#endregion
#region Archetype S8H08SEAPG2 and S8H08SEAWBPG2
Name = 'S8H08SEAPG2'
# print 'Importing Archetype: ' + Name
#### Input Variables
NoOfStories = 8
Thickness = 20.
Length = 11. * 12.
Flange_Thickness = 5.5*12. # Assume 6' Long Core
Long_Spacing = 4
BarSize = 8.0
Rho = 2.0 #In Fraction
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section1 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing, Thickness - 3.5)
BarSize = 8.0
Rho = 1.1 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section2 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2, Thickness - 3.5)
BarSize = 4.0
Rho = 0.25 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section3 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, None, None, None, None)
Sections = [Section1, Section1, Section1,
Section2, Section2, Section2,
Section3, Section3,
]
S8H08SEAPG2 = CreateArchetype()
Archetypes.append(S8H08SEAPG2)
Name = 'S8H08SEAWBPG2'
NoOfStories = 11
Sections = [
Section1, Section1, Section1,
Section1, Section1, Section1,
Section2, Section2, Section2,
Section3, Section3,
]
S8H08SEAWBPG2 = CreateArchetype(Basements3Levels)
Archetypes.append(S8H08SEAWBPG2)
#endregion
#region Archetype S12H08SEAPG2 and S12H08SEAWBPG2
Name = 'S12H08SEAPG2'
# print 'Importing Archetype: ' + Name
#### Input Variables
NoOfStories = 12
Thickness = 20.
Length = 14. * 12.
Flange_Thickness = 7*12. # Assume 6' Long Core
Long_Spacing = 4
BarSize = 8.0
Rho = 1.6 #In Fraction
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section1 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing, Thickness - 3.5)
BarSize = 8.0
Rho = 1.0 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section2 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2, Thickness - 3.5)
ThicknessBelow = float(Thickness)
Thickness = 16.
Length = Length - (ThicknessBelow - Thickness) * 2.
BarSize = 4.0
Rho = 0.45 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section3 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, None, None, None, None)
BarSize = 4.0
Rho = 0.25 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section4 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, None, None, None, None)
Sections = [Section1, Section1, Section1,
Section2, Section2, Section2,
Section3, Section3, Section3,
Section4, Section4, Section4,
]
S12H08SEAPG2 = CreateArchetype()
Archetypes.append(S12H08SEAPG2)
Name = 'S12H08SEAWBPG2'
NoOfStories = 16
Sections = [
Section1, Section1, Section1, Section1,
Section1, Section1, Section1,
Section2, Section2, Section2,
Section3, Section3, Section3,
Section4, Section4, Section4,
]
S12H08SEAWBPG2 = CreateArchetype(Basements)
Archetypes.append(S12H08SEAWBPG2)
#endregion
#region Archetype S16H08SEAPG2 and S16H08SEAWBPG2
Name = 'S16H08SEAPG2'
#### Input Variables
NoOfStories = 16
Thickness = 22.
Length = 16. * 12.
Flange_Thickness = 8.*12. # Assume 6' Long Core
Long_Spacing = 4
BarSize = 8.0
Rho = 1.4 #In Fraction
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section1 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing, Thickness - 3.5)
BarSize = 8.0
Rho = 1.0 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section2 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2, Thickness - 3.5)
ThicknessBelow = float(Thickness)
Thickness = 16.
Length = Length - (ThicknessBelow - Thickness) * 2.
BarSize = 4.0
Rho = 0.35 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section3 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, None, None, None, None)
BarSize = 4.0
Rho = 0.25 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section4 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, None, None, None, None)
Sections = [Section1, Section1, Section1, Section1,
Section2, Section2, Section2, Section2,
Section3, Section3, Section3, Section3,
Section4, Section4, Section4, Section4,
]
S16H08SEAPG2 = CreateArchetype()
Archetypes.append(S16H08SEAPG2)
Name = 'S16H08SEAWBPG2'
NoOfStories = 20 # Include Basement Floors Here
Sections = [Section1, Section1, Section1, Section1,
Section1, Section1, Section1, Section1,
Section2, Section2, Section2, Section2,
Section3, Section3, Section3, Section3,
Section4, Section4, Section4, Section4,
]
S16H08SEAWBPG2 = CreateArchetype(Basements)
Archetypes.append(S16H08SEAWBPG2)
#endregion
#region Archetype S20H08SEAPG2 and S20H08SEAWBPG2
Name = 'S20H08SEAPG2'
# print 'Importing Archetype: ' + Name
#### Input Variables
NoOfStories = 20
Thickness = 24.
Length = 18. * 12.
Flange_Thickness = 9*12. # Assume 6' Long Core
Long_Spacing = 4
BarSize = 8.0
Rho = 1.2 #In Fraction
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section1 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing, Thickness - 3.5)
BarSize = 8.0
Rho = 0.9 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section2 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2, Thickness - 3.5)
ThicknessBelow = float(Thickness)
Thickness = 18.
Length = Length - (ThicknessBelow - Thickness) * 2.
BarSize = 4.0
Rho = 0.5 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section3 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2, Thickness - 3.5)
BarSize = 4.0
Rho = 0.25 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section4 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, None, None, None, None)
ThicknessBelow = float(Thickness)
Thickness = 18.
Length = Length - (ThicknessBelow - Thickness) * 2.
BarSize = 4.0
Rho = 0.25 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section5 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, None, None, None, None)
Sections = [Section1, Section1, Section1, Section1,
Section2, Section2, Section2, Section2,
Section3, Section3, Section3, Section3,
Section4, Section4, Section4, Section4,
Section5, Section5, Section5, Section5,
]
S20H08SEA = CreateArchetype()
Archetypes.append(S20H08SEA)
Name = 'S20H08SEAWBPG2'
NoOfStories = 24 # Include Basement Floors Here
Sections = [Section1, Section1, Section1, Section1,
Section1, Section1, Section1, Section1,
Section2, Section2, Section2, Section2,
Section3, Section3, Section3, Section3,
Section4, Section4, Section4, Section4,
Section5, Section5, Section5, Section5,
]
S20H08SEAWBPG2 = CreateArchetype(Basements)
Archetypes.append(S20H08SEAWBPG2)
#endregion
#region Archetype S24H08SEAPG2 and S24H08SEAWBPG2
Name = 'S24H08SEAPG2'
# print 'Importing Archetype: ' + Name
#### Input Variables
NoOfStories = 24
Thickness = 28
Length = 21. * 12.
Flange_Thickness = 10.5*12. # Assume 6' Long Core
Long_Spacing = 4
BarSize = 5.0
Rho = 0.7 #In Fraction
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section1 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing, Thickness - 3.5)
BarSize = 5.0
Rho = 0.5 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section2 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2., Thickness - 3.5)
ThicknessBelow = float(Thickness)
Thickness = 18.
Length = Length - (ThicknessBelow - Thickness) * 2.
BarSize = 4.0
Rho = 0.5 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section3 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2., Thickness - 3.5)
BarSize = 4.0
Rho = 0.25 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section4 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, None, None, None, None)
ThicknessBelow = float(Thickness)
Thickness = 18.
Length = Length - (ThicknessBelow - Thickness) * 2.
BarSize = 4.0
Rho = 0.25 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section5 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, None, None, None, None)
BarSize = 4.0
Rho = 0.25 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section6 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, None, None, None, None)
Sections = [Section1, Section1, Section1, Section1,
Section2, Section2, Section2, Section2,
Section3, Section3, Section3, Section3,
Section4, Section4, Section4, Section4,
Section5, Section5, Section5, Section5,
Section6, Section6, Section6, Section6,
]
S24H08SEAPG2 = CreateArchetype()
Archetypes.append(S24H08SEAPG2)
Name = 'S24H08SEAWBPG2'
NoOfStories = 28
Sections = [Section1, Section1, Section1, Section1,
Section1, Section1, Section1, Section1,
Section2, Section2, Section2, Section2,
Section3, Section3, Section3, Section3,
Section4, Section4, Section4, Section4,
Section5, Section5, Section5, Section5,
Section6, Section6, Section6, Section6,
]
S24H08SEAWBPG2 = CreateArchetype(Basements)
Archetypes.append(S24H08SEAWBPG2)
#endregion
##### 2014 Maps ######
#region Archetype S4H14SEAPG2 and S4H14SEAWBPG2
Name = 'S4H14SEAPG2'
# print 'Importing Archetype: ' + Name
# Compute Seismic Weight
NoOfStories = 4
YGrids = [0] + np.array(np.arange(0,(NoOfStories)*13*12, 13*12)+15*12).tolist()
DeadLoads = np.ones(NoOfStories) * DL / 1000.
DeadLoads[-1] = DeadLoads[-1] * DL_Roof / DL
LiveLoads = np.ones(NoOfStories) * LL / 1000.
LiveLoads[-1] = LiveLoads[-1] * LL_Roof / LL
MassPerSqFt = DL / 1000.
Mass = np.ones(NoOfStories) * MassPerSqFt * FloorArea
Mass[-1] = FloorArea * DL_Roof / 1000. # Adjust for Roof Weight
WallTribArea = FloorArea * 0.5
WeightPerSqFt = DL
BuildingWeight = np.ones(NoOfStories) * WeightPerSqFt * FloorArea
BuildingWeight[-1] = 152. / 1000. * FloorArea # Adjust for Roof Weight
# Seismic Hazard
R = 6; Cd = 5
SaDesign, Sds, CuTa = GetSeattle2008Hazard(YGrids[-1], R=R)
Thickness = 18.
Length = 12. * 12.
Long_Spacing = 4
NoOfCols = 12
BarSize = 10.
Ag = ( (NoOfCols - 1) * Long_Spacing + 6 ) * Thickness
Rho = ( NoOfCols * 2 + 2 ) * np.pi * ( BarSize / 2. / 8.) ** 2. / Ag
# print Rho
Section1 = PlanarWallSection(Length, Thickness,
(NoOfCols - 1) * Long_Spacing + 6,
(NoOfCols - 1) * Long_Spacing + 6, BarSize,
[3] + (np.ones(NoOfCols - 2) * 2.).tolist() + [3],
[3] + (np.ones(NoOfCols - 2) * 2.).tolist() + [3],
0.255, 4.037, fpc_core, fy, fu, 3, 4., NoOfCols, 3)
NoOfCols = 10
BarSize = 8.0
Ag = ( (NoOfCols - 1) * Long_Spacing + 6 ) * Thickness
Rho = ( NoOfCols * 2 + 2 ) * np.pi * ( BarSize / 2. / 8.) ** 2. / Ag
# print Rho
Section2 = PlanarWallSection(Length, Thickness,
(NoOfCols - 1) * Long_Spacing + 6,
(NoOfCols - 1) * Long_Spacing + 6, BarSize,
[3] + (np.ones(NoOfCols - 2) * 2.).tolist() + [3],
[3] + (np.ones(NoOfCols - 2) * 2.).tolist() + [3],
0.255, 4.037, fpc_core, fy, fu, 3, 4., 8, 3)
Section3 = PlanarWallSection(Length, Thickness, 0, 0, 10.173,
[],
[],
0.255, 4.037, fpc_core, fy, fu, None, None, None, None)
Sections = [Section1, Section1, Section2, Section2]
S4H14SEAPG2 = CreateArchetype(Use2008Maps = False)
Archetypes.append(S4H14SEAPG2)
Name = 'S4H14SEAWBPG2'
NoOfStories = 6
Sections = [
Section1, Section1,
Section1, Section1, Section2, Section2
]
S4H14SEAWBPG2 = CreateArchetype(Basements2Levels, Use2008Maps = False)
Archetypes.append(S4H14SEAWBPG2)
#endregion
#region Archetype S8H14SEAPG2 and S8H14SEAWBPG2
Name = 'S8H14SEAPG2'
# print 'Importing Archetype: ' + Name
#### Input Variables
NoOfStories = 8
Thickness = 24.
Length = 12. * 12.
Flange_Thickness = 6*12. # Assume 6' Long Core
Long_Spacing = 4
BarSize = 8.0
Rho = 2.0 #In Fraction
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section1 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing, Thickness - 3.5)
BarSize = 8.0
Rho = 1.0 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section2 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2, Thickness - 3.5)
BarSize = 4.0
Rho = 0.25 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section3 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, None, None, None, None)
Sections = [Section1, Section1, Section1,
Section2, Section2, Section2,
Section3, Section3,
]
S8H14SEAPG2 = CreateArchetype(Use2008Maps = False)
Archetypes.append(S8H14SEAPG2)
Name = 'S8H14SEAWBPG2'
NoOfStories = 11
Sections = [
Section1, Section1, Section1,
Section1, Section1, Section1,
Section2, Section2, Section2,
Section3, Section3,
]
S8H14SEAWBPG2 = CreateArchetype(Basements3Levels, Use2008Maps = False)
Archetypes.append(S8H14SEAWBPG2)
#endregion
#region Archetype S12H14SEAPG2 and S12H14SEAWBPG2
Name = 'S12H14SEAPG2'
# print 'Importing Archetype: ' + Name
#### Input Variables
NoOfStories = 12
Thickness = 24.
Length = 15. * 12.
Flange_Thickness = 7.5*12. # Assume 6' Long Core
Long_Spacing = 4
BarSize = 8.0
Rho = 1.6 #In Fraction
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section1 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing, Thickness - 3.5)
BarSize = 8.0
Rho = 1.2 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section2 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2, Thickness - 3.5)
ThicknessBelow = float(Thickness)
Thickness = 18.
Length = Length - (ThicknessBelow - Thickness) * 2.
BarSize = 4.0
Rho = 0.7 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section3 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2, Thickness - 3.5)
BarSize = 4.0
Rho = 0.25 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section4 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, None, None, None, None)
Sections = [Section1, Section1, Section1,
Section2, Section2, Section2,
Section3, Section3, Section3,
Section4, Section4, Section4,
]
S12H14SEAPG2 = CreateArchetype(Use2008Maps = False)
Archetypes.append(S12H14SEAPG2)
Name = 'S12H14SEAWBPG2'
NoOfStories = 16
Sections = [
Section1, Section1, Section1, Section1,
Section1, Section1, Section1,
Section2, Section2, Section2,
Section3, Section3, Section3,
Section4, Section4, Section4,
]
S12H14SEAWBPG2 = CreateArchetype(Basements, Use2008Maps = False)
Archetypes.append(S12H14SEAWBPG2)
#endregion
#region Archetype S16H14SEAPG2 and S16H14SEAWBPG2
Name = 'S16H14SEAPG2'
#### Input Variables
NoOfStories = 16
Thickness = 28.
Length = 17. * 12.
Flange_Thickness = 8.5*12. # Assume 6' Long Core
Long_Spacing = 4
BarSize = 8.0
Rho = 1.5 #In Fraction
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section1 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing, Thickness - 3.5)
BarSize = 8.0
Rho = 1.0 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section2 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2, Thickness - 3.5)
ThicknessBelow = float(Thickness)
Thickness = 20.
Length = Length - (ThicknessBelow - Thickness) * 2.
BarSize = 5.0
Rho = 0.60 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section3 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2, Thickness - 3.5)
BarSize = 4.0
Rho = 0.25 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section4 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, None, None, None, None)
Sections = [Section1, Section1, Section1, Section1,
Section2, Section2, Section2, Section2,
Section3, Section3, Section3, Section3,
Section4, Section4, Section4, Section4,
]
S16H14SEAPG2 = CreateArchetype(Use2008Maps = False)
Archetypes.append(S16H14SEAPG2)
Name = 'S16H14SEAWBPG2'
NoOfStories = 20 # Include Basement Floors Here
Sections = [Section1, Section1, Section1, Section1,
Section1, Section1, Section1, Section1,
Section2, Section2, Section2, Section2,
Section3, Section3, Section3, Section3,
Section4, Section4, Section4, Section4,
]
S16H14SEAWBPG2 = CreateArchetype(Basements, False)
Archetypes.append(S16H14SEAWBPG2)
#endregion
#region Archetype S20H14SEAPG2 and S20H14SEAWBPG2
Name = 'S20H14SEAPG2'
# print 'Importing Archetype: ' + Name
#### Input Variables
NoOfStories = 20
Thickness = 30.
Length = 19. * 12.
Flange_Thickness = 9.5*12. # Assume 6' Long Core
Long_Spacing = 4
BarSize = 9.0
Rho = 1.4 #In Fraction
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section1 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing, Thickness - 3.5)
BarSize = 8.0
Rho = 0.95 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section2 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2, Thickness - 3.5)
ThicknessBelow = float(Thickness)
Thickness = 22.
Length = Length - (ThicknessBelow - Thickness) * 2.
BarSize = 6.0
Rho =0.7 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section3 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2, Thickness - 3.5)
BarSize = 4.0
Rho = 0.25 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section4 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, None, None, None, None)
ThicknessBelow = float(Thickness)
Thickness = 22.
Length = Length - (ThicknessBelow - Thickness) * 2.
BarSize = 4.0
Rho = 0.25 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section5 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, None, None, None, None)
Sections = [Section1, Section1, Section1, Section1,
Section2, Section2, Section2, Section2,
Section3, Section3, Section3, Section3,
Section4, Section4, Section4, Section4,
Section5, Section5, Section5, Section5,
]
S20H14SEAPG2 = CreateArchetype(Use2008Maps = False)
Archetypes.append(S20H14SEAPG2)
Name = 'S20H14SEAWBPG2'
NoOfStories = 24 # Include Basement Floors Here
Sections = [Section1, Section1, Section1, Section1,
Section1, Section1, Section1, Section1,
Section2, Section2, Section2, Section2,
Section3, Section3, Section3, Section3,
Section4, Section4, Section4, Section4,
Section5, Section5, Section5, Section5,
]
S20H14SEAWBPG2 = CreateArchetype(Basements, False)
Archetypes.append(S20H14SEAWBPG2)
#endregion
#region Archetype S24H14SEAPG2 and S24H14SEAWBPG2
Name = 'S24H14SEAPG2'
# print 'Importing Archetype: ' + Name
#### Input Variables
NoOfStories = 24
Thickness = 32.
Length = 21. * 12.
Flange_Thickness = 10.5*12. # Assume 6' Long Core
Long_Spacing = 4
BarSize = 9.0
Rho = 1.3 #In Fraction
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section1 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing, Thickness - 3.5)
BarSize = 8.0
Rho = 1.1 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section2 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2., Thickness - 3.5)
ThicknessBelow = float(Thickness)
Thickness = 26.
Length = Length - (ThicknessBelow - Thickness) * 2.
BarSize = 7.0
Rho = 0.8 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section3 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2., Thickness - 3.5)
BarSize = 4.0
Rho = 0.35 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section4 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, None, None, None, None)
ThicknessBelow = float(Thickness)
Thickness = 26.
Length = Length - (ThicknessBelow - Thickness) * 2.
BarSize = 4.0
Rho = 0.25 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section5 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, None, None, None, None)
BarSize = 4.0
Rho = 0.25 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section6 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, None, None, None, None)
Sections = [Section1, Section1, Section1, Section1,
Section2, Section2, Section2, Section2,
Section3, Section3, Section3, Section3,
Section4, Section4, Section4, Section4,
Section5, Section5, Section5, Section5,
Section6, Section6, Section6, Section6,
]
S24H14SEAPG2 = CreateArchetype(Use2008Maps = False)
Archetypes.append(S24H14SEAPG2)
Name = 'S24H14SEAWBPG2'
NoOfStories = 28
Sections = [Section1, Section1, Section1, Section1,
Section1, Section1, Section1, Section1,
Section2, Section2, Section2, Section2,
Section3, Section3, Section3, Section3,
Section4, Section4, Section4, Section4,
Section5, Section5, Section5, Section5,
Section6, Section6, Section6, Section6,
]
S24H14SEAWBPG2 = CreateArchetype(Basements, False)
Archetypes.append(S24H14SEAWBPG2)
#endregion
#region Archetype S24H14SEAPG2 and S24H14SEAWBPG2
Name = 'S24H14SEAPG2TEST'
# print 'Importing Archetype: ' + Name
#### Input Variables
NoOfStories = 24
Thickness = 32.
Length = 21. * 12.
Flange_Thickness = 10.5*12. # Assume 6' Long Core
Long_Spacing = 4
BarSize = 9.0
Rho = 1.3 #In Fraction
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section1 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing, Thickness - 3.5)
BarSize = 8.0
Rho = 1.1 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section2 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2., Thickness - 3.5)
ThicknessBelow = float(Thickness)
Thickness = 26.
Length = Length - (ThicknessBelow - Thickness) * 2.
BarSize = 7.0
Rho = 0.8 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section3 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2., Thickness - 3.5)
BarSize = 4.0
Rho = 0.35 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section4 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, None, None, None, None)
ThicknessBelow = float(Thickness)
Thickness = 26.
Length = Length - (ThicknessBelow - Thickness) * 2.
BarSize = 4.0
Rho = 0.25 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section5 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, None, None, None, None)
BarSize = 4.0
Rho = 0.25 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section6 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, None, None, None, None)
Sections = [Section1, Section1, Section1, Section1,
Section1, Section1, Section1, Section1,
Section1, Section1, Section1, Section1,
Section1, Section1, Section1, Section1,
Section1, Section1, Section1, Section1,
Section1, Section1, Section1, Section1,
]
S24H14SEAPG2TEST = CreateArchetype(Use2008Maps = False)
Archetypes.append(S24H14SEAPG2TEST)
Name = 'S24H14SEAWBPG2TEST'
NoOfStories = 28
Sections = [Section1, Section1, Section1, Section1,
Section1, Section1, Section1, Section1,
Section1, Section1, Section1, Section1,
Section1, Section1, Section1, Section1,
Section1, Section1, Section1, Section1,
Section1, Section1, Section1, Section1,
Section1, Section1, Section1, Section1,
]
S24H14SEAWBPG2TEST = CreateArchetype(Basements, False)
Archetypes.append(S24H14SEAWBPG2TEST)
#endregion
##### 2014 Maps ######
# PG3 : 25% Over-strength on ASCE 7 loads
#region Archetype S4H14SEAPG3 and S4H14SEAWBPG3
Name = 'S4H14SEAPG3'
# print 'Importing Archetype: ' + Name
# Compute Seismic Weight
NoOfStories = 4
YGrids = [0] + np.array(np.arange(0,(NoOfStories)*13*12, 13*12)+15*12).tolist()
DeadLoads = np.ones(NoOfStories) * DL / 1000.
DeadLoads[-1] = DeadLoads[-1] * DL_Roof / DL
LiveLoads = np.ones(NoOfStories) * LL / 1000.
LiveLoads[-1] = LiveLoads[-1] * LL_Roof / LL
MassPerSqFt = DL / 1000.
Mass = np.ones(NoOfStories) * MassPerSqFt * FloorArea
Mass[-1] = FloorArea * DL_Roof / 1000. # Adjust for Roof Weight
WallTribArea = FloorArea * 0.5
WeightPerSqFt = DL
BuildingWeight = np.ones(NoOfStories) * WeightPerSqFt * FloorArea
BuildingWeight[-1] = 152. / 1000. * FloorArea # Adjust for Roof Weight
# Seismic Hazard
R = 6; Cd = 5
SaDesign, Sds, CuTa = GetSeattle2014Hazard(YGrids[-1], R=R, Overstrength = 1.25)
Thickness = 20.
Length = 13. * 12.
Long_Spacing = 4
NoOfCols = 16
BarSize = 10.
Ag = ( (NoOfCols - 1) * Long_Spacing + 6 ) * Thickness
Rho = ( NoOfCols * 2 + 2 ) * np.pi * ( BarSize / 2. / 8.) ** 2. / Ag
# print Rho
Section1 = PlanarWallSection(Length, Thickness,
(NoOfCols - 1) * Long_Spacing + 6,
(NoOfCols - 1) * Long_Spacing + 6, BarSize,
[3] + (np.ones(NoOfCols - 2) * 2.).tolist() + [3],
[3] + (np.ones(NoOfCols - 2) * 2.).tolist() + [3],
0.255, 4.037, fpc_core, fy, fu, 3, 4., NoOfCols, 3)
NoOfCols = 16
BarSize = 8.0
Ag = ( (NoOfCols - 1) * Long_Spacing + 6 ) * Thickness
Rho = ( NoOfCols * 2 + 2 ) * np.pi * ( BarSize / 2. / 8.) ** 2. / Ag
# print Rho
Section2 = PlanarWallSection(Length, Thickness,
(NoOfCols - 1) * Long_Spacing + 6,
(NoOfCols - 1) * Long_Spacing + 6, BarSize,
[3] + (np.ones(NoOfCols - 2) * 2.).tolist() + [3],
[3] + (np.ones(NoOfCols - 2) * 2.).tolist() + [3],
0.255, 4.037, fpc_core, fy, fu, 3, 4., 8, 3)
Section3 = PlanarWallSection(Length, Thickness, 0, 0, 10.173,
[],
[],
0.255, 4.037, fpc_core, fy, fu, None, None, None, None)
Sections = [Section1, Section1, Section2, Section2]
S4H14SEAPG3 = CreateArchetype(Use2008Maps = False, Overstrength = 1.25)
Archetypes.append(S4H14SEAPG3)
Name = 'S4H14SEAWBPG3'
NoOfStories = 6
Sections = [
Section1, Section1,
Section1, Section1, Section2, Section2
]
S4H14SEAWBPG3 = CreateArchetype(Basements2Levels, Use2008Maps = False, Overstrength = 1.25)
Archetypes.append(S4H14SEAWBPG3)
#endregion
#region Archetype S8H14SEAPG3 and S8H14SEAWBPG3
Name = 'S8H14SEAPG3'
# print 'Importing Archetype: ' + Name
#### Input Variables
NoOfStories = 8
Thickness = 24.
Length = 14. * 12.
Flange_Thickness = 7*12. # Assume 6' Long Core
Long_Spacing = 4
BarSize = 8.0
Rho = 2.0 #In Fraction
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section1 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing, Thickness - 3.5)
BarSize = 8.0
Rho = 1.1 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section2 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2, Thickness - 3.5)
BarSize = 4.0
Rho = 0.30 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section3 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, None, None, None, None)
Sections = [Section1, Section1, Section1,
Section2, Section2, Section2,
Section3, Section3,
]
S8H14SEAPG3 = CreateArchetype(Use2008Maps = False, Overstrength = 1.25)
Archetypes.append(S8H14SEAPG3)
Name = 'S8H14SEAWBPG3'
NoOfStories = 11
Sections = [
Section1, Section1, Section1,
Section1, Section1, Section1,
Section2, Section2, Section2,
Section3, Section3,
]
S8H14SEAWBPG3 = CreateArchetype(Basements3Levels, Use2008Maps = False, Overstrength = 1.25)
Archetypes.append(S8H14SEAWBPG3)
#endregion
#region Archetype S12H14SEAPG3 and S12H14SEAWBPG3
Name = 'S12H14SEAPG3'
# print 'Importing Archetype: ' + Name
#### Input Variables
NoOfStories = 12
Thickness = 24.
Length = 18. * 12.
Flange_Thickness = 9*12. # Assume 6' Long Core
Long_Spacing = 4
BarSize = 8.0
Rho = 1.55 #In Fraction
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section1 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing, Thickness - 3.5)
BarSize = 8.0
Rho = 1.25 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section2 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2, Thickness - 3.5)
ThicknessBelow = float(Thickness)
Thickness = 18.
Length = Length - (ThicknessBelow - Thickness) * 2.
BarSize = 4.0
Rho = 0.75 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section3 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2, Thickness - 3.5)
BarSize = 4.0
Rho = 0.25 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section4 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, None, None, None, None)
Sections = [Section1, Section1, Section1,
Section2, Section2, Section2,
Section3, Section3, Section3,
Section4, Section4, Section4,
]
S12H14SEAPG3 = CreateArchetype(Use2008Maps = False, Overstrength = 1.25)
Archetypes.append(S12H14SEAPG3)
Name = 'S12H14SEAWBPG3'
NoOfStories = 16
Sections = [
Section1, Section1, Section1, Section1,
Section1, Section1, Section1,
Section2, Section2, Section2,
Section3, Section3, Section3,
Section4, Section4, Section4,
]
S12H14SEAWBPG3 = CreateArchetype(Basements, Use2008Maps = False, Overstrength = 1.25)
Archetypes.append(S12H14SEAWBPG3)
#endregion
#region Archetype S16H14SEAPG3 and S16H14SEAWBPG3
Name = 'S16H14SEAPG3'
#### Input Variables
NoOfStories = 16
Thickness = 34.
Length = 22. * 12.
Flange_Thickness = 11*12. # Assume 6' Long Core
Long_Spacing = 4
BarSize = 8.0
Rho = 0.9 #In Fraction
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section1 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing, Thickness - 3.5)
BarSize = 7.0
Rho = 0.8 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section2 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2, Thickness - 3.5)
ThicknessBelow = float(Thickness)
Thickness = 24.
Length = Length - (ThicknessBelow - Thickness) * 2.
BarSize = 5.0
Rho = 0.60 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section3 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2, Thickness - 3.5)
BarSize = 4.0
Rho = 0.25 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section4 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, None, None, None, None)
Sections = [Section1, Section1, Section1, Section1,
Section2, Section2, Section2, Section2,
Section3, Section3, Section3, Section3,
Section4, Section4, Section4, Section4,
]
S16H14SEAPG3 = CreateArchetype(Use2008Maps = False, Overstrength = 1.25)
Archetypes.append(S16H14SEAPG3)
Name = 'S16H14SEAWBPG3'
NoOfStories = 20 # Include Basement Floors Here
Sections = [Section1, Section1, Section1, Section1,
Section1, Section1, Section1, Section1,
Section2, Section2, Section2, Section2,
Section3, Section3, Section3, Section3,
Section4, Section4, Section4, Section4,
]
S16H14SEAWBPG3 = CreateArchetype(Basements, False, Overstrength = 1.25)
Archetypes.append(S16H14SEAWBPG3)
#endregion
#region Archetype S20H14SEAPG3 and S20H14SEAWBPG3
Name = 'S20H14SEAPG3'
# print 'Importing Archetype: ' + Name
#### Input Variables
NoOfStories = 20
Thickness = 40.
Length = 26. * 12.
Flange_Thickness = 13.*12. # Assume 6' Long Core
Long_Spacing = 4
BarSize = 8.0
Rho = 0.675 #In Fraction
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section1 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing, Thickness - 3.5)
BarSize = 7.0
Rho = 0.6 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section2 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2, Thickness - 3.5)
ThicknessBelow = float(Thickness)
Thickness = 26.
Length = Length - (ThicknessBelow - Thickness) * 2.
BarSize = 6.0
Rho =0.6 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section3 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2, Thickness - 3.5)
BarSize = 4.0
Rho = 0.25 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section4 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, None, None, None, None)
ThicknessBelow = float(Thickness)
Thickness = 22.
Length = Length - (ThicknessBelow - Thickness) * 2.
BarSize = 4.0
Rho = 0.25 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section5 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, None, None, None, None)
Sections = [Section1, Section1, Section1, Section1,
Section2, Section2, Section2, Section2,
Section3, Section3, Section3, Section3,
Section4, Section4, Section4, Section4,
Section5, Section5, Section5, Section5,
]
S20H14SEAPG3 = CreateArchetype(Use2008Maps = False, Overstrength = 1.25)
Archetypes.append(S20H14SEAPG3)
Name = 'S20H14SEAWBPG3'
NoOfStories = 24 # Include Basement Floors Here
Sections = [Section1, Section1, Section1, Section1,
Section1, Section1, Section1, Section1,
Section2, Section2, Section2, Section2,
Section3, Section3, Section3, Section3,
Section4, Section4, Section4, Section4,
Section5, Section5, Section5, Section5,
]
S20H14SEAWBPG3 = CreateArchetype(Basements, False, Overstrength = 1.25)
Archetypes.append(S20H14SEAWBPG3)
#endregion
#region Archetype S24H14SEAPG3 and S24H14SEAWBPG3
Name = 'S24H14SEAPG3'
# print 'Importing Archetype: ' + Name
#### Input Variables
NoOfStories = 24
Thickness = 44.
Length = 30. * 12.
Flange_Thickness = 15*12. # Assume 6' Long Core
Long_Spacing = 4
BarSize = 7.0
Rho = 0.525
#In Fraction
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section1 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing, Thickness - 3.5)
BarSize = 7.0
Rho = 0.525 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section2 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2., Thickness - 3.5)
ThicknessBelow = float(Thickness)
Thickness = 30.
Length = Length - (ThicknessBelow - Thickness) * 2.
BarSize = 6.0
Rho = 0.55 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section3 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2., Thickness - 3.5)
BarSize = 4.0
Rho = 0.30 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section4 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, None, None, None, None)
ThicknessBelow = float(Thickness)
Thickness = 24.
Length = Length - (ThicknessBelow - Thickness) * 2.
BarSize = 4.0
Rho = 0.25 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section5 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, None, None, None, None)
BarSize = 4.0
Rho = 0.25 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section6 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, None, None, None, None)
Sections = [Section1, Section1, Section1, Section1,
Section2, Section2, Section2, Section2,
Section3, Section3, Section3, Section3,
Section4, Section4, Section4, Section4,
Section5, Section5, Section5, Section5,
Section6, Section6, Section6, Section6,
]
S24H14SEAPG3 = CreateArchetype(Use2008Maps = False, Overstrength = 1.25)
Archetypes.append(S24H14SEAPG3)
Name = 'S24H14SEAWBPG3'
NoOfStories = 28
Sections = [Section1, Section1, Section1, Section1,
Section1, Section1, Section1, Section1,
Section2, Section2, Section2, Section2,
Section3, Section3, Section3, Section3,
Section4, Section4, Section4, Section4,
Section5, Section5, Section5, Section5,
Section6, Section6, Section6, Section6,
]
S24H14SEAWBPG3 = CreateArchetype(Basements, False, Overstrength = 1.25)
Archetypes.append(S24H14SEAWBPG3)
#endregion
# PG4 : 50% Over-strength on ASCE 7 loads
#region Archetype S4H14SEAPG4 and S4H14SEAWBPG4
Name = 'S4H14SEAPG4'
# print 'Importing Archetype: ' + Name
# Compute Seismic Weight
NoOfStories = 4
YGrids = [0] + np.array(np.arange(0,(NoOfStories)*13*12, 13*12)+15*12).tolist()
DeadLoads = np.ones(NoOfStories) * DL / 1000.
DeadLoads[-1] = DeadLoads[-1] * DL_Roof / DL
LiveLoads = np.ones(NoOfStories) * LL / 1000.
LiveLoads[-1] = LiveLoads[-1] * LL_Roof / LL
MassPerSqFt = DL / 1000.
Mass = np.ones(NoOfStories) * MassPerSqFt * FloorArea
Mass[-1] = FloorArea * DL_Roof / 1000. # Adjust for Roof Weight
WallTribArea = FloorArea * 0.5
WeightPerSqFt = DL
BuildingWeight = np.ones(NoOfStories) * WeightPerSqFt * FloorArea
BuildingWeight[-1] = 152. / 1000. * FloorArea # Adjust for Roof Weight
# Seismic Hazard
R = 6; Cd = 5
SaDesign, Sds, CuTa = GetSeattle2014Hazard(YGrids[-1], R=R, Overstrength = 1.50)
Thickness = 22.
Length = 15. * 12.
Long_Spacing = 4
NoOfCols = 18
BarSize = 10.
Ag = ( (NoOfCols - 1) * Long_Spacing + 6 ) * Thickness
Rho = ( NoOfCols * 2 + 2 ) * np.pi * ( BarSize / 2. / 8.) ** 2. / Ag
# print Rho
Section1 = PlanarWallSection(Length, Thickness,
(NoOfCols - 1) * Long_Spacing + 6,
(NoOfCols - 1) * Long_Spacing + 6, BarSize,
[3] + (np.ones(NoOfCols - 2) * 2.).tolist() + [3],
[3] + (np.ones(NoOfCols - 2) * 2.).tolist() + [3],
0.255, 4.037, fpc_core, fy, fu, 3, 4., NoOfCols, 3)
NoOfCols = 18
BarSize = 8.0
Ag = ( (NoOfCols - 1) * Long_Spacing + 6 ) * Thickness
Rho = ( NoOfCols * 2 + 2 ) * np.pi * ( BarSize / 2. / 8.) ** 2. / Ag
# print Rho
Section2 = PlanarWallSection(Length, Thickness,
(NoOfCols - 1) * Long_Spacing + 6,
(NoOfCols - 1) * Long_Spacing + 6, BarSize,
[3] + (np.ones(NoOfCols - 2) * 2.).tolist() + [3],
[3] + (np.ones(NoOfCols - 2) * 2.).tolist() + [3],
0.255, 4.037, fpc_core, fy, fu, 3, 4., 8, 3)
Section3 = PlanarWallSection(Length, Thickness, 0, 0, 10.173,
[],
[],
0.255, 4.037, fpc_core, fy, fu, None, None, None, None)
Sections = [Section1, Section1, Section2, Section2]
S4H14SEAPG4 = CreateArchetype(Use2008Maps = False, Overstrength = 1.50)
Archetypes.append(S4H14SEAPG4)
Name = 'S4H14SEAWBPG4'
NoOfStories = 6
Sections = [
Section1, Section1,
Section1, Section1, Section2, Section2
]
S4H14SEAWBPG4 = CreateArchetype(Basements2Levels, Use2008Maps = False, Overstrength = 1.50)
Archetypes.append(S4H14SEAWBPG4)
#endregion
#region Archetype S8H14SEAPG4 and S8H14SEAWBPG4
Name = 'S8H14SEAPG4'
# print 'Importing Archetype: ' + Name
#### Input Variables
NoOfStories = 8
Thickness = 26.
Length = 15. * 12.
Flange_Thickness = 7.5*12. # Assume 6' Long Core
Long_Spacing = 4
BarSize = 10.0
Rho = 2.0 #In Fraction
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section1 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing, Thickness - 3.5)
BarSize = 9.0
Rho = 1.3 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section2 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2, Thickness - 3.5)
BarSize = 5.0
Rho = 0.40 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section3 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, None, None, None, None)
Sections = [Section1, Section1, Section1,
Section2, Section2, Section2,
Section3, Section3,
]
S8H14SEAPG4 = CreateArchetype(Use2008Maps = False, Overstrength = 1.50)
Archetypes.append(S8H14SEAPG4)
Name = 'S8H14SEAWBPG4'
NoOfStories = 11
Sections = [
Section1, Section1, Section1,
Section1, Section1, Section1,
Section2, Section2, Section2,
Section3, Section3,
]
S8H14SEAWBPG4 = CreateArchetype(Basements3Levels, Use2008Maps = False, Overstrength = 1.50)
Archetypes.append(S8H14SEAWBPG4)
#endregion
#region Archetype S12H14SEAPG4 and S12H14SEAWBPG4
Name = 'S12H14SEAPG4'
# print 'Importing Archetype: ' + Name
#### Input Variables
NoOfStories = 12
Thickness = 30.
Length = 18. * 12.
Flange_Thickness = 9.0*12. # Assume 6' Long Core
Long_Spacing = 4
BarSize = 9.0
Rho = 1.70 #In Fraction
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section1 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing, Thickness - 3.5)
BarSize = 8.0
Rho = 1.35 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section2 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2, Thickness - 3.5)
ThicknessBelow = float(Thickness)
Thickness = 22.
Length = Length - (ThicknessBelow - Thickness) * 2.
BarSize = 6.0
Rho = 0.9 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section3 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2, Thickness - 3.5)
BarSize = 4.0
Rho = 0.35 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section4 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, None, None, None, None)
Sections = [Section1, Section1, Section1,
Section2, Section2, Section2,
Section3, Section3, Section3,
Section4, Section4, Section4,
]
S12H14SEAPG4 = CreateArchetype(Use2008Maps = False, Overstrength = 1.50)
Archetypes.append(S12H14SEAPG4)
Name = 'S12H14SEAWBPG4'
NoOfStories = 16
Sections = [
Section1, Section1, Section1, Section1,
Section1, Section1, Section1,
Section2, Section2, Section2,
Section3, Section3, Section3,
Section4, Section4, Section4,
]
S12H14SEAWBPG4 = CreateArchetype(Basements, Use2008Maps = False, Overstrength = 1.50)
Archetypes.append(S12H14SEAWBPG4)
#endregion
#region Archetype S16H14SEAPG4 and S16H14SEAWBPG4
Name = 'S16H14SEAPG4'
#### Input Variables
NoOfStories = 16
Thickness = 34.
Length = 23. * 12.
Flange_Thickness = 11.5*12. # Assume 6' Long Core
Long_Spacing = 4
BarSize = 9.0
Rho = 1.2 #In Fraction
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section1 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing, Thickness - 3.5)
BarSize = 8.0
Rho = 0.9 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section2 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2, Thickness - 3.5)
ThicknessBelow = float(Thickness)
Thickness = 26.
Length = Length - (ThicknessBelow - Thickness) * 2.
BarSize = 5.0
Rho = 0.65 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section3 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2, Thickness - 3.5)
BarSize = 4.0
Rho = 0.25 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section4 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, None, None, None, None)
Sections = [Section1, Section1, Section1, Section1,
Section2, Section2, Section2, Section2,
Section3, Section3, Section3, Section3,
Section4, Section4, Section4, Section4,
]
S16H14SEAPG4 = CreateArchetype(Use2008Maps = False, Overstrength = 1.50)
Archetypes.append(S16H14SEAPG4)
Name = 'S16H14SEAWBPG4'
NoOfStories = 20 # Include Basement Floors Here
Sections = [Section1, Section1, Section1, Section1,
Section1, Section1, Section1, Section1,
Section2, Section2, Section2, Section2,
Section3, Section3, Section3, Section3,
Section4, Section4, Section4, Section4,
]
S16H14SEAWBPG4 = CreateArchetype(Basements, False, Overstrength = 1.50)
Archetypes.append(S16H14SEAWBPG4)
#endregion
#region Archetype S20H14SEAPG4 and S20H14SEAWBPG4
Name = 'S20H14SEAPG4'
# print 'Importing Archetype: ' + Name
#### Input Variables
NoOfStories = 20
Thickness = 44.
Length = 27. * 12.
Flange_Thickness = 13.5*12. # Assume 6' Long Core
Long_Spacing = 4
BarSize = 8.0
Rho = 0.825 #In Fraction
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section1 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing, Thickness - 3.5)
BarSize = 8.0
Rho = 0.7 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section2 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2, Thickness - 3.5)
ThicknessBelow = float(Thickness)
Thickness = 30.
Length = Length - (ThicknessBelow - Thickness) * 2.
BarSize = 8.0
Rho =0.70 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section3 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2, Thickness - 3.5)
BarSize = 5.0
Rho = 0.4 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section4 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, None, None, None, None)
ThicknessBelow = float(Thickness)
Thickness = 22.
Length = Length - (ThicknessBelow - Thickness) * 2.
BarSize = 4.0
Rho = 0.25 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section5 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, None, None, None, None)
Sections = [Section1, Section1, Section1, Section1,
Section2, Section2, Section2, Section2,
Section3, Section3, Section3, Section3,
Section4, Section4, Section4, Section4,
Section5, Section5, Section5, Section5,
]
S20H14SEAPG4 = CreateArchetype(Use2008Maps = False, Overstrength = 1.50)
Archetypes.append(S20H14SEAPG4)
Name = 'S20H14SEAWBPG4'
NoOfStories = 24 # Include Basement Floors Here
Sections = [Section1, Section1, Section1, Section1,
Section1, Section1, Section1, Section1,
Section2, Section2, Section2, Section2,
Section3, Section3, Section3, Section3,
Section4, Section4, Section4, Section4,
Section5, Section5, Section5, Section5,
]
S20H14SEAWBPG4 = CreateArchetype(Basements, False, Overstrength = 1.50)
Archetypes.append(S20H14SEAWBPG4)
#endregion
#region Archetype S24H14SEAPG4 and S24H14SEAWBPG4
Name = 'S24H14SEAPG4'
# print 'Importing Archetype: ' + Name
#### Input Variables
NoOfStories = 24
Thickness = 50.
Length = 31. * 12.
Flange_Thickness = 15.5*12. # Assume 6' Long Core
Long_Spacing = 4
BarSize = 8.0
Rho = 0.7
#In Fraction
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section1 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing, Thickness - 3.5)
BarSize = 8.0
Rho = 0.6 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section2 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2., Thickness - 3.5)
ThicknessBelow = float(Thickness)
Thickness = 36.
Length = Length - (ThicknessBelow - Thickness) * 2.
BarSize = 7.0
Rho = 0.65 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section3 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2., Thickness - 3.5)
BarSize = 5.0
Rho = 0.40 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section4 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, None, None, None, None)
ThicknessBelow = float(Thickness)
Thickness = 26.
Length = Length - (ThicknessBelow - Thickness) * 2.
BarSize = 4.0
Rho = 0.25 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section5 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, None, None, None, None)
BarSize = 4.0
Rho = 0.25 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section6 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, None, None, None, None)
Sections = [Section1, Section1, Section1, Section1,
Section2, Section2, Section2, Section2,
Section3, Section3, Section3, Section3,
Section4, Section4, Section4, Section4,
Section5, Section5, Section5, Section5,
Section6, Section6, Section6, Section6,
]
S24H14SEAPG4 = CreateArchetype(Use2008Maps = False, Overstrength = 1.50)
Archetypes.append(S24H14SEAPG4)
Name = 'S24H14SEAWBPG4'
NoOfStories = 28
Sections = [Section1, Section1, Section1, Section1,
Section1, Section1, Section1, Section1,
Section2, Section2, Section2, Section2,
Section3, Section3, Section3, Section3,
Section4, Section4, Section4, Section4,
Section5, Section5, Section5, Section5,
Section6, Section6, Section6, Section6,
]
S24H14SEAWBPG4 = CreateArchetype(Basements, False, Overstrength = 1.50)
Archetypes.append(S24H14SEAWBPG4)
#endregion
# PG5: 1.5% Drift Limit
#region Archetype S4H14SEAPG5 and S4H14SEAWBPG5
Name = 'S4H14SEAPG5'
# print 'Importing Archetype: ' + Name
# Compute Seismic Weight
NoOfStories = 4
YGrids = [0] + np.array(np.arange(0,(NoOfStories)*13*12, 13*12)+15*12).tolist()
DeadLoads = np.ones(NoOfStories) * DL / 1000.
DeadLoads[-1] = DeadLoads[-1] * DL_Roof / DL
LiveLoads = np.ones(NoOfStories) * LL / 1000.
LiveLoads[-1] = LiveLoads[-1] * LL_Roof / LL
MassPerSqFt = DL / 1000.
Mass = np.ones(NoOfStories) * MassPerSqFt * FloorArea
Mass[-1] = FloorArea * DL_Roof / 1000. # Adjust for Roof Weight
WallTribArea = FloorArea * 0.5
WeightPerSqFt = DL
BuildingWeight = np.ones(NoOfStories) * WeightPerSqFt * FloorArea
BuildingWeight[-1] = 152. / 1000. * FloorArea # Adjust for Roof Weight
# Seismic Hazard
R = 6; Cd = 5
SaDesign, Sds, CuTa = GetSeattle2008Hazard(YGrids[-1], R=R)
Thickness = 24.
Length = 13. * 12.
Long_Spacing = 4
NoOfCols = 9
BarSize = 10.
Ag = ( (NoOfCols - 1) * Long_Spacing + 6 ) * Thickness
Rho = ( NoOfCols * 2 + 2 ) * np.pi * ( BarSize / 2. / 8.) ** 2. / Ag
# print Rho
Section1 = PlanarWallSection(Length, Thickness,
(NoOfCols - 1) * Long_Spacing + 6,
(NoOfCols - 1) * Long_Spacing + 6, BarSize,
[3] + (np.ones(NoOfCols - 2) * 2.).tolist() + [3],
[3] + (np.ones(NoOfCols - 2) * 2.).tolist() + [3],
0.255, 4.037, fpc_core, fy, fu, 3, 4., NoOfCols, 3)
NoOfCols = 9
BarSize = 8.0
Ag = ( (NoOfCols - 1) * Long_Spacing + 6 ) * Thickness
Rho = ( NoOfCols * 2 + 2 ) * np.pi * ( BarSize / 2. / 8.) ** 2. / Ag
# print Rho
Section2 = PlanarWallSection(Length, Thickness,
(NoOfCols - 1) * Long_Spacing + 6,
(NoOfCols - 1) * Long_Spacing + 6, BarSize,
[3] + (np.ones(NoOfCols - 2) * 2.).tolist() + [3],
[3] + (np.ones(NoOfCols - 2) * 2.).tolist() + [3],
0.255, 4.037, fpc_core, fy, fu, 3, 4., 8, 3)
Section3 = PlanarWallSection(Length, Thickness, 0, 0, 10.173,
[],
[],
0.255, 4.037, fpc_core, fy, fu, None, None, None, None)
Sections = [Section1, Section1, Section2, Section2]
S4H14SEAPG5 = CreateArchetype(Use2008Maps = False)
Archetypes.append(S4H14SEAPG5)
Name = 'S4H14SEAWBPG5'
NoOfStories = 6
Sections = [
Section1, Section1,
Section1, Section1, Section2, Section2
]
S4H14SEAWBPG5 = CreateArchetype(Basements2Levels, Use2008Maps = False)
Archetypes.append(S4H14SEAWBPG5)
#endregion
#region Archetype S8H14SEAPG5 and S8H14SEAWBPG5
Name = 'S8H14SEAPG5'
# print 'Importing Archetype: ' + Name
#### Input Variables
NoOfStories = 8
Thickness = 24.
Length = 14. * 12.
Flange_Thickness = 7*12. # Assume 6' Long Core
Long_Spacing = 4
BarSize = 8.0
Rho = 1.25 #In Fraction
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section1 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing, Thickness - 3.5)
BarSize = 8.0
Rho = 0.8 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section2 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2, Thickness - 3.5)
BarSize = 4.0
Rho = 0.25 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section3 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, None, None, None, None)
Sections = [Section1, Section1, Section1,
Section2, Section2, Section2,
Section3, Section3,
]
S8H14SEAPG5 = CreateArchetype(Use2008Maps = False)
Archetypes.append(S8H14SEAPG5)
Name = 'S8H14SEAWBPG5'
NoOfStories = 11
Sections = [
Section1, Section1, Section1,
Section1, Section1, Section1,
Section2, Section2, Section2,
Section3, Section3,
]
S8H14SEAWBPG5 = CreateArchetype(Basements3Levels, Use2008Maps = False)
Archetypes.append(S8H14SEAWBPG5)
#endregion
#region Archetype S12H14SEAPG5 and S12H14SEAWBPG5
Name = 'S12H14SEAPG5'
# print 'Importing Archetype: ' + Name
#### Input Variables
NoOfStories = 12
Thickness = 26.
Length = 17. * 12.
Flange_Thickness = 8.5*12. # Assume 6' Long Core
Long_Spacing = 4
BarSize = 8.0
Rho = 1.025 #In Fraction
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section1 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing, Thickness - 3.5)
BarSize = 7.0
Rho = 0.80 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section2 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2, Thickness - 3.5)
ThicknessBelow = float(Thickness)
Thickness = 18.
Length = Length - (ThicknessBelow - Thickness) * 2.
BarSize = 5.0
Rho = 0.60 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section3 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2, Thickness - 3.5)
BarSize = 4.0
Rho = 0.25 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section4 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, None, None, None, None)
Sections = [Section1, Section1, Section1,
Section2, Section2, Section2,
Section3, Section3, Section3,
Section4, Section4, Section4,
]
S12H14SEAPG5 = CreateArchetype(Use2008Maps = False)
Archetypes.append(S12H14SEAPG5)
Name = 'S12H14SEAWBPG5'
NoOfStories = 16
Sections = [
Section1, Section1, Section1, Section1,
Section1, Section1, Section1,
Section2, Section2, Section2,
Section3, Section3, Section3,
Section4, Section4, Section4,
]
S12H14SEAWBPG5 = CreateArchetype(Basements, Use2008Maps = False)
Archetypes.append(S12H14SEAWBPG5)
#endregion
#region Archetype S16H14SEAPG5 and S16H14SEAWBPG5
Name = 'S16H14SEAPG5'
#### Input Variables
NoOfStories = 16
Thickness = 32.
Length = 20. * 12.
Flange_Thickness = 10*12. # Assume 6' Long Core
Long_Spacing = 4
BarSize = 8.0
Rho = 0.725 #In Fraction
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section1 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing, Thickness - 3.5)
BarSize = 8.0
Rho = 0.6 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section2 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2, Thickness - 3.5)
ThicknessBelow = float(Thickness)
Thickness = 18.
Length = Length - (ThicknessBelow - Thickness) * 2.
BarSize = 5.0
Rho = 0.50 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section3 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2, Thickness - 3.5)
BarSize = 4.0
Rho = 0.25 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section4 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, None, None, None, None)
Sections = [Section1, Section1, Section1, Section1,
Section2, Section2, Section2, Section2,
Section3, Section3, Section3, Section3,
Section4, Section4, Section4, Section4,
]
S16H14SEAPG5 = CreateArchetype(Use2008Maps = False)
Archetypes.append(S16H14SEAPG5)
Name = 'S16H14SEAWBPG5'
NoOfStories = 20 # Include Basement Floors Here
Sections = [Section1, Section1, Section1, Section1,
Section1, Section1, Section1, Section1,
Section2, Section2, Section2, Section2,
Section3, Section3, Section3, Section3,
Section4, Section4, Section4, Section4,
]
S16H14SEAWBPG5 = CreateArchetype(Basements, False)
Archetypes.append(S16H14SEAWBPG5)
#endregion
#region Archetype S20H14SEAPG5 and S20H14SEAWBPG5
Name = 'S20H14SEAPG5'
# print 'Importing Archetype: ' + Name
#### Input Variables
NoOfStories = 20
Thickness = 36.
Length = 23. * 12.
Flange_Thickness = 11.5*12. # Assume 6' Long Core
Long_Spacing = 4
BarSize = 7.0
Rho = 0.525 #In Fraction
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section1 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing, Thickness - 3.5)
BarSize = 7.0
Rho = 0.525 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section2 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2, Thickness - 3.5)
ThicknessBelow = float(Thickness)
Thickness = 20.
Length = Length - (ThicknessBelow - Thickness) * 2.
BarSize = 6.0
Rho =0.5 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section3 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2, Thickness - 3.5)
BarSize = 4.0
Rho = 0.25 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section4 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, None, None, None, None)
ThicknessBelow = float(Thickness)
Thickness = 22.
Length = Length - (ThicknessBelow - Thickness) * 2.
BarSize = 4.0
Rho = 0.25 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section5 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, None, None, None, None)
Sections = [Section1, Section1, Section1, Section1,
Section2, Section2, Section2, Section2,
Section3, Section3, Section3, Section3,
Section4, Section4, Section4, Section4,
Section5, Section5, Section5, Section5,
]
S20H14SEAPG5 = CreateArchetype(Use2008Maps = False)
Archetypes.append(S20H14SEAPG5)
Name = 'S20H14SEAWBPG5'
NoOfStories = 24 # Include Basement Floors Here
Sections = [Section1, Section1, Section1, Section1,
Section1, Section1, Section1, Section1,
Section2, Section2, Section2, Section2,
Section3, Section3, Section3, Section3,
Section4, Section4, Section4, Section4,
Section5, Section5, Section5, Section5,
]
S20H14SEAWBPG5 = CreateArchetype(Basements, False)
Archetypes.append(S20H14SEAWBPG5)
#endregion
#region Archetype S24H14SEAPG5 and S24H14SEAWBPG5
Name = 'S24H14SEAPG5'
# print 'Importing Archetype: ' + Name
#### Input Variables
NoOfStories = 24
Thickness = 40.
Length = 25. * 12.
Flange_Thickness = 12.5*12. # Assume 6' Long Core
Long_Spacing = 4
BarSize = 7.0
Rho = 0.50 #In Fraction
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section1 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing, Thickness - 3.5)
BarSize = 7.0
Rho = 0.501 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section2 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2., Thickness - 3.5)
ThicknessBelow = float(Thickness)
Thickness = 26.
Length = Length - (ThicknessBelow - Thickness) * 2.
BarSize = 7.0
Rho = 0.55 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section3 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2., Thickness - 3.5)
BarSize = 4.0
Rho = 0.35 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section4 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, None, None, None, None)
ThicknessBelow = float(Thickness)
Thickness = 20.
Length = Length - (ThicknessBelow - Thickness) * 2.
BarSize = 4.0
Rho = 0.25 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section5 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, None, None, None, None)
BarSize = 4.0
Rho = 0.25 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section6 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, None, None, None, None)
Sections = [Section1, Section1, Section1, Section1,
Section2, Section2, Section2, Section2,
Section3, Section3, Section3, Section3,
Section4, Section4, Section4, Section4,
Section5, Section5, Section5, Section5,
Section6, Section6, Section6, Section6,
]
S24H14SEAPG5 = CreateArchetype(Use2008Maps = False)
Archetypes.append(S24H14SEAPG5)
Name = 'S24H14SEAWBPG5'
NoOfStories = 28
Sections = [Section1, Section1, Section1, Section1,
Section1, Section1, Section1, Section1,
Section2, Section2, Section2, Section2,
Section3, Section3, Section3, Section3,
Section4, Section4, Section4, Section4,
Section5, Section5, Section5, Section5,
Section6, Section6, Section6, Section6,
]
S24H14SEAWBPG5 = CreateArchetype(Basements, False)
Archetypes.append(S24H14SEAWBPG5)
#endregion
# PG6: 1.25% Drift Limit
#region Archetype S4H14SEAPG6 and S4H14SEAWBPG6
Name = 'S4H14SEAPG6'
# print 'Importing Archetype: ' + Name
# Compute Seismic Weight
NoOfStories = 4
YGrids = [0] + np.array(np.arange(0,(NoOfStories)*13*12, 13*12)+15*12).tolist()
DeadLoads = np.ones(NoOfStories) * DL / 1000.
DeadLoads[-1] = DeadLoads[-1] * DL_Roof / DL
LiveLoads = np.ones(NoOfStories) * LL / 1000.
LiveLoads[-1] = LiveLoads[-1] * LL_Roof / LL
MassPerSqFt = DL / 1000.
Mass = np.ones(NoOfStories) * MassPerSqFt * FloorArea
Mass[-1] = FloorArea * DL_Roof / 1000. # Adjust for Roof Weight
WallTribArea = FloorArea * 0.5
WeightPerSqFt = DL
BuildingWeight = np.ones(NoOfStories) * WeightPerSqFt * FloorArea
BuildingWeight[-1] = 152. / 1000. * FloorArea # Adjust for Roof Weight
# Seismic Hazard
R = 6; Cd = 5
SaDesign, Sds, CuTa = GetSeattle2008Hazard(YGrids[-1], R=R)
Thickness = 28.
Length = 14. * 12.
Long_Spacing = 4
NoOfCols = 10
BarSize = 9.
Ag = ( (NoOfCols - 1) * Long_Spacing + 6 ) * Thickness
Rho = ( NoOfCols * 2 + 2 ) * np.pi * ( BarSize / 2. / 8.) ** 2. / Ag
# print Rho
Section1 = PlanarWallSection(Length, Thickness,
(NoOfCols - 1) * Long_Spacing + 6,
(NoOfCols - 1) * Long_Spacing + 6, BarSize,
[3] + (np.ones(NoOfCols - 2) * 2.).tolist() + [3],
[3] + (np.ones(NoOfCols - 2) * 2.).tolist() + [3],
0.255, 4.037, fpc_core, fy, fu, 3, 4., NoOfCols, 3)
NoOfCols = 8
BarSize = 8.0
Ag = ( (NoOfCols - 1) * Long_Spacing + 6 ) * Thickness
Rho = ( NoOfCols * 2 + 2 ) * np.pi * ( BarSize / 2. / 8.) ** 2. / Ag
# print Rho
Section2 = PlanarWallSection(Length, Thickness,
(NoOfCols - 1) * Long_Spacing + 6,
(NoOfCols - 1) * Long_Spacing + 6, BarSize,
[3] + (np.ones(NoOfCols - 2) * 2.).tolist() + [3],
[3] + (np.ones(NoOfCols - 2) * 2.).tolist() + [3],
0.255, 4.037, fpc_core, fy, fu, 3, 4., 8, 3)
Section3 = PlanarWallSection(Length, Thickness, 0, 0, 10.173,
[],
[],
0.255, 4.037, fpc_core, fy, fu, None, None, None, None)
Sections = [Section1, Section1, Section2, Section2]
S4H14SEAPG6 = CreateArchetype(Use2008Maps = False)
Archetypes.append(S4H14SEAPG6)
Name = 'S4H14SEAWBPG6'
NoOfStories = 6
Sections = [
Section1, Section1,
Section1, Section1, Section2, Section2
]
S4H14SEAWBPG6 = CreateArchetype(Basements2Levels, Use2008Maps = False)
Archetypes.append(S4H14SEAWBPG6)
#endregion
#region Archetype S8H14SEAPG6 and S8H14SEAWBPG6
Name = 'S8H14SEAPG6'
# print 'Importing Archetype: ' + Name
#### Input Variables
NoOfStories = 8
Thickness = 24.
Length = 15. * 12.
Flange_Thickness = 7.5*12. # Assume 6' Long Core
Long_Spacing = 4
BarSize = 7.0
Rho = 0.975 #In Fraction
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section1 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing, Thickness - 3.5)
BarSize = 7.0
Rho = 0.65 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section2 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2, Thickness - 3.5)
BarSize = 4.0
Rho = 0.25 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section3 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, None, None, None, None)
Sections = [Section1, Section1, Section1,
Section2, Section2, Section2,
Section3, Section3,
]
S8H14SEAPG6 = CreateArchetype(Use2008Maps = False)
Archetypes.append(S8H14SEAPG6)
Name = 'S8H14SEAWBPG6'
NoOfStories = 11
Sections = [
Section1, Section1, Section1,
Section1, Section1, Section1,
Section2, Section2, Section2,
Section3, Section3,
]
S8H14SEAWBPG6 = CreateArchetype(Basements3Levels, Use2008Maps = False)
Archetypes.append(S8H14SEAWBPG6)
#endregion
#region Archetype S12H14SEAPG6 and S12H14SEAWBPG6
Name = 'S12H14SEAPG6'
# print 'Importing Archetype: ' + Name
#### Input Variables
NoOfStories = 12
Thickness = 28.
Length = 19. * 12.
Flange_Thickness = 9.5*12. # Assume 6' Long Core
Long_Spacing = 4
BarSize = 7.0
Rho = 0.55 #In Fraction
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section1 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing, Thickness - 3.5)
BarSize = 6.0
Rho = 0.50 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section2 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2, Thickness - 3.5)
ThicknessBelow = float(Thickness)
Thickness = 16.
Length = Length - (ThicknessBelow - Thickness) * 2.
BarSize = 5.0
Rho = 0.50 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section3 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2, Thickness - 3.5)
BarSize = 4.0
Rho = 0.25 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section4 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, None, None, None, None)
Sections = [Section1, Section1, Section1,
Section2, Section2, Section2,
Section3, Section3, Section3,
Section4, Section4, Section4,
]
S12H14SEAPG6 = CreateArchetype(Use2008Maps = False)
Archetypes.append(S12H14SEAPG6)
Name = 'S12H14SEAWBPG6'
NoOfStories = 16
Sections = [
Section1, Section1, Section1, Section1,
Section1, Section1, Section1,
Section2, Section2, Section2,
Section3, Section3, Section3,
Section4, Section4, Section4,
]
S12H14SEAWBPG6 = CreateArchetype(Basements, Use2008Maps = False)
Archetypes.append(S12H14SEAWBPG6)
#endregion
#region Archetype S16H14SEAPG6 and S16H14SEAWBPG6
Name = 'S16H14SEAPG6'
#### Input Variables
NoOfStories = 16
Thickness = 32.
Length = 22. * 12.
Flange_Thickness = 11.*12. # Assume 6' Long Core
Long_Spacing = 4
BarSize = 6.0
Rho = 0.50 #In Fraction
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section1 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing, Thickness - 3.5)
BarSize = 6.0
Rho = 0.5 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section2 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2, Thickness - 3.5)
ThicknessBelow = float(Thickness)
Thickness = 16.
Length = Length - (ThicknessBelow - Thickness) * 2.
BarSize = 5.0
Rho = 0.50 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section3 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2, Thickness - 3.5)
BarSize = 4.0
Rho = 0.25 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section4 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, None, None, None, None)
Sections = [Section1, Section1, Section1, Section1,
Section2, Section2, Section2, Section2,
Section3, Section3, Section3, Section3,
Section4, Section4, Section4, Section4,
]
S16H14SEAPG6 = CreateArchetype(Use2008Maps = False)
Archetypes.append(S16H14SEAPG6)
Name = 'S16H14SEAWBPG6'
NoOfStories = 20 # Include Basement Floors Here
Sections = [Section1, Section1, Section1, Section1,
Section1, Section1, Section1, Section1,
Section2, Section2, Section2, Section2,
Section3, Section3, Section3, Section3,
Section4, Section4, Section4, Section4,
]
S16H14SEAWBPG6 = CreateArchetype(Basements, False)
Archetypes.append(S16H14SEAWBPG6)
#endregion
#region Archetype S20H14SEAPG6 and S20H14SEAWBPG6
Name = 'S20H14SEAPG6'
# print 'Importing Archetype: ' + Name
#### Input Variables
NoOfStories = 20
Thickness = 24.
Length = 25. * 12.
Flange_Thickness = 12.5*12. # Assume 6' Long Core
Long_Spacing = 4
BarSize = 7.0
Rho = 0.501 #In Fraction
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section1 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing, Thickness - 3.5)
BarSize = 7.0
Rho = 0.501 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section2 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2, Thickness - 3.5)
ThicknessBelow = float(Thickness)
Thickness = 20.
Length = Length - (ThicknessBelow - Thickness) * 2.
BarSize = 6.0
Rho =0.501 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section3 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2, Thickness - 3.5)
BarSize = 4.0
Rho = 0.30 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section4 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, None, None, None, None)
ThicknessBelow = float(Thickness)
Thickness = 16.
Length = Length - (ThicknessBelow - Thickness) * 2.
BarSize = 4.0
Rho = 0.25 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section5 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, None, None, None, None)
Sections = [Section1, Section1, Section1, Section1,
Section2, Section2, Section2, Section2,
Section3, Section3, Section3, Section3,
Section4, Section4, Section4, Section4,
Section5, Section5, Section5, Section5,
]
S20H14SEAPG6 = CreateArchetype(Use2008Maps = False)
Archetypes.append(S20H14SEAPG6)
Name = 'S20H14SEAWBPG6'
NoOfStories = 24 # Include Basement Floors Here
Sections = [Section1, Section1, Section1, Section1,
Section1, Section1, Section1, Section1,
Section2, Section2, Section2, Section2,
Section3, Section3, Section3, Section3,
Section4, Section4, Section4, Section4,
Section5, Section5, Section5, Section5,
]
S20H14SEAWBPG6 = CreateArchetype(Basements, False)
Archetypes.append(S20H14SEAWBPG6)
#endregion
#region Archetype S24H14SEAPG6 and S24H14SEAWBPG6
Name = 'S24H14SEAPG6'
# print 'Importing Archetype: ' + Name
#### Input Variables
NoOfStories = 24
Thickness = 20.
Length = 28. * 12.
Flange_Thickness = 14.*12. # Assume 6' Long Core
Long_Spacing = 4
BarSize = 5.0
Rho = 0.50 #In Fraction
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section1 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing, Thickness - 3.5)
BarSize = 5.0
Rho = 0.501 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section2 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2., Thickness - 3.5)
ThicknessBelow = float(Thickness)
Thickness = 20.
Length = Length - (ThicknessBelow - Thickness) * 2.
BarSize = 5.0
Rho = 0.501 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section3 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2., Thickness - 3.5)
BarSize = 4.0
Rho = 0.25 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section4 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, None, None, None, None)
ThicknessBelow = float(Thickness)
Thickness = 16.
Length = Length - (ThicknessBelow - Thickness) * 2.
BarSize = 4.0
Rho = 0.25 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section5 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, None, None, None, None)
BarSize = 4.0
Rho = 0.25 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section6 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, None, None, None, None)
Sections = [Section1, Section1, Section1, Section1,
Section2, Section2, Section2, Section2,
Section3, Section3, Section3, Section3,
Section4, Section4, Section4, Section4,
Section5, Section5, Section5, Section5,
Section6, Section6, Section6, Section6,
]
S24H14SEAPG6 = CreateArchetype(Use2008Maps = False)
Archetypes.append(S24H14SEAPG6)
Name = 'S24H14SEAWBPG6'
NoOfStories = 28
Sections = [Section1, Section1, Section1, Section1,
Section1, Section1, Section1, Section1,
Section2, Section2, Section2, Section2,
Section3, Section3, Section3, Section3,
Section4, Section4, Section4, Section4,
Section5, Section5, Section5, Section5,
Section6, Section6, Section6, Section6,
]
S24H14SEAWBPG6 = CreateArchetype(Basements, False)
Archetypes.append(S24H14SEAWBPG6)
#endregion
# PG7 : 25% Over-strength on ASCE 7 loads
#region Archetype S4H14SEAPG7 and S4H14SEAWBPG7
Name = 'S4H14SEAPG7'
# print 'Importing Archetype: ' + Name
# Compute Seismic Weight
NoOfStories = 4
YGrids = [0] + np.array(np.arange(0,(NoOfStories)*13*12, 13*12)+15*12).tolist()
DeadLoads = np.ones(NoOfStories) * DL / 1000.
DeadLoads[-1] = DeadLoads[-1] * DL_Roof / DL
LiveLoads = np.ones(NoOfStories) * LL / 1000.
LiveLoads[-1] = LiveLoads[-1] * LL_Roof / LL
MassPerSqFt = DL / 1000.
Mass = np.ones(NoOfStories) * MassPerSqFt * FloorArea
Mass[-1] = FloorArea * DL_Roof / 1000. # Adjust for Roof Weight
WallTribArea = FloorArea * 0.5
WeightPerSqFt = DL
BuildingWeight = np.ones(NoOfStories) * WeightPerSqFt * FloorArea
BuildingWeight[-1] = 152. / 1000. * FloorArea # Adjust for Roof Weight
# Seismic Hazard
R = 6; Cd = 5
SaDesign, Sds, CuTa = GetSeattle2014Hazard(YGrids[-1], R=R, Overstrength = 1.25)
Thickness = 24.
Length = 16. * 12.
Long_Spacing = 4
NoOfCols = 10
BarSize = 10.
Ag = ( (NoOfCols - 1) * Long_Spacing + 6 ) * Thickness
Rho = ( NoOfCols * 2 + 2 ) * np.pi * ( BarSize / 2. / 8.) ** 2. / Ag
# print Rho
Section1 = PlanarWallSection(Length, Thickness,
(NoOfCols - 1) * Long_Spacing + 6,
(NoOfCols - 1) * Long_Spacing + 6, BarSize,
[3] + (np.ones(NoOfCols - 2) * 2.).tolist() + [3],
[3] + (np.ones(NoOfCols - 2) * 2.).tolist() + [3],
0.255, 4.037, fpc_core, fy, fu, 3, 4., NoOfCols, 3)
NoOfCols = 10
BarSize = 9.0
Ag = ( (NoOfCols - 1) * Long_Spacing + 6 ) * Thickness
Rho = ( NoOfCols * 2 + 2 ) * np.pi * ( BarSize / 2. / 8.) ** 2. / Ag
# print Rho
Section2 = PlanarWallSection(Length, Thickness,
(NoOfCols - 1) * Long_Spacing + 6,
(NoOfCols - 1) * Long_Spacing + 6, BarSize,
[3] + (np.ones(NoOfCols - 2) * 2.).tolist() + [3],
[3] + (np.ones(NoOfCols - 2) * 2.).tolist() + [3],
0.255, 4.037, fpc_core, fy, fu, 3, 4., 8, 3)
Section3 = PlanarWallSection(Length, Thickness, 0, 0, 10.173,
[],
[],
0.255, 4.037, fpc_core, fy, fu, None, None, None, None)
Sections = [Section1, Section1, Section2, Section2]
S4H14SEAPG7 = CreateArchetype(Use2008Maps = False, Overstrength = 1.25)
Archetypes.append(S4H14SEAPG7)
Name = 'S4H14SEAWBPG7'
NoOfStories = 6
Sections = [
Section1, Section1,
Section1, Section1, Section2, Section2
]
S4H14SEAWBPG7 = CreateArchetype(Basements2Levels, Use2008Maps = False, Overstrength = 1.25)
Archetypes.append(S4H14SEAWBPG7)
#endregion
#region Archetype S8H14SEAPG7 and S8H14SEAWBPG7
Name = 'S8H14SEAPG7'
# print 'Importing Archetype: ' + Name
#### Input Variables
NoOfStories = 8
Thickness = 24.
Length = 16. * 12.
Flange_Thickness = 8*12. # Assume 6' Long Core
Long_Spacing = 4
BarSize = 7.0
Rho = 1.25 #In Fraction
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section1 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing, Thickness - 3.5)
BarSize = 7.0
Rho = 0.85 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section2 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2, Thickness - 3.5)
BarSize = 4.0
Rho = 0.30 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section3 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, None, None, None, None)
Sections = [Section1, Section1, Section1,
Section2, Section2, Section2,
Section3, Section3,
]
S8H14SEAPG7 = CreateArchetype(Use2008Maps = False, Overstrength = 1.25)
Archetypes.append(S8H14SEAPG7)
Name = 'S8H14SEAWBPG7'
NoOfStories = 11
Sections = [
Section1, Section1, Section1,
Section1, Section1, Section1,
Section2, Section2, Section2,
Section3, Section3,
]
S8H14SEAWBPG7 = CreateArchetype(Basements3Levels, Use2008Maps = False, Overstrength = 1.25)
Archetypes.append(S8H14SEAWBPG7)
#endregion
#region Archetype S12H14SEAPG7 and S12H14SEAWBPG7
Name = 'S12H14SEAPG7'
# print 'Importing Archetype: ' + Name
#### Input Variables
NoOfStories = 12
Thickness = 26.
Length = 20. * 12.
Flange_Thickness = 10*12. # Assume 6' Long Core
Long_Spacing = 4
BarSize = 8.0
Rho = 1.0 #In Fraction
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section1 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing, Thickness - 3.5)
BarSize = 8.0
Rho = 0.8 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section2 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2, Thickness - 3.5)
ThicknessBelow = float(Thickness)
Thickness = 18.
Length = Length - (ThicknessBelow - Thickness) * 2.
BarSize = 4.0
Rho = 0.75 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section3 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2, Thickness - 3.5)
BarSize = 4.0
Rho = 0.25 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section4 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, None, None, None, None)
Sections = [Section1, Section1, Section1,
Section2, Section2, Section2,
Section3, Section3, Section3,
Section4, Section4, Section4,
]
S12H14SEAPG7 = CreateArchetype(Use2008Maps = False, Overstrength = 1.25)
Archetypes.append(S12H14SEAPG7)
Name = 'S12H14SEAWBPG7'
NoOfStories = 16
Sections = [
Section1, Section1, Section1, Section1,
Section1, Section1, Section1,
Section2, Section2, Section2,
Section3, Section3, Section3,
Section4, Section4, Section4,
]
S12H14SEAWBPG7 = CreateArchetype(Basements, Use2008Maps = False, Overstrength = 1.25)
Archetypes.append(S12H14SEAWBPG7)
#endregion
#region Archetype S16H14SEAPG7 and S16H14SEAWBPG7
Name = 'S16H14SEAPG7'
#### Input Variables
NoOfStories = 16
Thickness = 32.
Length = 24. * 12.
Flange_Thickness = 12*12. # Assume 6' Long Core
Long_Spacing = 4
BarSize = 8.0
Rho = 0.75 #In Fraction
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section1 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing, Thickness - 3.5)
BarSize = 7.0
Rho = 0.55 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section2 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2, Thickness - 3.5)
ThicknessBelow = float(Thickness)
Thickness = 22.
Length = Length - (ThicknessBelow - Thickness) * 2.
BarSize = 5.0
Rho = 0.50 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section3 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2, Thickness - 3.5)
BarSize = 4.0
Rho = 0.25 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section4 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, None, None, None, None)
Sections = [Section1, Section1, Section1, Section1,
Section2, Section2, Section2, Section2,
Section3, Section3, Section3, Section3,
Section4, Section4, Section4, Section4,
]
S16H14SEAPG7 = CreateArchetype(Use2008Maps = False, Overstrength = 1.25)
Archetypes.append(S16H14SEAPG7)
Name = 'S16H14SEAWBPG7'
NoOfStories = 20 # Include Basement Floors Here
Sections = [Section1, Section1, Section1, Section1,
Section1, Section1, Section1, Section1,
Section2, Section2, Section2, Section2,
Section3, Section3, Section3, Section3,
Section4, Section4, Section4, Section4,
]
S16H14SEAWBPG7 = CreateArchetype(Basements, False, Overstrength = 1.25)
Archetypes.append(S16H14SEAWBPG7)
#endregion
#region Archetype S20H14SEAPG7 and S20H14SEAWBPG7
Name = 'S20H14SEAPG7'
# print 'Importing Archetype: ' + Name
#### Input Variables
NoOfStories = 20
Thickness = 40.
Length = 28. * 12.
Flange_Thickness = 14.*12. # Assume 6' Long Core
Long_Spacing = 4
BarSize = 8.0
Rho = 0.50 #In Fraction
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section1 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing, Thickness - 3.5)
BarSize = 7.0
Rho = 0.5 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section2 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2, Thickness - 3.5)
ThicknessBelow = float(Thickness)
Thickness = 24.
Length = Length - (ThicknessBelow - Thickness) * 2.
BarSize = 6.0
Rho =0.5 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section3 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2, Thickness - 3.5)
BarSize = 4.0
Rho = 0.25 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section4 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, None, None, None, None)
ThicknessBelow = float(Thickness)
Thickness = 22.
Length = Length - (ThicknessBelow - Thickness) * 2.
BarSize = 4.0
Rho = 0.25 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section5 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, None, None, None, None)
Sections = [Section1, Section1, Section1, Section1,
Section2, Section2, Section2, Section2,
Section3, Section3, Section3, Section3,
Section4, Section4, Section4, Section4,
Section5, Section5, Section5, Section5,
]
S20H14SEAPG7 = CreateArchetype(Use2008Maps = False, Overstrength = 1.25)
Archetypes.append(S20H14SEAPG7)
Name = 'S20H14SEAWBPG7'
NoOfStories = 24 # Include Basement Floors Here
Sections = [Section1, Section1, Section1, Section1,
Section1, Section1, Section1, Section1,
Section2, Section2, Section2, Section2,
Section3, Section3, Section3, Section3,
Section4, Section4, Section4, Section4,
Section5, Section5, Section5, Section5,
]
S20H14SEAWBPG7 = CreateArchetype(Basements, False, Overstrength = 1.25)
Archetypes.append(S20H14SEAWBPG7)
#endregion
#region Archetype S24H14SEAPG7 and S24H14SEAWBPG7
Name = 'S24H14SEAPG7'
# print 'Importing Archetype: ' + Name
#### Input Variables
NoOfStories = 24
Thickness = 24.
Length = 35. * 12.
Flange_Thickness = 17.5*12. # Assume 6' Long Core
Long_Spacing = 4
BarSize = 6.0
Rho = 0.501
#In Fraction
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section1 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing, Thickness - 3.5)
BarSize = 6.0
Rho = 0.501 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section2 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2., Thickness - 3.5)
ThicknessBelow = float(Thickness)
Thickness = 24.
Length = Length - (ThicknessBelow - Thickness) * 2.
BarSize = 6.0
Rho = 0.501 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section3 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, 3., 4., Spacing*2., Thickness - 3.5)
BarSize = 4.0
Rho = 0.30 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section4 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, None, None, None, None)
ThicknessBelow = float(Thickness)
Thickness = 24.
Length = Length - (ThicknessBelow - Thickness) * 2.
BarSize = 4.0
Rho = 0.25 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section5 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, None, None, None, None)
BarSize = 4.0
Rho = 0.25 #In percentages
Abar = np.pi*(BarSize / 8. / 2.)**2.
Spacing = Abar * 2. / Thickness / Rho * 100
Section6 = IWallSection(Length, Flange_Thickness, Thickness, Rho, BarSize,
fpc_core, fy, fu, None, None, None, None)
Sections = [Section1, Section1, Section1, Section1,
Section2, Section2, Section2, Section2,
Section3, Section3, Section3, Section3,
Section4, Section4, Section4, Section4,
Section5, Section5, Section5, Section5,
Section6, Section6, Section6, Section6,
]
S24H14SEAPG7 = CreateArchetype(Use2008Maps = False, Overstrength = 1.25)
Archetypes.append(S24H14SEAPG7)
Name = 'S24H14SEAWBPG7'
NoOfStories = 28
Sections = [Section1, Section1, Section1, Section1,
Section1, Section1, Section1, Section1,
Section2, Section2, Section2, Section2,
Section3, Section3, Section3, Section3,
Section4, Section4, Section4, Section4,
Section5, Section5, Section5, Section5,
Section6, Section6, Section6, Section6,
]
S24H14SEAWBPG7 = CreateArchetype(Basements, False, Overstrength = 1.25)
Archetypes.append(S24H14SEAWBPG7)
#endregion
####################################################################################
# endregion
####################################################################################
# import ATCWallArchetypeHelpers as WallHelper
# for arch in Archetypes:
# print WallHelper.GetAxialLoadRatio(arch)
def GetArchetypeByName(Name):
return [x for x in Archetypes if x.Name == Name][0]
| 39.212703 | 138 | 0.632946 | 20,395 | 171,634 | 5.28708 | 0.022064 | 0.084726 | 0.099045 | 0.097338 | 0.889419 | 0.866494 | 0.857368 | 0.851368 | 0.832292 | 0.804136 | 0 | 0.094182 | 0.235565 | 171,634 | 4,376 | 139 | 39.221664 | 0.727674 | 0.081633 | 0 | 0.841536 | 0 | 0 | 0.009576 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.001288 | false | 0 | 0.002577 | 0.000258 | 0.005153 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
b117c0708bb27c3e9a0be816355cf627e559872c | 14,405 | py | Python | tests/test_filters/test_requests.py | OvalMoney/horus | 90d839e9465f5089fa2632dad9f28190db3a829b | [
"MIT"
] | 2 | 2020-07-17T07:43:53.000Z | 2020-12-03T11:14:59.000Z | tests/test_filters/test_requests.py | OvalMoney/horus | 90d839e9465f5089fa2632dad9f28190db3a829b | [
"MIT"
] | 1 | 2020-01-27T15:49:33.000Z | 2020-01-27T15:49:33.000Z | tests/test_filters/test_requests.py | OvalMoney/horus | 90d839e9465f5089fa2632dad9f28190db3a829b | [
"MIT"
] | 2 | 2020-07-17T07:44:04.000Z | 2020-12-01T11:10:00.000Z | import rapidjson
import pytest
from nephthys import RequestLogRecord, LogRecord
from nephthys.filters.requests import HeaderFilter, BodyTypeFilter, JsonBodyFilter, QueryStringFilter
from nephthys.filters.requests import (
RequestType,
QS_FILTERED,
HEADER_FILTERED,
BODY_NOT_LOGGABLE,
JSON_BODY_FILTERED,
)
def rec_generator():
return LogRecord(extra_tags=["my_tag"])
def req_rec_generator(
request_headers=None, response_headers=None, request_body=None, response_body=None, qs=None
):
req_rec = RequestLogRecord()
for name, value in request_headers or []:
req_rec.add_request_header(name, value)
for name, value in response_headers or []:
req_rec.add_response_header(name, value)
for name, value in qs or []:
req_rec.add_request_querystring(name, value)
req_rec.request_body = request_body
req_rec.response_body = response_body
return req_rec
@pytest.mark.parametrize(
"filters,in_record,out_record",
[
(
["QSFiltered", "QSfiltered2"],
req_rec_generator(
qs=[
("QSFiltered", "value1"),
("QSfiltered", "value2"),
("QSNotFiltered", "value3"),
("QSfiltered2", "value4"),
],
),
req_rec_generator(
qs=[
("QSFiltered", QS_FILTERED),
("QSfiltered", "value2"),
("QSNotFiltered", "value3"),
("QSfiltered2", QS_FILTERED),
],
),
),
(
[],
req_rec_generator(
qs=[
("QSFiltered", "value1"),
("QSfiltered", "value2"),
("QSNotFiltered", "value3"),
("QSfiltered2", "value4"),
],
),
req_rec_generator(
qs=[
("QSFiltered", "value1"),
("QSfiltered", "value2"),
("QSNotFiltered", "value3"),
("QSfiltered2", "value4"),
],
),
),
(["QSFiltered"], rec_generator(), rec_generator()),
],
)
def test_qs_filter(filters, in_record, out_record):
qs_filter = QueryStringFilter(filters)
qs_filter.filter(in_record)
assert in_record.asdict() == out_record.asdict()
@pytest.mark.parametrize(
"filters,req_type,in_record,out_record",
[
(
["X-Filter-me"],
RequestType.REQUEST,
req_rec_generator(
request_headers=[
("X-Filter-Me", "value1"),
("X-Not-Filter-Me", "value2"),
],
response_headers=[
("X-Filter-Me", "value1"),
("X-Not-Filter-Me", "value2"),
],
),
req_rec_generator(
request_headers=[
("X-Filter-Me", HEADER_FILTERED),
("X-Not-Filter-Me", "value2"),
],
response_headers=[
("X-Filter-Me", "value1"),
("X-Not-Filter-Me", "value2"),
],
),
),
(
["X-Filter-Me"],
RequestType.RESPONSE,
req_rec_generator(
request_headers=[
("X-Filter-Me", "value1"),
("X-Not-Filter-Me", "value2"),
],
response_headers=[
("X-Filter-Me", "value1"),
("X-Not-Filter-Me", "value2"),
],
),
req_rec_generator(
request_headers=[
("X-Filter-Me", "value1"),
("X-Not-Filter-Me", "value2"),
],
response_headers=[
("X-Filter-Me", HEADER_FILTERED),
("X-Not-Filter-Me", "value2"),
],
),
),
(
["X-Filter-Me"],
RequestType.ALL,
req_rec_generator(
request_headers=[
("X-Filter-Me", "value1"),
("X-Not-Filter-Me", "value2"),
],
response_headers=[
("X-Filter-Me", "value1"),
("X-Not-Filter-Me", "value2"),
],
),
req_rec_generator(
request_headers=[
("X-Filter-Me", HEADER_FILTERED),
("X-Not-Filter-Me", "value2"),
],
response_headers=[
("X-Filter-Me", HEADER_FILTERED),
("X-Not-Filter-Me", "value2"),
],
),
),
(
["X-Filter-Me"],
RequestType.ALL,
req_rec_generator(
request_headers=[
("X-Filter-Me", "value1"),
("X-Not-Filter-Me", "value2"),
]
),
req_rec_generator(
request_headers=[
("X-Filter-Me", HEADER_FILTERED),
("X-Not-Filter-Me", "value2"),
]
),
),
(
[],
RequestType.ALL,
req_rec_generator(
request_headers=[
("X-Filter-Me", "value1"),
("X-Not-Filter-Me", "value2"),
],
response_headers=[
("X-Filter-Me", "value1"),
("X-Not-Filter-Me", "value2"),
],
),
req_rec_generator(
request_headers=[
("X-Filter-Me", "value1"),
("X-Not-Filter-Me", "value2"),
],
response_headers=[
("X-Filter-Me", "value1"),
("X-Not-Filter-Me", "value2"),
],
),
),
(["X-Filter-Me"], RequestType.ALL, rec_generator(), rec_generator()),
],
)
def test_header_filter(filters, req_type, in_record, out_record):
head_filter = HeaderFilter(filters, req_type)
head_filter.filter(in_record)
assert in_record.asdict() == out_record.asdict()
@pytest.mark.parametrize(
"filters,req_type,in_record,out_record",
[
(
None,
RequestType.REQUEST,
req_rec_generator(
request_headers=[("Content-Type", "multipart/form-data")],
response_headers=[("Content-Type", "image/jpeg")],
request_body="Filter-Me",
response_body="Not-Filter-Me",
),
req_rec_generator(
request_headers=[("Content-Type", "multipart/form-data")],
response_headers=[("Content-Type", "image/jpeg")],
request_body=BODY_NOT_LOGGABLE.format("multipart/form-data"),
response_body="Not-Filter-Me",
),
),
(
None,
RequestType.RESPONSE,
req_rec_generator(
request_headers=[("Content-Type", "image/jpeg")],
response_headers=[("Content-Type", "multipart/form-data")],
request_body="Not-Filter-Me",
response_body="Filter-Me",
),
req_rec_generator(
request_headers=[("Content-Type", "image/jpeg")],
response_headers=[("Content-Type", "multipart/form-data")],
request_body="Not-Filter-Me",
response_body=BODY_NOT_LOGGABLE.format("multipart/form-data"),
),
),
(
None,
RequestType.ALL,
req_rec_generator(
request_headers=[("Content-Type", "multipart/form-data")],
response_headers=[("Content-Type", "image/jpeg")],
request_body="Filter-Me",
response_body="Filter-Me",
),
req_rec_generator(
request_headers=[("Content-Type", "multipart/form-data")],
response_headers=[("Content-Type", "image/jpeg")],
request_body=BODY_NOT_LOGGABLE.format("multipart/form-data"),
response_body=BODY_NOT_LOGGABLE.format("image/jpeg"),
),
),
(
["application/json"],
RequestType.ALL,
req_rec_generator(
request_headers=[("Content-Type", "application/json")],
response_headers=[("Content-Type", "text/plain")],
request_body="Not-Filter-Me",
response_body="Filter-Me",
),
req_rec_generator(
request_headers=[("Content-Type", "application/json")],
response_headers=[("Content-Type", "text/plain")],
request_body="Not-Filter-Me",
response_body=BODY_NOT_LOGGABLE.format("text/plain"),
),
),
(
["application/json"],
RequestType.ALL,
req_rec_generator(
request_headers=[("Content-Type", "application/json")],
request_body="Not-Filter-Me",
),
req_rec_generator(
request_headers=[("Content-Type", "application/json")],
request_body="Not-Filter-Me",
),
),
(["application/json"], RequestType.ALL, rec_generator(), rec_generator()),
],
)
def test_body_type_filter(filters, req_type, in_record, out_record):
head_filter = BodyTypeFilter(filters, req_type)
head_filter.filter(in_record)
assert in_record.asdict() == out_record.asdict()
@pytest.mark.parametrize(
"filters,req_type,in_record,out_record",
[
(
{"key": True},
RequestType.REQUEST,
req_rec_generator(
request_headers=[("Content-Type", "application/json")],
response_headers=[("Content-Type", "application/json")],
request_body=rapidjson.dumps(
{"key": "Filter-Me", "key2": "Not-Filter-Me"}
),
response_body=rapidjson.dumps({"key": "Not-filter-Me"}),
),
req_rec_generator(
request_headers=[("Content-Type", "application/json")],
response_headers=[("Content-Type", "application/json")],
request_body=rapidjson.dumps(
{"key": JSON_BODY_FILTERED, "key2": "Not-Filter-Me"}
),
response_body=rapidjson.dumps({"key": "Not-filter-Me"}),
),
),
(
{"key": True},
RequestType.RESPONSE,
req_rec_generator(
request_headers=[("Content-Type", "application/json")],
response_headers=[("Content-Type", "application/json")],
request_body=rapidjson.dumps({"key": "Not-filter-Me"}),
response_body=rapidjson.dumps(
{"key": "Filter-Me", "key2": "Not-Filter-Me"}
),
),
req_rec_generator(
request_headers=[("Content-Type", "application/json")],
response_headers=[("Content-Type", "application/json")],
request_body=rapidjson.dumps({"key": "Not-filter-Me"}),
response_body=rapidjson.dumps(
{"key": JSON_BODY_FILTERED, "key2": "Not-Filter-Me"}
),
),
),
(
{"key": {"key": True}},
RequestType.ALL,
req_rec_generator(
request_headers=[("Content-Type", "application/json")],
response_headers=[("Content-Type", "application/json")],
request_body=rapidjson.dumps({"key": {"key": "Filter-Me"}}),
response_body=rapidjson.dumps(
{"key": {"key": "Filter-Me"}, "key2": "Not-Filter-Me"}
),
),
req_rec_generator(
request_headers=[("Content-Type", "application/json")],
response_headers=[("Content-Type", "application/json")],
request_body=rapidjson.dumps({"key": {"key": JSON_BODY_FILTERED}}),
response_body=rapidjson.dumps(
{"key": {"key": JSON_BODY_FILTERED}, "key2": "Not-Filter-Me"}
),
),
),
(
{"key": {"key": True}},
RequestType.ALL,
req_rec_generator(
request_headers=[("Content-Type", "application/json")],
response_headers=[("Content-Type", "application/json")],
request_body=rapidjson.dumps({"key": "Not-Filter-Me"}),
response_body=rapidjson.dumps({"key2": "Not-Filter-Me"}),
),
req_rec_generator(
request_headers=[("Content-Type", "application/json")],
response_headers=[("Content-Type", "application/json")],
request_body=rapidjson.dumps({"key": "Not-Filter-Me"}),
response_body=rapidjson.dumps({"key2": "Not-Filter-Me"}),
),
),
(
{"key": {"key": True}},
RequestType.ALL,
req_rec_generator(
request_headers=[("Content-Type", "text/plain")],
response_headers=[("Content-Type", "application/json")],
request_body="Not-Filter-Me",
response_body=rapidjson.dumps({"key2": "Not-Filter-Me"}),
),
req_rec_generator(
request_headers=[("Content-Type", "text/plain")],
response_headers=[("Content-Type", "application/json")],
request_body="Not-Filter-Me",
response_body=rapidjson.dumps({"key2": "Not-Filter-Me"}),
),
),
({"key": {"key": True}}, RequestType.ALL, rec_generator(), rec_generator()),
],
)
def test_json_body_filter(filters, req_type, in_record, out_record):
head_filter = JsonBodyFilter(filters, req_type)
head_filter.filter(in_record)
assert in_record.asdict() == out_record.asdict()
| 35.134146 | 101 | 0.475043 | 1,225 | 14,405 | 5.354286 | 0.061224 | 0.092697 | 0.073792 | 0.103979 | 0.894039 | 0.854246 | 0.845098 | 0.82665 | 0.802104 | 0.788687 | 0 | 0.006737 | 0.381742 | 14,405 | 409 | 102 | 35.220049 | 0.729733 | 0 | 0 | 0.796875 | 0 | 0 | 0.196807 | 0.009649 | 0 | 0 | 0 | 0 | 0.010417 | 1 | 0.015625 | false | 0 | 0.013021 | 0.002604 | 0.033854 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
b1997a218715e93df38039f9bda8c916b867eb57 | 9,560 | py | Python | logic/fast_shift.py | aledelmo/FuzzyTracts | 137acc40f73b5514c2447aac017eb35a7d4ed051 | [
"Apache-2.0"
] | null | null | null | logic/fast_shift.py | aledelmo/FuzzyTracts | 137acc40f73b5514c2447aac017eb35a7d4ed051 | [
"Apache-2.0"
] | null | null | null | logic/fast_shift.py | aledelmo/FuzzyTracts | 137acc40f73b5514c2447aac017eb35a7d4ed051 | [
"Apache-2.0"
] | null | null | null | #!/usr/bin/env python
import numpy as np
def fast_shift3d(in_array, positions, padding=0):
out_array = np.zeros_like(in_array)
pos0 = positions[0]
pos1 = positions[1]
pos2 = positions[2]
if pos0 >= 0 and pos1 >= 0 and pos2 >= 0:
if pos0 == 0 and pos1 > 0 and pos2 > 0:
out_array[:, pos1:, pos2:] = in_array[:, :-pos1, :-pos2]
out_array[:, :pos1, :pos2] = padding
elif pos0 == 0 and pos1 > 0 and pos2 == 0:
out_array[:, pos1:, :] = in_array[:, :-pos1, :]
out_array[:, :pos1, :] = padding
elif pos0 == 0 and pos1 == 0 and pos2 > 0:
out_array[:, :, pos2:] = in_array[:, :, :-pos2]
out_array[:, :, :pos2] = padding
elif pos0 == 0 and pos1 == 0 and pos2 == 0:
out_array[:, :, :] = in_array[:, :, :]
out_array[:, :, :] = padding
elif pos0 > 0 and pos1 == 0 and pos2 > 0:
out_array[pos0:, :, pos2:] = in_array[:-pos0, :, :-pos2]
out_array[:pos0, :, :pos2] = padding
elif pos0 > 0 and pos1 == 0 and pos2 == 0:
out_array[pos0:, :, :] = in_array[:-pos0, :, :]
out_array[:pos0, :, :] = padding
elif pos0 > 0 and pos1 > 0 and pos2 == 0:
out_array[pos0:, pos1:, :] = in_array[:-pos0, :-pos1, :]
out_array[:pos0, :pos1, :] = padding
else:
out_array[pos0:, pos1:, pos2:] = in_array[:-pos0, :-pos1, :-pos2]
out_array[:pos0, :pos1, :pos2] = padding
elif pos0 >= 0 and pos1 >= 0 > pos2:
if pos0 == 0 and pos1 != 0:
out_array[:, pos1:, :pos2] = in_array[:, :-pos1, -pos2:]
out_array[:, :pos1, pos2:] = padding
elif pos0 != 0 and pos1 == 0:
out_array[pos0:, :, :pos2] = in_array[:-pos0, :, -pos2:]
out_array[:pos0, :, pos2:] = padding
elif pos0 == 0 and pos1 == 0:
out_array[:, :, :pos2] = in_array[:, :, -pos2:]
out_array[:, :, pos2:] = padding
else:
out_array[pos0:, pos1:, :pos2] = in_array[:-pos0, :-pos1, -pos2:]
out_array[:pos0, :pos1, pos2:] = padding
elif pos0 >= 0 > pos1 and pos2 >= 0:
if pos0 == 0 and pos2 != 0:
out_array[:, :pos1, pos2:] = in_array[:, -pos1:, :-pos2]
out_array[:, pos1:, :pos2] = padding
elif pos0 != 0 and pos2 == 0:
out_array[pos0:, :pos1, :] = in_array[:-pos0, -pos1:, :]
out_array[:pos0, pos1:, :] = padding
elif pos0 == 0 and pos2 == 0:
out_array[:, :pos1, :] = in_array[:, -pos1:, :]
out_array[:, pos1:, :] = padding
else:
out_array[pos0:, :pos1, pos2:] = in_array[:-pos0, -pos1:, :-pos2]
out_array[:pos0, pos1:, :pos2] = padding
elif pos0 >= 0 > pos1 and pos2 < 0:
if pos0 == 0:
out_array[:, :pos1, :pos2] = in_array[:, -pos1:, -pos2:]
out_array[:, pos1:, pos2:] = padding
else:
out_array[pos0:, :pos1, :pos2] = in_array[:-pos0, -pos1:, -pos2:]
out_array[:pos0, pos1:, pos2:] = padding
elif pos0 < 0 <= pos1 and pos2 >= 0:
if pos1 == 0 and pos2 != 0:
out_array[:pos0, :, pos2:] = in_array[-pos0:, :, :-pos2]
out_array[pos0:, :, :pos2] = padding
elif pos1 != 0 and pos2 == 0:
out_array[:pos0, pos1:, :] = in_array[-pos0:, :-pos1, :]
out_array[pos0:, :pos1, :] = padding
elif pos1 == 0 and pos2 == 0:
out_array[:pos0, :, :] = in_array[-pos0:, :, :]
out_array[pos0:, :, :] = padding
else:
out_array[:pos0, pos1:, pos2:] = in_array[-pos0:, :-pos1, :-pos2]
out_array[pos0:, :pos1, :pos2] = padding
elif pos0 < 0 <= pos1 and pos2 < 0:
if pos1 == 0:
out_array[:pos0, :, :pos2] = in_array[-pos0:, :, -pos2:]
out_array[pos0:, :, pos2:] = padding
else:
out_array[:pos0, pos1:, :pos2] = in_array[-pos0:, :-pos1, -pos2:]
out_array[pos0:, :pos1, pos2:] = padding
elif pos0 < 0 and pos1 < 0 <= pos2:
if pos2 == 0:
out_array[:pos0, :pos1, :] = in_array[-pos0:, -pos1:, :]
out_array[pos0:, pos1:, :] = padding
else:
out_array[:pos0, :pos1, pos2:] = in_array[-pos0:, -pos1:, :-pos2]
out_array[pos0:, pos1:, :pos2] = padding
else:
out_array[:pos0, :pos1, :pos2] = in_array[-pos0:, -pos1:, -pos2:]
out_array[pos0:, pos1:, pos2:] = padding
return out_array
def fast_shift3d_parallel(in_array, positions, padding=0):
cone = np.zeros(in_array.shape)
for position in positions:
pos0 = position[0]
pos1 = position[1]
pos2 = position[2]
out_array = np.zeros_like(in_array)
if pos0 >= 0 and pos1 >= 0 and pos2 >= 0:
if pos0 == 0 and pos1 > 0 and pos2 > 0:
out_array[:, pos1:, pos2:] = in_array[:, :-pos1, :-pos2]
out_array[:, :pos1, :pos2] = padding
elif pos0 == 0 and pos1 > 0 and pos2 == 0:
out_array[:, pos1:, :] = in_array[:, :-pos1, :]
out_array[:, :pos1, :] = padding
elif pos0 == 0 and pos1 == 0 and pos2 > 0:
out_array[:, :, pos2:] = in_array[:, :, :-pos2]
out_array[:, :, :pos2] = padding
elif pos0 == 0 and pos1 == 0 and pos2 == 0:
out_array[:, :, :] = in_array[:, :, :]
out_array[:, :, :] = padding
elif pos0 > 0 and pos1 == 0 and pos2 > 0:
out_array[pos0:, :, pos2:] = in_array[:-pos0, :, :-pos2]
out_array[:pos0, :, :pos2] = padding
elif pos0 > 0 and pos1 == 0 and pos2 == 0:
out_array[pos0:, :, :] = in_array[:-pos0, :, :]
out_array[:pos0, :, :] = padding
elif pos0 > 0 and pos1 > 0 and pos2 == 0:
out_array[pos0:, pos1:, :] = in_array[:-pos0, :-pos1, :]
out_array[:pos0, :pos1, :] = padding
else:
out_array[pos0:, pos1:, pos2:] = in_array[:-pos0, :-pos1, :-pos2]
out_array[:pos0, :pos1, :pos2] = padding
elif pos0 >= 0 and pos1 >= 0 > pos2:
if pos0 == 0 and pos1 != 0:
out_array[:, pos1:, :pos2] = in_array[:, :-pos1, -pos2:]
out_array[:, :pos1, pos2:] = padding
elif pos0 != 0 and pos1 == 0:
out_array[pos0:, :, :pos2] = in_array[:-pos0, :, -pos2:]
out_array[:pos0, :, pos2:] = padding
elif pos0 == 0 and pos1 == 0:
out_array[:, :, :pos2] = in_array[:, :, -pos2:]
out_array[:, :, pos2:] = padding
else:
out_array[pos0:, pos1:, :pos2] = in_array[:-pos0, :-pos1, -pos2:]
out_array[:pos0, :pos1, pos2:] = padding
elif pos0 >= 0 and pos1 < 0 <= pos2:
if pos0 == 0 and pos2 != 0:
out_array[:, :pos1, pos2:] = in_array[:, -pos1:, :-pos2]
out_array[:, pos1:, :pos2] = padding
elif pos0 != 0 and pos2 == 0:
out_array[pos0:, :pos1, :] = in_array[:-pos0, -pos1:, :]
out_array[:pos0, pos1:, :] = padding
elif pos0 == 0 and pos2 == 0:
out_array[:, :pos1, :] = in_array[:, -pos1:, :]
out_array[:, pos1:, :] = padding
else:
out_array[pos0:, :pos1, pos2:] = in_array[:-pos0, -pos1:, :-pos2]
out_array[:pos0, pos1:, :pos2] = padding
elif pos0 >= 0 > pos1 and pos2 < 0:
if pos0 == 0:
out_array[:, :pos1, :pos2] = in_array[:, -pos1:, -pos2:]
out_array[:, pos1:, pos2:] = padding
else:
out_array[pos0:, :pos1, :pos2] = in_array[:-pos0, -pos1:, -pos2:]
out_array[:pos0, pos1:, pos2:] = padding
elif pos0 < 0 <= pos1 and pos2 >= 0:
if pos1 == 0 and pos2 != 0:
out_array[:pos0, :, pos2:] = in_array[-pos0:, :, :-pos2]
out_array[pos0:, :, :pos2] = padding
elif pos1 != 0 and pos2 == 0:
out_array[:pos0, pos1:, :] = in_array[-pos0:, :-pos1, :]
out_array[pos0:, :pos1, :] = padding
elif pos1 == 0 and pos2 == 0:
out_array[:pos0, :, :] = in_array[-pos0:, :, :]
out_array[pos0:, :, :] = padding
else:
out_array[:pos0, pos1:, pos2:] = in_array[-pos0:, :-pos1, :-pos2]
out_array[pos0:, :pos1, :pos2] = padding
elif pos0 < 0 <= pos1 and pos2 < 0:
if pos1 == 0:
out_array[:pos0, :, :pos2] = in_array[-pos0:, :, -pos2:]
out_array[pos0:, :, pos2:] = padding
else:
out_array[:pos0, pos1:, :pos2] = in_array[-pos0:, :-pos1, -pos2:]
out_array[pos0:, :pos1, pos2:] = padding
elif pos0 < 0 and pos1 < 0 <= pos2:
if pos2 == 0:
out_array[:pos0, :pos1, :] = in_array[-pos0:, -pos1:, :]
out_array[pos0:, pos1:, :] = padding
else:
out_array[:pos0, :pos1, pos2:] = in_array[-pos0:, -pos1:, :-pos2]
out_array[pos0:, pos1:, :pos2] = padding
else:
out_array[:pos0, :pos1, :pos2] = in_array[-pos0:, -pos1:, -pos2:]
out_array[pos0:, pos1:, pos2:] = padding
np.maximum(cone, out_array, out=cone)
return cone
| 46.634146 | 81 | 0.468619 | 1,151 | 9,560 | 3.739357 | 0.030408 | 0.208178 | 0.200743 | 0.178439 | 0.943309 | 0.932156 | 0.932156 | 0.920074 | 0.920074 | 0.920074 | 0 | 0.094018 | 0.354707 | 9,560 | 204 | 82 | 46.862745 | 0.603663 | 0.002092 | 0 | 0.916667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.010417 | false | 0 | 0.005208 | 0 | 0.026042 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 10 |
b19c95f30b631074e548eb9ed8156b8ac0c882f2 | 1,285 | py | Python | temboo/core/Library/Zendesk/Requests/__init__.py | jordanemedlock/psychtruths | 52e09033ade9608bd5143129f8a1bfac22d634dd | [
"Apache-2.0"
] | 7 | 2016-03-07T02:07:21.000Z | 2022-01-21T02:22:41.000Z | temboo/core/Library/Zendesk/Requests/__init__.py | jordanemedlock/psychtruths | 52e09033ade9608bd5143129f8a1bfac22d634dd | [
"Apache-2.0"
] | null | null | null | temboo/core/Library/Zendesk/Requests/__init__.py | jordanemedlock/psychtruths | 52e09033ade9608bd5143129f8a1bfac22d634dd | [
"Apache-2.0"
] | 8 | 2016-06-14T06:01:11.000Z | 2020-04-22T09:21:44.000Z | from temboo.Library.Zendesk.Requests.CreateRequest import CreateRequest, CreateRequestInputSet, CreateRequestResultSet, CreateRequestChoreographyExecution
from temboo.Library.Zendesk.Requests.GetComment import GetComment, GetCommentInputSet, GetCommentResultSet, GetCommentChoreographyExecution
from temboo.Library.Zendesk.Requests.GetRequest import GetRequest, GetRequestInputSet, GetRequestResultSet, GetRequestChoreographyExecution
from temboo.Library.Zendesk.Requests.ListAllRequests import ListAllRequests, ListAllRequestsInputSet, ListAllRequestsResultSet, ListAllRequestsChoreographyExecution
from temboo.Library.Zendesk.Requests.ListComments import ListComments, ListCommentsInputSet, ListCommentsResultSet, ListCommentsChoreographyExecution
from temboo.Library.Zendesk.Requests.ListOrganizationRequests import ListOrganizationRequests, ListOrganizationRequestsInputSet, ListOrganizationRequestsResultSet, ListOrganizationRequestsChoreographyExecution
from temboo.Library.Zendesk.Requests.ListUserRequests import ListUserRequests, ListUserRequestsInputSet, ListUserRequestsResultSet, ListUserRequestsChoreographyExecution
from temboo.Library.Zendesk.Requests.UpdateRequest import UpdateRequest, UpdateRequestInputSet, UpdateRequestResultSet, UpdateRequestChoreographyExecution
| 142.777778 | 209 | 0.91284 | 88 | 1,285 | 13.329545 | 0.431818 | 0.068201 | 0.115942 | 0.163683 | 0.218244 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.04358 | 1,285 | 8 | 210 | 160.625 | 0.954435 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 1 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
4919380b16e62880a25eb14440d37441e56335ca | 130 | py | Python | tests/cases/lambda.py | wisn/py2many | e33871a3e54971407319e9df28dcadcdc3a49140 | [
"MIT"
] | 1 | 2021-05-14T00:35:04.000Z | 2021-05-14T00:35:04.000Z | tests/cases/lambda.py | wisn/py2many | e33871a3e54971407319e9df28dcadcdc3a49140 | [
"MIT"
] | null | null | null | tests/cases/lambda.py | wisn/py2many | e33871a3e54971407319e9df28dcadcdc3a49140 | [
"MIT"
] | null | null | null | from typing import Callable
def main():
# myfunc: Callable[[int, int], int] = lambda x, y: x + y
myfunc = lambda x, y: x + y
| 21.666667 | 58 | 0.615385 | 22 | 130 | 3.636364 | 0.5 | 0.1 | 0.2 | 0.225 | 0.25 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.238462 | 130 | 5 | 59 | 26 | 0.808081 | 0.415385 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.